OpenCourseWare Resources for Advanced High School Study
ERIC Educational Resources Information Center
Carson, Steve
2008-01-01
In 2000, the Massachusetts Institute of Technology (MIT) faculty first proposed putting the course materials from all 1,800 MIT classes online, free of charge. The idea behind MIT OpenCourseWare (OCW) was to use the Internet for more than just distance learning. When MIT began placing the course materials online in 2002 and 2003, the audience…
The OpenCourseWare Story: New England Roots, Global Reach
ERIC Educational Resources Information Center
Carson, Stephen
2008-01-01
The OpenCourseWare movement has its roots in New England. The concept emerged in 2000 at Massachusetts Institute of Technology (MIT) where then-President Charles Vest charged a faculty committee with answering two questions: "How is the Internet going to change education?" and "What should MIT do about it?" MIT moved quickly to…
OpenCourseWare and Open Educational Resources: The Next Big Thing in Technology-Enhanced Education?
ERIC Educational Resources Information Center
Caudill, Jason G.
2012-01-01
OpenCourseWare (OCW) and Open Educational Resources (OER) are two new and closely related educational technologies. Both provide open access to learning materials for students and instructors via the Internet. These are for the moment still very young technologies. While they have grown dramatically in just ten years there is still relatively…
SeqWare Query Engine: storing and searching sequence data in the cloud.
O'Connor, Brian D; Merriman, Barry; Nelson, Stanley F
2010-12-21
Since the introduction of next-generation DNA sequencers the rapid increase in sequencer throughput, and associated drop in costs, has resulted in more than a dozen human genomes being resequenced over the last few years. These efforts are merely a prelude for a future in which genome resequencing will be commonplace for both biomedical research and clinical applications. The dramatic increase in sequencer output strains all facets of computational infrastructure, especially databases and query interfaces. The advent of cloud computing, and a variety of powerful tools designed to process petascale datasets, provide a compelling solution to these ever increasing demands. In this work, we present the SeqWare Query Engine which has been created using modern cloud computing technologies and designed to support databasing information from thousands of genomes. Our backend implementation was built using the highly scalable, NoSQL HBase database from the Hadoop project. We also created a web-based frontend that provides both a programmatic and interactive query interface and integrates with widely used genome browsers and tools. Using the query engine, users can load and query variants (SNVs, indels, translocations, etc) with a rich level of annotations including coverage and functional consequences. As a proof of concept we loaded several whole genome datasets including the U87MG cell line. We also used a glioblastoma multiforme tumor/normal pair to both profile performance and provide an example of using the Hadoop MapReduce framework within the query engine. This software is open source and freely available from the SeqWare project (http://seqware.sourceforge.net). The SeqWare Query Engine provided an easy way to make the U87MG genome accessible to programmers and non-programmers alike. This enabled a faster and more open exploration of results, quicker tuning of parameters for heuristic variant calling filters, and a common data interface to simplify development of analytical tools. The range of data types supported, the ease of querying and integrating with existing tools, and the robust scalability of the underlying cloud-based technologies make SeqWare Query Engine a nature fit for storing and searching ever-growing genome sequence datasets.
SeqWare Query Engine: storing and searching sequence data in the cloud
2010-01-01
Background Since the introduction of next-generation DNA sequencers the rapid increase in sequencer throughput, and associated drop in costs, has resulted in more than a dozen human genomes being resequenced over the last few years. These efforts are merely a prelude for a future in which genome resequencing will be commonplace for both biomedical research and clinical applications. The dramatic increase in sequencer output strains all facets of computational infrastructure, especially databases and query interfaces. The advent of cloud computing, and a variety of powerful tools designed to process petascale datasets, provide a compelling solution to these ever increasing demands. Results In this work, we present the SeqWare Query Engine which has been created using modern cloud computing technologies and designed to support databasing information from thousands of genomes. Our backend implementation was built using the highly scalable, NoSQL HBase database from the Hadoop project. We also created a web-based frontend that provides both a programmatic and interactive query interface and integrates with widely used genome browsers and tools. Using the query engine, users can load and query variants (SNVs, indels, translocations, etc) with a rich level of annotations including coverage and functional consequences. As a proof of concept we loaded several whole genome datasets including the U87MG cell line. We also used a glioblastoma multiforme tumor/normal pair to both profile performance and provide an example of using the Hadoop MapReduce framework within the query engine. This software is open source and freely available from the SeqWare project (http://seqware.sourceforge.net). Conclusions The SeqWare Query Engine provided an easy way to make the U87MG genome accessible to programmers and non-programmers alike. This enabled a faster and more open exploration of results, quicker tuning of parameters for heuristic variant calling filters, and a common data interface to simplify development of analytical tools. The range of data types supported, the ease of querying and integrating with existing tools, and the robust scalability of the underlying cloud-based technologies make SeqWare Query Engine a nature fit for storing and searching ever-growing genome sequence datasets. PMID:21210981
Configuring a Context-Aware Middleware for Wireless Sensor Networks
Gámez, Nadia; Cubo, Javier; Fuentes, Lidia; Pimentel, Ernesto
2012-01-01
In the Future Internet, applications based on Wireless Sensor Networks will have to support reconfiguration with minimum human intervention, depending on dynamic context changes in their environment. These situations create a need for building these applications as adaptive software and including techniques that allow the context acquisition and decisions about adaptation. However, contexts use to be made up of complex information acquired from heterogeneous devices and user characteristics, making them difficult to manage. So, instead of building context-aware applications from scratch, we propose to use FamiWare, a family of middleware for Ambient Intelligence specifically designed to be aware of contexts in sensor and smartphone devices. It provides both, several monitoring services to acquire contexts from devices and users, and a context-awareness service to analyze and detect context changes. However, the current version of FamiWare does not allow the automatic incorporation related to the management of new contexts into the FamiWare family. To overcome this shortcoming, in this work, we first present how to model the context using a metamodel to define the contexts that must to be taken into account in an instantiation of FamiWare for a certain Ambient Intelligence system. Then, to configure a new context-aware version of FamiWare and to generate code ready-to-install within heterogeneous devices, we define a mapping that automatically transforms metamodel elements defining contexts into elements of the FamiWare family, and we also use the FamiWare configuration process to customize the new context-aware variant. Finally, we evaluate the benefits of our process, and we analyze both that the new version of the middleware works as expected and that it manages the contexts in an efficient way. PMID:23012505
Intelligent Middle-Ware Architecture for Mobile Networks
NASA Astrophysics Data System (ADS)
Rayana, Rayene Ben; Bonnin, Jean-Marie
Recent advances in electronic and automotive industries as well as in wireless telecommunication technologies have drawn a new picture where each vehicle became “fully networked”. Multiple stake-holders (network operators, drivers, car manufacturers, service providers, etc.) will participate in this emerging market, which could grow following various models. To free the market from technical constraints, it is important to return to the basics of the Internet, i.e., providing embarked devices with a fully operational Internet connectivity (IPv6).
NASA Astrophysics Data System (ADS)
Harrington, Philip S.
2002-05-01
Praise for the Second Edition of Star Ware "Star Ware is still a tour de force that any experienced amateur will find invaluable, and which hardware-minded beginners will thoroughly enjoy." -Robert Burnham, Sky & Telescope magazine "Star Ware condenses between two covers what would normally take a telescope buyer many months to accumulate." -John Shibley, Astronomy magazine Now more than ever, the backyard astronomer has a dazzling array of choices when it comes to telescope shopping-which can make choosing just the right sky-watching equipment a formidable challenge. In this revised and updated edition of Star Ware, the essential guide to buying astronomical equipment, award-winning astronomy writer Philip Harrington does the work for you, analyzing and exploring today's astronomy market and offering point-by-point comparisons of everything you need. Whether you're an experienced amateur astronomer or just getting started, Star Ware, Third Edition will prepare you to explore the farthest reaches of space with: Extensive, expanded reviews of leading models and accessories, including dozens of new products, to help you buy smart
NAFFS: network attached flash file system for cloud storage on portable consumer electronics
NASA Astrophysics Data System (ADS)
Han, Lin; Huang, Hao; Xie, Changsheng
Cloud storage technology has become a research hotspot in recent years, while the existing cloud storage services are mainly designed for data storage needs with stable high speed Internet connection. Mobile Internet connections are often unstable and the speed is relatively low. These native features of mobile Internet limit the use of cloud storage in portable consumer electronics. The Network Attached Flash File System (NAFFS) presented the idea of taking the portable device built-in NAND flash memory as the front-end cache of virtualized cloud storage device. Modern portable devices with Internet connection have built-in more than 1GB NAND Flash, which is quite enough for daily data storage. The data transfer rate of NAND flash device is much higher than mobile Internet connections[1], and its non-volatile feature makes it very suitable as the cache device of Internet cloud storage on portable device, which often have unstable power supply and intermittent Internet connection. In the present work, NAFFS is evaluated with several benchmarks, and its performance is compared with traditional network attached file systems, such as NFS. Our evaluation results indicate that the NAFFS achieves an average accessing speed of 3.38MB/s, which is about 3 times faster than directly accessing cloud storage by mobile Internet connection, and offers a more stable interface than that of directly using cloud storage API. Unstable Internet connection and sudden power off condition are tolerable, and no data in cache will be lost in such situation.
Research on Influence of Cloud Environment on Traditional Network Security
NASA Astrophysics Data System (ADS)
Ming, Xiaobo; Guo, Jinhua
2018-02-01
Cloud computing is a symbol of the progress of modern information network, cloud computing provides a lot of convenience to the Internet users, but it also brings a lot of risk to the Internet users. Second, one of the main reasons for Internet users to choose cloud computing is that the network security performance is great, it also is the cornerstone of cloud computing applications. This paper briefly explores the impact on cloud environment on traditional cybersecurity, and puts forward corresponding solutions.
On Study of Building Smart Campus under Conditions of Cloud Computing and Internet of Things
NASA Astrophysics Data System (ADS)
Huang, Chao
2017-12-01
two new concepts in the information era are cloud computing and internet of things, although they are defined differently, they share close relationship. It is a new measure to realize leap-forward development of campus by virtue of cloud computing, internet of things and other internet technologies to build smart campus. This paper, centering on the construction of smart campus, analyzes and compares differences between network in traditional campus and that in smart campus, and makes proposals on how to build smart campus finally from the perspectives of cloud computing and internet of things.
NASA Astrophysics Data System (ADS)
Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos
2014-05-01
Mobile cloud computing is receiving world-wide momentum for ubiquitous on-demand cloud services for mobile users provided by Amazon, Google etc. with low capital cost. However, Internet-centric clouds introduce wide area network (WAN) delays that are often intolerable for real-time applications such as video streaming. One promising approach to addressing this challenge is to deploy decentralized mini-cloud facility known as cloudlets to enable localized cloud services. When supported by local wireless connectivity, a wireless cloudlet is expected to offer low cost and high performance cloud services for the users. In this work, we implement a realistic framework that comprises both a popular Internet cloud (Amazon Cloud) and a real-world cloudlet (based on Ubuntu Enterprise Cloud (UEC)) for mobile cloud users in a wireless mesh network. We focus on real-time video streaming over the HTTP standard and implement a typical application. We further perform a comprehensive comparative analysis and empirical evaluation of the application's performance when it is delivered over the Internet cloud and the cloudlet respectively. The study quantifies the influence of the two different cloud networking architectures on supporting real-time video streaming. We also enable movement of the users in the wireless mesh network and investigate the effect of user's mobility on mobile cloud computing over the cloudlet and Amazon cloud respectively. Our experimental results demonstrate the advantages of the cloudlet paradigm over its Internet cloud counterpart in supporting the quality of service of real-time applications.
ERIC Educational Resources Information Center
Okopi, Fidel Onjefu; Odeyemi, Olajumoke Janet; Adesina, Adewale
2015-01-01
The study has identified the areas of strengths and weaknesses in the current use of Computer Based Learning (CBL) tools in Open and Distance Learning (ODL) institutions in Nigeria. To achieve these objectives, the following research questions were proposed: (i) What are the computer-based learning tools (soft and hard ware) that are actually in…
Web N.0, the New Development Trend of Internet
NASA Astrophysics Data System (ADS)
Sun, Zhiguo; Wang, Wensheng
This article analyzes the Internet basic theory, the network foundation environment and the user behavior change and so on, Which analyzes the development tendency of existing partial Internet products in the future Internet environment. The article also hot on the concept of cloud computing, Demonstrates the relation between Cloud Computing and Web 2.0 from the angle of Cloud-based end-user applications, The possibly killing application in the future was discussed.
Problematic Internet use and physical health.
Kelley, Kevin J; Gruber, Elon M
2013-06-01
Background and aims A considerable body of literature has emerged over the past two decades assessing the relationship between problematic or addictive use of the Internet and various indices of psychological well-being. Conversely, comparatively little research has assessed the relationship between problematic or addictive use of the Internet and one's physical health. Method The current study assesses this relationship using a sample of college students (N = 133) who responded online to two questionnaires: the Problematic Internet Use Questionnaire (PIUQ; Demetrovics, Szeredi&Rózsa, 2008) and the SF-36v2 Health Survey (Ware et al., 2008). Results The findings indicate that problematic Internet use is associated with poorer physical health. These results are consistent with other data that assessed the relationship between these two variables. Furthermore, this relationship supersedes the influence of the number of hours spent online per day. Conclusions The findings are discussed in terms of the limitations of the study design and conclusions that can be drawn from this preliminary empirical effort.
Issues Affecting Internet Use in Afghanistan and Developing Countries in the Middle East
2003-03-10
World Fashion Wares for Dot-Com Market ," Wall Street Journal, June 12, 2000, pp. B1, B12. Mansell, Robin, " Information and Communication Technologies ...Societies: Information Technology for Sustainable Development, New York: Oxford University Press, 1998. Marsh, Ann, "Mapping a Pan-African Market ...access to and use of information and communication technologies (ICT).[1] This paper examines some recent literature to identify the fundamental
Intimate partner violence, technology, and stalking.
Southworth, Cynthia; Finn, Jerry; Dawson, Shawndell; Fraser, Cynthia; Tucker, Sarah
2007-08-01
This research note describes the use of a broad range of technologies in intimate partner stalking, including cordless and cellular telephones, fax machines, e-mail, Internet-based harassment, global positioning systems, spy ware, video cameras, and online databases. The concept of "stalking with technology" is reviewed, and the need for an expanded definition of cyberstalking is presented. Legal issues and advocacy-centered responses, including training, legal remedies, public policy issues, and technology industry practices, are discussed.
New Technologies and the World Ahead: The Top 20 Plus 5
2011-01-01
Specialized Agent Software Programs. Bots represent the next great milestone in soft- ware development. The general deployment of bots is projected to be in...knowledge and areas of interest. Powerful personal- agent 206 Moving from Vision to Action programs will search the Internet and its databases based on...language- capable chatbot and avatar interfaces that can control electronic data and also change and manipulate things in the physical world. These
CyberTerrorism: Cyber Prevention vs Cyber Recovery
2007-12-01
appropriate available security measures (i.e. appropriate level of spy ware, IDS, and antivirus protection software installed) are unaffected by worm attacks...a worm is a form of a virus designed to copy itself by utilizing e-mail or other software applications. The main goal of using this technique is...to permeate the network or portions of the Internet with malicious code that will affect the performance of certain software applications or will
Future internet architecture and cloud ecosystem: A survey
NASA Astrophysics Data System (ADS)
Wan, Man; Yin, Shiqun
2018-04-01
The Internet has gradually become a social infrastructure, the existing TCP/IP architecture faces many challenges. So future Internet architecture become hot research. This paper introduces two ways of idea about the future research of Internet structure system, probes into the future Internet architecture and the environment of cloud ecosystem. Finally, we focuses the related research, and discuss basic principles and problems of OpenStack.
ERIC Educational Resources Information Center
Chudnov, Daniel
2010-01-01
Cloud computing is definitely a thing now, but it's not new and it's not even novel. Back when people were first learning about the Internet in the 1990s, every diagram that one saw showing how the Internet worked had a big cloud in the middle. That cloud represented the diverse links, routers, gateways, and protocols that passed traffic around in…
The monitoring and managing application of cloud computing based on Internet of Things.
Luo, Shiliang; Ren, Bin
2016-07-01
Cloud computing and the Internet of Things are the two hot points in the Internet application field. The application of the two new technologies is in hot discussion and research, but quite less on the field of medical monitoring and managing application. Thus, in this paper, we study and analyze the application of cloud computing and the Internet of Things on the medical field. And we manage to make a combination of the two techniques in the medical monitoring and managing field. The model architecture for remote monitoring cloud platform of healthcare information (RMCPHI) was established firstly. Then the RMCPHI architecture was analyzed. Finally an efficient PSOSAA algorithm was proposed for the medical monitoring and managing application of cloud computing. Simulation results showed that our proposed scheme can improve the efficiency about 50%. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
On Using Home Networks and Cloud Computing for a Future Internet of Things
NASA Astrophysics Data System (ADS)
Niedermayer, Heiko; Holz, Ralph; Pahl, Marc-Oliver; Carle, Georg
In this position paper we state four requirements for a Future Internet and sketch our initial concept. The requirements: (1) more comfort, (2) integration of home networks, (3) resources like service clouds in the network, and (4) access anywhere on any machine. Future Internet needs future quality and future comfort. There need to be new possiblities for everyone. Our focus is on higher layers and related to the many overlay proposals. We consider them to run on top of a basic Future Internet core. A new user experience means to include all user devices. Home networks and services should be a fundamental part of the Future Internet. Home networks extend access and allow interaction with the environment. Cloud Computing can provide reliable resources beyond local boundaries. For access anywhere, we also need secure storage for data and profiles in the network, in particular for access with non-personal devices (Internet terminal, ticket machine, ...).
NASA Astrophysics Data System (ADS)
Yang, Wei; Hall, Trevor
2012-12-01
The Internet is entering an era of cloud computing to provide more cost effective, eco-friendly and reliable services to consumer and business users and the nature of the Internet traffic will undertake a fundamental transformation. Consequently, the current Internet will no longer suffice for serving cloud traffic in metro areas. This work proposes an infrastructure with a unified control plane that integrates simple packet aggregation technology with optical express through the interoperation between IP routers and electrical traffic controllers in optical metro networks. The proposed infrastructure provides flexible, intelligent, and eco-friendly bandwidth on demand for cloud computing in metro areas.
NASA Astrophysics Data System (ADS)
Yang, Wei; Hall, Trevor J.
2013-12-01
The Internet is entering an era of cloud computing to provide more cost effective, eco-friendly and reliable services to consumer and business users. As a consequence, the nature of the Internet traffic has been fundamentally transformed from a pure packet-based pattern to today's predominantly flow-based pattern. Cloud computing has also brought about an unprecedented growth in the Internet traffic. In this paper, a hybrid optical switch architecture is presented to deal with the flow-based Internet traffic, aiming to offer flexible and intelligent bandwidth on demand to improve fiber capacity utilization. The hybrid optical switch is capable of integrating IP into optical networks for cloud-based traffic with predictable performance, for which the delay performance of the electronic module in the hybrid optical switch architecture is evaluated through simulation.
State of the Art of Network Security Perspectives in Cloud Computing
NASA Astrophysics Data System (ADS)
Oh, Tae Hwan; Lim, Shinyoung; Choi, Young B.; Park, Kwang-Roh; Lee, Heejo; Choi, Hyunsang
Cloud computing is now regarded as one of social phenomenon that satisfy customers' needs. It is possible that the customers' needs and the primary principle of economy - gain maximum benefits from minimum investment - reflects realization of cloud computing. We are living in the connected society with flood of information and without connected computers to the Internet, our activities and work of daily living will be impossible. Cloud computing is able to provide customers with custom-tailored features of application software and user's environment based on the customer's needs by adopting on-demand outsourcing of computing resources through the Internet. It also provides cloud computing users with high-end computing power and expensive application software package, and accordingly the users will access their data and the application software where they are located at the remote system. As the cloud computing system is connected to the Internet, network security issues of cloud computing are considered as mandatory prior to real world service. In this paper, survey and issues on the network security in cloud computing are discussed from the perspective of real world service environments.
Analysis of Laogang energy internet and construction of the cloud platform
NASA Astrophysics Data System (ADS)
Wang, Selan; Nie, Jianwen; Zhang, Daiyue; Li, Xia; Tai, Jun; Yu, Zhaohui; Lu, Yiqi; Xie, Da
2018-02-01
Laogang solid waste recycling base deals with about 70% waste domestic garbage of Shanghai every day. By recycling the garbage, great amount of energy including electricity, heat and gas can be produced. Meanwhile, the base itself consumes much energy as well. Therefore, an energy internet has been designed for the base to analyse the output and usage of the energy so that the energy utilization rate can be enhanced. In addition, a cloud platform has been established basing on the three-layer cloud technology: IaaS, PaaS and SaaS. This cloud platform mainly analysing electricity will judge whether the energy has been used suitably form all sides and furthermore, improve the operation of the whole energy internet in the base.
Open Course Ware, Distance Education, and 21st Century Geoscience Education
NASA Astrophysics Data System (ADS)
Connors, M. G.
2010-12-01
Open Course Ware (OCW) allows the highest quality educational materials (including videos of lectures from the best classroom lecturers) to find a wide audience. This audience may include many who wish to obtain credentials for formal study yet who are unable to be campus-based students. This opens a role for formal, credentialed and accredited distance education (DE) to efficiently integrate OCW into DE courses. OCW materials will in this manner be able to be used for education of credential-seeking students who would not otherwise benefit from them. Modern presentation methods using the Internet and video (including mobile device) technologies may offer pedagogical advantages over even traditional classroom learning. A detailed analysis of the development of Athabasca University’s PHYS 302 Vibrations and Waves course (based mainly on MIT’s OCW), and application of lessons learned to development of PHYS 305 Electromagnetism is presented. These courses are relevant to the study of geophysics, but examples of GEOL (Geology) courses will also be mentioned, along with an broad overview of OCW resources in Geoscience.
Analysis of the new health management based on health internet of things and cloud computing
NASA Astrophysics Data System (ADS)
Liu, Shaogang
2018-05-01
With the development and application of Internet of things and cloud technology in the medical field, it provides a higher level of exploration space for human health management. By analyzing the Internet of things technology and cloud technology, this paper studies a new form of health management system which conforms to the current social and technical level, and explores its system architecture, system characteristics and application. The new health management platform for networking and cloud can achieve the real-time monitoring and prediction of human health through a variety of sensors and wireless networks based on information and can be transmitted to the monitoring system, and then through the software analysis model, and gives the targeted prevention and treatment measures, to achieve real-time, intelligent health management.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-29
... Stainless Steel Cooking Ware From the Republic of Korea: Final Results of Sunset Reviews and Revocation of... reviews of the antidumping and countervailing duty orders on top of the stove stainless steel cooking ware... the stove stainless steel cooking ware from Korea includes all non-electric cooking ware of stainless...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-14
.... 701- TA-267 and 731-TA-304 (Third Review)] Porcelain-on-Steel Cooking Ware From Taiwan; Top-of-the-Stove Stainless Steel Cooking Ware From Korea AGENCY: United States International Trade Commission...-steel cooking ware from Taiwan and the antidumping and countervailing duty orders on imports of top-of...
Yang, Shu; Qiu, Yuyan; Shi, Bo
2016-09-01
This paper explores the methods of building the internet of things of a regional ECG monitoring, focused on the implementation of ECG monitoring center based on cloud computing platform. It analyzes implementation principles of automatic identifi cation in the types of arrhythmia. It also studies the system architecture and key techniques of cloud computing platform, including server load balancing technology, reliable storage of massive smalfi les and the implications of quick search function.
76 FR 2920 - Porcelain-on-Steel Cooking Ware From China
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-18
... Cooking Ware From China AGENCY: United States International Trade Commission. ACTION: Scheduling of an expedited five-year review concerning the antidumping duty order on porcelain-on-steel cooking ware from... revocation of the antidumping duty order on porcelain-on-steel cooking ware from China would be likely to...
76 FR 12369 - Porcelain-on-Steel Cooking Ware From China
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
... Cooking Ware From China Determination On the basis of the record \\1\\ developed in the subject five-year... porcelain-on-steel cooking ware from China would be likely to lead to continuation or recurrence of material... Cooking Ware from China: Investigation No. 731-TA- 298 (Third Review). Issued: February 28, 2011. By order...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-29
... DEPARTMENT OF COMMERCE International Trade Administration [A-583-508] Porcelain-on-Steel Cooking... duty order on porcelain-on-steel cooking ware (POS cooking ware) from Taiwan pursuant to section 751(c... the antidumping duty order on POS cooking ware from Taiwan. DATES: Effective Date: November 22, 2010...
The Unwalled Garden: Growth of the OpenCourseWare Consortium, 2001-2008
ERIC Educational Resources Information Center
Carson, Steve
2009-01-01
This article traces the development of the OpenCourseWare movement, including the origin of the concept at the Massachusetts Institute of Technology (MIT), the implementation of the MIT OpenCourseWare project, and the idea's spread into the global educational community, ultimately resulting in the formation of the OpenCourseWare Consortium. The…
Effect of Iron Oxide and Phase Separation on the Color of Blue Jun Ware Glaze.
Wang, Fen; Yang, Changan; Zhu, Jianfeng; Lin, Ying
2015-09-01
Based on the traditional Jun ware glaze, the imitated Jun ware glazes were prepared by adding iron oxide and introducing phase separation agent apatite through four-angle-method. The effect of iron oxide contents, phase separation and the firing temperature on the color of Jun ware glazes were investigated by a neutral atmosphere experiment, optical microscope and scanning electronic microscope. The results showed that the colorant, mainly Fe2O3, contributed to the Jun ware glaze blue and cyan colors of Jun ware glaze. The light scatter caused by the small droplets in phase separation structure only influenced the shade of the glaze color, intensify or weaken the color, and thus made the glaze perfect and elegant opal visual effects, but was not the origin of general blue or cyan colors of Jun ware glaze. In addition, the firing temperature and the basic glaze composition affected the glaze colors to some extent.
Dinh, Thanh; Kim, Younghan; Lee, Hyukjoon
2017-03-01
This paper presents a location-based interactive model of Internet of Things (IoT) and cloud integration (IoT-cloud) for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model.
Dinh, Thanh; Kim, Younghan; Lee, Hyukjoon
2017-01-01
This paper presents a location-based interactive model of Internet of Things (IoT) and cloud integration (IoT-cloud) for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model. PMID:28257067
The Education Value of Cloud Computing
ERIC Educational Resources Information Center
Katzan, Harry, Jr.
2010-01-01
Cloud computing is a technique for supplying computer facilities and providing access to software via the Internet. Cloud computing represents a contextual shift in how computers are provisioned and accessed. One of the defining characteristics of cloud software service is the transfer of control from the client domain to the service provider.…
Cloud Computing. Technology Briefing. Number 1
ERIC Educational Resources Information Center
Alberta Education, 2013
2013-01-01
Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…
Suciu, George; Suciu, Victor; Martian, Alexandru; Craciunescu, Razvan; Vulpe, Alexandru; Marcu, Ioana; Halunga, Simona; Fratu, Octavian
2015-11-01
Big data storage and processing are considered as one of the main applications for cloud computing systems. Furthermore, the development of the Internet of Things (IoT) paradigm has advanced the research on Machine to Machine (M2M) communications and enabled novel tele-monitoring architectures for E-Health applications. However, there is a need for converging current decentralized cloud systems, general software for processing big data and IoT systems. The purpose of this paper is to analyze existing components and methods of securely integrating big data processing with cloud M2M systems based on Remote Telemetry Units (RTUs) and to propose a converged E-Health architecture built on Exalead CloudView, a search based application. Finally, we discuss the main findings of the proposed implementation and future directions.
Impact of different cloud deployments on real-time video applications for mobile video cloud users
NASA Astrophysics Data System (ADS)
Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos
2015-02-01
The latest trend to access mobile cloud services through wireless network connectivity has amplified globally among both entrepreneurs and home end users. Although existing public cloud service vendors such as Google, Microsoft Azure etc. are providing on-demand cloud services with affordable cost for mobile users, there are still a number of challenges to achieve high-quality mobile cloud based video applications, especially due to the bandwidth-constrained and errorprone mobile network connectivity, which is the communication bottleneck for end-to-end video delivery. In addition, existing accessible clouds networking architectures are different in term of their implementation, services, resources, storage, pricing, support and so on, and these differences have varied impact on the performance of cloud-based real-time video applications. Nevertheless, these challenges and impacts have not been thoroughly investigated in the literature. In our previous work, we have implemented a mobile cloud network model that integrates localized and decentralized cloudlets (mini-clouds) and wireless mesh networks. In this paper, we deploy a real-time framework consisting of various existing Internet cloud networking architectures (Google Cloud, Microsoft Azure and Eucalyptus Cloud) and a cloudlet based on Ubuntu Enterprise Cloud over wireless mesh networking technology for mobile cloud end users. It is noted that the increasing trend to access real-time video streaming over HTTP/HTTPS is gaining popularity among both research and industrial communities to leverage the existing web services and HTTP infrastructure in the Internet. To study the performance under different deployments using different public and private cloud service providers, we employ real-time video streaming over the HTTP/HTTPS standard, and conduct experimental evaluation and in-depth comparative analysis of the impact of different deployments on the quality of service for mobile video cloud users. Empirical results are presented and discussed to quantify and explain the different impacts resulted from various cloud deployments, video application and wireless/mobile network setting, and user mobility. Additionally, this paper analyses the advantages, disadvantages, limitations and optimization techniques in various cloud networking deployments, in particular the cloudlet approach compared with the Internet cloud approach, with recommendations of optimized deployments highlighted. Finally, federated clouds and inter-cloud collaboration challenges and opportunities are discussed in the context of supporting real-time video applications for mobile users.
Evaluation of Future Internet Technologies for Processing and Distribution of Satellite Imagery
NASA Astrophysics Data System (ADS)
Becedas, J.; Perez, R.; Gonzalez, G.; Alvarez, J.; Garcia, F.; Maldonado, F.; Sucari, A.; Garcia, J.
2015-04-01
Satellite imagery data centres are designed to operate a defined number of satellites. For instance, difficulties when new satellites have to be incorporated in the system appear. This occurs because traditional infrastructures are neither flexible nor scalable. With the appearance of Future Internet technologies new solutions can be provided to manage large and variable amounts of data on demand. These technologies optimize resources and facilitate the appearance of new applications and services in the traditional Earth Observation (EO) market. The use of Future Internet technologies for the EO sector were validated with the GEO-Cloud experiment, part of the Fed4FIRE FP7 European project. This work presents the final results of the project, in which a constellation of satellites records the whole Earth surface on a daily basis. The satellite imagery is downloaded into a distributed network of ground stations and ingested in a cloud infrastructure, where the data is processed, stored, archived and distributed to the end users. The processing and transfer times inside the cloud, workload of the processors, automatic cataloguing and accessibility through the Internet are evaluated to validate if Future Internet technologies present advantages over traditional methods. Applicability of these technologies is evaluated to provide high added value services. Finally, the advantages of using federated testbeds to carry out large scale, industry driven experiments are analysed evaluating the feasibility of an experiment developed in the European infrastructure Fed4FIRE and its migration to a commercial cloud: SoftLayer, an IBM Company.
NASA Astrophysics Data System (ADS)
Wang, Xi Vincent; Wang, Lihui
2017-08-01
Cloud computing is the new enabling technology that offers centralised computing, flexible data storage and scalable services. In the manufacturing context, it is possible to utilise the Cloud technology to integrate and provide industrial resources and capabilities in terms of Cloud services. In this paper, a function block-based integration mechanism is developed to connect various types of production resources. A Cloud-based architecture is also deployed to offer a service pool which maintains these resources as production services. The proposed system provides a flexible and integrated information environment for the Cloud-based production system. As a specific type of manufacturing, Waste Electrical and Electronic Equipment (WEEE) remanufacturing experiences difficulties in system integration, information exchange and resource management. In this research, WEEE is selected as the example of Internet of Things to demonstrate how the obstacles and bottlenecks are overcome with the help of Cloud-based informatics approach. In the case studies, the WEEE recycle/recovery capabilities are also integrated and deployed as flexible Cloud services. Supporting mechanisms and technologies are presented and evaluated towards the end of the paper.
1. EXTERIOR VIEW OF 209 WARE STREET LOOKING SOUTH. THIS ...
1. EXTERIOR VIEW OF 209 WARE STREET LOOKING SOUTH. THIS STRUCTURE WAS ONE OF APPROXIMATELY SEVENTEEN DUPLEXES BUILT AS THE ORIGINAL WORKER HOUSING FOR THE LaGRANGE COTTON MILLS, LATER KNOWN AS CALUMET MILL. LaGRANGE MILLS (1888-89) WAS THE FIRST COTTON MILL IN LaGRANGE. NOTE THE GABLE-ON-HIP ROOF FORM AND TWO IDENTICAL STRUCTURES VISIBLE TO THE LEFT. - 209 Ware Street (House), 209 Ware Street, La Grange, Troup County, GA
Study on ancient Chinese imitated GE ware by INAA and WDXRF
NASA Astrophysics Data System (ADS)
Xie, Guoxi; Feng, Songlin; Feng, Xiangqian; Wang, Yanqing; Zhu, Jihao; Yan, Lingtong; Li, Yongqiang; Han, Hongye
2007-11-01
Imitated GE ware was one of the most famous products of Jingdezhen porcelain field in Ming dynasty (AD 1368-1644). The exterior features of its body and glaze are very marvelous. Black foot, purple mouth and crazing glaze are the main features of imitated GE ware. Until now, the key conditions of resulting these features are not clearly identified. In order to find the critical elements for firing these features, instrumental neutron activation analysis (INAA) and wavelength-dispersive X-ray fluorescence (WDXRF) were used to determine the element abundance patterns of imitated GE ware body and glaze. The experimental data was compared with that of imitated Longquan celadon and of Longquan celadon. The analytical results indicated that Fe, Ti and Na were the critical elements. The body of imitated GE ware which contains high Fe and Ti are the basic conditions of firing its black body, black foot and purple mouth. The glaze of imitated GE ware which contains high Na is the main condition of producing its crazing glaze. Na is the critical element which enlarges the difference in expansion coefficients between the glaze and body of imitated GE ware. Furthermore, Zijin soil was added into kaolin to make the body rich in Fe and Ti. And something which was rich in Na was used to produce crazing glaze in the manufacturing process of imitated GE ware.
2010-07-01
Cloud computing , an emerging form of computing in which users have access to scalable, on-demand capabilities that are provided through Internet... cloud computing , (2) the information security implications of using cloud computing services in the Federal Government, and (3) federal guidance and...efforts to address information security when using cloud computing . The complete report is titled Information Security: Federal Guidance Needed to
Effect of natural and synthetic organics on the processing of ceramics
NASA Astrophysics Data System (ADS)
Schulz, Brett M.
Dry pressing has been shown to be an efficient and cost effective method of manufacturing ceramic ware. Dry pressed parts are typically manufactured with a low moisture content which has the further advantage of eliminating the drying step that is necessary for plastic formed ware, i.e., jiggered or ram pressed. Problems associated with the use of dry pressing in an industrial setting involve the high loss rate during the bisque firing process and the poor surface finish of the green (unfired) ware. It was the goal of this research to improve the surface finish of dry pressed ware to a level that is satisfactory for decorating of the bisque fired ware. The adsorption of organic additives, specifically dispersants, on the surface of particles is an important aspect of ceramic processing. The interactions between organic additives, specifically sodium poly[acrylic acid] and poly[vinyl alcohol], have been demonstrated to result in phase separation into distinct domains during the spray-drying process. This phase separation leads to a poly[vinyl alcohol]-rich film on the surface of the granulate which will increase the P1 value, the pressure at the onset of granule deformation, of the granulate. This negative interaction between the organics increases the surface roughness of the dry pressed ware. The roughness of the industrially prepared ware was determined using an optical interferometer to set a baseline for improvements in the surface finish of the dry pressed ware. Blending of dried granulate was determined to significantly improve the surface finish of the ware. Alternative binders to replace a plasticized poly[vinyl alcohol] were observed to show improvements in the surface finish of the ware dry pressed in a semi-isostatic die. In summary the most important aspect to improving the surface finish of dry pressed ware, i.e. facilitating compaction, is the selection of the organic additives. Additives which are observed to have a negative interaction, i.e. to phase separate into distinct domains, will result in an organic rich film at the surface of the granule thus increasing the P1 value of the granulate.
Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang
2014-01-01
Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment. PMID:24686728
Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang
2014-03-28
Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment.
Practising cloud-based telemedicine in developing countries.
Puustjärvi, Juha; Puustjärvi, Leena
2013-01-01
In industrialised countries, telemedicine has proven to be a valuable tool for enabling access to knowledge and allowing information exchange, and showing that it is possible to provide good quality of healthcare to isolated communities. However, there are many barriers to the widespread implementation of telemedicine in rural areas of developing countries. These include deficient internet connectivity and sophisticated peripheral medical devices. Furthermore, developing countries have very high patients-per-doctor ratios. In this paper, we report our work on developing a cloud-based health information system, which promotes telemedicine and patient-centred healthcare by exploiting modern information and communication technologies such as OWL-ontologies and SQL-triggers. The reason for using cloud technology is twofold. First, cloud service models are easily adaptable for sharing patients health information, which is of prime importance in patient-centred healthcare as well as in telemedicine. Second, the cloud and the consulting physicians may locate anywhere in the internet.
Welch, Jennifer M; Hoffius, Susan D; Fox, E. Brooke
2011-01-01
Question/Objective: How can a special collection maintain or increase its profile in its parent institution, when that parent institution emphasizes scientific and clinical learning? Setting/Context: The Waring Historical Library, Medical University of South Carolina (MUSC), preserves and promotes the history of health sciences at MUSC and in South Carolina. As a state entity, MUSC has suffered significant budget cuts for the past several years. In this climate, the Waring had to find ways to maintain relevance in the MUSC community. Methods: The Waring partnered with the MUSC College of Nursing to explore new ways to build institutional allies. By combining traditional archival administration with innovative uses of digital collections aimed at institutional promotion and outreach, the Waring's digital library became an advocacy tool that led to the Waring's enhanced value to its parent institution. Outcomes: The Waring Library is a resource for MUSC development and alumni relations. Tangible outcomes include additional funding from grants, increased staff, no loss of institutional funding, increased access to collections, increased accessions, cultivation of institutional allies for long-term support of the Waring, and development of a template for future partnerships. PMID:21243056
Research and development of intelligent controller for high-grade sanitary ware
NASA Astrophysics Data System (ADS)
Bao, Kongjun; Shen, Qingping
2013-03-01
With the social and economic development and people's living standards improve, more and more emphasis on modern society, people improve the quality of family life, the use of intelligent controller applications in high-grade sanitary ware physiotherapy students. Analysis of high-grade sanitary ware physiotherapy common functions pointed out in the production and use of the possible risks, proposed implementation of the system hardware and matching, given the system software implementation process. High-grade sanitary ware physiotherapy intelligent controller not only to achieve elegant and beautiful, simple, physical therapy, water power, deodorant, multi-function, intelligent control, to meet the consumers, the high-end sanitary ware market, strong demand, Accelerate the enterprise product Upgrade and improve the competitiveness of enterprises.
An Internet-Based Accounting Information Systems Project
ERIC Educational Resources Information Center
Miller, Louise
2012-01-01
This paper describes a student project assignment used in an accounting information systems course. We are now truly immersed in the internet age, and while many required accounting information systems courses and textbooks introduce database design, accounting software development, cloud computing, and internet security, projects involving the…
Navigating the Challenges of the Cloud
ERIC Educational Resources Information Center
Ovadia, Steven
2010-01-01
Cloud computing is increasingly popular in education. Cloud computing is "the delivery of computer services from vast warehouses of shared machines that enables companies and individuals to cut costs by handing over the running of their email, customer databases or accounting software to someone else, and then accessing it over the internet."…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-14
... DEPARTMENT OF COMMERCE International Trade Administration [A-570-506] Porcelain-on-Steel Cooking... cooking ware (``POS cookware'') from the People's Republic of China (``PRC'') would likely lead to a... should the order be revoked. See Porcelain-on-Steel Cooking Ware from the People's Republic of China...
Cloud Infrastructure & Applications - CloudIA
NASA Astrophysics Data System (ADS)
Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank
The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.
Towards Practical Privacy-Preserving Internet Services
ERIC Educational Resources Information Center
Wang, Shiyuan
2012-01-01
Today's Internet offers people a vast selection of data centric services, such as online query services, the cloud, and location-based services, etc. These internet services bring people a lot of convenience, but at the same time raise privacy concerns, e.g., sensitive information revealed by the queries, sensitive data being stored and…
Exploring the Strategies for a Community College Transition into a Cloud-Computing Environment
ERIC Educational Resources Information Center
DeBary, Narges
2017-01-01
The use of the Internet has resulted in the birth of an innovative virtualization technology called cloud computing. Virtualization can tremendously improve the instructional and operational systems of a community college. Although the incidental adoption of the cloud solutions in the community colleges of higher education has been increased,…
NASA Astrophysics Data System (ADS)
Xiong, Ting; He, Zhiwen
2017-06-01
Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.
75 FR 49549 - ABC & D Recycling, Inc.-Lease and Operation Exemption-a Line of Railroad in Ware, MA
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-13
... DEPARTMENT OF TRANSPORTATION Surface Transportation Board [Docket No. FD 35397] ABC & D Recycling, Inc.--Lease and Operation Exemption--a Line of Railroad in Ware, MA ABC & D Recycling, Inc. (ABC & D..., ABC & D Recycling, Inc.--Lease and Operation Exemption--a Line of Railroad in Ware, Massachusetts (STB...
Cloud Computing Security Issue: Survey
NASA Astrophysics Data System (ADS)
Kamal, Shailza; Kaur, Rajpreet
2011-12-01
Cloud computing is the growing field in IT industry since 2007 proposed by IBM. Another company like Google, Amazon, and Microsoft provides further products to cloud computing. The cloud computing is the internet based computing that shared recourses, information on demand. It provides the services like SaaS, IaaS and PaaS. The services and recourses are shared by virtualization that run multiple operation applications on cloud computing. This discussion gives the survey on the challenges on security issues during cloud computing and describes some standards and protocols that presents how security can be managed.
ERIC Educational Resources Information Center
Dominguez, Alfredo
2013-01-01
Cloud computing has emerged as a new paradigm for on-demand delivery and consumption of shared IT resources over the Internet. Research has predicted that small and medium organizations (SMEs) would be among the earliest adopters of cloud solutions; however, this projection has not materialized. This study set out to investigate if behavior…
ERIC Educational Resources Information Center
Liao, Yuan
2011-01-01
The virtualization of computing resources, as represented by the sustained growth of cloud computing, continues to thrive. Information Technology departments are building their private clouds due to the perception of significant cost savings by managing all physical computing resources from a single point and assigning them to applications or…
Bourgeois, Quentin; Kroon, Erik
2017-01-01
The emergence of Corded Ware Groups throughout Europe in the 3rd millennium BC is one of the most defining events in European history. From the Wolga to the Rhine communities start to speak Indo-European languages and bury their dead in an extremely similar fashion. Recent ancient DNA-analyses identify a massive migration from the Eurasian steppe as the prime cause for this event. However, there is a fundamental difference between expressing a Corded Ware identity-the sharing of world views and ideas-and having a specific DNA-profile. Therefore, we argue that investigating the exchange of cultural information on burial rites between these communities serves as a crucial complement to the exchange of biological information. By adopting a practice perspective to 1161 Corded Ware burials throughout north-western Europe, combined with similarity indexes and network representations, we demonstrate a high degree of information sharing on the burial ritual between different regions. Moreover, we show that male burials are much more international in character than female burials and as such can be considered as the vector along which cultural information and Corded Ware identity was transmitted. This finding highlights an underlying complex societal organization of Corded Ware burial rites in which gender roles had a significant impact on the composition and transmission of cultural information. Our findings corroborate recent studies that suggest the Corded Ware was a male focused society.
Analysis on the security of cloud computing
NASA Astrophysics Data System (ADS)
He, Zhonglin; He, Yuhua
2011-02-01
Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.
ERIC Educational Resources Information Center
Aaron, Lynn S.; Roche, Catherine M.
2012-01-01
"Cloud computing" refers to the use of computing resources on the Internet instead of on individual personal computers. The field is expanding and has significant potential value for educators. This is discussed with a focus on four main functions: file storage, file synchronization, document creation, and collaboration--each of which has…
Cloud Based Drive Forensic and DDoS Analysis on Seafile as Case Study
NASA Astrophysics Data System (ADS)
Bahaweres, R. B.; Santo, N. B.; Ningsih, A. S.
2017-01-01
The rapid development of Internet due to increasing data rates through both broadband cable networks and 4G wireless mobile, make everyone easily connected to the internet. Storages as Services (StaaS) is more popular and many users want to store their data in one place so that whenever they need they can easily access anywhere, any place and anytime in the cloud. The use of the service makes it vulnerable to use by someone to commit a crime or can do Denial of Service (DoS) on cloud storage services. The criminals can use the cloud storage services to store, upload and download illegal file or document to the cloud storage. In this study, we try to implement a private cloud storage using Seafile on Raspberry Pi and perform simulations in Local Area Network and Wi-Fi environment to analyze forensically to discover or open a criminal act can be traced and proved forensically. Also, we can identify, collect and analyze the artifact of server and client, such as a registry of the desktop client, the file system, the log of seafile, the cache of the browser, and database forensic.
Migrating EO/IR sensors to cloud-based infrastructure as service architectures
NASA Astrophysics Data System (ADS)
Berglie, Stephen T.; Webster, Steven; May, Christopher M.
2014-06-01
The Night Vision Image Generator (NVIG), a product of US Army RDECOM CERDEC NVESD, is a visualization tool used widely throughout Army simulation environments to provide fully attributed synthesized, full motion video using physics-based sensor and environmental effects. The NVIG relies heavily on contemporary hardware-based acceleration and GPU processing techniques, which push the envelope of both enterprise and commodity-level hypervisor support for providing virtual machines with direct access to hardware resources. The NVIG has successfully been integrated into fully virtual environments where system architectures leverage cloudbased technologies to various extents in order to streamline infrastructure and service management. This paper details the challenges presented to engineers seeking to migrate GPU-bound processes, such as the NVIG, to virtual machines and, ultimately, Cloud-Based IAS architectures. In addition, it presents the path that led to success for the NVIG. A brief overview of Cloud-Based infrastructure management tool sets is provided, and several virtual desktop solutions are outlined. A discrimination is made between general purpose virtual desktop technologies compared to technologies that expose GPU-specific capabilities, including direct rendering and hard ware-based video encoding. Candidate hypervisor/virtual machine configurations that nominally satisfy the virtualized hardware-level GPU requirements of the NVIG are presented , and each is subsequently reviewed in light of its implications on higher-level Cloud management techniques. Implementation details are included from the hardware level, through the operating system, to the 3D graphics APls required by the NVIG and similar GPU-bound tools.
KeyWare: an open wireless distributed computing environment
NASA Astrophysics Data System (ADS)
Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir
1995-12-01
Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.
Generating DEM from LIDAR data - comparison of available software tools
NASA Astrophysics Data System (ADS)
Korzeniowska, K.; Lacka, M.
2011-12-01
In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.
Formal Specification and Analysis of Cloud Computing Management
2012-01-24
te r Cloud Computing in a Nutshell We begin this introduction to Cloud Computing with a famous quote by Larry Ellison: “The interesting thing about...the wording of some of our ads.” — Larry Ellison, Oracle CEO [106] In view of this statement, we summarize the essential aspects of Cloud Computing...1] M. Abadi, M. Burrows , M. Manasse, and T. Wobber. Moderately hard, memory-bound functions. ACM Transactions on Internet Technology, 5(2):299–327
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-03
... Concrete Steel Wire Strand A-580-852 1/1/10--12/31/10 Top-of-the Stove Stainless Steel Cooking Ware\\2\\ A... South Korea: Top-of-the-Stove Stainless Steel Cooking Ware \\3\\ C-580-602 1/1/10--11/21/10 The People's... antidumping duty order on Top-of-the Stove Stainless Steel Cooking Ware was revoked due to sunset review...
2010-09-01
Cloud computing describes a new distributed computing paradigm for IT data and services that involves over-the-Internet provision of dynamically scalable and often virtualized resources. While cost reduction and flexibility in storage, services, and maintenance are important considerations when deciding on whether or how to migrate data and applications to the cloud, large organizations like the Department of Defense need to consider the organization and structure of data on the cloud and the operations on such data in order to reap the full benefit of cloud
Advanced cloud fault tolerance system
NASA Astrophysics Data System (ADS)
Sumangali, K.; Benny, Niketa
2017-11-01
Cloud computing has become a prevalent on-demand service on the internet to store, manage and process data. A pitfall that accompanies cloud computing is the failures that can be encountered in the cloud. To overcome these failures, we require a fault tolerance mechanism to abstract faults from users. We have proposed a fault tolerant architecture, which is a combination of proactive and reactive fault tolerance. This architecture essentially increases the reliability and the availability of the cloud. In the future, we would like to compare evaluations of our proposed architecture with existing architectures and further improve it.
2014-09-01
resources, and generate large amounts of food and solid waste daily. Almost all Contingency Basecamp (CB) DFACs provide individual paper and plastic ware...which is costly in terms of purchase, transportation, and disposal. This work analyzed the effects of replacing paper and plastic ware with...reusable materials, and of adding industrial dishwashers to re- duce the logistical burden of using paper and plastic ware. Additional en- hancements
Where the Cloud Meets the Commons
ERIC Educational Resources Information Center
Ipri, Tom
2011-01-01
Changes presented by cloud computing--shared computing services, applications, and storage available to end users via the Internet--have the potential to seriously alter how libraries provide services, not only remotely, but also within the physical library, specifically concerning challenges facing the typical desktop computing experience.…
Machine Learning for Knowledge Extraction from PHR Big Data.
Poulymenopoulou, Michaela; Malamateniou, Flora; Vassilacopoulos, George
2014-01-01
Cloud computing, Internet of things (IOT) and NoSQL database technologies can support a new generation of cloud-based PHR services that contain heterogeneous (unstructured, semi-structured and structured) patient data (health, social and lifestyle) from various sources, including automatically transmitted data from Internet connected devices of patient living space (e.g. medical devices connected to patients at home care). The patient data stored in such PHR systems constitute big data whose analysis with the use of appropriate machine learning algorithms is expected to improve diagnosis and treatment accuracy, to cut healthcare costs and, hence, to improve the overall quality and efficiency of healthcare provided. This paper describes a health data analytics engine which uses machine learning algorithms for analyzing cloud based PHR big health data towards knowledge extraction to support better healthcare delivery as regards disease diagnosis and prognosis. This engine comprises of the data preparation, the model generation and the data analysis modules and runs on the cloud taking advantage from the map/reduce paradigm provided by Apache Hadoop.
Cloud Computing Based E-Learning System
ERIC Educational Resources Information Center
Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.
2010-01-01
Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…
USDA-ARS?s Scientific Manuscript database
Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...
Wink, Diane M
2012-01-01
In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.
Learning in the cloud: a new challenge for a global teaching system in optics and photonics
NASA Astrophysics Data System (ADS)
Sultana, Razia; Christ, Andreas; Feisst, Markus; Curticapean, Dan
2014-07-01
Nowadays, it is assumed of many applications, companies and parts of the society to be always available online. However, according to [Times, Oct, 31 2011], 73% of the world population do not use the internet and thus aren't "online" at all. The most common reasons for not being "online" are expensive personal computer equipment and high costs for data connections, especially in developing countries that comprise most of the world's population (e.g. parts of Africa, Asia, Central and South America). However it seems that these countries are leap-frogging the "PC and landline" age and moving directly to the "mobile" age. Decreasing prices for smart phones with internet connectivity and PC-like operating systems make it more affordable for these parts of the world population to join the "always-online" community. Storing learning content in a way accessible to everyone, including mobile and smart phones, seems therefore to be beneficial. This way, learning content can be accessed by personal computers as well as by mobile and smart phones and thus be accessible for a big range of devices and users. A new trend in the Internet technologies is to go to "the cloud". This paper discusses the changes, challenges and risks of storing learning content in the "cloud". The experiences were gathered during the evaluation of the necessary changes in order to make our solutions and systems "cloud-ready".
Klonoff, David C
2017-07-01
The Internet of Things (IoT) is generating an immense volume of data. With cloud computing, medical sensor and actuator data can be stored and analyzed remotely by distributed servers. The results can then be delivered via the Internet. The number of devices in IoT includes such wireless diabetes devices as blood glucose monitors, continuous glucose monitors, insulin pens, insulin pumps, and closed-loop systems. The cloud model for data storage and analysis is increasingly unable to process the data avalanche, and processing is being pushed out to the edge of the network closer to where the data-generating devices are. Fog computing and edge computing are two architectures for data handling that can offload data from the cloud, process it nearby the patient, and transmit information machine-to-machine or machine-to-human in milliseconds or seconds. Sensor data can be processed near the sensing and actuating devices with fog computing (with local nodes) and with edge computing (within the sensing devices). Compared to cloud computing, fog computing and edge computing offer five advantages: (1) greater data transmission speed, (2) less dependence on limited bandwidths, (3) greater privacy and security, (4) greater control over data generated in foreign countries where laws may limit use or permit unwanted governmental access, and (5) lower costs because more sensor-derived data are used locally and less data are transmitted remotely. Connected diabetes devices almost all use fog computing or edge computing because diabetes patients require a very rapid response to sensor input and cannot tolerate delays for cloud computing.
Modelling operations and security of cloud systems using Z-notation and Chinese Wall security policy
NASA Astrophysics Data System (ADS)
Basu, Srijita; Sengupta, Anirban; Mazumdar, Chandan
2016-11-01
Enterprises are increasingly using cloud computing for hosting their applications. Availability of fast Internet and cheap bandwidth are causing greater number of people to use cloud-based services. This has the advantage of lower cost and minimum maintenance. However, ensuring security of user data and proper management of cloud infrastructure remain major areas of concern. Existing techniques are either too complex, or fail to properly represent the actual cloud scenario. This article presents a formal cloud model using the constructs of Z-notation. Principles of the Chinese Wall security policy have been applied to design secure cloud-specific operations. The proposed methodology will enable users to safely host their services, as well as process sensitive data, on cloud.
Study on the application of mobile internet cloud computing platform
NASA Astrophysics Data System (ADS)
Gong, Songchun; Fu, Songyin; Chen, Zheng
2012-04-01
The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.
Universal Keyword Classifier on Public Key Based Encrypted Multikeyword Fuzzy Search in Public Cloud
Munisamy, Shyamala Devi; Chokkalingam, Arun
2015-01-01
Cloud computing has pioneered the emerging world by manifesting itself as a service through internet and facilitates third party infrastructure and applications. While customers have no visibility on how their data is stored on service provider's premises, it offers greater benefits in lowering infrastructure costs and delivering more flexibility and simplicity in managing private data. The opportunity to use cloud services on pay-per-use basis provides comfort for private data owners in managing costs and data. With the pervasive usage of internet, the focus has now shifted towards effective data utilization on the cloud without compromising security concerns. In the pursuit of increasing data utilization on public cloud storage, the key is to make effective data access through several fuzzy searching techniques. In this paper, we have discussed the existing fuzzy searching techniques and focused on reducing the searching time on the cloud storage server for effective data utilization. Our proposed Asymmetric Classifier Multikeyword Fuzzy Search method provides classifier search server that creates universal keyword classifier for the multiple keyword request which greatly reduces the searching time by learning the search path pattern for all the keywords in the fuzzy keyword set. The objective of using BTree fuzzy searchable index is to resolve typos and representation inconsistencies and also to facilitate effective data utilization. PMID:26380364
Munisamy, Shyamala Devi; Chokkalingam, Arun
2015-01-01
Cloud computing has pioneered the emerging world by manifesting itself as a service through internet and facilitates third party infrastructure and applications. While customers have no visibility on how their data is stored on service provider's premises, it offers greater benefits in lowering infrastructure costs and delivering more flexibility and simplicity in managing private data. The opportunity to use cloud services on pay-per-use basis provides comfort for private data owners in managing costs and data. With the pervasive usage of internet, the focus has now shifted towards effective data utilization on the cloud without compromising security concerns. In the pursuit of increasing data utilization on public cloud storage, the key is to make effective data access through several fuzzy searching techniques. In this paper, we have discussed the existing fuzzy searching techniques and focused on reducing the searching time on the cloud storage server for effective data utilization. Our proposed Asymmetric Classifier Multikeyword Fuzzy Search method provides classifier search server that creates universal keyword classifier for the multiple keyword request which greatly reduces the searching time by learning the search path pattern for all the keywords in the fuzzy keyword set. The objective of using BTree fuzzy searchable index is to resolve typos and representation inconsistencies and also to facilitate effective data utilization.
75 FR 80042 - Information Privacy and Innovation in the Internet Economy
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-21
... statistics that provide evidence of concern--or comments explaining why concerns are unwarranted--about cloud computing data privacy and security in the commercial context. We also seek data that links any such concerns to decisions to adopt, or refrain from adopting, cloud computing services. (41) The Task Force...
ERIC Educational Resources Information Center
Bull, Glen; Garofalo, Joe
2010-01-01
The ability to move from one representation of data to another is one of the key characteristics of expert mathematicians and scientists. Cloud computing will offer more opportunities to create and display multiple representations of data, making this skill even more important in the future. The advent of the Internet led to widespread…
Basken, Robyn; Bazzell, Charles M; Smith, Richard; Janardhanan, Rajesh; Khalpey, Zain
2017-07-01
Device thrombosis is a devastating complication of left ventricular assist devices. The definitive treatment has been device exchange or explant. Evidence of increasing morbidity and mortality with device exchange has shifted strategies toward conservative management. In this report, we detail the use of thrombolytics as salvage therapy in a patient with an occlusive HeartWare ventricular assist device (HeartWare Inc., Framingham, MA) thrombus, resulting in long-term survival without further intervention. © 2017 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Murata, K. T.
2014-12-01
Data-intensive or data-centric science is 4th paradigm after observational and/or experimental science (1st paradigm), theoretical science (2nd paradigm) and numerical science (3rd paradigm). Science cloud is an infrastructure for 4th science methodology. The NICT science cloud is designed for big data sciences of Earth, space and other sciences based on modern informatics and information technologies [1]. Data flow on the cloud is through the following three techniques; (1) data crawling and transfer, (2) data preservation and stewardship, and (3) data processing and visualization. Original tools and applications of these techniques have been designed and implemented. We mash up these tools and applications on the NICT Science Cloud to build up customized systems for each project. In this paper, we discuss science data processing through these three steps. For big data science, data file deployment on a distributed storage system should be well designed in order to save storage cost and transfer time. We developed a high-bandwidth virtual remote storage system (HbVRS) and data crawling tool, NICTY/DLA and Wide-area Observation Network Monitoring (WONM) system, respectively. Data files are saved on the cloud storage system according to both data preservation policy and data processing plan. The storage system is developed via distributed file system middle-ware (Gfarm: GRID datafarm). It is effective since disaster recovery (DR) and parallel data processing are carried out simultaneously without moving these big data from storage to storage. Data files are managed on our Web application, WSDBank (World Science Data Bank). The big-data on the cloud are processed via Pwrake, which is a workflow tool with high-bandwidth of I/O. There are several visualization tools on the cloud; VirtualAurora for magnetosphere and ionosphere, VDVGE for google Earth, STICKER for urban environment data and STARStouch for multi-disciplinary data. There are 30 projects running on the NICT Science Cloud for Earth and space science. In 2003 56 refereed papers were published. At the end, we introduce a couple of successful results of Earth and space sciences using these three techniques carried out on the NICT Sciences Cloud. [1] http://sc-web.nict.go.jp
A Novel Market-Oriented Dynamic Collaborative Cloud Service Platform
NASA Astrophysics Data System (ADS)
Hassan, Mohammad Mehedi; Huh, Eui-Nam
In today's world the emerging Cloud computing (Weiss, 2007) offer a new computing model where resources such as computing power, storage, online applications and networking infrastructures can be shared as "services" over the internet. Cloud providers (CPs) are incentivized by the profits to be made by charging consumers for accessing these services. Consumers, such as enterprises, are attracted by the opportunity for reducing or eliminating costs associated with "in-house" provision of these services.
Drews, Thorsten; Potapov, Evgenij; Weng, Yugo; Pasic, Miralem; Hetzer, Roland
2014-01-01
Objective This manuscript summarizes our surgical experience with the implantation of recent continuous-flow left ventricular assist devices (LVADs), with special emphasis on the HeartWare HVAD pump. Methods The HeartWare HVAD is, in our experience currently implanted in four different techniques: (I) “Classical” LVAD implantation with heart-lung machine and median sternotomy; (II) “Minimally-invasive” implantation without sternotomy and without heart-lung machine; (III) “Lateral implantation” to the descending aorta; (IV) Using two continuous-flow LVADs for implantable biventricular support. Results Five-hundred and four HeartWare HVADs have been implanted using the described techniques in our institution up to now. Conclusions The HeartWare HVAD is a versatile device. It has been found to be eminently suited to these four different modes of implantation. PMID:25452906
Designing Albaha Internet of Farming Architecture
NASA Astrophysics Data System (ADS)
Alahmadi, A.
2017-04-01
Up to now, most farmers in Albaha, Saudi Arabia are still practicing traditional way, which is not optimized in term of water usage, quality of product, etc. At the same time, nowadays ICT becomes a key driver for Innovation in Farming. In this project, we propose a smart Internet of farming system to assist farmers in Albaha to optimize their farm productivity by providing accurate information to the farmers the right time prediction to harvest, to fertilize, to watering and other activities related to the farming/agriculture technology. The proposed system utilizes wireless sensor cloud to capture remotely important data such as temperature, humidity, soil condition (moisture, water level), etc., and then they are sent to a storage servers at Albaha University cloud. An adaptive knowledge engine will process the captured data into knowledge and the farmers can retrieve the knowledge using their smartphones via the Internet.
Mobile Crowd Sensing for Traffic Prediction in Internet of Vehicles.
Wan, Jiafu; Liu, Jianqi; Shao, Zehui; Vasilakos, Athanasios V; Imran, Muhammad; Zhou, Keliang
2016-01-11
The advances in wireless communication techniques, mobile cloud computing, automotive and intelligent terminal technology are driving the evolution of vehicle ad hoc networks into the Internet of Vehicles (IoV) paradigm. This leads to a change in the vehicle routing problem from a calculation based on static data towards real-time traffic prediction. In this paper, we first address the taxonomy of cloud-assisted IoV from the viewpoint of the service relationship between cloud computing and IoV. Then, we review the traditional traffic prediction approached used by both Vehicle to Infrastructure (V2I) and Vehicle to Vehicle (V2V) communications. On this basis, we propose a mobile crowd sensing technology to support the creation of dynamic route choices for drivers wishing to avoid congestion. Experiments were carried out to verify the proposed approaches. Finally, we discuss the outlook of reliable traffic prediction.
Mobile Crowd Sensing for Traffic Prediction in Internet of Vehicles
Wan, Jiafu; Liu, Jianqi; Shao, Zehui; Vasilakos, Athanasios V.; Imran, Muhammad; Zhou, Keliang
2016-01-01
The advances in wireless communication techniques, mobile cloud computing, automotive and intelligent terminal technology are driving the evolution of vehicle ad hoc networks into the Internet of Vehicles (IoV) paradigm. This leads to a change in the vehicle routing problem from a calculation based on static data towards real-time traffic prediction. In this paper, we first address the taxonomy of cloud-assisted IoV from the viewpoint of the service relationship between cloud computing and IoV. Then, we review the traditional traffic prediction approached used by both Vehicle to Infrastructure (V2I) and Vehicle to Vehicle (V2V) communications. On this basis, we propose a mobile crowd sensing technology to support the creation of dynamic route choices for drivers wishing to avoid congestion. Experiments were carried out to verify the proposed approaches. Finally, we discuss the outlook of reliable traffic prediction. PMID:26761013
Cloud-based image sharing network for collaborative imaging diagnosis and consultation
NASA Astrophysics Data System (ADS)
Yang, Yuanyuan; Gu, Yiping; Wang, Mingqing; Sun, Jianyong; Li, Ming; Zhang, Weiqiang; Zhang, Jianguo
2018-03-01
In this presentation, we presented a new approach to design cloud-based image sharing network for collaborative imaging diagnosis and consultation through Internet, which can enable radiologists, specialists and physicians locating in different sites collaboratively and interactively to do imaging diagnosis or consultation for difficult or emergency cases. The designed network combined a regional RIS, grid-based image distribution management, an integrated video conferencing system and multi-platform interactive image display devices together with secured messaging and data communication. There are three kinds of components in the network: edge server, grid-based imaging documents registry and repository, and multi-platform display devices. This network has been deployed in a public cloud platform of Alibaba through Internet since March 2017 and used for small lung nodule or early staging lung cancer diagnosis services between Radiology departments of Huadong hospital in Shanghai and the First Hospital of Jiaxing in Zhejiang Province.
Job Scheduling with Efficient Resource Monitoring in Cloud Datacenter
Loganathan, Shyamala; Mukherjee, Saswati
2015-01-01
Cloud computing is an on-demand computing model, which uses virtualization technology to provide cloud resources to users in the form of virtual machines through internet. Being an adaptable technology, cloud computing is an excellent alternative for organizations for forming their own private cloud. Since the resources are limited in these private clouds maximizing the utilization of resources and giving the guaranteed service for the user are the ultimate goal. For that, efficient scheduling is needed. This research reports on an efficient data structure for resource management and resource scheduling technique in a private cloud environment and discusses a cloud model. The proposed scheduling algorithm considers the types of jobs and the resource availability in its scheduling decision. Finally, we conducted simulations using CloudSim and compared our algorithm with other existing methods, like V-MCT and priority scheduling algorithms. PMID:26473166
Job Scheduling with Efficient Resource Monitoring in Cloud Datacenter.
Loganathan, Shyamala; Mukherjee, Saswati
2015-01-01
Cloud computing is an on-demand computing model, which uses virtualization technology to provide cloud resources to users in the form of virtual machines through internet. Being an adaptable technology, cloud computing is an excellent alternative for organizations for forming their own private cloud. Since the resources are limited in these private clouds maximizing the utilization of resources and giving the guaranteed service for the user are the ultimate goal. For that, efficient scheduling is needed. This research reports on an efficient data structure for resource management and resource scheduling technique in a private cloud environment and discusses a cloud model. The proposed scheduling algorithm considers the types of jobs and the resource availability in its scheduling decision. Finally, we conducted simulations using CloudSim and compared our algorithm with other existing methods, like V-MCT and priority scheduling algorithms.
A Platform for Learning Internet of Things
ERIC Educational Resources Information Center
Bogdanovic, Zorica; Simic, Konstantin; Milutinovic, Miloš; Radenkovic, Božidar; Despotovic-Zrakic, Marijana
2014-01-01
This paper presents a model for conducting Internet of Things (IoT) classes based on a web-service oriented cloud platform. The goal of the designed model is to provide university students with knowledge about IoT concepts, possibilities, and business models, and allow them to develop basic system prototypes using general-purpose microdevices and…
Cloud Computing and Validated Learning for Accelerating Innovation in IoT
ERIC Educational Resources Information Center
Suciu, George; Todoran, Gyorgy; Vulpe, Alexandru; Suciu, Victor; Bulca, Cristina; Cheveresan, Romulus
2015-01-01
Innovation in Internet of Things (IoT) requires more than just creation of technology and use of cloud computing or big data platforms. It requires accelerated commercialization or aptly called go-to-market processes. To successfully accelerate, companies need a new type of product development, the so-called validated learning process.…
NASA Astrophysics Data System (ADS)
Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao
In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.
Superior accuracy of model-based radiostereometric analysis for measurement of polyethylene wear
Stilling, M.; Kold, S.; de Raedt, S.; Andersen, N. T.; Rahbek, O.; Søballe, K.
2012-01-01
Objectives The accuracy and precision of two new methods of model-based radiostereometric analysis (RSA) were hypothesised to be superior to a plain radiograph method in the assessment of polyethylene (PE) wear. Methods A phantom device was constructed to simulate three-dimensional (3D) PE wear. Images were obtained consecutively for each simulated wear position for each modality. Three commercially available packages were evaluated: model-based RSA using laser-scanned cup models (MB-RSA), model-based RSA using computer-generated elementary geometrical shape models (EGS-RSA), and PolyWare. Precision (95% repeatability limits) and accuracy (Root Mean Square Errors) for two-dimensional (2D) and 3D wear measurements were assessed. Results The precision for 2D wear measures was 0.078 mm, 0.102 mm, and 0.076 mm for EGS-RSA, MB-RSA, and PolyWare, respectively. For the 3D wear measures the precision was 0.185 mm, 0.189 mm, and 0.244 mm for EGS-RSA, MB-RSA, and PolyWare respectively. Repeatability was similar for all methods within the same dimension, when compared between 2D and 3D (all p > 0.28). For the 2D RSA methods, accuracy was below 0.055 mm and at least 0.335 mm for PolyWare. For 3D measurements, accuracy was 0.1 mm, 0.2 mm, and 0.3 mm for EGS-RSA, MB-RSA and PolyWare respectively. PolyWare was less accurate compared with RSA methods (p = 0.036). No difference was observed between the RSA methods (p = 0.10). Conclusions For all methods, precision and accuracy were better in 2D, with RSA methods being superior in accuracy. Although less accurate and precise, 3D RSA defines the clinically relevant wear pattern (multidirectional). PolyWare is a good and low-cost alternative to RSA, despite being less accurate and requiring a larger sample size. PMID:23610688
Cho, Sung M; Moazami, Nader; Frontera, Jennifer A
2017-08-01
Ischemic stroke and intracranial hemorrhage (ICH) following left ventricular assist device (LVAD) placement are major causes of morbidity. The incidence and mortality associated with these events stratified by device type have not been systematically explored. A systematic review of PubMed was conducted from January 2007 through June 2016 for all English-language articles involving HeartMate II (HMII) and HeartWare LVAD patients. Ischemic stroke and/or ICH incidence (events per patient-year) and associated mortality rates were abstracted for each device type. Of 735 articles reviewed, 48 (11,310 patients) met inclusion criteria (33 HMII, six HeartWare, eight both devices, and one unspecified). The median duration of device support was 112 days (total 13,723 patient-years). Overall, ischemic stroke or ICH occurred in 9.8% (1110 persons and 0.08 events per patient year [EPPY]). Ischemic stroke occurred in a median of 6.0% or 0.06 EPPY (range 0-16% or 0-0.21 EPPY) of HMII patients versus 7.5% or 0.09 EPPY (range 4-17.1% or 0.01-0.94 EPPY) of HeartWare patients. ICH occurred in a median of 3.0% or 0.04 EPPY (range 0-13.5% or 0-0.13 EPPY) of HMII and 8.0% or 0.08 EPPY (range 3-23% or 0.01-0.56 EPPY) of HeartWare patients. The median mortality rate for LVAD-associated ischemic stroke was 31% (HMII: 33%, [range 2.4-75%] and HeartWare: 11.5% [range 3.9-40%]), and the median mortality rate following ICH was 71% (HMII: 75%, [range 3.9-100%] and HeartWare: 44%, [range 3.1-88%]). Ischemic stroke and ICH are common after LVAD placement, but heterogeneous event rates are reported in the literature. Given the high associated mortality, further prospective study is warranted.
The Role of Networks in Cloud Computing
NASA Astrophysics Data System (ADS)
Lin, Geng; Devine, Mac
The confluence of technology advancements and business developments in Broadband Internet, Web services, computing systems, and application software over the past decade has created a perfect storm for cloud computing. The "cloud model" of delivering and consuming IT functions as services is poised to fundamentally transform the IT industry and rebalance the inter-relationships among end users, enterprise IT, software companies, and the service providers in the IT ecosystem (Armbrust et al., 2009; Lin, Fu, Zhu, & Dasmalchi, 2009).
mPano: cloud-based mobile panorama view from single picture
NASA Astrophysics Data System (ADS)
Li, Hongzhi; Zhu, Wenwu
2013-09-01
Panorama view provides people an informative and natural user experience to represent the whole scene. The advances on mobile augmented reality, mobile-cloud computing, and mobile internet can enable panorama view on mobile phone with new functionalities, such as anytime anywhere query where a landmark picture is and what the whole scene looks like. To generate and explore panorama view on mobile devices faces significant challenges due to the limitations of computing capacity, battery life, and memory size of mobile phones, as well as the bandwidth of mobile Internet connection. To address the challenges, this paper presents a novel cloud-based mobile panorama view system that can generate and view panorama-view on mobile devices from a single picture, namely "Pano". In our system, first, we propose a novel iterative multi-modal image retrieval (IMIR) approach to get spatially adjacent images using both tag and content information from the single picture. Second, we propose a cloud-based parallel server synthing approach to generate panorama view in cloud, against today's local-client synthing approach that is almost impossible for mobile phones. Third, we propose predictive-cache solution to reduce latency of image delivery from cloud server to the mobile client. We have built a real mobile panorama view system and perform experiments. The experimental results demonstrated the effectiveness of our system and the proposed key component technologies, especially for landmark images.
Cloud Computing for radiologists.
Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit
2012-07-01
Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.
Cloud Computing for radiologists
Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit
2012-01-01
Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future. PMID:23599560
XpressWare Installation User guide
NASA Astrophysics Data System (ADS)
Duffey, K. P.
XpressWare is a set of X terminal software, released by Tektronix Inc, that accommodates the X Window system on a range of host computers. The software comprises boot files (the X server image), configuration files, fonts, and font tools to support the X terminal. The files can be installed on one host or distributed across multiple hosts The purpose of this guide is to present the system or network administrator with a step-by-step account of how to install XpressWare, and how subsequently to configure the X terminals appropriately for the environment in which they operate.
Power laws for the backscattering matrices in the case of lidar sensing of cirrus clouds
NASA Astrophysics Data System (ADS)
Kustova, Natalia V.; Konoshonkin, Alexander V.; Borovoi, Anatoli; Okamoto, Hajime; Sato, Kaori; Katagiri, Shuichiro
2017-11-01
The data bank for the backscattering matrixes of cirrus clouds that was calculated earlier by the authors and was available in the internet for free access has been replaced in the case of randomly oriented crystals by simple analytic equations. Four microphysical ratios conventionally measured by lidars have been calculated for different shapes and the effective size of the crystals. These values could be used for retrieving shapes of the crystals in cirrus clouds.
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj
2016-04-01
Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.
Strueber, Martin; Larbalestier, Robert; Jansz, Paul; Zimpfer, Daniel; Fiane, Arnt E; Tsui, Steven; Simon, André; Schmitto, Jan D; Khaghani, Asghar; Wieselthaler, George M; Najarian, Kevin; Schueler, Stephan
2014-05-01
The post-market Registry to Evaluate the HeartWare Left Ventricular Assist System (ReVOLVE) is an investigator-initiated registry established to collect post-CE Mark Trial clinical data on patients receiving a HeartWare ventricular assist device (HVAD) in the European Union and Australia. The ReVOLVE is a multi-center, prospective, single-arm registry performed at seven centers in Europe and two in Australia. Herein we describe a total of 254 commercial HVAD implants according to labeled indications between February 2009 and November 2012. Summary statistics included patients' demographics, adverse events, length of support and outcomes. Compared with the clinical trial supporting the CE Mark of the HeartWare system, patient selection differed in that patients were older, and there were higher proportions of females and patients with idiopathic cardiomyopathies in the ReVOLVE cohort. Duration of support ranged from 1 to 1,057 days, with a mean of 363 ± 280 days (median 299.5 days). Transplantation was done in 56 patients (22%), explant for recovery was performed in 3 patients (1%), 43 died while on support (17%), and 152 (60%) remain on the device. Success in patients with the HeartWare system was 87% at 6 months, 85% at 1 year, 79% at 2 years and 73% at 3 years. Adverse event rates were low, comparable or improved when compared to the CE Mark Trial. Real-world use of the HeartWare system continues to demonstrate excellent clinical outcomes in patients supported with the device. Copyright © 2014 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Cloud Computing - A Unified Approach for Surveillance Issues
NASA Astrophysics Data System (ADS)
Rachana, C. R.; Banu, Reshma, Dr.; Ahammed, G. F. Ali, Dr.; Parameshachari, B. D., Dr.
2017-08-01
Cloud computing describes highly scalable resources provided as an external service via the Internet on a basis of pay-per-use. From the economic point of view, the main attractiveness of cloud computing is that users only use what they need, and only pay for what they actually use. Resources are available for access from the cloud at any time, and from any location through networks. Cloud computing is gradually replacing the traditional Information Technology Infrastructure. Securing data is one of the leading concerns and biggest issue for cloud computing. Privacy of information is always a crucial pointespecially when an individual’s personalinformation or sensitive information is beingstored in the organization. It is indeed true that today; cloud authorization systems are notrobust enough. This paper presents a unified approach for analyzing the various security issues and techniques to overcome the challenges in the cloud environment.
Security Risks of Cloud Computing and Its Emergence as 5th Utility Service
NASA Astrophysics Data System (ADS)
Ahmad, Mushtaq
Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.
Generic Module for Collecting Data in Smart Cities
NASA Astrophysics Data System (ADS)
Martinez, A.; Ramirez, F.; Estrada, H.; Torres, L. A.
2017-09-01
The Future Internet brings new technologies to the common life of people, such as Internet of Things, Cloud Computing or Big Data. All this technologies have change the way people communicate and also the way the devices interact with the context, giving rise to new paradigms, as the case of smart cities. Currently, the mobile devices represent one of main sources of information for new applications that take into account the user context, such as apps for mobility, health, of security. Several platforms have been proposed that consider the development of Future Internet applications, however, no generic modules can be found that implement the collection of context data from smartphones. In this research work we present a generic module to collect data from different sensors of the mobile devices and also to send, in a standard manner, this data to the Open FIWARE Cloud to be stored or analyzed by software tools. The proposed module enables the human-as-a-sensor approach for FIWARE Platform.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-11
... KOREA: Top-of-the Stove Stainless Steel Cooking Ware A-580-601 1/1/09 - 12/31/09 THAILAND: Prestressed...-of-the-Stove Stainless Steel Cooking Ware C-580-602 1/1/09 - 12/31/09 Suspension Agreements MEXICO...
HeartWare HVAD for Biventricular Support in Children and Adolescents: The Stanford Experience.
Stein, Mary Lyn; Yeh, Justin; Reinhartz, Olaf; Rosenthal, David N; Kaufman, Beth D; Almond, Chris S; Hollander, Seth A; Maeda, Katsuhide
2016-01-01
Despite increasing use of mechanical circulatory support in children, experience with biventricular device implantation remains limited. We describe our experience using the HeartWare HVAD to provide biventricular support to three patients and compare these patients with five patients supported with HeartWare left ventricular assist device (LVAD). At the end of the study period, all three biventricular assist device (BiVAD) patients had been transplanted and were alive. LVAD patients were out of bed and ambulating a median of 10.5 days postimplantation. The BiVAD patients were out of bed a median of 31 days postimplantation. Pediatric patients with both left ventricular and biventricular heart failure can be successfully bridged to transplantation with the HeartWare HVAD. Rapid improvement in functional status following HVAD implantation for isolated left ventricular support is seen. Patients supported with BiVAD also demonstrate functional recovery, albeit more modestly. In the absence of infection, systemic inflammatory response raises concern for inadequate support.
ERIC Educational Resources Information Center
Ekufu, ThankGod K.
2012-01-01
Organizations are finding it difficult in today's economy to implement the vast information technology infrastructure required to effectively conduct their business operations. Despite the fact that some of these organizations are leveraging on the computational powers and the cost-saving benefits of computing on the Internet cloud, others…
46 CFR 390.5 - Agreement vessels.
Code of Federal Regulations, 2010 CFR
2010-10-01
... water-borne carriage of men, materials, goods or wares between: (i) Two points in the United States; (ii..., Great Lakes, noncontiguous domestic, or short sea transportation trade. (iv) Engaged primarily in the water-borne carriage of men, materials, goods or wares; and (v) Designated in the agreement as a...
46 CFR 390.5 - Agreement vessels.
Code of Federal Regulations, 2012 CFR
2012-10-01
... water-borne carriage of men, materials, goods or wares between: (i) Two points in the United States; (ii..., Great Lakes, noncontiguous domestic, or short sea transportation trade. (iv) Engaged primarily in the water-borne carriage of men, materials, goods or wares; and (v) Designated in the agreement as a...
46 CFR 390.5 - Agreement vessels.
Code of Federal Regulations, 2013 CFR
2013-10-01
... water-borne carriage of men, materials, goods or wares between: (i) Two points in the United States; (ii..., Great Lakes, noncontiguous domestic, or short sea transportation trade. (iv) Engaged primarily in the water-borne carriage of men, materials, goods or wares; and (v) Designated in the agreement as a...
46 CFR 390.5 - Agreement vessels.
Code of Federal Regulations, 2014 CFR
2014-10-01
... water-borne carriage of men, materials, goods or wares between: (i) Two points in the United States; (ii..., Great Lakes, noncontiguous domestic, or short sea transportation trade. (iv) Engaged primarily in the water-borne carriage of men, materials, goods or wares; and (v) Designated in the agreement as a...
46 CFR 390.5 - Agreement vessels.
Code of Federal Regulations, 2011 CFR
2011-10-01
... water-borne carriage of men, materials, goods or wares between: (i) Two points in the United States; (ii..., Great Lakes, noncontiguous domestic, or short sea transportation trade. (iv) Engaged primarily in the water-borne carriage of men, materials, goods or wares; and (v) Designated in the agreement as a...
78 FR 60245 - Privacy Act Systems of Records; LabWare Laboratory Information Management System
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-01
... Services Laboratories (NVSL). Diagnostic testing provides official test results for animal imports, exports.... Diagnostic testing is also done in connection with suspected foreign animal disease investigations and... of Records; LabWare Laboratory Information Management System AGENCY: Animal and Plant Health...
NASA Astrophysics Data System (ADS)
Smith, R.; Kasprzyk, J. R.; Zagona, E. A.
2015-12-01
Instead of building new infrastructure to increase their supply reliability, water resource managers are often tasked with better management of current systems. The managers often have existing simulation models that aid their planning, and lack methods for efficiently generating and evaluating planning alternatives. This presentation discusses how multiobjective evolutionary algorithm (MOEA) decision support can be used with the sophisticated water infrastructure model, RiverWare, in highly constrained water planning environments. We first discuss a study that performed a many-objective tradeoff analysis of water supply in the Tarrant Regional Water District (TRWD) in Texas. RiverWare is combined with the Borg MOEA to solve a seven objective problem that includes systemwide performance objectives and individual reservoir storage reliability. Decisions within the formulation balance supply in multiple reservoirs and control pumping between the eastern and western parts of the system. The RiverWare simulation model is forced by two stochastic hydrology scenarios to inform how management changes in wet versus dry conditions. The second part of the presentation suggests how a broader set of RiverWare-MOEA studies can inform tradeoffs in other systems, especially in political situations where multiple actors are in conflict over finite water resources. By incorporating quantitative representations of diverse parties' objectives during the search for solutions, MOEAs may provide support for negotiations and lead to more widely beneficial water management outcomes.
Lu, Heqing; Zhang, Xiaofeng; Li, Bin
2017-09-30
Through illustrating the designing of high-risk pregnancy maternal-fetal monitoring system based on the internet of things, this paper introduced the specific application of using wearable medical devices to provide maternal-fetal mobile medical services. With the help of big data and cloud obstetrics platform, the monitoring and warning network was further improved, the level-to-level administration of high-risk pregnancy was realized, the level of perinatal health care was enhanced and the risk of critical emergency of pregnancy decreased.
CD-ROM Networking: Navigating through VINES and NetWare and the New Software Technologies.
ERIC Educational Resources Information Center
Lieberman, Paula
1995-01-01
Provides an overview of developments in CD-ROM networking technology and describes products offered by Axis, Banyan (VINES--network operating environment), CD Connection, Celerity, Data/Ware, Document Imaging Systems Corporation (DISC), Imagery, Jodian, Meridian, Micro Design International, Microsoft, Microtest, Novell, OnLine Computer Systems,…
Research on Data Mining of the Internet of Things Based on Cloud Computing Platform
NASA Astrophysics Data System (ADS)
Zhang, Wenqing
2018-02-01
Based on the development of society and the progress of information technology, China’s information industry has made great progress and has gradually become an important pillar of national economic development. In this context, the gradual integration of information technology had promoted the construction of the Internet of Things system, so as to promote the human life developed in the direction of modernization intelligently. At present, in the process of forming the development of the Internet of Things the first need to fully tap the data, which thus provide users with better service, for the development of large-scale development of the Internet. This paper analyzes the meaning of Internet of things, and discusses the characteristics of Internet of things and data mining, hoping to promote the improvement on the Internet of Things system in China, and thus promote the realization of higher efficiency.
A novel wearable device for continuous, non-invasion blood pressure measurement.
Xin, Qin; Wu, Jianping
2017-08-01
In this paper, we have developed a wearable cuffless device for daily blood pressure (BP) measurement. We incorporated the light based sensor and other hard wares in a small volume for BP detection. With optimized algorithm, the real-time BP reading could be achieved, the data could be presented in the screen and be transmitted by internet of things (IoT) for history data comparison and multi-terminal viewing. Thus, further analysis provides the probability for diet or sports suggestion and alarm. We have measured BP from more than 60 subjects, compare to traditional mercury blood pressure meter, no obvious error in both systolic blood pressure (SBP) and diastolic blood pressure (DBP) are detected. Such device can be used for continues non-invasion BP detection, and further data docking and health analysis could be achieved. Copyright © 2017. Published by Elsevier Ltd.
Cloudweaver: Adaptive and Data-Driven Workload Manager for Generic Clouds
NASA Astrophysics Data System (ADS)
Li, Rui; Chen, Lei; Li, Wen-Syan
Cloud computing denotes the latest trend in application development for parallel computing on massive data volumes. It relies on clouds of servers to handle tasks that used to be managed by an individual server. With cloud computing, software vendors can provide business intelligence and data analytic services for internet scale data sets. Many open source projects, such as Hadoop, offer various software components that are essential for building a cloud infrastructure. Current Hadoop (and many others) requires users to configure cloud infrastructures via programs and APIs and such configuration is fixed during the runtime. In this chapter, we propose a workload manager (WLM), called CloudWeaver, which provides automated configuration of a cloud infrastructure for runtime execution. The workload management is data-driven and can adapt to dynamic nature of operator throughput during different execution phases. CloudWeaver works for a single job and a workload consisting of multiple jobs running concurrently, which aims at maximum throughput using a minimum set of processors.
A Cloud-Based Internet of Things Platform for Ambient Assisted Living
Cubo, Javier; Nieto, Adrián; Pimentel, Ernesto
2014-01-01
A common feature of ambient intelligence is that many objects are inter-connected and act in unison, which is also a challenge in the Internet of Things. There has been a shift in research towards integrating both concepts, considering the Internet of Things as representing the future of computing and communications. However, the efficient combination and management of heterogeneous things or devices in the ambient intelligence domain is still a tedious task, and it presents crucial challenges. Therefore, to appropriately manage the inter-connection of diverse devices in these systems requires: (1) specifying and efficiently implementing the devices (e.g., as services); (2) handling and verifying their heterogeneity and composition; and (3) standardizing and managing their data, so as to tackle large numbers of systems together, avoiding standalone applications on local servers. To overcome these challenges, this paper proposes a platform to manage the integration and behavior-aware orchestration of heterogeneous devices as services, stored and accessed via the cloud, with the following contributions: (i) we describe a lightweight model to specify the behavior of devices, to determine the order of the sequence of exchanged messages during the composition of devices; (ii) we define a common architecture using a service-oriented standard environment, to integrate heterogeneous devices by means of their interfaces, via a gateway, and to orchestrate them according to their behavior; (iii) we design a framework based on cloud computing technology, connecting the gateway in charge of acquiring the data from the devices with a cloud platform, to remotely access and monitor the data at run-time and react to emergency situations; and (iv) we implement and generate a novel cloud-based IoT platform of behavior-aware devices as services for ambient intelligence systems, validating the whole approach in real scenarios related to a specific ambient assisted living application. PMID:25093343
A cloud-based Internet of Things platform for ambient assisted living.
Cubo, Javier; Nieto, Adrián; Pimentel, Ernesto
2014-08-04
A common feature of ambient intelligence is that many objects are inter-connected and act in unison, which is also a challenge in the Internet of Things. There has been a shift in research towards integrating both concepts, considering the Internet of Things as representing the future of computing and communications. However, the efficient combination and management of heterogeneous things or devices in the ambient intelligence domain is still a tedious task, and it presents crucial challenges. Therefore, to appropriately manage the inter-connection of diverse devices in these systems requires: (1) specifying and efficiently implementing the devices (e.g., as services); (2) handling and verifying their heterogeneity and composition; and (3) standardizing and managing their data, so as to tackle large numbers of systems together, avoiding standalone applications on local servers. To overcome these challenges, this paper proposes a platform to manage the integration and behavior-aware orchestration of heterogeneous devices as services, stored and accessed via the cloud, with the following contributions: (i) we describe a lightweight model to specify the behavior of devices, to determine the order of the sequence of exchanged messages during the composition of devices; (ii) we define a common architecture using a service-oriented standard environment, to integrate heterogeneous devices by means of their interfaces, via a gateway, and to orchestrate them according to their behavior; (iii) we design a framework based on cloud computing technology, connecting the gateway in charge of acquiring the data from the devices with a cloud platform, to remotely access and monitor the data at run-time and react to emergency situations; and (iv) we implement and generate a novel cloud-based IoT platform of behavior-aware devices as services for ambient intelligence systems, validating the whole approach in real scenarios related to a specific ambient assisted living application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-08-01
An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less
Developing cloud-based Business Process Management (BPM): a survey
NASA Astrophysics Data System (ADS)
Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh
2018-03-01
In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.
Impact of OpenCourseWare Publication on Higher Education Participation and Student Recruitment
ERIC Educational Resources Information Center
Carson, Stephen; Kanchanaraksa, Sukon; Gooding, Ira; Mulder, Fred; Schuwer, Robert
2012-01-01
The free and open publication of course materials (OpenCourseWare or OCW) was initially undertaken by Massachusetts Institute of Technology (MIT) and other universities primarily to share educational resources among educators (Abelson, 2007). OCW, however, and more in general open educational resources (OER), have also provided well-documented…
The OpenCourseWare Model: High-Impact Open Educational Content
ERIC Educational Resources Information Center
Carson, Stephen
2007-01-01
OpenCourseWare (OCW) is one among several models for offering open educational resources (OER). This article explains the OCW model and its position within the broader OER context. OCW primarily represents publication of existing course materials already in use for teaching purposes. OCW projects are most often institutional, carrying the…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-01
...-605... 12/1/09-11/30/10 Porcelain-On-Steel Cooking Ware A-583-508........ 12/1/09-11/30/10 Welded Astm... Fittings A-570-881...... 12/1/09-11/30/10 Porcelain-on-Steel Cooking Ware A-570-506........ 12/1/09-11/30...
NASA Astrophysics Data System (ADS)
Shichkina, Y. A.; Kupriyanov, M. S.; Moldachev, S. O.
2018-05-01
Today, a description of various Internet devices very often appears on the Internet. For the efficient operation of the Industrial Internet of things, it is necessary to provide a modern level of data processing starting from getting them from devices ending with returning them to devices in a processed form. Current solutions of the Internet of Things are mainly focused on the development of centralized decisions, projecting the Internet of Things on the set of cloud-based platforms that are open, but limit the ability of participants of the Internet of Things to adapt these systems to their own problems. Therefore, it is often necessary to create specialized software for specific areas of the Internet of Things. This article describes the solution of the problem of virtualization of the system of devices based on the Docker system. This solution allows developers to test any software on any number of devices forming a mesh.
Fine bakery wares with label claims in Europe and their categorisation by nutrient profiling models.
Trichterborn, J; Harzer, G; Kunz, C
2011-03-01
This study assesses a range of commercially available fine bakery wares with nutrition or health related on-pack communication against the criteria of selected nutrient profiling models. Different purposes of the application of nutrient profiles were considered, including front-of-pack signposting and the regulation of claims or advertising. More than 200 commercially available fine bakery wares carrying claims were identified in Germany, France, Spain, Sweden and United Kingdom and evaluated against five nutrient profiling models. All models were assessed regarding their underlying principles, generated results and inter-model agreement levels. Total energy, saturated fatty acids, sugars, sodium and fibre were critical parameters for the categorisation of products. The Choices Programme was the most restrictive model in this category, while the Food and Drug Administration model allowed the highest number of products to qualify. According to all models, more savoury than sweet products met the criteria. On average, qualifying products contained less than half the amounts of nutrients to limit and more than double the amount of fibre compared with all the products in the study. None of the models had a significant impact on the average energy contents. Nutrient profiles can be applied to identify fine bakery wares with a significantly better nutritional composition than the average range of products positioned as healthier. Important parameters to take into account include energy, saturated fatty acids, sugars, sodium and fibre. Different criteria sets for subcategories of fine bakery wares do not seem necessary.
NASA Astrophysics Data System (ADS)
Reddy, A.; Attaelmanan, A. G.; Mouton, M.
2012-07-01
The identification of more than 25% of the pottery sherds from the late PIR.D period (ca. 2nd - mid. 3rd c. AD) assemblage from the recently excavated building H at Mleiha as Indian is based on form and fabric, but using only visual assessment. Petrographic analysis of the fabrics can provide more precise indicators of the geographical origin of the wares. In this study, a total of 21 sherds from various key sites in Western India were compared with 7 different 'Indian' coarse-ware vessels sampled at Mleiha using X-ray fluorescence (XRF) spectrometry. The analyses were conducted on powdered samples collected from the core of each sherd. Each sample was irradiated for 1000 seconds using a 1.2 mm diameter X-ray beam. The resulting spectra were used for quantification of the X-ray intensity and elemental concentration. Levels of correlation in the elemental ratios of the sherds were statistically tested using an F-test as well as a Chi-test. Initial review of the XRF results indicates that the Maharashtra and Gujarat regions of India are probable source areas for at least two of the types of wares. Collection of additional samples from these areas and other regions of India, and further statistical analysis through methods such as Principal Component Analysis will help to isolate groups of wares from India and correlate them with types of vessels imported into the Oman peninsula in antiquity.
A new Information publishing system Based on Internet of things
NASA Astrophysics Data System (ADS)
Zhu, Li; Ma, Guoguang
2018-03-01
A new information publishing system based on Internet of things is proposed, which is composed of four level hierarchical structure, including the screen identification layer, the network transport layer, the service management layer and the publishing application layer. In the architecture, the screen identification layer has realized the internet of screens in which geographically dispersed independent screens are connected to the internet by the customized set-top boxes. The service management layer uses MQTT protocol to implement a lightweight broker-based publish/subscribe messaging mechanism in constrained environments such as internet of things to solve the bandwidth bottleneck. Meanwhile the cloud-based storage technique is used to storage and manage the promptly increasing multimedia publishing information. The paper has designed and realized a prototype SzIoScreen, and give some related test results.
Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing
Balasubramaniam, S.; Kavitha, V.
2015-01-01
Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826
Geometric data perturbation-based personal health record transactions in cloud computing.
Balasubramaniam, S; Kavitha, V
2015-01-01
Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.
NASA Astrophysics Data System (ADS)
Marinos, Alexandros; Briscoe, Gerard
Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.
Modeling Dissolved Solids in the Rincon Valley, New Mexico Using RiverWare
NASA Astrophysics Data System (ADS)
Abudu, S.; Ahn, S. R.; Sheng, Z.
2017-12-01
Simulating transport and storage of dissolved solids in surface water and underlying alluvial aquifer is essential to evaluate the impacts of surface water operations, groundwater pumping, and climate variability on the spatial and temporal variability of salinity in the Rio Grande Basin. In this study, we developed a monthly RiverWare water quantity and quality model to simulate the both concentration and loads of dissolved solids for the Rincon Valley, New Mexico from Caballo Reservoir to Leasburg Dam segment of the Rio Grande. The measured flows, concentration and loads of dissolved solids in the main stream and drains were used to develop RiveWare model using 1980-1988 data for calibration, and 1989-1995 data for validation. The transport of salt is tracked using discretized salt and post-process approaches. Flow and salt exchange between the surface water and adjacent groundwater objects is computed using "soil moisture salt with supplemental flow" method in the RiverWare. In the groundwater objects, the "layered salt" method is used to simulate concentration of the dissolved solids in the shallow groundwater storage. In addition, the estimated local inflows under different weather conditions by using a calibrated Soil Water Assessment Tool (SWAT) were fed into the RiverWare to refine the simulation of the flow and dissolved solids. The results show the salt concentration and loads increased at Leasburg Dam, which indicates the river collects salts from the agricultural return flow and the underlying aquifer. The RiverWare model with the local inflow fed by SWAT delivered the better quantification of temporal and spatial salt exchange patterns between the river and the underlying aquifer. The results from the proposed modeling approach can be used to refine the current mass-balance budgets for dissolved-solids transport in the Rio Grande, and provide guidelines for planning and decision-making to control salinity in arid river environment.
Outcomes of HeartWare Ventricular Assist System support in 141 patients: a single-centre experience.
Wu, Long; Weng, Yu-Guo; Dong, Nian-Guo; Krabatsch, Thomas; Stepanenko, Alexander; Hennig, Ewald; Hetzer, Roland
2013-07-01
A third-generation ventricular assist device, the HeartWare Ventricular Assist System, has demonstrated its reliability and durability in animal models and clinical experience. However, studies of a large series of applications are still lacking. We evaluate the safety and efficacy of the HeartWare pump in 141 patients with end-stage heart failure at a single centre. A total of 141 patients (116 men and 25 women with a mean age of 52 years) in New York Heart Association (NYHA) Class IV received implantation of the HeartWare Ventricular Assist System between August 2009 and April 2011 at the Deutsches Herzzentrum Berlin. The outcomes were measured in terms of laboratory data, adverse events, NYHA functional class and survival during device support. The HeartWare system provided an adequate haemodynamic support for patients both inside and outside the hospital. NYHA class improved to I-II. Organ function and pulmonary vascular resistance improved significantly. In this cohort of patients, 14 patients underwent heart transplantation, one had had the device explanted following myocardial recovery, one had changed to another assist device, 81 were on ongoing support and 44 died. The overall actuarial survival rates at 6 and 12 months were 70 and 67%, respectively, and the 3-, 6- and 12-month survival rates on a left ventricular assist device (LVAD) support for bridge to transplantation patients were 82, 81 and 79%, respectively. Infection and bleeding were the main adverse events. Four patients underwent an LVAD exchange for pump thrombosis. The HeartWare system provides a safe and effective circulatory support in a population with a wide range of body surface areas, with a satisfactory actuarial survival time and an improved quality of life. It can be used for univentricular or biventricular support, being implanted into the pericardial space with simplified surgical techniques.
Bioinformatics clouds for big data manipulation.
Dai, Lin; Gao, Xin; Guo, Yan; Xiao, Jingfa; Zhang, Zhang
2012-11-28
As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor.
Fingerprinting Reverse Proxies Using Timing Analysis of TCP Flows
2013-09-01
bayes classifier,” in Cloud Computing Security , ser. CCSW ’09. New York City, NY: ACM, 2009, pp. 31–42. [30] J. Zhang, R. Perdisci, W. Lee, U. Sarfraz...FSM Finite State Machine HTML Hypertext Markup Language HTTP Hypertext Transfer Protocol HTTPS Hypertext Transfer Protocol Secure ICMP Internet Control...This hidden traffic concept supports network access control, security protection through obfuscation, and performance boosts at the Internet facing
Dynamic Optical Networks for Future Internet Environments
NASA Astrophysics Data System (ADS)
Matera, Francesco
2014-05-01
This article reports an overview on the evolution of the optical network scenario taking into account the exponential growth of connected devices, big data, and cloud computing that is driving a concrete transformation impacting the information and communication technology world. This hyper-connected scenario is deeply affecting relationships between individuals, enterprises, citizens, and public administrations, fostering innovative use cases in practically any environment and market, and introducing new opportunities and new challenges. The successful realization of this hyper-connected scenario depends on different elements of the ecosystem. In particular, it builds on connectivity and functionalities allowed by converged next-generation networks and their capacity to support and integrate with the Internet of Things, machine-to-machine, and cloud computing. This article aims at providing some hints of this scenario to contribute to analyze impacts on optical system and network issues and requirements. In particular, the role of the software-defined network is investigated by taking into account all scenarios regarding data centers, cloud computing, and machine-to-machine and trying to illustrate all the advantages that could be introduced by advanced optical communications.
A Self-Provisioning Mechanism in OpenStack for IoT Devices.
Solano, Antonio; Dormido, Raquel; Duro, Natividad; Sánchez, Juan Miguel
2016-08-17
The aim of this paper is to introduce a plug-and-play mechanism for an Internet of Things (IoT) device to instantiate a Software as a Service (SaaS) application in a private cloud, built up with OpenStack. The SaaS application is the digital avatar of a physical object connected to Internet. As a proof of concept, a Vending Machine is retrofitted and connected to Internet with and Arduino Open Hardware device. Once the self-configuration mechanism is completed, it is possible to order a product from a mobile communication device.
A Self-Provisioning Mechanism in OpenStack for IoT Devices
Solano, Antonio; Dormido, Raquel; Duro, Natividad; Sánchez, Juan Miguel
2016-01-01
The aim of this paper is to introduce a plug-and-play mechanism for an Internet of Things (IoT) device to instantiate a Software as a Service (SaaS) application in a private cloud, built up with OpenStack. The SaaS application is the digital avatar of a physical object connected to Internet. As a proof of concept, a Vending Machine is retrofitted and connected to Internet with and Arduino Open Hardware device. Once the self-configuration mechanism is completed, it is possible to order a product from a mobile communication device. PMID:27548166
Positioning New Patterns of Privilege in Learning: A Response to Ware
ERIC Educational Resources Information Center
Paxton-Buursma, Debra J.; Mariage, Troy V.
2011-01-01
This special series represents collective courage because what is willing to be risked may be profound. At center is a willingness to reach out and cultivate new conversations on disability. Indeed, the artists who contribute to Ware's article are key co-authors; their art ushers us into a new disability literacy that extends and challenges…
On the Waring problem for polynomial rings
Fröberg, Ralf; Ottaviani, Giorgio; Shapiro, Boris
2012-01-01
In this note we discuss an analog of the classical Waring problem for . Namely, we show that a general homogeneous polynomial of degree divisible by k≥2 can be represented as a sum of at most kn k-th powers of homogeneous polynomials in . Noticeably, kn coincides with the number obtained by naive dimension count. PMID:22460787
Reconciliation of the cloud computing model with US federal electronic health record regulations
2011-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing. PMID:21727204
Reconciliation of the cloud computing model with US federal electronic health record regulations.
Schweitzer, Eugene J
2012-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing.
The Optical Gravitational Lensing Experiment. Eclipsing Binary Stars in the Small Magellanic Cloud
NASA Astrophysics Data System (ADS)
Wyrzykowski, L.; Udalski, A.; Kubiak, M.; Szymanski, M. K.; Zebrun, K.; Soszynski, I.; Wozniak, P. R.; Pietrzynski, G.; Szewczyk, O.
2004-03-01
We present new version of the OGLE-II catalog of eclipsing binary stars detected in the Small Magellanic Cloud, based on Difference Image Analysis catalog of variable stars in the Magellanic Clouds containing data collected from 1997 to 2000. We found 1351 eclipsing binary stars in the central 2.4 square degree area of the SMC. 455 stars are newly discovered objects, not found in the previous release of the catalog. The eclipsing objects were selected with the automatic search algorithm based on the artificial neural network. The full catalog is accessible from the OGLE Internet archive.
Students as Ground Observers for Satellite Cloud Retrieval Validation
NASA Technical Reports Server (NTRS)
Chambers, Lin H.; Costulis, P. Kay; Young, David F.; Rogerson, Tina M.
2004-01-01
The Students' Cloud Observations On-Line (S'COOL) Project was initiated in 1997 to obtain student observations of clouds coinciding with the overpass of the Clouds and the Earth's Radiant Energy System (CERES) instruments on NASA's Earth Observing System satellites. Over the past seven years we have accumulated more than 9,000 cases worldwide where student observations are available within 15 minutes of a CERES observation. This paper reports on comparisons between the student and satellite data as one facet of the validation of the CERES cloud retrievals. Available comparisons include cloud cover, cloud height, cloud layering, and cloud visual opacity. The large volume of comparisons allows some assessment of the impact of surface cover, such as snow and ice, reported by the students. The S'COOL observation database, accessible via the Internet at http://scool.larc.nasa.gov, contains over 32,000 student observations and is growing by over 700 observations each month. Some of these observations may be useful for assessment of other satellite cloud products. In particular, some observing sites have been making hourly observations of clouds during the school day to learn about the diurnal cycle of cloudiness.
Deng, Yong-Yuan; Chen, Chin-Ling; Tsaur, Woei-Jiunn; Tang, Yung-Wen; Chen, Jung-Hsuan
2017-12-15
As sensor networks and cloud computation technologies have rapidly developed over recent years, many services and applications integrating these technologies into daily life have come together as an Internet of Things (IoT). At the same time, aging populations have increased the need for expanded and more efficient elderly care services. Fortunately, elderly people can now wear sensing devices which relay data to a personal wireless device, forming a body area network (BAN). These personal wireless devices collect and integrate patients' personal physiological data, and then transmit the data to the backend of the network for related diagnostics. However, a great deal of the information transmitted by such systems is sensitive data, and must therefore be subject to stringent security protocols. Protecting this data from unauthorized access is thus an important issue in IoT-related research. In regard to a cloud healthcare environment, scholars have proposed a secure mechanism to protect sensitive patient information. Their schemes provide a general architecture; however, these previous schemes still have some vulnerability, and thus cannot guarantee complete security. This paper proposes a secure and lightweight body-sensor network based on the Internet of Things for cloud healthcare environments, in order to address the vulnerabilities discovered in previous schemes. The proposed authentication mechanism is applied to a medical reader to provide a more comprehensive architecture while also providing mutual authentication, and guaranteeing data integrity, user untraceability, and forward and backward secrecy, in addition to being resistant to replay attack.
Bioinformatics clouds for big data manipulation
2012-01-01
Abstract As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. Reviewers This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor. PMID:23190475
Second Language Vocabulary Acquisition from Language Input and from Form-Focused Activities
ERIC Educational Resources Information Center
Laufer, Batia
2009-01-01
Interest in L2 vocabulary learning and teaching started long before the nineteen-eighties (for references to earlier studies, see Rob Waring's database http://www1.harenet.ne.jp/~waring/vocab/vocrefs/vocref.html) but it declined with the advent of generative linguistics to the point of discrimination and neglect (Meara 1980). In 1986, I argued…
ERIC Educational Resources Information Center
Sheu, Feng-Ru; Shih, Meilun
2017-01-01
As freely adoptable digital resources, OpenCourseWare (OCW) have become a prominent form of Open Educational Resources (OER). More than 275 institutions in the worldwide OCW consortium have committed to creating free access open course materials. Despite the resources and efforts to create OCW worldwide, little understanding of its use exists.…
Challenges in the Adoption and Use of OpenCourseWare: Experience of the United Nations University
ERIC Educational Resources Information Center
Barrett, Brendan F. D.; Grover, Velma I.; Janowski, Tomasz; van Lavieren, Hanneke; Ojo, Adegboyega; Schmidt, Philipp
2009-01-01
This paper provides insights on the adoption or use of OpenCourseWare (OCW) to support broader research, training and institutional capacity development goals, based on the experience of the United Nations University. Specifically, it explains the strategic context for the use of OCW in the university through its related efforts in the area of…
75 FR 11991 - ABC & D Recycling, Inc.-Lease and Operation Exemption-a Line of Railroad in Ware, MA
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
... DEPARTMENT OF TRANSPORTATION Surface Transportation Board [STB Finance Docket No. 35356] ABC & D Recycling, Inc.--Lease and Operation Exemption--a Line of Railroad in Ware, MA ABC & D Recycling, Inc. (ABC & D), a noncarrier, has filed a verified notice of exemption under 49 CFR 1150.31 to lease from O...
OpenCourseWare, Global Access and the Right to Education: Real Access or Marketing Ploy?
ERIC Educational Resources Information Center
Huijser, Henk; Bedford, Tas; Bull, David
2008-01-01
This paper explores the potential opportunities that OpenCourseWare (OCW) offers in providing wider access to tertiary education, based on the ideal of "the right to education." It first discusses the wider implications of OCW, and its underlying philosophy, before using a case study of a tertiary preparation program (TPP) at the…
ERIC Educational Resources Information Center
Parker, Preston Paul
2011-01-01
This study examines perceived benefits and costs of instructors who contributed to the Massachusetts Institute of Technology (MIT) OpenCourseWare (OCW) project. While previous research has investigated the benefits and costs of OCW from the perspectives of the users and institution, the instructor's perspective is the focus of this qualitative…
ERIC Educational Resources Information Center
Bonk, Curtis J.; Lee, Mimi Miyoung; Kou, Xiaojing; Xu, Shuya; Sheu, Feng-Ru
2015-01-01
This research targeted the learning preferences, goals and motivations, achievements, challenges, and possibilities for life change of self-directed online learners who subscribed to the monthly OpenCourseWare (OCW) e-newsletter from MIT. Data collection included a 25-item survey of 1,429 newsletter subscribers; 613 of whom also completed an…
NASA Astrophysics Data System (ADS)
Aneri, Parikh; Sumathy, S.
2017-11-01
Cloud computing provides services over the internet and provides application resources and data to the users based on their demand. Base of the Cloud Computing is consumer provider model. Cloud provider provides resources which consumer can access using cloud computing model in order to build their application based on their demand. Cloud data center is a bulk of resources on shared pool architecture for cloud user to access. Virtualization is the heart of the Cloud computing model, it provides virtual machine as per application specific configuration and those applications are free to choose their own configuration. On one hand, there is huge number of resources and on other hand it has to serve huge number of requests effectively. Therefore, resource allocation policy and scheduling policy play very important role in allocation and managing resources in this cloud computing model. This paper proposes the load balancing policy using Hungarian algorithm. Hungarian Algorithm provides dynamic load balancing policy with a monitor component. Monitor component helps to increase cloud resource utilization by managing the Hungarian algorithm by monitoring its state and altering its state based on artificial intelligent. CloudSim used in this proposal is an extensible toolkit and it simulates cloud computing environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malik, Saif Ur Rehman; Khan, Samee U.; Ewen, Sam J.
2015-03-14
As we delve deeper into the ‘Digital Age’, we witness an explosive growth in the volume, velocity, and variety of the data available on the Internet. For example, in 2012 about 2.5 quintillion bytes of data was created on a daily basis that originated from myriad of sources and applications including mobiledevices, sensors, individual archives, social networks, Internet of Things, enterprises, cameras, software logs, etc. Such ‘Data Explosions’ has led to one of the most challenging research issues of the current Information and Communication Technology era: how to optimally manage (e.g., store, replicated, filter, and the like) such large amountmore » of data and identify new ways to analyze large amounts of data for unlocking information. It is clear that such large data streams cannot be managed by setting up on-premises enterprise database systems as it leads to a large up-front cost in buying and administering the hardware and software systems. Therefore, next generation data management systems must be deployed on cloud. The cloud computing paradigm provides scalable and elastic resources, such as data and services accessible over the Internet Every Cloud Service Provider must assure that data is efficiently processed and distributed in a way that does not compromise end-users’ Quality of Service (QoS) in terms of data availability, data search delay, data analysis delay, and the like. In the aforementioned perspective, data replication is used in the cloud for improving the performance (e.g., read and write delay) of applications that access data. Through replication a data intensive application or system can achieve high availability, better fault tolerance, and data recovery. In this paper, we survey data management and replication approaches (from 2007 to 2011) that are developed by both industrial and research communities. The focus of the survey is to discuss and characterize the existing approaches of data replication and management that tackle the resource usage and QoS provisioning with different levels of efficiencies. Moreover, the breakdown of both influential expressions (data replication and management) to provide different QoS attributes is deliberated. Furthermore, the performance advantages and disadvantages of data replication and management approaches in the cloud computing environments are analyzed. Open issues and future challenges related to data consistency, scalability, load balancing, processing and placement are also reported.« less
Reviews on Security Issues and Challenges in Cloud Computing
NASA Astrophysics Data System (ADS)
An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.
2016-11-01
Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.
2014-09-01
becoming a more and more prevalent technology in the business world today. According to Syal and Goswami (2012), cloud technology is seen as a...use of computing resources, applications, and personal files without reliance on a single computer or system ( Syal & Goswami, 2012). By operating in...cloud services largely being web-based, which can be retrieved through most systems with access to the Internet ( Syal & Goswami, 2012). The end user can
ERIC Educational Resources Information Center
Yang, Hui-Chi; Sun, Yu-Chih
2013-01-01
OpenCourseWare (OCW) has received increasing attention over the past few years in higher education. These courses provide appealing opportunities to view classes taught in well-established universities worldwide. The current study aims to examine how OCW lectures can serve as authentic learning materials to facilitate vocabulary acquisition for…
Enabling Secure XMPP Communications in Federated IoT Clouds Through XEP 0027 and SAML/SASL SSO
Celesti, Antonio; Fazio, Maria; Villari, Massimo
2017-01-01
Nowadays, in the panorama of Internet of Things (IoT), finding a right compromise between interactivity and security is not trivial at all. Currently, most of pervasive communication technologies are designed to work locally. As a consequence, the development of large-scale Internet services and applications is not so easy for IoT Cloud providers. The main issue is that both IoT architectures and services have started as simple but they are becoming more and more complex. Consequently, the web service technology is often inappropriate. Recently, many operators in both academia and industry fields are considering the possibility to adopt the eXtensible Messaging and Presence Protocol (XMPP) for the implementation of IoT Cloud communication systems. In fact, XMPP offers many advantages in term of real-time capabilities, efficient data distribution, service discovery and inter-domain communication compared to other technologies. Nevertheless, the protocol lacks of native security, data confidentiality and trustworthy federation features. In this paper, considering an XMPP-based IoT Cloud architectural model, we discuss how can be possible to enforce message signing/encryption and Single-Sign On (SSO) authentication respectively for secure inter-module and inter-domain communications in a federated environment. Experiments prove that security mechanisms introduce an acceptable overhead, considering the obvious advantages achieved in terms of data trustiness and privacy. PMID:28178214
Enabling Secure XMPP Communications in Federated IoT Clouds Through XEP 0027 and SAML/SASL SSO.
Celesti, Antonio; Fazio, Maria; Villari, Massimo
2017-02-07
Nowadays, in the panorama of Internet of Things (IoT), finding a right compromise between interactivity and security is not trivial at all. Currently, most of pervasive communication technologies are designed to work locally. As a consequence, the development of large-scale Internet services and applications is not so easy for IoT Cloud providers. The main issue is that both IoT architectures and services have started as simple but they are becoming more and more complex. Consequently, the web service technology is often inappropriate. Recently, many operators in both academia and industry fields are considering the possibility to adopt the eXtensible Messaging and Presence Protocol (XMPP) for the implementation of IoT Cloud communication systems. In fact, XMPP offers many advantages in term of real-time capabilities, efficient data distribution, service discovery and inter-domain communication compared to other technologies. Nevertheless, the protocol lacks of native security, data confidentiality and trustworthy federation features. In this paper, considering an XMPP-based IoT Cloud architectural model, we discuss how can be possible to enforce message signing/encryption and Single-Sign On (SSO) authentication respectively for secure inter-module and inter-domain communications in a federated environment. Experiments prove that security mechanisms introduce an acceptable overhead, considering the obvious advantages achieved in terms of data trustiness and privacy.
IoT-based flood embankments monitoring system
NASA Astrophysics Data System (ADS)
Michta, E.; Szulim, R.; Sojka-Piotrowska, A.; Piotrowski, K.
2017-08-01
In the paper a concept of flood embankments monitoring system based on using Internet of Things approach and Cloud Computing technologies will be presented. The proposed system consists of sensors, IoT nodes, Gateways and Cloud based services. Nodes communicates with the sensors measuring certain physical parameters describing the state of the embankments and communicates with the Gateways. Gateways are specialized active devices responsible for direct communication with the nodes, collecting sensor data, preprocess the data, applying local rules and communicate with the Cloud Services using communication API delivered by cloud services providers. Architecture of all of the system components will be proposed consisting IoT devices functionalities description, their communication model, software modules and services bases on using a public cloud computing platform like Microsoft Azure will be proposed. The most important aspects of maintaining the communication in a secure way will be shown.
Ahmed, Abdulghani Ali; Xue Li, Chua
2018-01-01
Cloud storage service allows users to store their data online, so that they can remotely access, maintain, manage, and back up data from anywhere via the Internet. Although helpful, this storage creates a challenge to digital forensic investigators and practitioners in collecting, identifying, acquiring, and preserving evidential data. This study proposes an investigation scheme for analyzing data remnants and determining probative artifacts in a cloud environment. Using pCloud as a case study, this research collected the data remnants available on end-user device storage following the storing, uploading, and accessing of data in the cloud storage. Data remnants are collected from several sources, including client software files, directory listing, prefetch, registry, network PCAP, browser, and memory and link files. Results demonstrate that the collected remnants data are beneficial in determining a sufficient number of artifacts about the investigated cybercrime. © 2017 American Academy of Forensic Sciences.
Cloud@Home: A New Enhanced Computing Paradigm
NASA Astrophysics Data System (ADS)
Distefano, Salvatore; Cunsolo, Vincenzo D.; Puliafito, Antonio; Scarpa, Marco
Cloud computing is a distributed computing paradigm that mixes aspects of Grid computing, ("… hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities" (Foster, 2002)) Internet Computing ("…a computing platform geographically distributed across the Internet" (Milenkovic et al., 2003)), Utility computing ("a collection of technologies and business practices that enables computing to be delivered seamlessly and reliably across multiple computers, ... available as needed and billed according to usage, much like water and electricity are today" (Ross & Westerman, 2004)) Autonomic computing ("computing systems that can manage themselves given high-level objectives from administrators" (Kephart & Chess, 2003)), Edge computing ("… provides a generic template facility for any type of application to spread its execution across a dedicated grid, balancing the load …" Davis, Parikh, & Weihl, 2004) and Green computing (a new frontier of Ethical computing1 starting from the assumption that in next future energy costs will be related to the environment pollution).
Cyber physical systems role in manufacturing technologies
NASA Astrophysics Data System (ADS)
Al-Ali, A. R.; Gupta, Ragini; Nabulsi, Ahmad Al
2018-04-01
Empowered by the recent development in single System-on-Chip, Internet of Things, and cloud computing technologies, cyber physical systems are evolving as a major controller during and post the manufacturing products process. In additional to their real physical space, cyber products nowadays have a virtual space. A product virtual space is a digital twin that is attached to it to enable manufacturers and their clients to better manufacture, monitor, maintain and operate it throughout its life time cycles, i.e. from the product manufacturing date, through operation and to the end of its lifespan. Each product is equipped with a tiny microcontroller that has a unique identification number, access code and WiFi conductivity to access it anytime and anywhere during its life cycle. This paper presents the cyber physical systems architecture and its role in manufacturing. Also, it highlights the role of Internet of Things and cloud computing in industrial manufacturing and factory automation.
Research on cloud-based remote measurement and analysis system
NASA Astrophysics Data System (ADS)
Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan
2015-02-01
The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.
Design for Run-Time Monitor on Cloud Computing
NASA Astrophysics Data System (ADS)
Kang, Mikyung; Kang, Dong-In; Yun, Mira; Park, Gyung-Leen; Lee, Junghoon
Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is the type of a parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring the system status change, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize resources on cloud computing. RTM monitors application software through library instrumentation as well as underlying hardware through performance counter optimizing its computing configuration based on the analyzed data.
Background Characterization Techniques For Pattern Recognition Applications
NASA Astrophysics Data System (ADS)
Noah, Meg A.; Noah, Paul V.; Schroeder, John W.; Kessler, Bernard V.; Chernick, Julian A.
1989-08-01
The Department of Defense has a requirement to investigate technologies for the detection of air and ground vehicles in a clutter environment. The use of autonomous systems using infrared, visible, and millimeter wave detectors has the potential to meet DOD's needs. In general, however, the hard-ware technology (large detector arrays with high sensitivity) has outpaced the development of processing techniques and software. In a complex background scene the "problem" is as much one of clutter rejection as it is target detection. The work described in this paper has investigated a new, and innovative, methodology for background clutter characterization, target detection and target identification. The approach uses multivariate statistical analysis to evaluate a set of image metrics applied to infrared cloud imagery and terrain clutter scenes. The techniques are applied to two distinct problems: the characterization of atmospheric water vapor cloud scenes for the Navy's Infrared Search and Track (IRST) applications to support the Infrared Modeling Measurement and Analysis Program (IRAMMP); and the detection of ground vehicles for the Army's Autonomous Homing Munitions (AHM) problems. This work was sponsored under two separate Small Business Innovative Research (SBIR) programs by the Naval Surface Warfare Center (NSWC), White Oak MD, and the Army Material Systems Analysis Activity at Aberdeen Proving Ground MD. The software described in this paper will be available from the respective contract technical representatives.
Deng, Yong-Yuan; Chen, Chin-Ling; Tsaur, Woei-Jiunn; Tang, Yung-Wen; Chen, Jung-Hsuan
2017-01-01
As sensor networks and cloud computation technologies have rapidly developed over recent years, many services and applications integrating these technologies into daily life have come together as an Internet of Things (IoT). At the same time, aging populations have increased the need for expanded and more efficient elderly care services. Fortunately, elderly people can now wear sensing devices which relay data to a personal wireless device, forming a body area network (BAN). These personal wireless devices collect and integrate patients’ personal physiological data, and then transmit the data to the backend of the network for related diagnostics. However, a great deal of the information transmitted by such systems is sensitive data, and must therefore be subject to stringent security protocols. Protecting this data from unauthorized access is thus an important issue in IoT-related research. In regard to a cloud healthcare environment, scholars have proposed a secure mechanism to protect sensitive patient information. Their schemes provide a general architecture; however, these previous schemes still have some vulnerability, and thus cannot guarantee complete security. This paper proposes a secure and lightweight body-sensor network based on the Internet of Things for cloud healthcare environments, in order to address the vulnerabilities discovered in previous schemes. The proposed authentication mechanism is applied to a medical reader to provide a more comprehensive architecture while also providing mutual authentication, and guaranteeing data integrity, user untraceability, and forward and backward secrecy, in addition to being resistant to replay attack. PMID:29244776
[Chapter 2. Internet of Things help to collect Big Data].
Brouard, Benoît
2017-10-27
According to the report ?The Internet of Things Market? the number of connected devices will reach 68 billion in 2020. In 2012, the total amount of data was 500 petabytes. So, after the race to increase power computation, now the stake is in the capacity to store all these data in the cloud, to open their access and to analyze these data properly. The use of these data is a major challenge for medical research and public health.
"Cloud" functions and templates of engineering calculations for nuclear power plants
NASA Astrophysics Data System (ADS)
Ochkov, V. F.; Orlov, K. A.; Ko, Chzho Ko
2014-10-01
The article deals with an important problem of setting up computer-aided design calculations of various circuit configurations and power equipment carried out using the templates and standard computer programs available in the Internet. Information about the developed Internet-based technology for carrying out such calculations using the templates accessible in the Mathcad Prime software package is given. The technology is considered taking as an example the solution of two problems relating to the field of nuclear power engineering.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vang, Leng; Prescott, Steven R; Smith, Curtis
In collaborating scientific research arena it is important to have an environment where analysts have access to a shared of information documents, software tools and be able to accurately maintain and track historical changes in models. A new cloud-based environment would be accessible remotely from anywhere regardless of computing platforms given that the platform has available of Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report reviews development of a Cloud-based Architecture Capabilities (CAC) as a web portal for PRA tools.
Cloud-Based Data Sharing Connects Emergency Managers
NASA Technical Reports Server (NTRS)
2014-01-01
Under an SBIR contract with Stennis Space Center, Baltimore-based StormCenter Communications Inc. developed an improved interoperable platform for sharing geospatial data over the Internet in real time-information that is critical for decision makers in emergency situations.
Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.
Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen
2013-01-01
Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework.
Generic-distributed framework for cloud services marketplace based on unified ontology.
Hasan, Samer; Valli Kumari, V
2017-11-01
Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.
Analysis of the frontier technology of agricultural IoT and its predication research
NASA Astrophysics Data System (ADS)
Han, Shuqing; Zhang, Jianhua; Zhu, Mengshuai; Wu, Jianzhai; Shen, Chen; Kong, Fantao
2017-09-01
Agricultural IoT (Internet of Things) develops rapidly. Nanotechnology, biotechnology and optoelectronic technology are successfully integrated into the agricultural sensor technology. Big data, cloud computing and artificial intelligence technology have also been successfully used in IoT. This paper carries out the research on integration of agricultural sensor technology, nanotechnology, biotechnology and optoelectronic technology and the application of big data, cloud computing and artificial intelligence technology in agricultural IoT. The advantages and development of the integration of nanotechnology, biotechnology and optoelectronic technology with agricultural sensor technology were discussed. The application of big data, cloud computing and artificial intelligence technology in IoT and their development trend were analysed.
Cloud-based robot remote control system for smart factory
NASA Astrophysics Data System (ADS)
Wu, Zhiming; Li, Lianzhong; Xu, Yang; Zhai, Jingmei
2015-12-01
With the development of internet technologies and the wide application of robots, there is a prospect (trend/tendency) of integration between network and robots. A cloud-based robot remote control system over networks for smart factory is proposed, which enables remote users to control robots and then realize intelligent production. To achieve it, a three-layer system architecture is designed including user layer, service layer and physical layer. Remote control applications running on the cloud server is developed on Microsoft Azure. Moreover, DIV+ CSS technologies are used to design human-machine interface to lower maintenance cost and improve development efficiency. Finally, an experiment is implemented to verify the feasibility of the program.
Research on Key Technologies of Cloud Computing
NASA Astrophysics Data System (ADS)
Zhang, Shufen; Yan, Hongcan; Chen, Xuebin
With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.
Cloud computing in pharmaceutical R&D: business risks and mitigations.
Geiger, Karl
2010-05-01
Cloud computing provides information processing power and business services, delivering these services over the Internet from centrally hosted locations. Major technology corporations aim to supply these services to every sector of the economy. Deploying business processes 'in the cloud' requires special attention to the regulatory and business risks assumed when running on both hardware and software that are outside the direct control of a company. The identification of risks at the correct service level allows a good mitigation strategy to be selected. The pharmaceutical industry can take advantage of existing risk management strategies that have already been tested in the finance and electronic commerce sectors. In this review, the business risks associated with the use of cloud computing are discussed, and mitigations achieved through knowledge from securing services for electronic commerce and from good IT practice are highlighted.
1982-02-01
ware porcelain vessel fragments 26 23 3 25 1 - - Porcelain "china" doll fragments 3 3 - 3 - - Porcelain door knob A insulator fraqments 6 6 - 6...34 doll fragments 3 3 - 3 - Porcelain door knob & insulator fraqments 6 6 - 6 - Decorated & white ware ironstone vessel fragments 85 82 3 85 - Stoneware...77 - - - - Stoneware or kaolin pipe fragments 2 2 - - 2 - - - - Porcelain insulator fragment 1 1 - - I - - - - Glazed brick fraqment I I - - I
Localized soft elasticity in liquid crystal elastomers (POSTPRINT)
2016-02-23
AFRL-RX-WP-JA-2016-0280 LOCALIZED SOFT ELASTICITY IN LIQUID CRYSTAL ELASTOMER (POSTPRINT) Taylor H. Ware, Andreas F. Shick, and...MM-YY) 2. REPORT TYPE 3. DATES COVERED (From - To) 11 August 2015 Interim 31 January 2014 – 11 July 2015 4. TITLE AND SUBTITLE LOCALIZED SOFT ...2016 Localized soft elasticity in liquid crystal elastomers Taylor H. Ware1,2, John S. Biggins3, Andreas F. Shick1, Mark Warner3 & Timothy J. White1
Metalorganic vapor phase epitaxy of AlN on sapphire with low etch pit density
NASA Astrophysics Data System (ADS)
Koleske, D. D.; Figiel, J. J.; Alliman, D. L.; Gunning, B. P.; Kempisty, J. M.; Creighton, J. R.; Mishima, A.; Ikenaga, K.
2017-06-01
Using metalorganic vapor phase epitaxy, methods were developed to achieve AlN films on sapphire with low etch pit density (EPD). Key to this achievement was using the same AlN growth recipe and only varying the pre-growth conditioning of the quartz-ware. After AlN growth, the quartz-ware was removed from the growth chamber and either exposed to room air or moved into the N2 purged glove box and exposed to H2O vapor. After the quartz-ware was exposed to room air or H2O, the AlN film growth was found to be more reproducible, resulting in films with (0002) and (10-12) x-ray diffraction (XRD) rocking curve linewidths of 200 and 500 arc sec, respectively, and EPDs < 100 cm-2. The EPD was found to correlate with (0002) linewidths, suggesting that the etch pits are associated with open core screw dislocations similar to GaN films. Once reproducible AlN conditions were established using the H2O pre-treatment, it was found that even small doses of trimethylaluminum (TMAl)/NH3 on the quartz-ware surfaces generated AlN films with higher EPDs. The presence of these residual TMAl/NH3-derived coatings in metalorganic vapor phase epitaxy (MOVPE) systems and their impact on the sapphire surface during heating might explain why reproducible growth of AlN on sapphire is difficult.
An Efficient Virtual Machine Consolidation Scheme for Multimedia Cloud Computing.
Han, Guangjie; Que, Wenhui; Jia, Gangyong; Shu, Lei
2016-02-18
Cloud computing has innovated the IT industry in recent years, as it can delivery subscription-based services to users in the pay-as-you-go model. Meanwhile, multimedia cloud computing is emerging based on cloud computing to provide a variety of media services on the Internet. However, with the growing popularity of multimedia cloud computing, its large energy consumption cannot only contribute to greenhouse gas emissions, but also result in the rising of cloud users' costs. Therefore, the multimedia cloud providers should try to minimize its energy consumption as much as possible while satisfying the consumers' resource requirements and guaranteeing quality of service (QoS). In this paper, we have proposed a remaining utilization-aware (RUA) algorithm for virtual machine (VM) placement, and a power-aware algorithm (PA) is proposed to find proper hosts to shut down for energy saving. These two algorithms have been combined and applied to cloud data centers for completing the process of VM consolidation. Simulation results have shown that there exists a trade-off between the cloud data center's energy consumption and service-level agreement (SLA) violations. Besides, the RUA algorithm is able to deal with variable workload to prevent hosts from overloading after VM placement and to reduce the SLA violations dramatically.
An Efficient Virtual Machine Consolidation Scheme for Multimedia Cloud Computing
Han, Guangjie; Que, Wenhui; Jia, Gangyong; Shu, Lei
2016-01-01
Cloud computing has innovated the IT industry in recent years, as it can delivery subscription-based services to users in the pay-as-you-go model. Meanwhile, multimedia cloud computing is emerging based on cloud computing to provide a variety of media services on the Internet. However, with the growing popularity of multimedia cloud computing, its large energy consumption cannot only contribute to greenhouse gas emissions, but also result in the rising of cloud users’ costs. Therefore, the multimedia cloud providers should try to minimize its energy consumption as much as possible while satisfying the consumers’ resource requirements and guaranteeing quality of service (QoS). In this paper, we have proposed a remaining utilization-aware (RUA) algorithm for virtual machine (VM) placement, and a power-aware algorithm (PA) is proposed to find proper hosts to shut down for energy saving. These two algorithms have been combined and applied to cloud data centers for completing the process of VM consolidation. Simulation results have shown that there exists a trade-off between the cloud data center’s energy consumption and service-level agreement (SLA) violations. Besides, the RUA algorithm is able to deal with variable workload to prevent hosts from overloading after VM placement and to reduce the SLA violations dramatically. PMID:26901201
Yap, Kevin Yi-Lwern; Ho, Yasmin Xiu Xiu; Chui, Wai Keung; Chan, Alexandre
2010-11-01
Concomitant use of anticancer drugs (ACDs) and antidepressants (ADs) in the treatment of depression in patients with cancer may result in potentially harmful drug-drug interactions (DDIs). It is crucial that clinicians make timely, accurate, safe and effective decisions regarding drug therapies in patients. The ubiquitous nature of the internet or "cloud" has enabled easy dissemination of DDI information, but there is currently no database dedicated to allow searching of ACD interactions by chemotherapy regimens. We describe the implementation of an AD interaction module to a previously published oncology-specific DDI database for clinicians which focuses on ACDs, single-agent and multiple-agent chemotherapy regimens. Drug- and DDI-related information were collated from drug information handbooks, databases, package inserts, and published literature from PubMed, Scopus and Science Direct. Web documents were constructed using Adobe software and programming scripts, and mounted on a domain served from the internet cloud. OncoRx is an oncology-specific DDI database whose structure is designed around all the major classes of ACDs and their frequently prescribed chemotherapy regimens. There are 117 ACDs and 256 regimens in OncoRx, and it can detect over 1 500 interactions with 21 ADs. Clinicians are provided with the pharmacokinetic parameters of the drugs, information on the regimens and details of the detected DDIs during an interaction search. OncoRx is the first database of its kind which allows detection of ACD and chemotherapy regimen interactions with ADs. This tool will assist clinicians in improving clinical response and reducing adverse effects based on the therapeutic and toxicity profiles of the drugs.
Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing
Kang, Mikyung; Kang, Dong-In; Crago, Stephen P.; Park, Gyung-Leen; Lee, Junghoon
2011-01-01
Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data. PMID:22163811
Design and development of a run-time monitor for multi-core architectures in cloud computing.
Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon
2011-01-01
Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.
Application-oriented offloading in heterogeneous networks for mobile cloud computing
NASA Astrophysics Data System (ADS)
Tseng, Fan-Hsun; Cho, Hsin-Hung; Chang, Kai-Di; Li, Jheng-Cong; Shih, Timothy K.
2018-04-01
Nowadays Internet applications have become more complicated that mobile device needs more computing resources for shorter execution time but it is restricted to limited battery capacity. Mobile cloud computing (MCC) is emerged to tackle the finite resource problem of mobile device. MCC offloads the tasks and jobs of mobile devices to cloud and fog environments by using offloading scheme. It is vital to MCC that which task should be offloaded and how to offload efficiently. In the paper, we formulate the offloading problem between mobile device and cloud data center and propose two algorithms based on application-oriented for minimum execution time, i.e. the Minimum Offloading Time for Mobile device (MOTM) algorithm and the Minimum Execution Time for Cloud data center (METC) algorithm. The MOTM algorithm minimizes offloading time by selecting appropriate offloading links based on application categories. The METC algorithm minimizes execution time in cloud data center by selecting virtual and physical machines with corresponding resource requirements of applications. Simulation results show that the proposed mechanism not only minimizes total execution time for mobile devices but also decreases their energy consumption.
Parisi, A V; Downs, N; Turner, J; Amar, A
2016-09-01
A set of online activities for children and the community that are based on an integrated real-time solar UV and cloud measurement system are described. These activities use the functionality of the internet to provide an educative tool for school children and the public on the influence of cloud and the angle of the sun above the horizon on the global erythemal UV or sunburning UV, the diffuse erythemal UV, the global UVA (320-400nm) and the vitamin D effective UV. Additionally, the units of UV exposure and UV irradiance are investigated, along with the meaning and calculation of the UV index (UVI). This research will help ensure that children and the general public are better informed about sun safety by improving their personal understanding of the daily and the atmospheric factors that influence solar UV radiation and the solar UV exposures of the various wavebands in the natural environment. The activities may correct common misconceptions of children and the public about UV irradiances and exposure, utilising the widespread reach of the internet to increase the public's awareness of the factors influencing UV irradiances and exposures in order to provide clear information for minimizing UV exposure, while maintaining healthy, outdoor lifestyles. Copyright © 2016 Elsevier B.V. All rights reserved.
SECURE INTERNET OF THINGS-BASED CLOUD FRAMEWORK TO CONTROL ZIKA VIRUS OUTBREAK.
Sareen, Sanjay; Sood, Sandeep K; Gupta, Sunil Kumar
2017-01-01
Zika virus (ZikaV) is currently one of the most important emerging viruses in the world which has caused outbreaks and epidemics and has also been associated with severe clinical manifestations and congenital malformations. Traditional approaches to combat the ZikaV outbreak are not effective for detection and control. The aim of this study is to propose a cloud-based system to prevent and control the spread of Zika virus disease using integration of mobile phones and Internet of Things (IoT). A Naive Bayesian Network (NBN) is used to diagnose the possibly infected users, and Google Maps Web service is used to provide the geographic positioning system (GPS)-based risk assessment to prevent the outbreak. It is used to represent each ZikaV infected user, mosquito-dense sites, and breeding sites on the Google map that helps the government healthcare authorities to control such risk-prone areas effectively and efficiently. The performance and accuracy of the proposed system are evaluated using dataset for 2 million users. Our system provides high accuracy for initial diagnosis of different users according to their symptoms and appropriate GPS-based risk assessment. The cloud-based proposed system contributed to the accurate NBN-based classification of infected users and accurate identification of risk-prone areas using Google Maps.
Where are the bariatric bypass ads? The answer, my friend, is bloomin' on the Web.
Botvin, Judith D
2003-01-01
If you look up bariatric, or gastric bypass surgery on the Internet, you'll find 9,000 entries or more, depending on the search engine used. Some of these, but not all, are from hospitals and healthcare systems, most of which have more requests than they can handle. We look at the publicity for bariatric surgery presented by St. Cloud Hospital in St. Cloud, Minn.; Holy Name Hospital, Teaneck, N.J.; Kuakini Medical Center, Honolulu, Hawaii; and Cincinnati Children's Medical Center, Cincinnati, Ohio.
Using a cloud to replenish parched groundwater modeling efforts.
Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B
2010-01-01
Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.
Framework Development Supporting the Safety Portal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven Ralph; Kvarfordt, Kellie Jean; Vang, Leng
2015-07-01
In a collaborating scientific research arena it is important to have an environment where analysts have access to a shared repository of information, documents, and software tools, and be able to accurately maintain and track historical changes in models. The new Safety Portal cloud-based environment will be accessible remotely from anywhere regardless of computing platforms given that the platform has available Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report discusses current development of a cloud-based web portal for PRA tools.
Cloud Computing: Virtual Clusters, Data Security, and Disaster Recovery
NASA Astrophysics Data System (ADS)
Hwang, Kai
Dr. Kai Hwang is a Professor of Electrical Engineering and Computer Science and Director of Internet and Cloud Computing Lab at the Univ. of Southern California (USC). He received the Ph.D. in Electrical Engineering and Computer Science from the Univ. of California, Berkeley. Prior to joining USC, he has taught at Purdue Univ. for many years. He has also served as a visiting Chair Professor at Minnesota, Hong Kong Univ., Zhejiang Univ., and Tsinghua Univ. He has published 8 books and over 210 scientific papers in computer science/engineering.
Using a cloud to replenish parched groundwater modeling efforts
Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.
2010-01-01
Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.
[Study on extraction technology of soyasaponins from residual of bean ware].
Lu, Rumei; Zhang, Yizhen; Bi, Yi
2003-04-01
To find out the optimum extraction technology of soyasaponins from residual of bean ware. The optimum extraction conditions were investigated by the orthogonal design, and the content of soyasaponins was determined by UV-spectro-pho-tometry. The optimum extraction technology was A3B1C1, that is adding 7 times and 6 times amount of 70% alcohol and refluxing for two times and each time for 1.0 h. The selected technology showed higher yield of soyasaponins, good stability and high efficient.
Wearable Internet of Things - from human activity tracking to clinical integration.
Kumari, Poonam; Lopez-Benitez, Miguel; Gyu Myoung Lee; Tae-Seong Kim; Minhas, Atul S
2017-07-01
Wearable devices for human activity tracking have been emerging rapidly. Most of them are capable of sending health statistics to smartphones, smartwatches or smart bands. However, they only provide the data for individual analysis and their data is not integrated into clinical practice. Leveraging on the Internet of Things (IoT), edge and cloud computing technologies, we propose an architecture which is capable of providing cloud based clinical services using human activity data. Such services could supplement the shortage of staff in primary healthcare centers thereby reducing the burden on healthcare service providers. The enormous amount of data created from such services could also be utilized for planning future therapies by studying recovery cycles of existing patients. We provide a prototype based on our architecture and discuss its salient features. We also provide use cases of our system in personalized and home based healthcare services. We propose an International Telecommunication Union based standardization (ITU-T) for our design and discuss future directions in wearable IoT.
A Brief Analysis of Development Situations and Trend of Cloud Computing
NASA Astrophysics Data System (ADS)
Yang, Wenyan
2017-12-01
in recent years, the rapid development of Internet technology has radically changed people's work, learning and lifestyles. More and more activities are completed by virtue of computers and networks. The amount of information and data generated is bigger day by day, and people rely more on computer, which makes computing power of computer fail to meet demands of accuracy and rapidity from people. The cloud computing technology has experienced fast development, which is widely applied in the computer industry as a result of advantages of high precision, fast computing and easy usage. Moreover, it has become a focus in information research at present. In this paper, the development situations and trend of cloud computing shall be analyzed and researched.
Making Spatial Statistics Service Accessible On Cloud Platform
NASA Astrophysics Data System (ADS)
Mu, X.; Wu, J.; Li, T.; Zhong, Y.; Gao, X.
2014-04-01
Web service can bring together applications running on diverse platforms, users can access and share various data, information and models more effectively and conveniently from certain web service platform. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services. With the rampant growth of massive data and restriction of net, traditional web services platforms have some prominent problems existing in development such as calculation efficiency, maintenance cost and data security. In this paper, we offer a spatial statistics service based on Microsoft cloud. An experiment was carried out to evaluate the availability and efficiency of this service. The results show that this spatial statistics service is accessible for the public conveniently with high processing efficiency.
A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing.
Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang
2017-07-24
With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient.
Military clouds: utilization of cloud computing systems at the battlefield
NASA Astrophysics Data System (ADS)
Süleyman, Sarıkürk; Volkan, Karaca; İbrahim, Kocaman; Ahmet, Şirzai
2012-05-01
Cloud computing is known as a novel information technology (IT) concept, which involves facilitated and rapid access to networks, servers, data saving media, applications and services via Internet with minimum hardware requirements. Use of information systems and technologies at the battlefield is not new. Information superiority is a force multiplier and is crucial to mission success. Recent advances in information systems and technologies provide new means to decision makers and users in order to gain information superiority. These developments in information technologies lead to a new term, which is known as network centric capability. Similar to network centric capable systems, cloud computing systems are operational today. In the near future extensive use of military clouds at the battlefield is predicted. Integrating cloud computing logic to network centric applications will increase the flexibility, cost-effectiveness, efficiency and accessibility of network-centric capabilities. In this paper, cloud computing and network centric capability concepts are defined. Some commercial cloud computing products and applications are mentioned. Network centric capable applications are covered. Cloud computing supported battlefield applications are analyzed. The effects of cloud computing systems on network centric capability and on the information domain in future warfare are discussed. Battlefield opportunities and novelties which might be introduced to network centric capability by cloud computing systems are researched. The role of military clouds in future warfare is proposed in this paper. It was concluded that military clouds will be indispensible components of the future battlefield. Military clouds have the potential of improving network centric capabilities, increasing situational awareness at the battlefield and facilitating the settlement of information superiority.
Cenozoic Antarctic DiatomWare/BugCam: An aid for research and teaching
Wise, S.W.; Olney, M.; Covington, J.M.; Egerton, V.M.; Jiang, S.; Ramdeen, D.K.; ,; Schrader, H.; Sims, P.A.; Wood, A.S.; Davis, A.; Davenport, D.R.; Doepler, N.; Falcon, W.; Lopez, C.; Pressley, T.; Swedberg, O.L.; Harwood, D.M.
2007-01-01
Cenozoic Antarctic DiatomWare/BugCam© is an interactive, icon-driven digital-image database/software package that displays over 500 illustrated Cenozoic Antarctic diatom taxa along with original descriptions (including over 100 generic and 20 family-group descriptions). This digital catalog is designed primarily for use by micropaleontologists working in the field (at sea or on the Antarctic continent) where hard-copy literature resources are limited. This new package will also be useful for classroom/lab teaching as well as for any paleontologists making or refining taxonomic identifications at the microscope. The database (Cenozoic Antarctic DiatomWare) is displayed via a custom software program (BugCam) written in Visual Basic for use on PCs running Windows 95 or later operating systems. BugCam is a flexible image display program that utilizes an intuitive thumbnail “tree” structure for navigation through the database. The data are stored on Micrsosoft EXCEL spread sheets, hence no separate relational database program is necessary to run the package
The medium is NOT the message or Indefinitely long-term file storage at Leeds University
NASA Technical Reports Server (NTRS)
Holdsworth, David
1996-01-01
Approximately 3 years ago we implemented an archive file storage system which embodies experiences gained over more than 25 years of using and writing file storage systems. It is the third in-house system that we have written, and all three systems have been adopted by other institutions. This paper discusses the requirements for long-term data storage in a university environment, and describes how our present system is designed to meet these requirements indefinitely. Particular emphasis is laid on experiences from past systems, and their influence on current system design. We also look at the influence of the IEEE-MSS standard. We currently have the system operating in five UK universities. The system operates in a multi-server environment, and is currently operational with UNIX (SunOS4, Solaris2, SGI-IRIX, HP-UX), NetWare3 and NetWare4. PCs logged on to NetWare can also archive and recover files that live on their hard disks.
76 FR 34965 - Cybersecurity, Innovation, and the Internet Economy
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-15
... disrupt computing systems. These threats are exacerbated by the interconnected and interdependent architecture of today's computing environment. Theoretically, security deficiencies in one area may provide... does the move to cloud-based services have on education and research efforts in the I3S? 45. What is...
ERIC Educational Resources Information Center
Walter, Virginia A.
1997-01-01
Virtual field trips can provide experiences beyond the reach of average K-12 students. Describes multimedia products for school use: Africa Trail, Dinosaur Hunter, Louvre Museum, Magic School Bus Explores the Rainforest, and Up to the Himalayas: Kingdoms in the Clouds and provides book and Internet connections for additional learning, highlighting…
Finete, Virginia de Lourdes Mendes; Gouvêa, Marcos Martins; Marques, Flávia Ferreira de Carvalho; Netto, Annibal Duarte Pereira
2014-06-01
Experimental studies of the natural photoluminescence of melamine in aqueous solutions showed that its fluorescence intensity (at 250/365 nm) was appropriated for analytical purposes. The exploitation of such melamine property provided the basis of development of a new, simple, precise and accurate method based on high performance liquid chromatography with fluorescence detection (HPLC-Fluo) to determine melamine in kitchen plastic ware following aqueous extraction using a microwave oven. Optimization of analytical parameters such as solvent composition, pH and extraction conditions led to limits of detection and quantification of melamine of 0.0081 and 0.027 μg mL(-1), respectively, with a linear range up to 10 μg mL(-1). Sample extracts fortified with melamine at three concentration levels produced an average recovery of 98±6%, which was in agreement with the results achieved with a reference HPLC-UV method. Different samples of kitchen plastic ware analyzed by the developed and optimized method showed melamine concentrations in the aqueous extract up to 17 µg mL(-1), which corresponded to 86.0 mg kg(-1) in these utensils. The results obtained indicate that the use of kitchen plastic ware made of melamine can contaminate food with this compound after heating in a microwave oven. Copyright © 2014. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Passmore, P.; Zimakov, L.; Rozhkov, M.
The 3rd Generation Seismic Recorder, Model 130-01, has been designed to be easier to use - more compact, lighter in weight, lower power, and requires less maintenance than other recorders. Not only is the hardware optimized for field deployments, soft- ware tools as well have been specially developed to support both field and base station operation. The 130's case is a clamshell design, inherently waterproof, with easy access to all user features on the top of the unit. The 130 has 6 input/output connectors, an LCD display, and a removable lid on top of the case. There are two Channel input connectors on a 6-channel unit (only one on a 3-channel unit), a Terminal connector for setup and control, a Net connector combining Ethernet and Serial PPP for network access, a 12 VDC Power connector, and a GPS receiver connector. The LCD display allows the user to monitor the status of various sub systems within the 130 without having a terminal device attached. For storing large amounts of data the IBM MicrodriveTM is offered. User setup, control and status monitoring is done either with a Personal Digital Assistant (PDA) (Palm OS compatible) using our Palm Field Controller (PFC) software or from a PC/workstation using our REF TEK Network Controller (RNC) GUI interface. StarBand VSAT is the premier two-way, always-on, high-speed satellite Internet ser- vice. StarBand means high-speed Internet without the constraints and congestion of land-based cable or telephone networks. StarBand uses a single satellite dish antenna for receiving and for sending dataUno telephone connection is needed. The hardware ° cost is much less than standard VSAT equipment with double or single hop transmis- sion. REF TEK protocol (RTP) provides end-to-end error-correcting data transmission and command/control. StarBandSs low cost VSAT provides two-way, always-on, high speed satellite Internet data availability. REF TEK and StarBand create the most ad- vanced real-time seismological data acquisition system. 1 Results of data transmission and availability is discussed. 2
Assessing Teaching Skills with a Mobile Simulation
ERIC Educational Resources Information Center
Gibson, David
2013-01-01
Because mobile technologies are overtaking personal computers as the primary tools of Internet access, and cloud-based resources are fundamentally transforming the world's knowledge, new forms of teaching and assessment are required to foster 21st century literacies, including those needed by K-12 teachers. A key feature of mobile technology…
Computer Security Primer: Systems Architecture, Special Ontology and Cloud Virtual Machines
ERIC Educational Resources Information Center
Waguespack, Leslie J.
2014-01-01
With the increasing proliferation of multitasking and Internet-connected devices, security has reemerged as a fundamental design concern in information systems. The shift of IS curricula toward a largely organizational perspective of security leaves little room for focus on its foundation in systems architecture, the computational underpinnings of…
Chan, Teresa; Sennik, Serena; Zaki, Amna; Trotter, Brendon
2015-03-01
Cloud-based applications such as Google Docs, Skype, Dropbox, and SugarSync are revolutionizing the way that we interact with the world. Members of the millennial generation (those born after 1980) are now becoming senior residents and junior attending physicians. We describe a novel technique combining Internet- and cloud-based methods to digitally augment the classic study group used by final-year residents studying for the Royal College of Physicians and Surgeons of Canada examination. This material was developed by residents and improved over the course of 18 months. This is an innovation report about a process for enhanced communication and collaboration as there has been little research to date regarding the augmentation of learner-driven initiatives with virtual resources.
Open Reading Frame Phylogenetic Analysis on the Cloud
2013-01-01
Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843
Virtualized Networks and Virtualized Optical Line Terminal (vOLT)
NASA Astrophysics Data System (ADS)
Ma, Jonathan; Israel, Stephen
2017-03-01
The success of the Internet and the proliferation of the Internet of Things (IoT) devices is forcing telecommunications carriers to re-architecture a central office as a datacenter (CORD) so as to bring the datacenter economics and cloud agility to a central office (CO). The Open Network Operating System (ONOS) is the first open-source software-defined network (SDN) operating system which is capable of managing and controlling network, computing, and storage resources to support CORD infrastructure and network virtualization. The virtualized Optical Line Termination (vOLT) is one of the key components in such virtualized networks.
The Optical Gravitational Lensing Experiment. Eclipsing Binary Stars in the Large Magellanic Cloud
NASA Astrophysics Data System (ADS)
Wyrzykowski, L.; Udalski, A.; Kubiak, M.; Szymanski, M.; Zebrun, K.; Soszynski, I.; Wozniak, P. R.; Pietrzynski, G.; Szewczyk, O.
2003-03-01
We present the catalog of 2580 eclipsing binary stars detected in 4.6 square degree area of the central parts of the Large Magellanic Cloud. The photometric data were collected during the second phase of the OGLE microlensing search from 1997 to 2000. The eclipsing objects were selected with the automatic search algorithm based on an artificial neural network. Basic statistics of eclipsing stars are presented. Also, the list of 36 candidates of detached eclipsing binaries for spectroscopic study and for precise LMC distance determination is provided. The full catalog is accessible from the OGLE Internet archive.
The Spacecraft Emergency Response System (SERS) for Autonomous Mission Operations
NASA Technical Reports Server (NTRS)
Breed, Julia; Chu, Kai-Dee; Baker, Paul; Starr, Cynthia; Fox, Jeffrey; Baitinger, Mick
1998-01-01
Today, most mission operations are geared toward lowering cost through unmanned operations. 7-day/24-hour operations are reduced to either 5-day/8-hour operations or become totally autonomous, especially for deep-space missions. Proper and effective notification during a spacecraft emergency could mean success or failure for an entire mission. The Spacecraft Emergency Response System (SERS) is a tool designed for autonomous mission operations. The SERS automatically contacts on-call personnel as needed when crises occur, either on-board the spacecraft or within the automated ground systems. Plus, the SERS provides a group-ware solution to facilitate the work of the person(s) contacted. The SERS is independent of the spacecraft's automated ground system. It receives and catalogues reports for various ground system components in near real-time. Then, based on easily configurable parameters, the SERS determines whom, if anyone, should be alerted. Alerts may be issued via Sky-Tel 2-way pager, Telehony, or e-mail. The alerted personnel can then review and respond to the spacecraft anomalies through the Netscape Internet Web Browser, or directly review and respond from the Sky-Tel 2-way pager.
An integrated system for land resources supervision based on the IoT and cloud computing
NASA Astrophysics Data System (ADS)
Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie
2017-01-01
Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.
Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian
2011-08-30
Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.
Cloud Service Selection Using Multicriteria Decision Analysis
Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Israat Tanzeena
2014-01-01
Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios. PMID:24696645
Cloud service selection using multicriteria decision analysis.
Whaiduzzaman, Md; Gani, Abdullah; Anuar, Nor Badrul; Shiraz, Muhammad; Haque, Mohammad Nazmul; Haque, Israat Tanzeena
2014-01-01
Cloud computing (CC) has recently been receiving tremendous attention from the IT industry and academic researchers. CC leverages its unique services to cloud customers in a pay-as-you-go, anytime, anywhere manner. Cloud services provide dynamically scalable services through the Internet on demand. Therefore, service provisioning plays a key role in CC. The cloud customer must be able to select appropriate services according to his or her needs. Several approaches have been proposed to solve the service selection problem, including multicriteria decision analysis (MCDA). MCDA enables the user to choose from among a number of available choices. In this paper, we analyze the application of MCDA to service selection in CC. We identify and synthesize several MCDA techniques and provide a comprehensive analysis of this technology for general readers. In addition, we present a taxonomy derived from a survey of the current literature. Finally, we highlight several state-of-the-art practical aspects of MCDA implementation in cloud computing service selection. The contributions of this study are four-fold: (a) focusing on the state-of-the-art MCDA techniques, (b) highlighting the comparative analysis and suitability of several MCDA methods, (c) presenting a taxonomy through extensive literature review, and (d) analyzing and summarizing the cloud computing service selections in different scenarios.
Survey on Security Issues in Cloud Computing and Associated Mitigation Techniques
NASA Astrophysics Data System (ADS)
Bhadauria, Rohit; Sanyal, Sugata
2012-06-01
Cloud Computing holds the potential to eliminate the requirements for setting up of high-cost computing infrastructure for IT-based solutions and services that the industry uses. It promises to provide a flexible IT architecture, accessible through internet for lightweight portable devices. This would allow multi-fold increase in the capacity or capabilities of the existing and new software. In a cloud computing environment, the entire data reside over a set of networked resources, enabling the data to be accessed through virtual machines. Since these data-centers may lie in any corner of the world beyond the reach and control of users, there are multifarious security and privacy challenges that need to be understood and taken care of. Also, one can never deny the possibility of a server breakdown that has been witnessed, rather quite often in the recent times. There are various issues that need to be dealt with respect to security and privacy in a cloud computing scenario. This extensive survey paper aims to elaborate and analyze the numerous unresolved issues threatening the cloud computing adoption and diffusion affecting the various stake-holders linked to it.
Cloud-based adaptive exon prediction for DNA analysis.
Putluri, Srinivasareddy; Zia Ur Rahman, Md; Fathima, Shaik Yasmeen
2018-02-01
Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database.
ASTER cloud coverage reassessment using MODIS cloud mask products
NASA Astrophysics Data System (ADS)
Tonooka, Hideyuki; Omagari, Kunjuro; Yamamoto, Hirokazu; Tachikawa, Tetsushi; Fujita, Masaru; Paitaer, Zaoreguli
2010-10-01
In the Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER) Project, two kinds of algorithms are used for cloud assessment in Level-1 processing. The first algorithm based on the LANDSAT-5 TM Automatic Cloud Cover Assessment (ACCA) algorithm is used for a part of daytime scenes observed with only VNIR bands and all nighttime scenes, and the second algorithm based on the LANDSAT-7 ETM+ ACCA algorithm is used for most of daytime scenes observed with all spectral bands. However, the first algorithm does not work well for lack of some spectral bands sensitive to cloud detection, and the two algorithms have been less accurate over snow/ice covered areas since April 2008 when the SWIR subsystem developed troubles. In addition, they perform less well for some combinations of surface type and sun elevation angle. We, therefore, have developed the ASTER cloud coverage reassessment system using MODIS cloud mask (MOD35) products, and have reassessed cloud coverage for all ASTER archived scenes (>1.7 million scenes). All of the new cloud coverage data are included in Image Management System (IMS) databases of the ASTER Ground Data System (GDS) and NASA's Land Process Data Active Archive Center (LP DAAC) and used for ASTER product search by users, and cloud mask images are distributed to users through Internet. Daily upcoming scenes (about 400 scenes per day) are reassessed and inserted into the IMS databases in 5 to 7 days after each scene observation date. Some validation studies for the new cloud coverage data and some mission-related analyses using those data are also demonstrated in the present paper.
Eye Safe, Visible Wavelength Lidar Systems: Design and Operational Advances, Results and Potential
NASA Technical Reports Server (NTRS)
Spinhirne, James; Welton, Ellsworth J.; Berkoff, Timothy; Campbell, James
2007-01-01
In the early nineties the first of the eye safe visible wavelength lidar systems known now as Micro Pulse Lidar (MPL) became operational. The important advance of the design was a system that, unlike most existing lidar, operated at eye safe energy densities and could thus operate unattended for full time monitoring. Since that time there have been many dozens of these systems produced and applied for full time profiling of atmospheric cloud and aerosol structure. There is currently an observational network of MPL sites to support global climate research. In thc course of application of these instruments there have been significant improvements in the, design and performance of the systems. In the last half decade particularly there has been significant application and technical development of MPL systems. In this paper we review progress. The current MPL systems in use are all single wavelength systems designed for cloud and aerosol applications. For the cloud and aerosol applications, both lidar depolarization and multi wavelength measurements have significant applications. These can be accomplished with the MPL, approach. The main current challenge for the lidar network activity are in the area of the reliability, repeatability and efficiency of data processing. The network makes use of internet data downloads and automated processing. The heights of all cloud and aerosol layers are needed. The recent emphasis has been in operationally deriving aerosol extinction cross section. Future emphasis will include adding cirrus optical parameters. For operational effectiveness, improvements to simplify routine data signal calibration are being researched. Overall the MPL systems have proven very effective. A large data base of results from globally distributed sites can be easily accessed through the internet. Applications have included atmospheric model development. Validation of current global satellite observations of aerosol and clouds, including now orbital lidar observations, was a primary goal for NASA. Although sampling issues require careful consideration, results have proven useful.
Learning from the past: Rare ε-Fe2O3 in the ancient black-glazed Jian (Tenmoku) wares
Dejoie, Catherine; Sciau, Philippe; Li, Weidong; Noé, Laure; Mehta, Apurva; Chen, Kai; Luo, Hongjie; Kunz, Martin; Tamura, Nobumichi; Liu, Zhi
2014-01-01
Ancient Jian wares are famous for their lustrous black glaze that exhibits unique colored patterns. Some striking examples include the brownish colored “Hare's Fur” (HF) strips and the silvery “Oil Spot” (OS) patterns. Herein, we investigated the glaze surface of HF and OS samples using a variety of characterization methods. Contrary to the commonly accepted theory, we identified the presence of ε-Fe2O3, a rare metastable polymorph of Fe2O3 with unique magnetic properties, in both HF and OS samples. We found that surface crystals of OS samples are up to several micrometers in size and exclusively made of ε-Fe2O3. Interestingly, these ε-Fe2O3 crystals on the OS sample surface are organized in a periodic two dimensional fashion. These results shed new lights on the actual mechanisms and kinetics of polymorphous transitions of Fe2O3. Deciphering technologies behind the fabrication of ancient Jian wares can thus potentially help researchers improve the ε-Fe2O3 synthesis. PMID:24820819
Connected car: Engines diagnostic via Internet of Things (IoT)
NASA Astrophysics Data System (ADS)
Hamid, A. F. A.; Rahman, M. T. A.; Khan, S. F.; Adom, A. H.; Rahim, M. A.; Rahim, N. A.; Ismail, M. H. N.; Norizan, A.
2017-10-01
This paper is about an experiment for performing engines diagnostic using wireless sensing Internet of Thing (IoT). The study is to overcome problem of current standard On Board Diagnosis (OBD-II) data acquisition method that only can be perform in offline or wired method. From this paper it show a method to determined how the data from engines can be collected, make the data can be easily understand by human and sending data over the wireless internet connection via platform of IOT. This study is separate into three stages that is CAN-bus data collection, CAN data conversion and send data to cloud storage. Every stage is experimented with a two different method and consist five data parameter that is Revolution per Minute (RPM), Manifold Air Pressure (MAP), load-fuel, barometric pressure and engine temperature. The experiment use Arduino Uno as microcontroller, CAN-bus converter and ESP8266 wifi board as transfer medium for data to internet.
Internet-based computer technology on radiotherapy.
Chow, James C L
2017-01-01
Recent rapid development of Internet-based computer technologies has made possible many novel applications in radiation dose delivery. However, translational speed of applying these new technologies in radiotherapy could hardly catch up due to the complex commissioning process and quality assurance protocol. Implementing novel Internet-based technology in radiotherapy requires corresponding design of algorithm and infrastructure of the application, set up of related clinical policies, purchase and development of software and hardware, computer programming and debugging, and national to international collaboration. Although such implementation processes are time consuming, some recent computer advancements in the radiation dose delivery are still noticeable. In this review, we will present the background and concept of some recent Internet-based computer technologies such as cloud computing, big data processing and machine learning, followed by their potential applications in radiotherapy, such as treatment planning and dose delivery. We will also discuss the current progress of these applications and their impacts on radiotherapy. We will explore and evaluate the expected benefits and challenges in implementation as well.
Using Web Speech Technology with Language Learning Applications
ERIC Educational Resources Information Center
Daniels, Paul
2015-01-01
In this article, the author presents the history of human-to-computer interaction based upon the design of sophisticated computerized speech recognition algorithms. Advancements such as the arrival of cloud-based computing and software like Google's Web Speech API allows anyone with an Internet connection and Chrome browser to take advantage of…
Google Wave: Collaboration Reworked
ERIC Educational Resources Information Center
Rethlefsen, Melissa L.
2010-01-01
Over the past several years, Internet users have become accustomed to Web 2.0 and cloud computing-style applications. It's commonplace and even intuitive to drag and drop gadgets on personalized start pages, to comment on a Facebook post without reloading the page, and to compose and save documents through a web browser. The web paradigm has…
University Internet Services: Problems and Opportunities.
ERIC Educational Resources Information Center
Phan, Dien D.; Chen, Jim Q.
This paper presents the findings of a study on the use of World Wide Web among students at St. Cloud State University, Minnesota, USA. The paper explores problems and challenges on campus Web computing and the relationships among the extent of Web usage, class level, and overall student academic performance. Specifically, the purposes of this…
A Review of Cloud Application Assessment Practices at the University of Ballarat
ERIC Educational Resources Information Center
Wilmott, Deirdre; Knox, Ian
2012-01-01
It has been suggested that traditional assessment practices in tertiary institutions tend not to equip students well for the processes of effective learning in a learning society [1]. This paper reviews alternative Internet based assessment practices used in Library, Business and Education courses at the University of Ballarat, Victoria, Australia…
Remote sensing for detection of termite infestations—Proof of Concept
Frederick Green III; Rachel A. Arango; Charles R. Boardman; Keith J. Bourne; John C. Hermanson; Robert A. Munson
2015-01-01
This paper reports the results of a search to discover the most cost effective and robust method of detecting Reticulitermes flavipes infestations in structural members of remote bridges, homes and other wooden structures and transmitting these results to internet cloud storage thus obviating routine travel to these structures for periodic visual...
ERIC Educational Resources Information Center
Godwin-Jones, Robert
2008-01-01
Creating effective electronic tools for language learning frequently requires large data sets containing extensive examples of actual human language use. Collections of authentic language in spoken and written forms provide developers the means to enrich their applications with real world examples. As the Internet continues to expand…
NASA Astrophysics Data System (ADS)
Parikh, Ashesh; Mehta, Nihal
2015-03-01
Recent advances in internet browser technologies makes it possible to incorporate advanced functionality of a traditional PACS for viewing DICOM medical images on standard web browsers without the need to pre-install any plug-ins, apps or software. We demonstrate some of the capabilities of standard web browsers setting the stage for a cloud-based PACS.
Design and Implementation of Marine Information System, and Analysis of Learners' Intention toward
ERIC Educational Resources Information Center
Pan, Yu-Jen; Kao, Jui-Chung; Yu, Te-Cheng
2016-01-01
The goal of this study is to conduct further research and discussion on applying the internet on marine education, utilizing existing technologies such as cloud service, social network, data collection analysis, etc. to construct a marine environment education information system. The content to be explored includes marine education information…
A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing
Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang
2017-01-01
With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient. PMID:28737733
Human face recognition using eigenface in cloud computing environment
NASA Astrophysics Data System (ADS)
Siregar, S. T. M.; Syahputra, M. F.; Rahmat, R. F.
2018-02-01
Doing a face recognition for one single face does not take a long time to process, but if we implement attendance system or security system on companies that have many faces to be recognized, it will take a long time. Cloud computing is a computing service that is done not on a local device, but on an internet connected to a data center infrastructure. The system of cloud computing also provides a scalability solution where cloud computing can increase the resources needed when doing larger data processing. This research is done by applying eigenface while collecting data as training data is also done by using REST concept to provide resource, then server can process the data according to existing stages. After doing research and development of this application, it can be concluded by implementing Eigenface, recognizing face by applying REST concept as endpoint in giving or receiving related information to be used as a resource in doing model formation to do face recognition.
COMBAT: mobile-Cloud-based cOmpute/coMmunications infrastructure for BATtlefield applications
NASA Astrophysics Data System (ADS)
Soyata, Tolga; Muraleedharan, Rajani; Langdon, Jonathan; Funai, Colin; Ames, Scott; Kwon, Minseok; Heinzelman, Wendi
2012-05-01
The amount of data processed annually over the Internet has crossed the zetabyte boundary, yet this Big Data cannot be efficiently processed or stored using today's mobile devices. Parallel to this explosive growth in data, a substantial increase in mobile compute-capability and the advances in cloud computing have brought the state-of-the- art in mobile-cloud computing to an inflection point, where the right architecture may allow mobile devices to run applications utilizing Big Data and intensive computing. In this paper, we propose the MObile Cloud-based Hybrid Architecture (MOCHA), which formulates a solution to permit mobile-cloud computing applications such as object recognition in the battlefield by introducing a mid-stage compute- and storage-layer, called the cloudlet. MOCHA is built on the key observation that many mobile-cloud applications have the following characteristics: 1) they are compute-intensive, requiring the compute-power of a supercomputer, and 2) they use Big Data, requiring a communications link to cloud-based database sources in near-real-time. In this paper, we describe the operation of MOCHA in battlefield applications, by formulating the aforementioned mobile and cloudlet to be housed within a soldier's vest and inside a military vehicle, respectively, and enabling access to the cloud through high latency satellite links. We provide simulations using the traditional mobile-cloud approach as well as utilizing MOCHA with a mid-stage cloudlet to quantify the utility of this architecture. We show that the MOCHA platform for mobile-cloud computing promises a future for critical battlefield applications that access Big Data, which is currently not possible using existing technology.
Biswas, Amitava; Liu, Chen; Monga, Inder; ...
2016-01-01
For last few years, there has been a tremendous growth in data traffic due to high adoption rate of mobile devices and cloud computing. Internet of things (IoT) will stimulate even further growth. This is increasing scale and complexity of telecom/internet service provider (SP) and enterprise data centre (DC) compute and network infrastructures. As a result, managing these large network-compute converged infrastructures is becoming complex and cumbersome. To cope up, network and DC operators are trying to automate network and system operations, administrations and management (OAM) functions. OAM includes all non-functional mechanisms which keep the network running.
Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.
Liao, Wen-Hwa; Qiu, Wan-Li
2016-01-01
Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.
2011-01-01
Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105
Jia, La-jiang; Jin, Pu-jun
2015-01-01
The present paper analyzes the interior rust that occurred in bronze alloy sample from 24 pieces of Early Qin bronze wares. Firstly, samples were processed by grinding, polishing and ultrasonic cleaning-to make a mirror surface. Then, a confocal micro-Raman spectrometer was employed to carry out spectroscopic study on the inclusions in samples. The conclusion indicated that corrosive phases are PbCO3 , PbO and Cu2O, which are common rusting production on bronze alloy. The light-colored circular or massive irregular areas in metallographic structure of samples are proved as Cu2O, showing that bronze wares are not only easy to be covered with red Cu2O rusting layer, but also their alloy is easy to be eroded by atomic oxygen. In other words, the rust Cu2O takes place in both the interior and exterior parts of the bronze alloy. In addition, Raman spectrum analysis shows that the dark grey materials are lead corrosive products--PbCO3 and PbO, showing the corroding process of lead element as Pb -->PbO-->PbCO3. In the texture of cast state of bronze alloy, lead is usually distributed as independent particles between the different alloy phases. The lead particles in bronze alloy would have oxidation reaction and generate PbO when buried in the soil, and then have chemical reaction with CO3(2-) dissolved in the underground water to generate PbCO3, which is a rather stable lead corrosive production. A conclusion can be drawn that the external corrosive factors (water, dissolved oxygen and carbonate, etc) can enter the bronze ware interior through the passageway between different phases and make the alloy to corrode gradually.
Hillebrand, Julia; Hoffmeier, Andreas; Djie Tiong Tjan, Tonny; Sindermann, Juergen R; Schmidt, Christoph; Martens, Sven; Scherer, Mirela
2017-05-01
Left ventricular assist device (LVAD) implantation is a well-established therapy to support patients with end-stage heart failure. However, the operative procedure is associated with severe trauma. Third generation LVADs like the HeartWare assist device (HeartWare, Inc., Framingham, MA, USA) are characterized by enhanced technology despite smaller size. These devices offer new minimally invasive surgical options. Tricuspid regurgitation requiring valve repair is frequent in patients with the need for mechanical circulatory support as it is strongly associated with ischemic and nonischemic cardiomyopathy. We report on HeartWare LVAD implantation and simultaneous tricuspid valve reconstruction through minimally invasive access by partial upper sternotomy to the fifth left intercostal space. Four male patients (mean age 51.72 ± 11.95 years) suffering from chronic heart failure due to dilative (three patients) and ischemic (one patient) cardiomyopathy and also exhibiting concomitant tricuspid valve insufficiency due to annular dilation underwent VAD implantation and tricuspid valve annuloplasty. Extracorporeal circulation was established via the ascending aorta, superior vena cava, and right atrium. In all four cases the LVAD implantation and tricuspid valve repair via partial median sternotomy was successful. During the operative procedure, no conversion to full sternotomy was necessary. One patient needed postoperative re-exploration because of pericardial effusion. No postoperative focal neurologic injury was observed. New generation VADs are advantageous because of the possibility of minimally invasive implantation procedure which can therefore minimize surgical trauma. Concomitant tricuspid valve reconstruction can also be performed simultaneously through partial upper sternotomy. Nevertheless, minimally invasive LVAD implantation is a challenging operative technique. © 2016 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
An Adaptive Multilevel Security Framework for the Data Stored in Cloud Environment
Dorairaj, Sudha Devi; Kaliannan, Thilagavathy
2015-01-01
Cloud computing is renowned for delivering information technology services based on internet. Nowadays, organizations are interested in moving their massive data and computations into cloud to reap their significant benefits of on demand service, resource pooling, and rapid elasticity that helps to satisfy the dynamically changing infrastructure demand without the burden of owning, managing, and maintaining it. Since the data needs to be secured throughout its life cycle, security of the data in cloud is a major challenge to be concentrated on because the data is in third party's premises. Any uniform simple or high level security method for all the data either compromises the sensitive data or proves to be too costly with increased overhead. Any common multiple method for all data becomes vulnerable when the common security pattern is identified at the event of successful attack on any information and also encourages more attacks on all other data. This paper suggests an adaptive multilevel security framework based on cryptography techniques that provide adequate security for the classified data stored in cloud. The proposed security system acclimates well for cloud environment and is also customizable and more reliant to meet the required level of security of data with different sensitivity that changes with business needs and commercial conditions. PMID:26258165
An Adaptive Multilevel Security Framework for the Data Stored in Cloud Environment.
Dorairaj, Sudha Devi; Kaliannan, Thilagavathy
2015-01-01
Cloud computing is renowned for delivering information technology services based on internet. Nowadays, organizations are interested in moving their massive data and computations into cloud to reap their significant benefits of on demand service, resource pooling, and rapid elasticity that helps to satisfy the dynamically changing infrastructure demand without the burden of owning, managing, and maintaining it. Since the data needs to be secured throughout its life cycle, security of the data in cloud is a major challenge to be concentrated on because the data is in third party's premises. Any uniform simple or high level security method for all the data either compromises the sensitive data or proves to be too costly with increased overhead. Any common multiple method for all data becomes vulnerable when the common security pattern is identified at the event of successful attack on any information and also encourages more attacks on all other data. This paper suggests an adaptive multilevel security framework based on cryptography techniques that provide adequate security for the classified data stored in cloud. The proposed security system acclimates well for cloud environment and is also customizable and more reliant to meet the required level of security of data with different sensitivity that changes with business needs and commercial conditions.
Tag Clouds in the Blogosphere: Electronic Literacy and Social Networking
ERIC Educational Resources Information Center
Godwin-Jones, Robert
2006-01-01
Electronic literacy today is a moving target. How and why people read and write online are evolving at the fast pace of Internet time. One of the most striking developments in the past few years has been how new social networking phenomena on the Web like community tagging, shared bookmarking, and blogs have created convergences between consumers…
ERIC Educational Resources Information Center
Halac, Hicran Hanim; Cabuk, Alper
2013-01-01
Depending on the evolving technological possibilities, distance and online education applications have gradually gained more significance in the education system. Regarding the issues, such as advancements in the server services, disc capacity, cloud computing opportunities resulting from the increase in the number of the broadband internet users,…
Collaborative Cloud: A New Model for e-Learning
ERIC Educational Resources Information Center
Liao, Jian; Wang, Minhong; Ran, Weijia; Yang, Stephen J. H.
2014-01-01
The number of learners using e-learning has been increasing at an enormous rate in the past decade due to easy access to higher educational resources via the Internet. On the other hand, the number of teachers in most universities is growing slowly. As a result, instructional problems have emerged due to the lack of sufficient support to learners…
An Innovative Research on the Usage of Facebook in the Higher Education Context of Hong Kong
ERIC Educational Resources Information Center
Lam, Louis
2012-01-01
Teaching and learning is undergoing a dramatic change due to the advancement in telecommunication and IT. Increasingly, online learning platform is playing an important role higher education. The maturity of Internet and emergence of various cloud services catalyse the development of these platforms and student learning behaviour. An example is…
Software Simplifies the Sharing of Numerical Models
NASA Technical Reports Server (NTRS)
2014-01-01
To ease the sharing of climate models with university students, Goddard Space Flight Center awarded SBIR funding to Reston, Virginia-based Parabon Computation Inc., a company that specializes in cloud computing. The firm developed a software program capable of running climate models over the Internet, and also created an online environment for people to collaborate on developing such models.
Combining Fog Computing with Sensor Mote Machine Learning for Industrial IoT.
Lavassani, Mehrzad; Forsström, Stefan; Jennehag, Ulf; Zhang, Tingting
2018-05-12
Digitalization is a global trend becoming ever more important to our connected and sustainable society. This trend also affects industry where the Industrial Internet of Things is an important part, and there is a need to conserve spectrum as well as energy when communicating data to a fog or cloud back-end system. In this paper we investigate the benefits of fog computing by proposing a novel distributed learning model on the sensor device and simulating the data stream in the fog, instead of transmitting all raw sensor values to the cloud back-end. To save energy and to communicate as few packets as possible, the updated parameters of the learned model at the sensor device are communicated in longer time intervals to a fog computing system. The proposed framework is implemented and tested in a real world testbed in order to make quantitative measurements and evaluate the system. Our results show that the proposed model can achieve a 98% decrease in the number of packets sent over the wireless link, and the fog node can still simulate the data stream with an acceptable accuracy of 97%. We also observe an end-to-end delay of 180 ms in our proposed three-layer framework. Hence, the framework shows that a combination of fog and cloud computing with a distributed data modeling at the sensor device for wireless sensor networks can be beneficial for Industrial Internet of Things applications.
Combining Fog Computing with Sensor Mote Machine Learning for Industrial IoT
Lavassani, Mehrzad; Jennehag, Ulf; Zhang, Tingting
2018-01-01
Digitalization is a global trend becoming ever more important to our connected and sustainable society. This trend also affects industry where the Industrial Internet of Things is an important part, and there is a need to conserve spectrum as well as energy when communicating data to a fog or cloud back-end system. In this paper we investigate the benefits of fog computing by proposing a novel distributed learning model on the sensor device and simulating the data stream in the fog, instead of transmitting all raw sensor values to the cloud back-end. To save energy and to communicate as few packets as possible, the updated parameters of the learned model at the sensor device are communicated in longer time intervals to a fog computing system. The proposed framework is implemented and tested in a real world testbed in order to make quantitative measurements and evaluate the system. Our results show that the proposed model can achieve a 98% decrease in the number of packets sent over the wireless link, and the fog node can still simulate the data stream with an acceptable accuracy of 97%. We also observe an end-to-end delay of 180 ms in our proposed three-layer framework. Hence, the framework shows that a combination of fog and cloud computing with a distributed data modeling at the sensor device for wireless sensor networks can be beneficial for Industrial Internet of Things applications. PMID:29757227
NASA Astrophysics Data System (ADS)
Ham, J. M.
2016-12-01
New microprocessor boards, open-source sensors, and cloud infrastructure developed for the Internet of Things (IoT) can be used to create low-cost monitoring systems for environmental research. This project describes two applications in soil science and hydrology: 1) remote monitoring of the soil temperature regime near oil and gas operations to detect the thermal signature associated with the natural source zone degradation of hydrocarbon contaminants in the vadose zone, and 2) remote monitoring of soil water content near the surface as part of a global citizen science network. In both cases, prototype data collection systems were built around the cellular (2G/3G) "Electron" microcontroller (www.particle.io). This device allows connectivity to the cloud using a low-cost global SIM and data plan. The systems have cellular connectivity in over 100 countries and data can be logged to the cloud for storage. Users can view data real time over any internet connection or via their smart phone. For both projects, data logging, storage, and visualization was done using IoT services like Thingspeak (thingspeak.com). The soil thermal monitoring system was tested on experimental plots in Colorado USA to evaluate the accuracy and reliability of different temperature sensors and 3D printed housings. The soil water experiment included comparison opens-source capacitance-based sensors to commercial versions. Results demonstrate the power of leveraging IoT technology for field research.
Internet Hospitals in China: Cross-Sectional Survey
Lin, Lingyan; Fan, Si; Lin, Fen; Wang, Long; Guo, Tongjun; Ma, Chuyang; Zhang, Jingkun; Chen, Yixin
2017-01-01
Background The Internet hospital, an innovative approach to providing health care, is rapidly developing in China because it has the potential to provide widely accessible outpatient service delivery via Internet technologies. To date, China’s Internet hospitals have not been systematically investigated. Objective The aim of this study was to describe the characteristics of China’s Internet hospitals, and to assess their health service capacity. Methods We searched Baidu, the popular Chinese search engine, to identify Internet hospitals, using search terms such as “Internet hospital,” “web hospital,” or “cloud hospital.” All Internet hospitals in mainland China were eligible for inclusion if they were officially registered. Our search was carried out until March 31, 2017. Results We identified 68 Internet hospitals, of which 43 have been put into use and 25 were under construction. Of the 43 established Internet hospitals, 13 (30%) were in the hospital informatization stage, 24 (56%) were in the Web ward stage, and 6 (14%) were in full Internet hospital stage. Patients accessed outpatient service delivery via website (74%, 32/43), app (42%, 18/43), or offline medical consultation facility (37%, 16/43) from the Internet hospital. Furthermore, 25 (58%) of the Internet hospitals asked doctors to deliver health services at a specific Web clinic, whereas 18 (42%) did not. The consulting methods included video chat (60%, 26/43), telephone (19%, 8/43), and graphic message (28%, 12/43); 13 (30%) Internet hospitals cannot be consulted online any more. Only 6 Internet hospitals were included in the coverage of health insurance. The median number of doctors available online was zero (interquartile range [IQR] 0 to 5; max 16,492). The median consultation fee per time was ¥20 (approximately US $2.90, IQR ¥0 to ¥200). Conclusions Internet hospitals provide convenient outpatient service delivery. However, many of the Internet hospitals are not yet mature and are faced with various issues such as online doctor scarcity and the unavailability of health insurance coverage. China’s Internet hospitals are heading in the right direction to improve provision of health services, but much more remains to be done. PMID:28676472
Cloud-based adaptive exon prediction for DNA analysis
Putluri, Srinivasareddy; Fathima, Shaik Yasmeen
2018-01-01
Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database. PMID:29515813
Sukič, Primož; Štumberger, Gorazd
2017-05-13
Clouds moving at a high speed in front of the Sun can cause step changes in the output power of photovoltaic (PV) power plants, which can lead to voltage fluctuations and stability problems in the connected electricity networks. These effects can be reduced effectively by proper short-term cloud passing forecasting and suitable PV power plant output power control. This paper proposes a low-cost Internet of Things (IoT)-based solution for intra-minute cloud passing forecasting. The hardware consists of a Raspberry PI Model B 3 with a WiFi connection and an OmniVision OV5647 sensor with a mounted wide-angle lens, a circular polarizing (CPL) filter and a natural density (ND) filter. The completely new algorithm for cloud passing forecasting uses the green and blue colors in the photo to determine the position of the Sun, to recognize the clouds, and to predict their movement. The image processing is performed in several stages, considering selectively only a small part of the photo relevant to the movement of the clouds in the vicinity of the Sun in the next minute. The proposed algorithm is compact, fast and suitable for implementation on low cost processors with low computation power. The speed of the cloud parts closest to the Sun is used to predict when the clouds will cover the Sun. WiFi communication is used to transmit this data to the PV power plant control system in order to decrease the output power slowly and smoothly.
Sukič, Primož; Štumberger, Gorazd
2017-01-01
Clouds moving at a high speed in front of the Sun can cause step changes in the output power of photovoltaic (PV) power plants, which can lead to voltage fluctuations and stability problems in the connected electricity networks. These effects can be reduced effectively by proper short-term cloud passing forecasting and suitable PV power plant output power control. This paper proposes a low-cost Internet of Things (IoT)-based solution for intra-minute cloud passing forecasting. The hardware consists of a Raspberry PI Model B 3 with a WiFi connection and an OmniVision OV5647 sensor with a mounted wide-angle lens, a circular polarizing (CPL) filter and a natural density (ND) filter. The completely new algorithm for cloud passing forecasting uses the green and blue colors in the photo to determine the position of the Sun, to recognize the clouds, and to predict their movement. The image processing is performed in several stages, considering selectively only a small part of the photo relevant to the movement of the clouds in the vicinity of the Sun in the next minute. The proposed algorithm is compact, fast and suitable for implementation on low cost processors with low computation power. The speed of the cloud parts closest to the Sun is used to predict when the clouds will cover the Sun. WiFi communication is used to transmit this data to the PV power plant control system in order to decrease the output power slowly and smoothly. PMID:28505078
Bhavani, Selvaraj Rani; Senthilkumar, Jagatheesan; Chilambuchelvan, Arul Gnanaprakasam; Manjula, Dhanabalachandran; Krishnamoorthy, Ramasamy; Kannan, Arputharaj
2015-03-27
The Internet has greatly enhanced health care, helping patients stay up-to-date on medical issues and general knowledge. Many cancer patients use the Internet for cancer diagnosis and related information. Recently, cloud computing has emerged as a new way of delivering health services but currently, there is no generic and fully automated cloud-based self-management intervention for breast cancer patients, as practical guidelines are lacking. We investigated the prevalence and predictors of cloud use for medical diagnosis among women with breast cancer to gain insight into meaningful usage parameters to evaluate the use of generic, fully automated cloud-based self-intervention, by assessing how breast cancer survivors use a generic self-management model. The goal of this study was implemented and evaluated with a new prototype called "CIMIDx", based on representative association rules that support the diagnosis of medical images (mammograms). The proposed Cloud-Based System Support Intelligent Medical Image Diagnosis (CIMIDx) prototype includes two modules. The first is the design and development of the CIMIDx training and test cloud services. Deployed in the cloud, the prototype can be used for diagnosis and screening mammography by assessing the cancers detected, tumor sizes, histology, and stage of classification accuracy. To analyze the prototype's classification accuracy, we conducted an experiment with data provided by clients. Second, by monitoring cloud server requests, the CIMIDx usage statistics were recorded for the cloud-based self-intervention groups. We conducted an evaluation of the CIMIDx cloud service usage, in which browsing functionalities were evaluated from the end-user's perspective. We performed several experiments to validate the CIMIDx prototype for breast health issues. The first set of experiments evaluated the diagnostic performance of the CIMIDx framework. We collected medical information from 150 breast cancer survivors from hospitals and health centers. The CIMIDx prototype achieved high sensitivity of up to 99.29%, and accuracy of up to 98%. The second set of experiments evaluated CIMIDx use for breast health issues, using t tests and Pearson chi-square tests to assess differences, and binary logistic regression to estimate the odds ratio (OR) for the predictors' use of CIMIDx. For the prototype usage statistics for the same 150 breast cancer survivors, we interviewed 114 (76.0%), through self-report questionnaires from CIMIDx blogs. The frequency of log-ins/person ranged from 0 to 30, total duration/person from 0 to 1500 minutes (25 hours). The 114 participants continued logging in to all phases, resulting in an intervention adherence rate of 44.3% (95% CI 33.2-55.9). The overall performance of the prototype for the good category, reported usefulness of the prototype (P=.77), overall satisfaction of the prototype (P=.31), ease of navigation (P=.89), user friendliness evaluation (P=.31), and overall satisfaction (P=.31). Positive evaluations given by 100 participants via a Web-based questionnaire supported our hypothesis. The present study shows that women felt favorably about the use of a generic fully automated cloud-based self- management prototype. The study also demonstrated that the CIMIDx prototype resulted in the detection of more cancers in screening and diagnosing patients, with an increased accuracy rate.
Development of a cloud-based system for remote monitoring of a PVT panel
NASA Astrophysics Data System (ADS)
Saraiva, Luis; Alcaso, Adérito; Vieira, Paulo; Ramos, Carlos Figueiredo; Cardoso, Antonio Marques
2016-10-01
The paper presents a monitoring system developed for an energy conversion system based on the sun and known as thermophotovoltaic panel (PVT). The project was implemented using two embedded microcontrollers platforms (arduino Leonardo and arduino yún), wireless transmission systems (WI-FI and XBEE) and net computing ,commonly known as cloud (Google cloud). The main objective of the project is to provide remote access and real-time data monitoring (like: electrical current, electrical voltage, input fluid temperature, output fluid temperature, backward fluid temperature, up PV glass temperature, down PV glass temperature, ambient temperature, solar radiation, wind speed, wind direction and fluid mass flow). This project demonstrates the feasibility of using inexpensive microcontroller's platforms and free internet service in theWeb, to support the remote study of renewable energy systems, eliminating the acquisition of dedicated systems typically more expensive and limited in the kind of processing proposed.
[Antimony and other heavy metals in metallic kitchen ware].
Ishiwata, H; Sugita, T; Yoshihira, K
1989-01-01
The antimony in metallic kitchen ware was determined. The content of this element in metals used for the production or repairing of utensils, containers and packaging which come in contact with foods is regulated and should be less than 5% in under the Japanese Food Sanitation Law. In eight metallic samples, antimony was detected in solder used for the production of a can for green tea and an eggbeater. The contents were 1.30% in the former and 1.90% in the latter. No antimony was detected in solder used for a cookie cutter. A sample of solder used for electric work, not for food utensils, contained 0.81% of antimony. In other metallic utensils which come in contact with food such as aluminum foil, a brass spoon, a stainless steel fork, a wire netting, and an iron rock for vegetable color stabilizing, antimony was not detected at a 0.05% detection limit. A qualitative test using rhodamine B also showed positive results in only three solder samples. Lead concentrations in solder used for the kitchen ware were from 39.3 to 51.3%. These concentrations were higher than the limit (20%) of lead content by the Law. No cadmium was detected in any samples.
NASA Astrophysics Data System (ADS)
Pérez-Consuegra, Nicolás; Parra, Mauricio; Jaramillo, Carlos; Silvestro, Daniele; Echeverri, Sebastián; Montes, Camilo; Jaramillo, José María; Escobar, Jaime
2018-01-01
The Cocinetas Basin in the Guajira Peninsula, the northernmost tip of South America, today has a dry climate with low rainfall (<500 mm/yr), a long dry season (>ten months) and no year-long rivers or permanent standing bodies of fresh water. In contrast, the fossil and geological record indicate that the Cocinetas Basin was much wetter during the Miocene-Pliocene (∼17-2.8 Ma). Water needed to sustain the paleofauna could either have originated from local sources or been brought by a larger river system (e.g. proto Magdalena/Orinoco river) with headwaters either in Andean ranges or the Guyana shield. We present a provenance study of the Pliocene Ware Formation, using petrographic analysis of conglomerate clasts and heavy minerals, and U-Pb dating of 140 detrital zircons. Clasts and heavy minerals are typical of ensialic metamorphic and igneous sources. The detrital zircon age distribution indicates the Guajira ranges as the most probable sediment source. The overall results indicate that the fluvial system of the Ware Formation drained the surrounding ranges. The water was probably derived by local precipitation onto the Guajira peninsula.
NASA ROVER, Tackling Citizen Science With Grand Challenges and Everyday Problems
NASA Technical Reports Server (NTRS)
Crecelius, Sarah; Chambers, Lin; Rogerson, Tina
2015-01-01
ROVER is the Citizen Science arm of the NASA Clouds and the Earth's Radiant Energy System (CERES) Students' Cloud Observations On-Line (S'COOL) Project. Since 2007, participants around the world have been making and reporting ground truth observations of clouds to assist in the validation of the NASA CERES satellite instrument. NASA scientists are very interested in learning how clouds affect our atmosphere, weather, and climate (relating to climate change). It is the clouds, in part, that affect the overall temperature and energy balance of the Earth. The more we know about clouds, the more we will know about our Earth as a system and citizen scientists are an important piece of that puzzle! As a ROVER cloud observer, all participants follow simple online tutorials to collect data on cloud type, height, cover and related conditions. Observations are sent to NASA to be matched to similar information obtained from satellites and sent back to participants for comparison and analysis. The supporting ROVER website houses a searchable database archiving all participant reports and matching satellite data. By involving Citizen Scientists in cloud observations and reporting we can gain a valuable set of data that would have been previously unavailable to science teams due to funding, manpower, and resource limitations or would have taken an unreasonable amount of time to collect. Reports from a wide range of Citizen Scientist locations are helpful to assess the satellite data under different conditions. With nothing more than their eyes and an internet connection participants provide a different perspective and analysis of clouds, adding to a more complete picture of what's happening in the atmosphere in which we live.
The CERES S'COOL Project: Development and Operational Phases
NASA Technical Reports Server (NTRS)
Chambers, Lin H.; Young, David F.; Racel, Anne M.
1998-01-01
As part of NASA's Mission to Planet Earth, the first Clouds and the Earth's Radiant Energy System (CERES) instrument will be launched on the Tropical Rainfall Measuring Mission (TRMM) spacecraft from the Tanegashima launch site in Japan in November 1997. The instrument will measure the radiation budget incoming and outgoing radiant energy - of the Earth. The major feature of interest is clouds, which play a very strong role in regulating our climate. CERES will identify clear and cloudy regions and determine cloud physical and microphysical properties using imager data from a companion instrument. Validation efforts for the remote sensing algorithms will be intensive. As one component of the validation, the S'COOL (Students' Cloud Observations On-Line) project will involve school children around the globe in making ground truth measurements at the time of a CERES overpass. They will report cloud type, height, fraction, and opacity, as well as the local surface conditions. Their observations will be collected at the NASA Langley Distributed Active Archive Center (DAAC) and made available over the Internet for educational purposes as well as for use by the CERES Science Team in validation efforts. Pilot testing of the S'COOL project began in January 1997 with two local schools in Southeastern Virginia and one remote site in Montana. National testing in April 1997 involved 8 schools (grades 3 to high school) across the United States. Global testing will be carried out in October 1997. Details of the S'COOL project, which is mainly Internet-based, are being developed in each of these phases according to feedback received from participants. In 1998, when the CERES instrument is operational, a global observer network should be in place providing useful information to the scientists and learning opportunities to the students. Broad participation in the S'COOL project is planned, both to obtain data from a wide range of geographic areas, and to involve as many students as possible in learning about clouds and atmospheric science. This paper reports on the development phase of the S'COOL project, including the reaction of the teachers and students who have been involved. It describes the operational state of the S'COOL network, and identifies opportunities for additional participants.
Deception Using an SSH Honeypot
2017-09-01
the device itself but also the device’s cloud and mobile infrastructure. This increase in unsecured devices connected to the Internet presents...have SSH enabled on their systems without knowledge that this service is running. Computer -security professionals use several techniques to gain...early 2000s. Honeypots are decoy computer systems intended for no other purpose than to collect data on attackers. They gather information about
ERIC Educational Resources Information Center
Crearie, Linda
2016-01-01
Technological advances over the last decade have had a significant impact on the teaching and learning experiences students encounter today. We now take technologies such as Web 2.0, mobile devices, cloud computing, podcasts, social networking, super-fast broadband, and connectedness for granted. So what about the student use of these types of…
CIRA: Cooperative Institute for Research in the Atmosphere Newsletter, Volume 28, Fall 2007
NASA Technical Reports Server (NTRS)
McInnis-Efaw, Mary (Editor); Leinen, Laura (Editor)
2007-01-01
The articles in this issue of the Cooperative Institute for Research in the Atmosphere (CIRA) Newsletter are: "Unmanned Aerial Systems: An Overview of NOAA's Unmanned Aircraft System Program," "International Activities: Weather Briefings and Training Via the Internet," "Cloudsat's One-Year Anniversary: An Abundance of Exciting New Cloud Observations," and "The Migration of NCAR'S Auto-Nowcaster into NWS AWIPS."
2014-09-30
fingerprint sensor etc. Secure application execution Trust established outwards With normal world apps With internet/cloud apps...Xilinx Zynq Security Components and Capabilities © Copyright 2014 Xilinx . Security Features Inherited from FPGAs Zynq Secure Boot TrustZone...2014 Xilinx . Security Features Inherited from FPGAs Zynq Secure Boot TrustZone Integration 4 Agenda © Copyright 2014 Xilinx . Device DNA and User
A scoping review of cloud computing in healthcare.
Griebel, Lena; Prokosch, Hans-Ulrich; Köpcke, Felix; Toddenroth, Dennis; Christoph, Jan; Leb, Ines; Engel, Igor; Sedlmayr, Martin
2015-03-19
Cloud computing is a recent and fast growing area of development in healthcare. Ubiquitous, on-demand access to virtually endless resources in combination with a pay-per-use model allow for new ways of developing, delivering and using services. Cloud computing is often used in an "OMICS-context", e.g. for computing in genomics, proteomics and molecular medicine, while other field of application still seem to be underrepresented. Thus, the objective of this scoping review was to identify the current state and hot topics in research on cloud computing in healthcare beyond this traditional domain. MEDLINE was searched in July 2013 and in December 2014 for publications containing the terms "cloud computing" and "cloud-based". Each journal and conference article was categorized and summarized independently by two researchers who consolidated their findings. 102 publications have been analyzed and 6 main topics have been found: telemedicine/teleconsultation, medical imaging, public health and patient self-management, hospital management and information systems, therapy, and secondary use of data. Commonly used features are broad network access for sharing and accessing data and rapid elasticity to dynamically adapt to computing demands. Eight articles favor the pay-for-use characteristics of cloud-based services avoiding upfront investments. Nevertheless, while 22 articles present very general potentials of cloud computing in the medical domain and 66 articles describe conceptual or prototypic projects, only 14 articles report from successful implementations. Further, in many articles cloud computing is seen as an analogy to internet-/web-based data sharing and the characteristics of the particular cloud computing approach are unfortunately not really illustrated. Even though cloud computing in healthcare is of growing interest only few successful implementations yet exist and many papers just use the term "cloud" synonymously for "using virtual machines" or "web-based" with no described benefit of the cloud paradigm. The biggest threat to the adoption in the healthcare domain is caused by involving external cloud partners: many issues of data safety and security are still to be solved. Until then, cloud computing is favored more for singular, individual features such as elasticity, pay-per-use and broad network access, rather than as cloud paradigm on its own.
Public Auditing with Privacy Protection in a Multi-User Model of Cloud-Assisted Body Sensor Networks
Li, Song; Cui, Jie; Zhong, Hong; Liu, Lu
2017-01-01
Wireless Body Sensor Networks (WBSNs) are gaining importance in the era of the Internet of Things (IoT). The modern medical system is a particular area where the WBSN techniques are being increasingly adopted for various fundamental operations. Despite such increasing deployments of WBSNs, issues such as the infancy in the size, capabilities and limited data processing capacities of the sensor devices restrain their adoption in resource-demanding applications. Though providing computing and storage supplements from cloud servers can potentially enrich the capabilities of the WBSNs devices, data security is one of the prevailing issues that affects the reliability of cloud-assisted services. Sensitive applications such as modern medical systems demand assurance of the privacy of the users’ medical records stored in distant cloud servers. Since it is economically impossible to set up private cloud servers for every client, auditing data security managed in the remote servers has necessarily become an integral requirement of WBSNs’ applications relying on public cloud servers. To this end, this paper proposes a novel certificateless public auditing scheme with integrated privacy protection. The multi-user model in our scheme supports groups of users to store and share data, thus exhibiting the potential for WBSNs’ deployments within community environments. Furthermore, our scheme enriches user experiences by offering public verifiability, forward security mechanisms and revocation of illegal group members. Experimental evaluations demonstrate the security effectiveness of our proposed scheme under the Random Oracle Model (ROM) by outperforming existing cloud-assisted WBSN models. PMID:28475110
Now and Next-Generation Sequencing Techniques: Future of Sequence Analysis Using Cloud Computing
Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav
2012-01-01
Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed “cloud computing”) has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows. PMID:23248640
Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.
Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav
2012-01-01
Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.
Li, Song; Cui, Jie; Zhong, Hong; Liu, Lu
2017-05-05
Wireless Body Sensor Networks (WBSNs) are gaining importance in the era of the Internet of Things (IoT). The modern medical system is a particular area where the WBSN techniques are being increasingly adopted for various fundamental operations. Despite such increasing deployments of WBSNs, issues such as the infancy in the size, capabilities and limited data processing capacities of the sensor devices restrain their adoption in resource-demanding applications. Though providing computing and storage supplements from cloud servers can potentially enrich the capabilities of the WBSNs devices, data security is one of the prevailing issues that affects the reliability of cloud-assisted services. Sensitive applications such as modern medical systems demand assurance of the privacy of the users' medical records stored in distant cloud servers. Since it is economically impossible to set up private cloud servers for every client, auditing data security managed in the remote servers has necessarily become an integral requirement of WBSNs' applications relying on public cloud servers. To this end, this paper proposes a novel certificateless public auditing scheme with integrated privacy protection. The multi-user model in our scheme supports groups of users to store and share data, thus exhibiting the potential for WBSNs' deployments within community environments. Furthermore, our scheme enriches user experiences by offering public verifiability, forward security mechanisms and revocation of illegal group members. Experimental evaluations demonstrate the security effectiveness of our proposed scheme under the Random Oracle Model (ROM) by outperforming existing cloud-assisted WBSN models.
Internet Hospitals in China: Cross-Sectional Survey.
Xie, Xiaoxu; Zhou, Weimin; Lin, Lingyan; Fan, Si; Lin, Fen; Wang, Long; Guo, Tongjun; Ma, Chuyang; Zhang, Jingkun; He, Yuan; Chen, Yixin
2017-07-04
The Internet hospital, an innovative approach to providing health care, is rapidly developing in China because it has the potential to provide widely accessible outpatient service delivery via Internet technologies. To date, China's Internet hospitals have not been systematically investigated. The aim of this study was to describe the characteristics of China's Internet hospitals, and to assess their health service capacity. We searched Baidu, the popular Chinese search engine, to identify Internet hospitals, using search terms such as "Internet hospital," "web hospital," or "cloud hospital." All Internet hospitals in mainland China were eligible for inclusion if they were officially registered. Our search was carried out until March 31, 2017. We identified 68 Internet hospitals, of which 43 have been put into use and 25 were under construction. Of the 43 established Internet hospitals, 13 (30%) were in the hospital informatization stage, 24 (56%) were in the Web ward stage, and 6 (14%) were in full Internet hospital stage. Patients accessed outpatient service delivery via website (74%, 32/43), app (42%, 18/43), or offline medical consultation facility (37%, 16/43) from the Internet hospital. Furthermore, 25 (58%) of the Internet hospitals asked doctors to deliver health services at a specific Web clinic, whereas 18 (42%) did not. The consulting methods included video chat (60%, 26/43), telephone (19%, 8/43), and graphic message (28%, 12/43); 13 (30%) Internet hospitals cannot be consulted online any more. Only 6 Internet hospitals were included in the coverage of health insurance. The median number of doctors available online was zero (interquartile range [IQR] 0 to 5; max 16,492). The median consultation fee per time was ¥20 (approximately US $2.90, IQR ¥0 to ¥200). Internet hospitals provide convenient outpatient service delivery. However, many of the Internet hospitals are not yet mature and are faced with various issues such as online doctor scarcity and the unavailability of health insurance coverage. China's Internet hospitals are heading in the right direction to improve provision of health services, but much more remains to be done. ©Xiaoxu Xie, Weimin Zhou, Lingyan Lin, Si Fan, Fen Lin, Long Wang, Tongjun Guo, Chuyang Ma, Jingkun Zhang, Yuan He, Yixin Chen. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 04.07.2017.
Murray, P.; Denton, I.; Wilkinson, D.
1957-10-01
The production of thoria ware of very low porosity by the slip casting of pure thoria is described. It comprises dry milling calcined thoria to obtain particles ranging up to 11 microns in size and having 60% of particles less than 2 microns, forming an aqueous slip of the milled thoric casting the slip and firing the dry cast at a sintering temperature of from 1600 to 1825 d C. The preferred composition of the slip is 1600 grams of thoria in each liter of slip. The preferred pH of the slip is 1. When thoria of 99.9% purity is used the slip is suitable for casting for as long as six weeks after preparation.
Network science in Egyptology.
Coulombe, Patrick; Qualls, Clifford; Kruszynski, Robert; Nerlich, Andreas; Bianucci, Raffaella; Harris, Richard; Mermier, Christine; Appenzeller, Otto
2012-01-01
Egyptology relies on traditional descriptive methods. Here we show that modern, Internet-based science and statistical methods can be applied to Egyptology. Two four-thousand-year-old sarcophagi in one tomb, one within the other, with skeletal remains of a woman, gave us the opportunity to diagnose a congenital nervous system disorder in the absence of a living nervous system. The sarcophagi were discovered near Thebes, Egypt. They were well preserved and meticulously restored. The skeletal remains suggested that the woman, aged between 50 and 60 years, was Black, possibly of Nubian descent and suffered from syringobulbia, a congenital cyst in the brain stem and upper spinal cord. We employed crowd sourcing, the anonymous responses of 204 Facebook users who performed a matching task of living persons' iris color with iris color of the Udjat eyes, a decoration found on Egyptian sarcophagi, to confirm the ethnicities of the sarcophagus occupants. We used modern fMRI techniques to illustrate the putative extent of her lesion in the brain stem and upper spinal cord deduced from her skeletal remains. We compared, statistically, the right/left ratios, a non-dimensional number, of the orbit height, orbit width, malar height and the infraorbital foramena with the same measures obtained from 32 ancient skulls excavated from the Fayum, North of Thebes. We found that these ratios were significantly different in this skull indicating atrophy of cranial bones on the left. In this instance, Internet science and the use of modern neurologic research tools showed that ancient sarcophagus makers shaped and decorated their wares to fit the ethnicity of the prospective occupants of the sarcophagi. We also showed that, occasionally, human nervous system disease may be recognizable in the absence of a living nervous system.
Implementing eco friendly highly reliable upload feature using multi 3G service
NASA Astrophysics Data System (ADS)
Tanutama, Lukas; Wijaya, Rico
2017-12-01
The current trend of eco friendly Internet access is preferred. In this research the understanding of eco friendly is minimum power consumption. The devices that are selected have operationally low power consumption and normally have no power consumption as they are hibernating during idle state. To have the reliability a router of a router that has internal load balancing feature will provide the improvement of previous research on multi 3G services for broadband lines. Previous studies emphasized on accessing and downloading information files from Public Cloud residing Web Servers. The demand is not only for speed but high reliability of access as well. High reliability will mean mitigating both direct and indirect high cost due to repeated attempts of uploading and downloading the large files. Nomadic and mobile computer users need viable solution. Following solution for downloading information has been proposed and tested. The solution is promising. The result is now extended to providing reliable access line by means of redundancy and automatic reconfiguration for uploading and downloading large information files to a Web Server in the Cloud. The technique is taking advantage of internal load balancing feature to provision a redundant line acting as a backup line. A router that has the ability to provide load balancing to several WAN lines is chosen. The WAN lines are constructed using multiple 3G lines. The router supports the accessing Internet with more than one 3G access line which increases the reliability and availability of the Internet access as the second line immediately takes over if the first line is disturbed.
2015-01-01
Background The Internet has greatly enhanced health care, helping patients stay up-to-date on medical issues and general knowledge. Many cancer patients use the Internet for cancer diagnosis and related information. Recently, cloud computing has emerged as a new way of delivering health services but currently, there is no generic and fully automated cloud-based self-management intervention for breast cancer patients, as practical guidelines are lacking. Objective We investigated the prevalence and predictors of cloud use for medical diagnosis among women with breast cancer to gain insight into meaningful usage parameters to evaluate the use of generic, fully automated cloud-based self-intervention, by assessing how breast cancer survivors use a generic self-management model. The goal of this study was implemented and evaluated with a new prototype called “CIMIDx”, based on representative association rules that support the diagnosis of medical images (mammograms). Methods The proposed Cloud-Based System Support Intelligent Medical Image Diagnosis (CIMIDx) prototype includes two modules. The first is the design and development of the CIMIDx training and test cloud services. Deployed in the cloud, the prototype can be used for diagnosis and screening mammography by assessing the cancers detected, tumor sizes, histology, and stage of classification accuracy. To analyze the prototype’s classification accuracy, we conducted an experiment with data provided by clients. Second, by monitoring cloud server requests, the CIMIDx usage statistics were recorded for the cloud-based self-intervention groups. We conducted an evaluation of the CIMIDx cloud service usage, in which browsing functionalities were evaluated from the end-user’s perspective. Results We performed several experiments to validate the CIMIDx prototype for breast health issues. The first set of experiments evaluated the diagnostic performance of the CIMIDx framework. We collected medical information from 150 breast cancer survivors from hospitals and health centers. The CIMIDx prototype achieved high sensitivity of up to 99.29%, and accuracy of up to 98%. The second set of experiments evaluated CIMIDx use for breast health issues, using t tests and Pearson chi-square tests to assess differences, and binary logistic regression to estimate the odds ratio (OR) for the predictors’ use of CIMIDx. For the prototype usage statistics for the same 150 breast cancer survivors, we interviewed 114 (76.0%), through self-report questionnaires from CIMIDx blogs. The frequency of log-ins/person ranged from 0 to 30, total duration/person from 0 to 1500 minutes (25 hours). The 114 participants continued logging in to all phases, resulting in an intervention adherence rate of 44.3% (95% CI 33.2-55.9). The overall performance of the prototype for the good category, reported usefulness of the prototype (P=.77), overall satisfaction of the prototype (P=.31), ease of navigation (P=.89), user friendliness evaluation (P=.31), and overall satisfaction (P=.31). Positive evaluations given by 100 participants via a Web-based questionnaire supported our hypothesis. Conclusions The present study shows that women felt favorably about the use of a generic fully automated cloud-based self- management prototype. The study also demonstrated that the CIMIDx prototype resulted in the detection of more cancers in screening and diagnosing patients, with an increased accuracy rate. PMID:25830608
NASA Astrophysics Data System (ADS)
Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young
2016-07-01
Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.
Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young
2016-07-28
Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.
Accelerating statistical image reconstruction algorithms for fan-beam x-ray CT using cloud computing
NASA Astrophysics Data System (ADS)
Srivastava, Somesh; Rao, A. Ravishankar; Sheinin, Vadim
2011-03-01
Statistical image reconstruction algorithms potentially offer many advantages to x-ray computed tomography (CT), e.g. lower radiation dose. But, their adoption in practical CT scanners requires extra computation power, which is traditionally provided by incorporating additional computing hardware (e.g. CPU-clusters, GPUs, FPGAs etc.) into a scanner. An alternative solution is to access the required computation power over the internet from a cloud computing service, which is orders-of-magnitude more cost-effective. This is because users only pay a small pay-as-you-go fee for the computation resources used (i.e. CPU time, storage etc.), and completely avoid purchase, maintenance and upgrade costs. In this paper, we investigate the benefits and shortcomings of using cloud computing for statistical image reconstruction. We parallelized the most time-consuming parts of our application, the forward and back projectors, using MapReduce, the standard parallelization library on clouds. From preliminary investigations, we found that a large speedup is possible at a very low cost. But, communication overheads inside MapReduce can limit the maximum speedup, and a better MapReduce implementation might become necessary in the future. All the experiments for this paper, including development and testing, were completed on the Amazon Elastic Compute Cloud (EC2) for less than $20.
Data Privacy in Cloud-assisted Healthcare Systems: State of the Art and Future Challenges.
Sajid, Anam; Abbas, Haider
2016-06-01
The widespread deployment and utility of Wireless Body Area Networks (WBAN's) in healthcare systems required new technologies like Internet of Things (IoT) and cloud computing, that are able to deal with the storage and processing limitations of WBAN's. This amalgamation of WBAN-based healthcare systems to cloud-based healthcare systems gave rise to serious privacy concerns to the sensitive healthcare data. Hence, there is a need for the proactive identification and effective mitigation mechanisms for these patient's data privacy concerns that pose continuous threats to the integrity and stability of the healthcare environment. For this purpose, a systematic literature review has been conducted that presents a clear picture of the privacy concerns of patient's data in cloud-assisted healthcare systems and analyzed the mechanisms that are recently proposed by the research community. The methodology used for conducting the review was based on Kitchenham guidelines. Results from the review show that most of the patient's data privacy techniques do not fully address the privacy concerns and therefore require more efforts. The summary presented in this paper would help in setting research directions for the techniques and mechanisms that are needed to address the patient's data privacy concerns in a balanced and light-weight manner by considering all the aspects and limitations of the cloud-assisted healthcare systems.
Investigation of cloud properties and atmospheric stability with MODIS
NASA Technical Reports Server (NTRS)
Menzel, Paul
1995-01-01
In the past six months several milestones were accomplished. The MODIS Airborne Simulator (MAS) was flown in a 50 channel configuration for the first time in January 1995 and the data were calibrated and validated; in the same field campaign the approach for validating MODIS radiances using the MAS and High resolution Interferometer Sounder (HIS) instruments was successfully tested on GOES-8. Cloud masks for two scenes (one winter and the other summer) of AVHRR local area coverage from the Gulf of Mexico to Canada were processed and forwarded to the SDST for MODIS Science Team investigation; a variety of surface and cloud scenes were evident. Beta software preparations continued with incorporation of the EOS SDP Toolkit. SCAR-C data was processed and presented at the biomass burning conference. Preparations for SCAR-B accelerated with generation of a home page for access to real time satellite data related to biomass burning; this will be available to the scientists in Brazil via internet on the World Wide Web. The CO2 cloud algorithm was compared to other algorithms that differ in their construction of clear radiance fields. The HIRS global cloud climatology was completed for six years. The MODIS science team meeting was attended by five of the UW scientists.
Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young
2016-01-01
Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296
Adsorption of Ten Microcystin Congeners to Common Laboratory-Ware Is Solvent and Surface Dependent.
Altaner, Stefan; Puddick, Jonathan; Wood, Susanna A; Dietrich, Daniel R
2017-04-06
Cyanobacteria can produce heptapetides called microcystins (MC) which are harmful to humans due to their ability to inhibit cellular protein phosphatases. Quantitation of these toxins can be hampered by their adsorption to common laboratory-ware during sample processing and analysis. Because of their structural diversity (>100 congeners) and different physico-chemical properties, they vary in their adsorption to surfaces. In this study, the adsorption of ten different MC congeners (encompassing non-arginated to doubly-arginated congeners) to common laboratory-ware was assessed using different solvent combinations. Sample handling steps were mimicked with glass and polypropylene pipettes and vials with increasing methanol concentrations at two pH levels, before analysis by liquid chromatography-tandem mass spectrometry. We demonstrated that MC adsorb to polypropylene surfaces irrespective of pH. After eight successive pipet actions using polypropylene tips ca. 20% of the MC were lost to the surface material, which increased to 25%-40% when solutions were acidified. The observed loss was alleviated by changing the methanol (MeOH) concentration in the final solvent. The required MeOH concentration varied depending on which congener was present. Microcystins only adsorbed to glass pipettes (loss up to 30% after eight pipet actions) when in acidified aqueous solutions. The latter appeared largely dependent on the presence of ionizable groups, such as arginine residues.
The use of high technology in STEM education
NASA Astrophysics Data System (ADS)
Lakshminarayanan, Vasudevan; McBride, Annette C.
2015-10-01
There has been a huge increase in the use of high technology in education. In this paper we discuss some aspects of technology that have major applications in STEM education, namely, (a) virtual reality systems, (b) personal electronic response systems aka "clickers", (c) flipped classrooms, (d) mobile learning "m-Learning", (e) massive open online courses "MOOCS", (f) internet-of-things and (g) cloud computing.
ERIC Educational Resources Information Center
Kumar, Vikas; Sharma, Deepika
2016-01-01
Students in the digital era are habitual of using digital devices not only for playing and interacting with their friends and peers, but also as a tool for education and learning. These digital natives are highly obsessed with the internet driven portable devices and always demand for a multimedia rich content. This specific demand needs to be…
Welton, Michael; Rodriguez-Lainz, Alfonso; Loza, Oralia; Brodine, Stephanie; Fraga, Miguel
2018-03-01
Lead exposure from lead-glazed ceramics (LGCs) and traditional folk remedies have been identified as significant sources of elevated blood lead levels in Mexico and the United States. This study took place from 2005 to 2012 in a rural community in Baja California, Mexico. 1) Investigate the knowledge, attitudes, and practices related to lead and lead exposures from LGCs and two lead-based folk remedies ( azarcon and greta); and 2) evaluate a pilot intervention to provide alternative lead-safe cookware. A baseline household survey was conducted in 2005, followed by the pilot intervention in 2006, and follow-up surveys in 2007 and 2012. For the pilot intervention, families who reported using LGCs were given lead-safe alternative cookware to try and its acceptance was evaluated in the following year. The community was mostly of indigenous background from Oaxaca and a high proportion of households had young children. In 2006, all participants using traditional ceramic ware at the time ( n = 48) accepted lead-safe alternative cookware to try, and 97% reported that they were willing to exchange traditional ceramic ware for lead-safe alternatives. The use of ceramic cookware decreased from over 90% during respondents' childhood household use in Oaxaca to 47% in 2006 among households in Baja California, and further reduced to 16.8% in 2012. While empacho, a folk illness, was widely recognized as an intestinal disorder, there was almost universal unfamiliarity with the use and knowledge of azarcon and greta for its treatment. This pilot evaluation provides evidence 1) for an effective and innovative strategy to reduce lead exposure from LGCs and 2) of the feasibility of substituting lead-free alternative cookware for traditional ceramic ware in a rural indigenous community, when delivered in a culturally appropriate manner with health education. This strategy could complement other approaches to reduce exposure to lead from LGCs.
Secure Cloud-Based Solutions for Different eHealth Services in Spanish Rural Health Centers.
de la Torre-Díez, Isabel; Lopez-Coronado, Miguel; Garcia-Zapirain Soto, Begonya; Mendez-Zorrilla, Amaia
2015-07-27
The combination of eHealth applications and/or services with cloud technology provides health care staff—with sufficient mobility and accessibility for them—to be able to transparently check any data they may need without having to worry about its physical location. The main aim of this paper is to put forward secure cloud-based solutions for a range of eHealth services such as electronic health records (EHRs), telecardiology, teleconsultation, and telediagnosis. The scenario chosen for introducing the services is a set of four rural health centers located within the same Spanish region. iCanCloud software was used to perform simulations in the proposed scenario. We chose online traffic and the cost per unit in terms of time as the parameters for choosing the secure solution on the most optimum cloud for each service. We suggest that load balancers always be fitted for all solutions in communication together with several Internet service providers and that smartcards be used to maintain identity to an appropriate extent. The solutions offered via private cloud for EHRs, teleconsultation, and telediagnosis services require a volume of online traffic calculated at being able to reach 2 Gbps per consultation. This may entail an average cost of €500/month. The security solutions put forward for each eHealth service constitute an attempt to centralize all information on the cloud, thus offering greater accessibility to medical information in the case of EHRs alongside more reliable diagnoses and treatment for telecardiology, telediagnosis, and teleconsultation services. Therefore, better health care for the rural patient can be obtained at a reasonable cost.
NASA Astrophysics Data System (ADS)
Li, Cunbin; Wang, Yi; Lin, Shuaishuai
2017-09-01
With the rapid development of the energy internet and the deepening of the electric power reform, the traditional marketing mode of electric power does not apply to most of electric power enterprises, so must seek a breakthrough, however, in the face of increasingly complex marketing information, how to make a quick, reasonable transformation, makes the electric power marketing competitiveness assessment more accurate and objective becomes a big problem. In this paper, cloud model and TOPSIS method is proposed. Firstly, build the electric power marketing competitiveness evaluation index system. Then utilize the cloud model to transform the qualitative evaluation of the marketing data into quantitative values and use the entropy weight method to weaken the subjective factors of evaluation index weight. Finally, by TOPSIS method the closeness degrees of alternatives are obtained. This method provides a novel solution for the electric power marketing competitiveness evaluation. Through the case analysis the effectiveness and feasibility of this model are verified.
A service based adaptive U-learning system using UX.
Jeong, Hwa-Young; Yi, Gangman
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques.
A Service Based Adaptive U-Learning System Using UX
Jeong, Hwa-Young
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques. PMID:25147832
A Cloud-Based Global Flood Disaster Community Cyber-Infrastructure: Development and Demonstration
NASA Technical Reports Server (NTRS)
Wan, Zhanming; Hong, Yang; Khan, Sadiq; Gourley, Jonathan; Flamig, Zachary; Kirschbaum, Dalia; Tang, Guoqiang
2014-01-01
Flood disasters have significant impacts on the development of communities globally. This study describes a public cloud-based flood cyber-infrastructure (CyberFlood) that collects, organizes, visualizes, and manages several global flood databases for authorities and the public in real-time, providing location-based eventful visualization as well as statistical analysis and graphing capabilities. In order to expand and update the existing flood inventory, a crowdsourcing data collection methodology is employed for the public with smartphones or Internet to report new flood events, which is also intended to engage citizen-scientists so that they may become motivated and educated about the latest developments in satellite remote sensing and hydrologic modeling technologies. Our shared vision is to better serve the global water community with comprehensive flood information, aided by the state-of-the- art cloud computing and crowdsourcing technology. The CyberFlood presents an opportunity to eventually modernize the existing paradigm used to collect, manage, analyze, and visualize water-related disasters.
A Cloud-Based Car Parking Middleware for IoT-Based Smart Cities: Design and Implementation
Ji, Zhanlin; Ganchev, Ivan; O'Droma, Máirtín; Zhao, Li; Zhang, Xueji
2014-01-01
This paper presents the generic concept of using cloud-based intelligent car parking services in smart cities as an important application of the Internet of Things (IoT) paradigm. This type of services will become an integral part of a generic IoT operational platform for smart cities due to its pure business-oriented features. A high-level view of the proposed middleware is outlined and the corresponding operational platform is illustrated. To demonstrate the provision of car parking services, based on the proposed middleware, a cloud-based intelligent car parking system for use within a university campus is described along with details of its design, implementation, and operation. A number of software solutions, including Kafka/Storm/Hbase clusters, OSGi web applications with distributed NoSQL, a rule engine, and mobile applications, are proposed to provide ‘best’ car parking service experience to mobile users, following the Always Best Connected and best Served (ABC&S) paradigm. PMID:25429416
A cloud-based car parking middleware for IoT-based smart cities: design and implementation.
Ji, Zhanlin; Ganchev, Ivan; O'Droma, Máirtín; Zhao, Li; Zhang, Xueji
2014-11-25
This paper presents the generic concept of using cloud-based intelligent car parking services in smart cities as an important application of the Internet of Things (IoT) paradigm. This type of services will become an integral part of a generic IoT operational platform for smart cities due to its pure business-oriented features. A high-level view of the proposed middleware is outlined and the corresponding operational platform is illustrated. To demonstrate the provision of car parking services, based on the proposed middleware, a cloud-based intelligent car parking system for use within a university campus is described along with details of its design, implementation, and operation. A number of software solutions, including Kafka/Storm/Hbase clusters, OSGi web applications with distributed NoSQL, a rule engine, and mobile applications, are proposed to provide 'best' car parking service experience to mobile users, following the Always Best Connected and best Served (ABC&S) paradigm.
Umakanthan, Ramanan; Haglund, Nicholas A; Stulak, John M; Joyce, Lyle D; Ahmad, Rashid; Keebler, Mary E; Maltais, Simon
2013-01-01
Advances in mechanical circulatory support have been critical in bridging patients awaiting heart transplantation. In addition, improvement in device durability has enabled left ventricular assist device therapy to be applied as destination therapy in those not felt to be transplant candidate. Because of the increasing complexity of patients, there continues to be a need for alternative strategies for device implantation to bridge high-risk patients awaiting heart transplantation, wherein the risks of numerous previous sternotomies may be prohibitive. We present a unique technique for placement of the HeartWare ventricular assist device via left anterior thoracotomy to the descending aorta in a patient awaiting heart transplantation with a history of multiple previous sternotomies.
Roman mosaic glass: a study of production processes, using PIXE spectrometry
NASA Astrophysics Data System (ADS)
Fleming, S. J.; Swann, C. P.
1999-04-01
The most attractive Roman glass produced during the early part of the 1st century A.D. was mosaic ware - bowls and dishes molded from arrays of multi-colored canes that created abstract floral and geometric designs. Yet ancient literature tells us little about the organization of the glassworking industry in which such wares were produced. We have focused upon two kinds of mosaic decoration that include a component of white glass in their cane construction and have purple glass as their matrix. A consistent pattern in the minor levels of lead in each kind of glass suggests that they were the products of two separate workshops, each with separate sources of supply for their glass stock.
Rajan, J Pandia; Rajan, S Edward
2018-01-01
Wireless physiological signal monitoring system designing with secured data communication in the health care system is an important and dynamic process. We propose a signal monitoring system using NI myRIO connected with the wireless body sensor network through multi-channel signal acquisition method. Based on the server side validation of the signal, the data connected to the local server is updated in the cloud. The Internet of Things (IoT) architecture is used to get the mobility and fast access of patient data to healthcare service providers. This research work proposes a novel architecture for wireless physiological signal monitoring system using ubiquitous healthcare services by virtual Internet of Things. We showed an improvement in method of access and real time dynamic monitoring of physiological signal of this remote monitoring system using virtual Internet of thing approach. This remote monitoring and access system is evaluated in conventional value. This proposed system is envisioned to modern smart health care system by high utility and user friendly in clinical applications. We claim that the proposed scheme significantly improves the accuracy of the remote monitoring system compared to the other wireless communication methods in clinical system.
Rosnell, Tomi; Honkavaara, Eija
2012-01-01
The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems’ SOCET SET classical commercial photogrammetric software and another is built using Microsoft®’s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation. PMID:22368479
NASA Astrophysics Data System (ADS)
Sproles, E. A.; Crumley, R. L.; Nolin, A. W.; Mar, E.; Lopez-Moreno, J. J.
2017-12-01
Streamflow in snowy mountain regions is extraordinarily challenging to forecast, and prediction efforts are hampered by the lack of timely snow data—particularly in data sparse regions. SnowCloud is a prototype web-based framework that integrates remote sensing, cloud computing, interactive mapping tools, and a hydrologic model to offer a new paradigm for delivering key data to water resource managers. We tested the skill of SnowCloud to forecast monthly streamflow with one month lead time in three snow-dominated headwaters. These watersheds represent a range of precipitation/runoff schemes: the Río Elqui in northern Chile (200 mm/yr, entirely snowmelt); the John Day River, Oregon, USA (635 mm/yr, primarily snowmelt); and the Río Aragon in the northern Spain (850 mm/yr, snowmelt dominated). Model skill corresponded to snowpack contribution with Nash-Sutcliffe Efficiencies of 0.86, 0.52, and 0.21 respectively. SnowCloud does not require the user to possess advanced programming skills or proprietary software. We access NASA's MOD10A1 snow cover product to calculate the snow metrics globally using Google Earth Engine's geospatial analysis and cloud computing service. The analytics and forecast tools are provided through a web-based portal that requires only internet access and minimal training. To test the efficacy of SnowCloud we provided the tools and a series of tutorials in English and Spanish to water resource managers in Chile, Spain, and the United States. Participants assessed their user experience and provided feedback, and the results of our multi-cultural assessment are also presented. While our results focus on SnowCloud, they outline methods to develop cloud-based tools that function effectively across cultures and languages. Our approach also addresses the primary challenges of science-based computing; human resource limitations, infrastructure costs, and expensive proprietary software. These challenges are particularly problematic in developing countries.
Rosnell, Tomi; Honkavaara, Eija
2012-01-01
The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems' SOCET SET classical commercial photogrammetric software and another is built using Microsoft(®)'s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation.
NASA Technical Reports Server (NTRS)
Velden, Christopher S.
1994-01-01
The thrust of the proposed effort under this contract is aimed at improving techniques to track water vapor data in sequences of imagery from geostationary satellites. In regards to this task, significant testing, evaluation, and progress was accomplished during this period. Sets of winds derived from Meteosat data were routinely produced during Atlantic hurricane events in the 1993 season. These wind sets were delivered via Internet in real time to the Hurricane Research Division in Miami for their evaluation in a track forecast model. For eighteen cases in which 72-hour forecasts were produced, thirteen resulted in track forecast improvements (some quite significant). In addition, quality-controlled Meteosat water vapor winds produced by NESDIS were validated against rawinsondes, yielding an 8 m/s RMS. This figure is comparable to upper-level cloud drift wind accuracies. Given the complementary horizontal coverage in cloud-free areas, we believe that water vapor vectors can supplement cloud-drift wind information to provide good full-disk coverage of the upper tropospheric flow. The impact of these winds on numerical analysis and forecasts will be tested in the next reporting period.
Centralized Duplicate Removal Video Storage System with Privacy Preservation in IoT.
Yan, Hongyang; Li, Xuan; Wang, Yu; Jia, Chunfu
2018-06-04
In recent years, the Internet of Things (IoT) has found wide application and attracted much attention. Since most of the end-terminals in IoT have limited capabilities for storage and computing, it has become a trend to outsource the data from local to cloud computing. To further reduce the communication bandwidth and storage space, data deduplication has been widely adopted to eliminate the redundant data. However, since data collected in IoT are sensitive and closely related to users' personal information, the privacy protection of users' information becomes a challenge. As the channels, like the wireless channels between the terminals and the cloud servers in IoT, are public and the cloud servers are not fully trusted, data have to be encrypted before being uploaded to the cloud. However, encryption makes the performance of deduplication by the cloud server difficult because the ciphertext will be different even if the underlying plaintext is identical. In this paper, we build a centralized privacy-preserving duplicate removal storage system, which supports both file-level and block-level deduplication. In order to avoid the leakage of statistical information of data, Intel Software Guard Extensions (SGX) technology is utilized to protect the deduplication process on the cloud server. The results of the experimental analysis demonstrate that the new scheme can significantly improve the deduplication efficiency and enhance the security. It is envisioned that the duplicated removal system with privacy preservation will be of great use in the centralized storage environment of IoT.
NASA Technical Reports Server (NTRS)
Green, Carolyn J.; Chambers, Lin H.
1998-01-01
The Students Clouds Observations On-Line or S'COOL project was piloted in 1997. It was created with the idea of using students to serve as one component of the validation for the Clouds and the Earth's Radiant Energy System (CERES) instrument which was launched with the Tropical Rainfall Measuring Mission (TRMM) in November, 1997. As part of NASA's Earth Science Enterprise CERES is interested in the role clouds play in regulating our climate. Over thirty schools became involved in the initial thrust of the project. The CERES instrument detects the location of clouds and identifies their physical properties. S'COOL students coordinate their ground truth observations with the exact overpass of the satellite at their location. Their findings regarding cloud type, height, fraction and opacity as well as surface conditions are then reported to the NASA Langley Distributed Active Archive Center (DAAC). The data is then accessible to both the CERES team for validation and to schools for educational application via the Internet. By March of 1998 ninety-three schools, in nine countries had enrolled in the S'COOL project. Joining the United States participants were from schools in Australia, Canada, France, Germany, Norway, Spain, Sweden, and Switzerland. The project is gradually becoming the global project envisioned by the project s creators. As students obtain the requested data useful for the scientists, it was hoped that students with guidance from their instructors would have opportunity and motivation to learn more about clouds and atmospheric science as well.
Cloud Optimized Image Format and Compression
NASA Astrophysics Data System (ADS)
Becker, P.; Plesea, L.; Maurer, T.
2015-04-01
Cloud based image storage and processing requires revaluation of formats and processing methods. For the true value of the massive volumes of earth observation data to be realized, the image data needs to be accessible from the cloud. Traditional file formats such as TIF and NITF were developed in the hay day of the desktop and assumed fast low latency file access. Other formats such as JPEG2000 provide for streaming protocols for pixel data, but still require a server to have file access. These concepts no longer truly hold in cloud based elastic storage and computation environments. This paper will provide details of a newly evolving image storage format (MRF) and compression that is optimized for cloud environments. Although the cost of storage continues to fall for large data volumes, there is still significant value in compression. For imagery data to be used in analysis and exploit the extended dynamic range of the new sensors, lossless or controlled lossy compression is of high value. Compression decreases the data volumes stored and reduces the data transferred, but the reduced data size must be balanced with the CPU required to decompress. The paper also outlines a new compression algorithm (LERC) for imagery and elevation data that optimizes this balance. Advantages of the compression include its simple to implement algorithm that enables it to be efficiently accessed using JavaScript. Combing this new cloud based image storage format and compression will help resolve some of the challenges of big image data on the internet.
Marine Corps Private Cloud Computing Environment Strategy
2012-05-15
leveraging economies of scale through the MCEITS PCCE, the Marine Corps will measure consumed IT resources more effectively, increase or decrease...flexible broad network access, resource pooling, elastic provisioning and measured services. By leveraging economies of scale the Marine Corps will be able...IaaS SaaS / IaaS 1 1 LCE I ACE Dets I I I I ------------------~ GIG / CJ Internet Security Boundary MCEN I DISN r :------------------ MCEN
A secure online image trading system for untrusted cloud environments.
Munadi, Khairul; Arnia, Fitri; Syaryadhi, Mohd; Fujiyoshi, Masaaki; Kiya, Hitoshi
2015-01-01
In conventional image trading systems, images are usually stored unprotected on a server, rendering them vulnerable to untrusted server providers and malicious intruders. This paper proposes a conceptual image trading framework that enables secure storage and retrieval over Internet services. The process involves three parties: an image publisher, a server provider, and an image buyer. The aim is to facilitate secure storage and retrieval of original images for commercial transactions, while preventing untrusted server providers and unauthorized users from gaining access to true contents. The framework exploits the Discrete Cosine Transform (DCT) coefficients and the moment invariants of images. Original images are visually protected in the DCT domain, and stored on a repository server. Small representation of the original images, called thumbnails, are generated and made publicly accessible for browsing. When a buyer is interested in a thumbnail, he/she sends a query to retrieve the visually protected image. The thumbnails and protected images are matched using the DC component of the DCT coefficients and the moment invariant feature. After the matching process, the server returns the corresponding protected image to the buyer. However, the image remains visually protected unless a key is granted. Our target application is the online market, where publishers sell their stock images over the Internet using public cloud servers.
Unidata Cyberinfrastructure in the Cloud
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Young, J. W.
2016-12-01
Data services, software, and user support are critical components of geosciences cyber-infrastructure to help researchers to advance science. With the maturity of and significant advances in cloud computing, it has recently emerged as an alternative new paradigm for developing and delivering a broad array of services over the Internet. Cloud computing is now mature enough in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Given the enormous potential of cloud-based services, Unidata has been moving to augment its software, services, data delivery mechanisms to align with the cloud-computing paradigm. To realize the above vision, Unidata has worked toward: * Providing access to many types of data from a cloud (e.g., via the THREDDS Data Server, RAMADDA and EDEX servers); * Deploying data-proximate tools to easily process, analyze, and visualize those data in a cloud environment cloud for consumption by any one, by any device, from anywhere, at any time; * Developing and providing a range of pre-configured and well-integrated tools and services that can be deployed by any university in their own private or public cloud settings. Specifically, Unidata has developed Docker for "containerized applications", making them easy to deploy. Docker helps to create "disposable" installs and eliminates many configuration challenges. Containerized applications include tools for data transport, access, analysis, and visualization: THREDDS Data Server, Integrated Data Viewer, GEMPAK, Local Data Manager, RAMADDA Data Server, and Python tools; * Leveraging Jupyter as a central platform and hub with its powerful set of interlinking tools to connect interactively data servers, Python scientific libraries, scripts, and workflows; * Exploring end-to-end modeling and prediction capabilities in the cloud; * Partnering with NOAA and public cloud vendors (e.g., Amazon and OCC) on the NOAA Big Data Project to harness their capabilities and resources for the benefit of the academic community.
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Booth, N.; Walker, J.; Kunicki, T.
2012-12-01
The U.S. Geological Survey Center for Integrated Data Analytics (CIDA), in holding with the President's Digital Government Strategy and the Department of Interior's IT Transformation initiative, has evolved its data center and application architecture toward the "cloud" paradigm. In this case, "cloud" refers to a goal of developing services that may be distributed to infrastructure anywhere on the Internet. This transition has taken place across the entire data management spectrum from data center location to physical hardware configuration to software design and implementation. In CIDA's case, physical hardware resides in Madison at the Wisconsin Water Science Center, in South Dakota at the Earth Resources Observation and Science Center (EROS), and in the near future at a DOI approved commercial vendor. Tasks normally conducted on desktop-based GIS software with local copies of data in proprietary formats are now done using browser-based interfaces to web processing services drawing on a network of standard data-source web services. Organizations are gaining economies of scale through data center consolidation and the creation of private cloud services as well as taking advantage of the commoditization of data processing services. Leveraging open standards for data and data management take advantage of this commoditization and provide the means to reliably build distributed service based systems. This presentation will use CIDA's experience as an illustration of the benefits and hurdles of moving to the cloud. Replicating, reformatting, and processing large data sets, such as downscaled climate projections, traditionally present a substantial challenge to environmental science researchers who need access to data subsets and derived products. The USGS Geo Data Portal (GDP) project uses cloud concepts to help earth system scientists' access subsets, spatial summaries, and derivatives of commonly needed very large data. The GDP project has developed a reusable architecture and advanced processing services that currently accesses archives hosted at Lawrence Livermore National Lab, Oregon State University, the University Corporation for Atmospheric Research, and the U.S. Geological Survey, among others. Several examples of how the GDP project uses cloud concepts will be highlighted in this presentation: 1) The high bandwidth network connectivity of large data centers reduces the need for data replication and storage local to processing services. 2) Standard data serving web services, like OPeNDAP, Web Coverage Services, and Web Feature Services allow GDP services to remotely access custom subsets of data in a variety of formats, further reducing the need for data replication and reformatting. 3) The GDP services use standard web service APIs to allow browser-based user interfaces to run complex and compute-intensive processes for users from any computer with an Internet connection. The combination of physical infrastructure and application architecture implemented for the Geo Data Portal project offer an operational example of how distributed data and processing on the cloud can be used to aid earth system science.
Low-cost real-time 3D PC distributed-interactive-simulation (DIS) application for C4I
NASA Astrophysics Data System (ADS)
Gonthier, David L.; Veron, Harry
1998-04-01
A 3D Distributed Interactive Simulation (DIS) application was developed and demonstrated in a PC environment. The application is capable of running in the stealth mode or as a player which includes battlefield simulations, such as ModSAF. PCs can be clustered together, but not necessarily collocated, to run a simulation or training exercise on their own. A 3D perspective view of the battlefield is displayed that includes terrain, trees, buildings and other objects supported by the DIS application. Screen update rates of 15 to 20 frames per second have been achieved with fully lit and textured scenes thus providing high quality and fast graphics. A complete PC system can be configured for under $2,500. The software runs under Windows95 and WindowsNT. It is written in C++ and uses a commercial API called RenderWare for 3D rendering. The software uses Microsoft Foundation classes and Microsoft DirectPlay for joystick input. The RenderWare libraries enhance the performance through optimization for MMX and the Pentium Pro processor. The RenderWare and the Righteous 3D graphics board from Orchid Technologies with an advertised rendering rate of up to 2 million texture mapped triangles per second. A low-cost PC DIS simulator that can partake in a real-time collaborative simulation with other platforms is thus achieved.
Adsorption of Ten Microcystin Congeners to Common Laboratory-Ware Is Solvent and Surface Dependent
Altaner, Stefan; Puddick, Jonathan; Wood, Susanna A.; Dietrich, Daniel R.
2017-01-01
Cyanobacteria can produce heptapetides called microcystins (MC) which are harmful to humans due to their ability to inhibit cellular protein phosphatases. Quantitation of these toxins can be hampered by their adsorption to common laboratory-ware during sample processing and analysis. Because of their structural diversity (>100 congeners) and different physico-chemical properties, they vary in their adsorption to surfaces. In this study, the adsorption of ten different MC congeners (encompassing non-arginated to doubly-arginated congeners) to common laboratory-ware was assessed using different solvent combinations. Sample handling steps were mimicked with glass and polypropylene pipettes and vials with increasing methanol concentrations at two pH levels, before analysis by liquid chromatography-tandem mass spectrometry. We demonstrated that MC adsorb to polypropylene surfaces irrespective of pH. After eight successive pipet actions using polypropylene tips ca. 20% of the MC were lost to the surface material, which increased to 25%–40% when solutions were acidified. The observed loss was alleviated by changing the methanol (MeOH) concentration in the final solvent. The required MeOH concentration varied depending on which congener was present. Microcystins only adsorbed to glass pipettes (loss up to 30% after eight pipet actions) when in acidified aqueous solutions. The latter appeared largely dependent on the presence of ionizable groups, such as arginine residues. PMID:28383495
Tamez, Daniel; LaRose, Jeffrey A.; Shambaugh, Charles; Chorpenning, Katherine; Soucy, Kevin G; Sobieski, Michael A; Sherwood, Leslie; Giridharan, Guruprasad A; Monreal, Gretel; Koenig, Steven C; Slaughter, Mark S
2014-01-01
Implantation of ventricular assist devices (VADs) for treatment of end-stage heart failure (HF) falls decidedly short of clinical demand, which exceeds 100,000 HF patients per year. VAD implantation often requires major surgical intervention with associated risk of adverse events and long recovery periods. To address these limitations, HeartWare, Inc. (Miami Lakes, FL) has developed a platform of miniature ventricular devices with progressively reduced surgical invasiveness and innovative patient peripherals. One surgical implant concept is a transapical version of the miniaturized left ventricular assist device (MVAD). The HeartWare MVAD Pump® is a small, continuous flow, full-support device that has a displacement volume of 22mL. A new cannula configuration has been developed for transapical implantation, where the outflow cannula is positioned across the aortic valve. The two primary objectives for this feasibility study were to evaluate anatomic fit and surgical approach and efficacy of the transapical MVAD configuration. Anatomic fit and surgical approach were demonstrated using human cadavers (n=4). Efficacy was demonstrated in acute (n =2) and chronic (n = 1) bovine model experiments and assessed by improvements in hemodynamics, biocompatibility, flow dynamics, and histopathology. Potential advantages of the MVAD Pump include flow support in the same direction as the native ventricle, elimination of cardiopulmonary bypass, and minimally-invasive implantation. PMID:24399057
Souza, Eliana Pereira Salles de; Cabrera, Eliana Márcia Sotello; Braile, Domingo Marcolino
2010-01-01
Technological advances and the Internet have contributed to the increased disclosure and updating of knowledge and science. Scientific papers are considered the best form of disclosure of information and have been undergoing many changes, not on their way of development, but on the structure of publication. The Future paper, a name for this new structure, uses hypermediatic resources, allowing a quick, easy and organized access to these items online. The exchange of information, comments and criticisms can be performed in real time, providing agility in science disclosure. The trend for the future of documents, both from professionals or enterprises, is the "cloud computing", in which all documents will be developed and updated with the use of various equipments: computer, palm, netbook, ipad, without need to have the software installed on your computer, requiring only an Internet connection.
NASA Astrophysics Data System (ADS)
Puche, William S.; Sierra, Javier E.; Moreno, Gustavo A.
2014-08-01
The convergence of new technologies in the digital world has made devices with internet connectivity such as televisions, smatphone, Tablet, Blu-ray, game consoles, among others, to increase more and more. Therefore the major research centers are in the task of improving the network performance to mitigate the bottle neck phenomenon regarding capacity and high transmission rates in information and data. The implementation of standard HbbTV (Hybrid Broadcast Broadband TV), and technological platforms OTT (Over the Top), capable of distributing video, audio, TV, and other Internet services via devices connected directly to the cloud. Therefore a model to improve the transmission capacity required by content distribution networks (CDN) for online TV, with high-capacity optical networks is proposed.
Research on the application of wisdom technology in smart city
NASA Astrophysics Data System (ADS)
Li, Juntao; Ma, Shuai; Gu, Weihua; Chen, Weiyi
2015-12-01
This paper first analyzes the concept of smart technology, the relationship between wisdom technology and smart city, and discusses the practical application of IOT(Internet of things) in smart city to explore a better way to realize smart city; then Introduces the basic concepts of cloud computing and smart city, and explains the relationship between the two; Discusses five advantages of cloud computing that applies to smart city construction: a unified and highly efficient, large-scale infrastructure software and hardware management, service scheduling and resource management, security control and management, energy conservation and management platform layer, and to promote modern practical significance of the development of services, promoting regional social and economic development faster. Finally, a brief description of the wisdom technology and smart city management is presented.
An expert fitness diagnosis system based on elastic cloud computing.
Tseng, Kevin C; Wu, Chia-Chuan
2014-01-01
This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.
Microstructure and properties of ceramics
NASA Technical Reports Server (NTRS)
Hamano, K.
1984-01-01
The history of research into the microstructure and properties of ceramic ware is discussed; methods of producing ceramics with particular characteristics are investigated. Bubbles, sintering, cracks, and electron microscopy are discussed.
Accumulation in Leaves of Sorghum bicolor (L.) Moench: A Source of Natural Food Pigment," J. Agricultural and Food Chemistry (2014) "Catalytic deoxygenation of triglycerides and fatty acids to
Applied analysis of lacquer films based on pyrolysis-gas chromatography/mass spectrometry.
Lu, Rong; Kamiya, Yukio; Miyakoshi, Tetsuo
2006-09-15
Ancient lacquer film, a Nanban lacquer film, an old lacquer-ware object imported from an Asian country, and the Baroque and Rococo lacquer films were analyzed by pyrolysis-gas chromatography/mass spectrometry. Compared with the results of the natural lacquer film, it was revealed that the ancient lacquer film and Nanban lacquer film were made from Rhus vernicifera, and the old lacquer-ware imported from an Asian country was made from Melanorrhoea usitata. However, the Baroque and Rococo lacquer films obtained from the Doerner Institute in Munich, Germany were made from natural resins. 3-Pentadecylcatechol (MW=320) (urushiol), 3-heptadecylcatechol (MW=348) (laccol), and 4-heptadecylcatechol (MW=348) (thitsiol) were the main products of the pyrolysis of R. vernicifera, Rhus succedanea, and M. usitata.
Umakanthan, Ramanan; Haglund, Nicholas A.; Stulak, John M.; Joyce, Lyle D.; Ahmad, Rashid; Keebler, Mary E.; Maltais, Simon
2014-01-01
Advances in mechanical circulatory support have been critical in bridging patients awaiting heart transplantation. In addition, improvement in device durability has enabled left ventricular assist device therapy to be applied as destination therapy in those not felt to be transplant candidate. Because of the increasing complexity of patients, there continues to be a need for alternative strategies for device implantation to bridge high-risk patients awaiting heart transplantation, wherein the risks of numerous previous sternotomies may be prohibitive. We present a unique technique for placement of the HeartWare ventricular assist device via left anterior thoracotomy to the descending aorta in a patient awaiting heart transplantation with a history of multiple previous sternotomies. PMID:24172273
Secure Cloud-Based Solutions for Different eHealth Services in Spanish Rural Health Centers
2015-01-01
Background The combination of eHealth applications and/or services with cloud technology provides health care staff—with sufficient mobility and accessibility for them—to be able to transparently check any data they may need without having to worry about its physical location. Objective The main aim of this paper is to put forward secure cloud-based solutions for a range of eHealth services such as electronic health records (EHRs), telecardiology, teleconsultation, and telediagnosis. Methods The scenario chosen for introducing the services is a set of four rural health centers located within the same Spanish region. iCanCloud software was used to perform simulations in the proposed scenario. We chose online traffic and the cost per unit in terms of time as the parameters for choosing the secure solution on the most optimum cloud for each service. Results We suggest that load balancers always be fitted for all solutions in communication together with several Internet service providers and that smartcards be used to maintain identity to an appropriate extent. The solutions offered via private cloud for EHRs, teleconsultation, and telediagnosis services require a volume of online traffic calculated at being able to reach 2 Gbps per consultation. This may entail an average cost of €500/month. Conclusions The security solutions put forward for each eHealth service constitute an attempt to centralize all information on the cloud, thus offering greater accessibility to medical information in the case of EHRs alongside more reliable diagnoses and treatment for telecardiology, telediagnosis, and teleconsultation services. Therefore, better health care for the rural patient can be obtained at a reasonable cost. PMID:26215155
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-07
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.
NASA Astrophysics Data System (ADS)
Nishi, N.; Hamada, A.
2012-12-01
Stratiform clouds (nimbostratus and cirriform clouds) in the upper troposphere accompanied with cumulonimbus activity cover large part of the tropical region and largely affect the radiation and water vapor budgets there. Recently new satellites (CloudSat and CALIPSO) can give us the information of cloud height and cloud ice amount even over the open ocean. However, their coverage is limited just below the satellite paths; it is difficult to capture the whole shape and to trace the lifecycle of each cloud system by using just these datasets. We made, as a complementary product, a dataset of cloud top height and visible optical thickness with one-hour resolution over the wide region, by using infrared split-window data of the geostationary satellites (AGU fall meeting 2011) and released on the internet (http://database.rish.kyoto-u.ac.jp/arch/ctop/). We made lookup tables for estimating cloud top height only with geostationary infrared observations by comparing them with the direct cloud observation by CloudSat (Hamada and Nishi, 2010, JAMC). We picked out the same-time observations by MTSAT and CloudSat and regressed the cloud top height observation of CloudSat back onto 11μm brightness temperature (Tb) and the difference between the 11μm Tb and 12μm Tb. We will call our estimated cloud top height as "CTOP" below. The area of our coverage is 85E-155W (MTSAT2) and 80E-160W(MTSAT1R), and 20S-20N. The accuracy of the estimation with the IR split-window observation is the best in the upper tropospheric height range. We analyzed the formation and maintenance of the cloud systems whose top height is in the upper troposphere with our CTOP analysis, CloudSat 2B-GEOPROF, and GSMaP (Global Satellite Mapping of Precipitation) precipitation data. Most of the upper tropospheric stratiform clouds have their cloud top within 13-15 km range. The cloud top height decreases slowly when dissipating but still has high value to the end. However, we sometimes observe that a little lower cloud top height (6-10 km) is kept within one-two days. A typical example is observed on 5 January 2011 in a dissipating cloud system with 1000-km scale. This cluster located between 0-10N just west of the International Date Line and moved westward with keeping relatively lower cloud top (6-10 km) over one day. This top height is lower than the ubiquitous upper-tropospheric stratiform clouds but higher than the so-called 'congestus cloud' whose top height is around 0C. CloudSat data show the presence of convective rainfall. It suggests that this cloud system continuously kept making new anvil clouds in a little lower height than usual. We examined the seasonal variation of the distribution of cloud systems with a little lower cloud top height (6-11 km) during 2010-11. The number of such cloud systems is not constant with seasons but frequently increased in some specific seasons. Over the equatorial ocean region (east of 150E), they were frequently observed during the northern winter.
Code of Federal Regulations, 2011 CFR
2011-07-01
... FISHING AND OPERATIONS ON AQUATIC PRODUCTS General Some Basic Definitions § 784.14 “Goods.” The definition... marine equipment), wares, products, commodities, merchandise, or articles or subjects of commerce of any...
Code of Federal Regulations, 2010 CFR
2010-07-01
... FISHING AND OPERATIONS ON AQUATIC PRODUCTS General Some Basic Definitions § 784.14 “Goods.” The definition... marine equipment), wares, products, commodities, merchandise, or articles or subjects of commerce of any...
NASA Astrophysics Data System (ADS)
Sareen, Sanjay; Gupta, Sunil Kumar; Sood, Sandeep K.
2017-10-01
Zika virus is a mosquito-borne disease that spreads very quickly in different parts of the world. In this article, we proposed a system to prevent and control the spread of Zika virus disease using integration of Fog computing, cloud computing, mobile phones and the Internet of things (IoT)-based sensor devices. Fog computing is used as an intermediary layer between the cloud and end users to reduce the latency time and extra communication cost that is usually found high in cloud-based systems. A fuzzy k-nearest neighbour is used to diagnose the possibly infected users, and Google map web service is used to provide the geographic positioning system (GPS)-based risk assessment to prevent the outbreak. It is used to represent each Zika virus (ZikaV)-infected user, mosquito-dense sites and breeding sites on the Google map that help the government healthcare authorities to control such risk-prone areas effectively and efficiently. The proposed system is deployed on Amazon EC2 cloud to evaluate its performance and accuracy using data set for 2 million users. Our system provides high accuracy of 94.5% for initial diagnosis of different users according to their symptoms and appropriate GPS-based risk assessment.
IAServ: an intelligent home care web services platform in a cloud for aging-in-place.
Su, Chuan-Jun; Chiang, Chang-Yu
2013-11-12
As the elderly population has been rapidly expanding and the core tax-paying population has been shrinking, the need for adequate elderly health and housing services continues to grow while the resources to provide such services are becoming increasingly scarce. Thus, increasing the efficiency of the delivery of healthcare services through the use of modern technology is a pressing issue. The seamless integration of such enabling technologies as ontology, intelligent agents, web services, and cloud computing is transforming healthcare from hospital-based treatments to home-based self-care and preventive care. A ubiquitous healthcare platform based on this technological integration, which synergizes service providers with patients' needs to be developed to provide personalized healthcare services at the right time, in the right place, and the right manner. This paper presents the development and overall architecture of IAServ (the Intelligent Aging-in-place Home care Web Services Platform) to provide personalized healthcare service ubiquitously in a cloud computing setting to support the most desirable and cost-efficient method of care for the aged-aging in place. The IAServ is expected to offer intelligent, pervasive, accurate and contextually-aware personal care services. Architecturally the implemented IAServ leverages web services and cloud computing to provide economic, scalable, and robust healthcare services over the Internet.
IAServ: An Intelligent Home Care Web Services Platform in a Cloud for Aging-in-Place
Su, Chuan-Jun; Chiang, Chang-Yu
2013-01-01
As the elderly population has been rapidly expanding and the core tax-paying population has been shrinking, the need for adequate elderly health and housing services continues to grow while the resources to provide such services are becoming increasingly scarce. Thus, increasing the efficiency of the delivery of healthcare services through the use of modern technology is a pressing issue. The seamless integration of such enabling technologies as ontology, intelligent agents, web services, and cloud computing is transforming healthcare from hospital-based treatments to home-based self-care and preventive care. A ubiquitous healthcare platform based on this technological integration, which synergizes service providers with patients’ needs to be developed to provide personalized healthcare services at the right time, in the right place, and the right manner. This paper presents the development and overall architecture of IAServ (the Intelligent Aging-in-place Home care Web Services Platform) to provide personalized healthcare service ubiquitously in a cloud computing setting to support the most desirable and cost-efficient method of care for the aged-aging in place. The IAServ is expected to offer intelligent, pervasive, accurate and contextually-aware personal care services. Architecturally the implemented IAServ leverages web services and cloud computing to provide economic, scalable, and robust healthcare services over the Internet. PMID:24225647
Reducing Time to Science: Unidata and JupyterHub Technology Using the Jetstream Cloud
NASA Astrophysics Data System (ADS)
Chastang, J.; Signell, R. P.; Fischer, J. L.
2017-12-01
Cloud computing can accelerate scientific workflows, discovery, and collaborations by reducing research and data friction. We describe the deployment of Unidata and JupyterHub technologies on the NSF-funded XSEDE Jetstream cloud. With the aid of virtual machines and Docker technology, we deploy a Unidata JupyterHub server co-located with a Local Data Manager (LDM), THREDDS data server (TDS), and RAMADDA geoscience content management system. We provide Jupyter Notebooks and the pre-built Python environments needed to run them. The notebooks can be used for instruction and as templates for scientific experimentation and discovery. We also supply a large quantity of NCEP forecast model results to allow data-proximate analysis and visualization. In addition, users can transfer data using Globus command line tools, and perform their own data-proximate analysis and visualization with Notebook technology. These data can be shared with others via a dedicated TDS server for scientific distribution and collaboration. There are many benefits of this approach. Not only is the cloud computing environment fast, reliable and scalable, but scientists can analyze, visualize, and share data using only their web browser. No local specialized desktop software or a fast internet connection is required. This environment will enable scientists to spend less time managing their software and more time doing science.
Hostetter, Jason; Khanna, Nishanth; Mandell, Jacob C
2018-06-01
The purpose of this study was to integrate web-based forms with a zero-footprint cloud-based Picture Archiving and Communication Systems (PACS) to create a tool of potential benefit to radiology research and education. Web-based forms were created with a front-end and back-end architecture utilizing common programming languages including Vue.js, Node.js and MongoDB, and integrated into an existing zero-footprint cloud-based PACS. The web-based forms application can be accessed in any modern internet browser on desktop or mobile devices and allows the creation of customizable forms consisting of a variety of questions types. Each form can be linked to an individual DICOM examination or a collection of DICOM examinations. Several uses are demonstrated through a series of case studies, including implementation of a research platform for multi-reader multi-case (MRMC) studies and other imaging research, and creation of an online Objective Structure Clinical Examination (OSCE) and an educational case file. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Providing Assistive Technology Applications as a Service Through Cloud Computing.
Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio
2015-01-01
Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.
Public health practice course using Google Plus.
Wu, Ting-Ting; Sung, Tien-Wen
2014-03-01
In recent years, mobile device-assisted clinical education has become popular among nursing school students. The introduction of mobile devices saves manpower and reduces errors while enhancing nursing students' professional knowledge and skills. To respond to the demands of various learning strategies and to maintain existing systems of education, the concept of Cloud Learning is gradually being introduced to instructional environments. Cloud computing facilitates learning that is personalized, diverse, and virtual. This study involved assessing the advantages of mobile devices and Cloud Learning in a public health practice course, in which Google+ was used as the learning platform, integrating various application tools. Users could save and access data by using any wireless Internet device. The platform was student centered and based on resource sharing and collaborative learning. With the assistance of highly flexible and convenient technology, certain obstacles in traditional practice training can be resolved. Our findings showed that the students who adopted Google+ were learned more effectively compared with those who were limited to traditional learning systems. Most students and the nurse educator expressed a positive attitude toward and were satisfied with the innovative learning method.
Performing quantum computing experiments in the cloud
NASA Astrophysics Data System (ADS)
Devitt, Simon J.
2016-09-01
Quantum computing technology has reached a second renaissance in the past five years. Increased interest from both the private and public sector combined with extraordinary theoretical and experimental progress has solidified this technology as a major advancement in the 21st century. As anticipated my many, some of the first realizations of quantum computing technology has occured over the cloud, with users logging onto dedicated hardware over the classical internet. Recently, IBM has released the Quantum Experience, which allows users to access a five-qubit quantum processor. In this paper we take advantage of this online availability of actual quantum hardware and present four quantum information experiments. We utilize the IBM chip to realize protocols in quantum error correction, quantum arithmetic, quantum graph theory, and fault-tolerant quantum computation by accessing the device remotely through the cloud. While the results are subject to significant noise, the correct results are returned from the chip. This demonstrates the power of experimental groups opening up their technology to a wider audience and will hopefully allow for the next stage of development in quantum information technology.
Back table outflow graft anastomosis technique for HeartWare HVAD implantation.
Basher, S; Bick, J; Maltais, S
2015-12-01
The management of concomitant aortic and aortic valve disease with left ventricular assist device (LVAD) implantation for patients with severe cardiomyopathy is challenging, and has not been established given the complexity of LVAD surgery with concomitant aortic interventions. A 45-year-old patient presented to our institution with end-stage heart failure symptoms and non-ischemic cardiomyopathy. The patient was found to have a bicuspid aortic valve, severe native aortic regurgitation, a significant ascending aortic aneurysm, and severely depressed left ventricular (LV) function requiring two inotropes. He underwent a successful hemiarch repair of the ascending aortic aneurysm using a back table outflow graft anastomosis technique, and subsequent placement of a HeartWare Ventricular Assist Device (HVAD) with concomitant aortic valve closure with a modified Park's stitch. The patient did well postoperatively and is currently listed for heart transplantation.
[Aluminum--occurrence and toxicity for organisms].
Ochmański, W; Barabasz, W
2000-01-01
Aluminium (Al.) is an ubiquitous element found in every food product. The sources of Al. are especially corn, yellow cheese, salt, herbs, spices, tea and tap water. In household Al.-made ware is a major source of the element. Al. may cause diseases in humans, especially hampers many metabolic processes especially turnover of calcium, phosphorus and iron. Salts of Al. may bind to DNA, RNA, inhibit such enzymes as hexokinase, acid and alkaline phosphatases, phosphodiesterase and phosphooxydase. Al. salts are especially harmful to nervous, hematopoietic systems and to skeleton. Al. gets to organism with food, water, cosmetics, from aluminium ware and containers. Toxicity comes from substitution of Mg and Fe ions effecting in disturbances in intracellular signaling, excretory functions and cellular growth. Neurotoxic action of Al. probably comes from substitution of Mg ions in ATP, what finally influences function of every ATP using-enzymes. There are observations in experimental models proving Al. salts are responsible for Alzheimer disease development. Toxicity of Al. to skeletal system results in diminished resistance thus tendencies to breaking, and comes from lower collagen synthesis and slowing down of mineralisation. Low erythropoietin production, inhibition of hem-synthesing enzymes and binding of Al. to transferrin, effects in anaemia. Carcinogenic effects of Al. were nor proved nor denied, but high concentrations of Al. were found in many neoplastic cells. In conclusion, we should introduce prophylactic measures effecting in less Al. intake esp. avoiding use of Al.-made ware nad controlling food for Al. content.
Maximally Permissive Composition of Actors in Ptolemy II
2013-03-20
into our physical world by means of sensors and actuators . This global network of Cyber-Physical Systems (i.e., integrations of computation with...physical processes [Lee, 2008]), is often referred to as the “Internet of Things” ( IoT ). This term was coined by Kevin Ashton [Ashton, 2009] in 1999 to...processing capabilities. A newly emerging outer- most peripheral layer of the Cloud that is key to the full realization of the IoT , is identified as “The
2016-03-01
Representational state transfer Java messaging service Java application programming interface (API) Internet relay chat (IRC)/extensible messaging and...JBoss application server or an Apache Tomcat servlet container instance. The relational database management system can be either PostgreSQL or MySQL ... Java library called direct web remoting. This library has been part of the core CACE architecture for quite some time; however, there have not been
Lackey, Amanda E; Pandey, Tarun; Moshiri, Mariam; Lalwani, Neeraj; Lall, Chandana; Bhargava, Puneet
2014-06-01
It is an opportune time for radiologists to focus on personal productivity. The ever increasing reliance on computers and the Internet has significantly changed the way we work. Myriad software applications are available to help us improve our personal efficiency. In this article, the authors discuss some tools that help improve collaboration and personal productivity, maximize e-learning, and protect valuable digital data. Published by Elsevier Inc.
Workload Model Based Dynamic Adaptation of Social Internet of Vehicles
Alam, Kazi Masudul; Saini, Mukesh; El Saddik, Abdulmotaleb
2015-01-01
Social Internet of Things (SIoT) has gained much interest among different research groups in recent times. As a key member of a smart city, the vehicular domain of SIoT (SIoV) is also undergoing steep development. In the SIoV, vehicles work as sensor-hub to capture surrounding information using the in-vehicle and Smartphone sensors and later publish them for the consumers. A cloud centric cyber-physical system better describes the SIoV model where physical sensing-actuation process affects the cloud based service sharing or computation in a feedback loop or vice versa. The cyber based social relationship abstraction enables distributed, easily navigable and scalable peer-to-peer communication among the SIoV subsystems. These cyber-physical interactions involve a huge amount of data and it is difficult to form a real instance of the system to test the feasibility of SIoV applications. In this paper, we propose an analytical model to measure the workloads of various subsystems involved in the SIoV process. We present the basic model which is further extended to incorporate complex scenarios. We provide extensive simulation results for different parameter settings of the SIoV system. The findings of the analyses are further used to design example adaptation strategies for the SIoV subsystems which would foster deployment of intelligent transport systems. PMID:26389905
Workload Model Based Dynamic Adaptation of Social Internet of Vehicles.
Alam, Kazi Masudul; Saini, Mukesh; El Saddik, Abdulmotaleb
2015-09-15
Social Internet of Things (SIoT) has gained much interest among different research groups in recent times. As a key member of a smart city, the vehicular domain of SIoT (SIoV) is also undergoing steep development. In the SIoV, vehicles work as sensor-hub to capture surrounding information using the in-vehicle and Smartphone sensors and later publish them for the consumers. A cloud centric cyber-physical system better describes the SIoV model where physical sensing-actuation process affects the cloud based service sharing or computation in a feedback loop or vice versa. The cyber based social relationship abstraction enables distributed, easily navigable and scalable peer-to-peer communication among the SIoV subsystems. These cyber-physical interactions involve a huge amount of data and it is difficult to form a real instance of the system to test the feasibility of SIoV applications. In this paper, we propose an analytical model to measure the workloads of various subsystems involved in the SIoV process. We present the basic model which is further extended to incorporate complex scenarios. We provide extensive simulation results for different parameter settings of the SIoV system. The findings of the analyses are further used to design example adaptation strategies for the SIoV subsystems which would foster deployment of intelligent transport systems.
Secure Dynamic access control scheme of PHR in cloud computing.
Chen, Tzer-Shyong; Liu, Chia-Hui; Chen, Tzer-Long; Chen, Chin-Sheng; Bau, Jian-Guo; Lin, Tzu-Ching
2012-12-01
With the development of information technology and medical technology, medical information has been developed from traditional paper records into electronic medical records, which have now been widely applied. The new-style medical information exchange system "personal health records (PHR)" is gradually developed. PHR is a kind of health records maintained and recorded by individuals. An ideal personal health record could integrate personal medical information from different sources and provide complete and correct personal health and medical summary through the Internet or portable media under the requirements of security and privacy. A lot of personal health records are being utilized. The patient-centered PHR information exchange system allows the public autonomously maintain and manage personal health records. Such management is convenient for storing, accessing, and sharing personal medical records. With the emergence of Cloud computing, PHR service has been transferred to storing data into Cloud servers that the resources could be flexibly utilized and the operation cost can be reduced. Nevertheless, patients would face privacy problem when storing PHR data into Cloud. Besides, it requires a secure protection scheme to encrypt the medical records of each patient for storing PHR into Cloud server. In the encryption process, it would be a challenge to achieve accurately accessing to medical records and corresponding to flexibility and efficiency. A new PHR access control scheme under Cloud computing environments is proposed in this study. With Lagrange interpolation polynomial to establish a secure and effective PHR information access scheme, it allows to accurately access to PHR with security and is suitable for enormous multi-users. Moreover, this scheme also dynamically supports multi-users in Cloud computing environments with personal privacy and offers legal authorities to access to PHR. From security and effectiveness analyses, the proposed PHR access scheme in Cloud computing environments is proven flexible and secure and could effectively correspond to real-time appending and deleting user access authorization and appending and revising PHR records.
Li, Chun-Ta; Shih, Dong-Her; Wang, Chun-Cheng
2018-04-01
With the rapid development of wireless communication technologies and the growing prevalence of smart devices, telecare medical information system (TMIS) allows patients to receive medical treatments from the doctors via Internet technology without visiting hospitals in person. By adopting mobile device, cloud-assisted platform and wireless body area network, the patients can collect their physiological conditions and upload them to medical cloud via their mobile devices, enabling caregivers or doctors to provide patients with appropriate treatments at anytime and anywhere. In order to protect the medical privacy of the patient and guarantee reliability of the system, before accessing the TMIS, all system participants must be authenticated. Mohit et al. recently suggested a lightweight authentication protocol for cloud-based health care system. They claimed their protocol ensures resilience of all well-known security attacks and has several important features such as mutual authentication and patient anonymity. In this paper, we demonstrate that Mohit et al.'s authentication protocol has various security flaws and we further introduce an enhanced version of their protocol for cloud-assisted TMIS, which can ensure patient anonymity and patient unlinkability and prevent the security threats of report revelation and report forgery attacks. The security analysis proves that our enhanced protocol is secure against various known attacks as well as found in Mohit et al.'s protocol. Compared with existing related protocols, our enhanced protocol keeps the merits of all desirable security requirements and also maintains the efficiency in terms of computation costs for cloud-assisted TMIS. We propose a more secure mutual authentication and privacy preservation protocol for cloud-assisted TMIS, which fixes the mentioned security weaknesses found in Mohit et al.'s protocol. According to our analysis, our authentication protocol satisfies most functionality features for privacy preservation and effectively cope with cloud-assisted TMIS with better efficiency. Copyright © 2018 Elsevier B.V. All rights reserved.
Volunteered Cloud Computing for Disaster Management
NASA Astrophysics Data System (ADS)
Evans, J. D.; Hao, W.; Chettri, S. R.
2014-12-01
Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects; automates reconfiguration of their virtual machines; ensures accountability for donated computing; and optimizes the use of "interstitial" computing. Initial applications include fire detection from multispectral satellite imagery and flood risk mapping through hydrological simulations.
Development of the cloud sharing system for residential earthquake responses using smartphones
NASA Astrophysics Data System (ADS)
Shohei, N.; Fujiwara, H.; Azuma, H.; Hao, K. X.
2015-12-01
Earthquake responses at residential depends on its building structure, site amplification, epicenter distance, and etc. Until recently, it was impossible to obtain the individual residential response by conventional seismometer in terms of costs. However, current technology makes it possible with the Micro Electro Mechanical Systems (MEMS) sensors inside mobile terminals like smartphones. We developed the cloud sharing system for residential earthquake response in local community utilizing mobile terminals, such as an iPhone, iPad, iPod touch as a collaboration between NIED and Hakusan Corp. The triggered earthquake acceleration waveforms are recorded at sampling frequencies of 100Hz and stored on their memories once an threshold value was exceeded or ordered information received from the Earthquake Early Warning system. The recorded data is automatically transmitted and archived on the cloud server once the wireless communication is available. Users can easily get the uploaded data by use of a web browser through Internet. The cloud sharing system is designed for residential and only shared in local community internal. Residents can freely add sensors and register information about installation points in each region. And if an earthquake occurs, they can easily view the local distribution of seismic intensities and even analyze waves.To verify this cloud-based seismic wave sharing system, we have performed on site experiments under the cooperation of several local communities, The system and experimental results will be introduced and demonstrated in the presentation.
Comparing Networks from a Data Analysis Perspective
NASA Astrophysics Data System (ADS)
Li, Wei; Yang, Jing-Yu
To probe network characteristics, two predominant ways of network comparison are global property statistics and subgraph enumeration. However, they suffer from limited information and exhaustible computing. Here, we present an approach to compare networks from the perspective of data analysis. Initially, the approach projects each node of original network as a high-dimensional data point, and the network is seen as clouds of data points. Then the dispersion information of the principal component analysis (PCA) projection of the generated data clouds can be used to distinguish networks. We applied this node projection method to the yeast protein-protein interaction networks and the Internet Autonomous System networks, two types of networks with several similar higher properties. The method can efficiently distinguish one from the other. The identical result of different datasets from independent sources also indicated that the method is a robust and universal framework.
An Atlas of Computed Equivalent Widths of Quasar Broad Emission Lines
NASA Astrophysics Data System (ADS)
Korista, Kirk; Baldwin, Jack; Ferland, Gary; Verner, Dima
We present graphically the results of several thousand photoionization calculations of broad emission-line clouds in quasars, spanning 7 orders of magnitude in hydrogen ionizing flux and particle density. The equivalent widths of 42 quasar emission lines are presented as contours in the particle density-ionizing flux plane for a typical incident continuum shape, solar chemical abundances, and cloud column density of N(H) = 1023 cm-2. Results are similarly given for a small subset of emission lines for two other column densities (1022 and 1024 cm-2), five other incident continuum shapes, and a gas metallicity of 5 Z⊙. These graphs should prove useful in the analysis of quasar emission-line data and in the detailed modeling of quasar broad emission-line regions. The digital results of these emission-line grids and many more are available over the Internet.
Volunteer Clouds and Citizen Cyberscience for LHC Physics
NASA Astrophysics Data System (ADS)
Aguado Sanchez, Carlos; Blomer, Jakob; Buncic, Predrag; Chen, Gang; Ellis, John; Garcia Quintas, David; Harutyunyan, Artem; Grey, Francois; Lombrana Gonzalez, Daniel; Marquina, Miguel; Mato, Pere; Rantala, Jarno; Schulz, Holger; Segal, Ben; Sharma, Archana; Skands, Peter; Weir, David; Wu, Jie; Wu, Wenjing; Yadav, Rohit
2011-12-01
Computing for the LHC, and for HEP more generally, is traditionally viewed as requiring specialized infrastructure and software environments, and therefore not compatible with the recent trend in "volunteer computing", where volunteers supply free processing time on ordinary PCs and laptops via standard Internet connections. In this paper, we demonstrate that with the use of virtual machine technology, at least some standard LHC computing tasks can be tackled with volunteer computing resources. Specifically, by presenting volunteer computing resources to HEP scientists as a "volunteer cloud", essentially identical to a Grid or dedicated cluster from a job submission perspective, LHC simulations can be processed effectively. This article outlines both the technical steps required for such a solution and the implications for LHC computing as well as for LHC public outreach and for participation by scientists from developing regions in LHC research.
1. Historic American Buildings Survey E. W. Russell, Photographer, March ...
1. Historic American Buildings Survey E. W. Russell, Photographer, March 14, 1936 REAR VIEW, NORTH OF SLAVE QUARTERS - Waring House, Slave Quarters, 351 Government Street (now South Claiborne Street), Mobile, Mobile County, AL
38 CFR 3.810 - Clothing allowance.
Code of Federal Regulations, 2010 CFR
2010-07-01
... wears or uses certain prosthetic or orthopedic appliances which tend to wear or tear clothing (including... or used which tends to ware or tear the veteran's clothing, or that because of the use of a physician...
Research on information security in big data era
NASA Astrophysics Data System (ADS)
Zhou, Linqi; Gu, Weihong; Huang, Cheng; Huang, Aijun; Bai, Yongbin
2018-05-01
Big data is becoming another hotspot in the field of information technology after the cloud computing and the Internet of Things. However, the existing information security methods can no longer meet the information security requirements in the era of big data. This paper analyzes the challenges and a cause of data security brought by big data, discusses the development trend of network attacks under the background of big data, and puts forward my own opinions on the development of security defense in technology, strategy and product.
Worldwide Report, Arms Control
1985-09-21
Intersoft- ware, Logic Control Control y Aplicaciones , Dielsa, Eliop, EISA, Sainco CTNE, Fagor Electronica, Standard CTNE, Fagor Electronica, Piher...regulation systems Industrial turbine of advanced design Alfa Sewing Machines Control y Aplicaciones , Danobat, Etxe-Tar C02, CO and ultraviolet lasers
Comparison of methods for measuring travel time at Florida freeways and arterials : [summary].
DOT National Transportation Integrated Search
2014-07-01
In this project, University of Florida researchers : collected field data along several highways to : evaluate travel time measurements from several : sources: STEWARD, BlueTOAD, INRIX, and HERE. : STEWARD (Statewide Transportation Engineering : Ware...
40 CFR 63.11440 - What are the monitoring requirements for new and existing sources?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Clay Ceramics Manufacturing Area Sources Standards, Compliance, and Monitoring Requirements § 63.11440... ceramic ware, you must conduct a daily check of the peak firing temperature. If the peak temperature...
Knowledge Retrieval Solutions.
ERIC Educational Resources Information Center
Khan, Kamran
1998-01-01
Excalibur RetrievalWare offers true knowledge retrieval solutions. Its fundamental technologies, Adaptive Pattern Recognition Processing and Semantic Networks, have capabilities for knowledge discovery and knowledge management of full-text, structured and visual information. The software delivers a combination of accuracy, extensibility,…
47 CFR 90.614 - Segments of the 806-824/851-869 MHz band for non-border areas.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., Jasper, Jeff Davis, Jefferson, Jenkins, Johnson, Jones, Lamar, Lanier, Laurens, Lee, Liberty, Lincoln..., Thomas, Tift, Toombs, Towns, Treutlen, Troup, Turner, Twiggs, Union, Upson, Walker, Walton, Ware, Warren...
47 CFR 90.614 - Segments of the 806-824/851-869 MHz band for non-border areas.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., Jasper, Jeff Davis, Jefferson, Jenkins, Johnson, Jones, Lamar, Lanier, Laurens, Lee, Liberty, Lincoln..., Thomas, Tift, Toombs, Towns, Treutlen, Troup, Turner, Twiggs, Union, Upson, Walker, Walton, Ware, Warren...
47 CFR 90.614 - Segments of the 806-824/851-869 MHz band for non-border areas.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., Jasper, Jeff Davis, Jefferson, Jenkins, Johnson, Jones, Lamar, Lanier, Laurens, Lee, Liberty, Lincoln..., Thomas, Tift, Toombs, Towns, Treutlen, Troup, Turner, Twiggs, Union, Upson, Walker, Walton, Ware, Warren...
Lesaffre, E; Asefa, M; Verbeke, G
1999-04-15
The Jimma Infant Survival Differential Longitudinal Study is an Ethiopian study, set up to establish risk factors affecting infant survival and to investigate socio-economic, maternal and infant-rearing factors that contribute most to the child's early survival. Here, a subgroup of about 1500 children born in Jimma town is examined for their first year's weight gain. Of special interest is the impact of certain cultural practices like uvulectomy, milk teeth extraction and butter swallowing, on child's weight gain; these have never been thoroughly investigated in any study. In this context, the linear mixed model (Laird and Ware) is employed. The purpose of this paper is to illustrate the practical issues when constructing the longitudinal model. Recently developed diagnostics will be used herefor. Finally, special attention will be paid to the two-stage interpretation of the linear mixed model.
NASA Astrophysics Data System (ADS)
Fleming, S. J.; Swann, C. P.
1992-02-01
The early 9th century A.D. trade links between China's Tang dynasty and the Western world, by land along the caravan routes of the Silk Road and by sea via India and the Gulf, encouraged a taste for many kinds of Chinese ceramics among the affluent societies of the Islamic empire. Using PIXE analysis to determine the primary composition and minor element patterns of the clay fabrics of a wide range of pottery recovered from the Islamic city of Siraf (in southern Iran), we have established clear differences between Chinese wares imported to that region and a variety of imitative wares of local origin. Parallels and contrasts are drawn between our data and those obtained by Chinese scholars in recent years for the products of various Tang kiln sites, particularly for a series of distinctive stonewares from Tongguan (Hunan province).
Real-Time Mapping: Contemporary Challenges and the Internet of Things as the Way Forward
NASA Astrophysics Data System (ADS)
Bęcek, Kazimierz
2016-12-01
The Internet of Things (IoT) is an emerging technology that was conceived in 1999. The key components of the IoT are intelligent sensors, which represent objects of interest. The adjective `intelligent' is used here in the information gathering sense, not the psychological sense. Some 30 billion sensors that `know' the current status of objects they represent are already connected to the Internet. Various studies indicate that the number of installed sensors will reach 212 billion by 2020. Various scenarios of IoT projects show sensors being able to exchange data with the network as well as between themselves. In this contribution, we discuss the possibility of deploying the IoT in cartography for real-time mapping. A real-time map is prepared using data harvested through querying sensors representing geographical objects, and the concept of a virtual sensor for abstract objects, such as a land parcel, is presented. A virtual sensor may exist as a data record in the cloud. Sensors are identified by an Internet Protocol address (IP address), which implies that geographical objects through their sensors would also have an IP address. This contribution is an updated version of a conference paper presented by the author during the International Federation of Surveyors 2014 Congress in Kuala Lumpur. The author hopes that the use of the IoT for real-time mapping will be considered by the mapmaking community.
NASA Astrophysics Data System (ADS)
Adedayo, Bada; Wang, Qi; Alcaraz Calero, Jose M.; Grecos, Christos
2015-02-01
The recent explosion in video-related Internet traffic has been driven by the widespread use of smart mobile devices, particularly smartphones with advanced cameras that are able to record high-quality videos. Although many of these devices offer the facility to record videos at different spatial and temporal resolutions, primarily with local storage considerations in mind, most users only ever use the highest quality settings. The vast majority of these devices are optimised for compressing the acquired video using a single built-in codec and have neither the computational resources nor battery reserves to transcode the video to alternative formats. This paper proposes a new low-complexity dynamic resource allocation engine for cloud-based video transcoding services that are both scalable and capable of being delivered in real-time. Firstly, through extensive experimentation, we establish resource requirement benchmarks for a wide range of transcoding tasks. The set of tasks investigated covers the most widely used input formats (encoder type, resolution, amount of motion and frame rate) associated with mobile devices and the most popular output formats derived from a comprehensive set of use cases, e.g. a mobile news reporter directly transmitting videos to the TV audience of various video format requirements, with minimal usage of resources both at the reporter's end and at the cloud infrastructure end for transcoding services.
Collaborative Working Architecture for IoT-Based Applications.
Mora, Higinio; Signes-Pont, María Teresa; Gil, David; Johnsson, Magnus
2018-05-23
The new sensing applications need enhanced computing capabilities to handle the requirements of complex and huge data processing. The Internet of Things (IoT) concept brings processing and communication features to devices. In addition, the Cloud Computing paradigm provides resources and infrastructures for performing the computations and outsourcing the work from the IoT devices. This scenario opens new opportunities for designing advanced IoT-based applications, however, there is still much research to be done to properly gear all the systems for working together. This work proposes a collaborative model and an architecture to take advantage of the available computing resources. The resulting architecture involves a novel network design with different levels which combines sensing and processing capabilities based on the Mobile Cloud Computing (MCC) paradigm. An experiment is included to demonstrate that this approach can be used in diverse real applications. The results show the flexibility of the architecture to perform complex computational tasks of advanced applications.
Malavasi, Massimiliano; Turri, Enrico; Atria, Jose Joaquin; Christensen, Heidi; Marxer, Ricard; Desideri, Lorenzo; Coy, Andre; Tamburini, Fabio; Green, Phil
2017-01-01
A better use of the increasing functional capabilities of home automation systems and Internet of Things (IoT) devices to support the needs of users with disability, is the subject of a research project currently conducted by Area Ausili (Assistive Technology Area), a department of Polo Tecnologico Regionale Corte Roncati of the Local Health Trust of Bologna (Italy), in collaboration with AIAS Ausilioteca Assistive Technology (AT) Team. The main aim of the project is to develop experimental low cost systems for environmental control through simplified and accessible user interfaces. Many of the activities are focused on automatic speech recognition and are developed in the framework of the CloudCAST project. In this paper we report on the first technical achievements of the project and discuss future possible developments and applications within and outside CloudCAST.
A Standard Mutual Authentication Protocol for Cloud Computing Based Health Care System.
Mohit, Prerna; Amin, Ruhul; Karati, Arijit; Biswas, G P; Khan, Muhammad Khurram
2017-04-01
Telecare Medical Information System (TMIS) supports a standard platform to the patient for getting necessary medical treatment from the doctor(s) via Internet communication. Security protection is important for medical records (data) of the patients because of very sensitive information. Besides, patient anonymity is another most important property, which must be protected. Most recently, Chiou et al. suggested an authentication protocol for TMIS by utilizing the concept of cloud environment. They claimed that their protocol is patient anonymous and well security protected. We reviewed their protocol and found that it is completely insecure against patient anonymity. Further, the same protocol is not protected against mobile device stolen attack. In order to improve security level and complexity, we design a light weight authentication protocol for the same environment. Our security analysis ensures resilience of all possible security attacks. The performance of our protocol is relatively standard in comparison with the related previous research.
NASA Astrophysics Data System (ADS)
Cruz, Febus Reidj G.; Padilla, Dionis A.; Hortinela, Carlos C.; Bucog, Krissel C.; Sarto, Mildred C.; Sia, Nirlu Sebastian A.; Chung, Wen-Yaw
2017-02-01
This study is about the determination of moisture content of milled rice using image processing technique and perceptron neural network algorithm. The algorithm involves several inputs that produces an output which is the moisture content of the milled rice. Several types of milled rice are used in this study, namely: Jasmine, Kokuyu, 5-Star, Ifugao, Malagkit, and NFA rice. The captured images are processed using MATLAB R2013a software. There is a USB dongle connected to the router which provided internet connection for online web access. The GizDuino IOT-644 is used for handling the temperature and humidity sensor, and for sending and receiving of data from computer to the cloud storage. The result is compared to the actual moisture content range using a moisture tester for milled rice. Based on results, this study provided accurate data in determining the moisture content of the milled rice.
Design and implementation of website information disclosure assessment system.
Cho, Ying-Chiang; Pan, Jen-Yi
2015-01-01
Internet application technologies, such as cloud computing and cloud storage, have increasingly changed people's lives. Websites contain vast amounts of personal privacy information. In order to protect this information, network security technologies, such as database protection and data encryption, attract many researchers. The most serious problems concerning web vulnerability are e-mail address and network database leakages. These leakages have many causes. For example, malicious users can steal database contents, taking advantage of mistakes made by programmers and administrators. In order to mitigate this type of abuse, a website information disclosure assessment system is proposed in this study. This system utilizes a series of technologies, such as web crawler algorithms, SQL injection attack detection, and web vulnerability mining, to assess a website's information disclosure. Thirty websites, randomly sampled from the top 50 world colleges, were used to collect leakage information. This testing showed the importance of increasing the security and privacy of website information for academic websites.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure
NASA Astrophysics Data System (ADS)
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-01
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.
Planning and management of cloud computing networks
NASA Astrophysics Data System (ADS)
Larumbe, Federico
The evolution of the Internet has a great impact on a big part of the population. People use it to communicate, query information, receive news, work, and as entertainment. Its extraordinary usefulness as a communication media made the number of applications and technological resources explode. However, that network expansion comes at the cost of an important power consumption. If the power consumption of telecommunication networks and data centers is considered as the power consumption of a country, it would rank at the 5 th place in the world. Furthermore, the number of servers in the world is expected to grow by a factor of 10 between 2013 and 2020. This context motivates us to study techniques and methods to allocate cloud computing resources in an optimal way with respect to cost, quality of service (QoS), power consumption, and environmental impact. The results we obtained from our test cases show that besides minimizing capital expenditures (CAPEX) and operational expenditures (OPEX), the response time can be reduced up to 6 times, power consumption by 30%, and CO2 emissions by a factor of 60. Cloud computing provides dynamic access to IT resources as a service. In this paradigm, programs are executed in servers connected to the Internet that users access from their computers and mobile devices. The first advantage of this architecture is to reduce the time of application deployment and interoperability, because a new user only needs a web browser and does not need to install software on local computers with specific operating systems. Second, applications and information are available from everywhere and with any device with an Internet access. Also, servers and IT resources can be dynamically allocated depending on the number of users and workload, a feature called elasticity. This thesis studies the resource management of cloud computing networks and is divided in three main stages. We start by analyzing the planning of cloud computing networks to get a comprehensive vision. The first question to be solved is what are the optimal data center locations. We found that the location of each data center has a big impact on cost, QoS, power consumption, and greenhouse gas emissions. An optimization problem with a multi-criteria objective function is proposed to decide jointly the optimal location of data centers and software components, link capacities, and information routing. Once the network planning has been analyzed, the problem of dynamic resource provisioning in real time is addressed. In this context, virtualization is a key technique in cloud computing because each server can be shared by multiple Virtual Machines (VMs) and the total power consumption can be reduced. In the same line of location problems, we propose a Green Cloud Broker that optimizes VM placement across multiple data centers. In fact, when multiple data centers are considered, response time can be reduced by placing VMs close to users, cost can be minimized, power consumption can be optimized by using energy efficient data centers, and CO2 emissions can be decreased by choosing data centers provided with renewable energy sources. The third stage of the analysis is the short-term management of a cloud data center. In particular, a method is proposed to assign VMs to servers by considering communication traffic among VMs. Cloud data centers receive new applications over time and these applications need on-demand resource provisioning. Each application is composed of multiple types of VMs that interact among themselves. A program called scheduler must place each new VM in a server and that impacts the QoS and power consumption. Our method places VMs that communicate among themselves in servers that are close to each other in the network topology, thus reducing communication delay and increasing the throughput available among VMs. Furthermore, the power consumption of each type of server is considered and the most efficient ones are chosen to place the VMs. The number of VMs of each application can be dynamically changed to match the workload and servers not needed in a particular period can be suspended to save energy. The methodology developed is based on Mixed Integer Programming (MIP) models to formalize the problems and use state of the art optimization solvers. Then, heuristics are developed to solve cases with more than 1,000 potential data center locations for the planning problem, 1,000 nodes for the cloud broker, and 128,000 servers for the VM placement problem. Solutions with very short optimality gaps, between 0% and 1.95%, are obtained, and execution time in the order of minutes for the planning problem and less than a second for real time cases. We consider that this thesis on resource provisioning of cloud computing networks includes important contributions on this research area, and innovative commercial applications based on the proposed methods have promising future.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-17
... District, Roughly bounded by S. Philpot St., East Ave, E. Ware & Park Sts., Cedartown, 11000776 IDAHO Ada... following resource: MAINE Androscoggin County Cowan Mill Island Mill St., Lewiston, 85001656 [FR Doc. 2011...
The use of tags and tag clouds to discern credible content in online health message forums.
O'Grady, Laura; Wathen, C Nadine; Charnaw-Burger, Jill; Betel, Lisa; Shachak, Aviv; Luke, Robert; Hockema, Stephen; Jadad, Alejandro R
2012-01-01
Web sites with health-oriented content are potentially harmful if inaccurate or inappropriate medical information is used to make health-related decisions. Checklists, rating systems and guidelines have been developed to help people determine what is credible, but recent Internet technologies emphasize applications that are collaborative in nature, including tags and tag clouds, where site users 'tag' or label online content, each using their own labelling system. Concepts such as the date, reference, author, testimonial and quotations are considered predictors of credible content. An understanding of these descriptive tools, how they relate to the depiction of credibility and how this relates to overall efforts to label data in relation to the semantic web has yet to emerge. This study investigates how structured (pre-determined) and unstructured (user-generated) tags and tag clouds with a multiple word search feature are used by participants to assess credibility of messages posted in online message forums. The targeted respondents were those using web sites message forums for disease self-management. We also explored the relevancy of our findings to the labelling or indexing of data in the context of the semantic web. Diabetes was chosen as the content area in this study, since (a) this is a condition with increasing prevalence and (b) diabetics have been shown to actively use the Internet to manage their condition. From January to March 2010 participants were recruited using purposive sampling techniques. A screening instrument was used to determine eligibility. The study consisted of a demographic and computer usage survey, a series of usability tests and an interview. We tested participants (N=22) on two scenarios, each involving tasks that assessed their ability to tag content and search using a tag cloud that included six structured credibility terms (statistics, date, reference, author, testimonial and quotations). MORAE Usability software (version 3.1) was employed to record participants' use of the study environment. The surveys were analyzed using SPSS version 17. Interviews with participants were transcribed, coded and analyzed using thematic text analysis with the aid of NVivo8. Most participants had experience with Internet resources. However, less than one quarter of this sample had seen or used tags or a tag clouds. The ways in which participants used tags to label the content posted in the message forums varied. Some participants were tagging the information for their own subsequent use, whereas others viewed this process from the perspective of others: they tagged the content in ways that they thought other users would find beneficial. Many participants did not use the structured credibility tags when asked to search for credible content. The interviews corroborated these findings by confirming participants were not considering credibility foremost when tagging. Many participants in this study focused on assessing whether the information was relevant to their current circumstances, after which they would proceed to determine its credibility by corroborating with other sources. The use of structured tags to label information may not be a useful way to encourage the use of tagging, or to indicate credibility in this context. Current applications used in the semantic web automate this process. Therefore it may be useful to engage consumers of online content, in particular health-related content, to be more directly involved in the annotation of this content. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Veenendaal, B.; Brovelli, M. A.; Li, S.; Ivánová, I.
2017-09-01
Although maps have been around for a very long time, web maps are yet very young in their origin. Despite their relatively short history, web maps have been developing very rapidly over the past few decades. The use, users and usability of web maps have rapidly expanded along with developments in web technologies and new ways of mapping. In the process of these developments, the terms and terminology surrounding web mapping have also changed and evolved, often relating to the new technologies or new uses. Examples include web mapping, web GIS, cloud mapping, internet mapping, internet GIS, geoweb, map mashup, online mapping etc., not to mention those with prefixes such as "web-based" and "internet-based". So, how do we keep track of these terms, relate them to each other and have common understandings of their meanings so that references to them are not ambiguous, misunderstood or even different? This paper explores the terms surrounding web mapping and web GIS, and the development of their meaning over time. The paper then suggests the current context in which these terms are used and provides meanings that may assist in better understanding and communicating using these terms in the future.
NASA Astrophysics Data System (ADS)
Martin, C.; Dye, M. J.; Daniels, M. D.; Keiser, K.; Maskey, M.; Graves, S. J.; Kerkez, B.; Chandrasekar, V.; Vernon, F.
2015-12-01
The Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) project tackles the challenges of collecting and disseminating geophysical observational data in real-time, especially for researchers with limited IT budgets and expertise. The CHORDS Portal is a component that allows research teams to easily configure and operate a cloud-based service which can receive data from dispersed instruments, manage a rolling archive of the observations, and serve these data to any client on the Internet. The research group (user) creates a CHORDS portal simply by running a prepackaged "CHORDS appliance" on Amazon Web Services. The user has complete ownership and management of the portal. Computing expenses are typically very small. RESTful protocols are employed for delivering and fetching data from the portal, which means that any system capable of sending an HTTP GET message is capable of accessing the portal. A simple API is defined, making it straightforward for non-experts to integrate a diverse collection of field instruments. Languages with network access libraries, such as Python, sh, Matlab, R, IDL, Ruby and JavaScript (and most others) can retrieve structured data from the portal with just a few lines of code. The user's private portal provides a browser-based system for configuring, managing and monitoring the health of the integrated real-time system. This talk will highlight the design goals, architecture and agile development of the CHORDS Portal. A running portal, with operational data feeds from across the country, will be presented.
NASA Astrophysics Data System (ADS)
Wei, Wang; Chongchao, Pan; Yikai, Liang; Gang, Li
2017-11-01
With the rapid development of information technology, the scale of data center increases quickly, and the energy consumption of computer room also increases rapidly, among which, energy consumption of air conditioning cooling makes up a large proportion. How to apply new technology to reduce the energy consumption of the computer room becomes an important topic of energy saving in the current research. This paper study internet of things technology, and design a kind of green computer room environmental monitoring system. In the system, we can get the real-time environment data from the application of wireless sensor network technology, which will be showed in a creative way of three-dimensional effect. In the environment monitor, we can get the computer room assets view, temperature cloud view, humidity cloud view, microenvironment view and so on. Thus according to the condition of the microenvironment, we can adjust the air volume, temperature and humidity parameters of the air conditioning for the individual equipment cabinet to realize the precise air conditioning refrigeration. And this can reduce the energy consumption of air conditioning, as a result, the overall energy consumption of the green computer room will reduce greatly. At the same time, we apply this project in the computer center of Weihai, and after a year of test and running, we find that it took a good energy saving effect, which fully verified the effectiveness of this project on the energy conservation of the computer room.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rudkevich, Aleksandr; Goldis, Evgeniy
This research conducted by the Newton Energy Group, LLC (NEG) is dedicated to the development of pCloud: a Cloud-based Power Market Simulation Environment. pCloud is offering power industry stakeholders the capability to model electricity markets and is organized around the Software as a Service (SaaS) concept -- a software application delivery model in which software is centrally hosted and provided to many users via the internet. During the Phase I of this project NEG developed a prototype design for pCloud as a SaaS-based commercial service offering, system architecture supporting that design, ensured feasibility of key architecture's elements, formed technological partnershipsmore » and negotiated commercial agreements with partners, conducted market research and other related activities and secured funding for continue development of pCloud between the end of Phase I and beginning of Phase II, if awarded. Based on the results of Phase I activities, NEG has established that the development of a cloud-based power market simulation environment within the Windows Azure platform is technologically feasible, can be accomplished within the budget and timeframe available through the Phase II SBIR award with additional external funding. NEG believes that pCloud has the potential to become a game-changing technology for the modeling and analysis of electricity markets. This potential is due to the following critical advantages of pCloud over its competition: - Standardized access to advanced and proven power market simulators offered by third parties. - Automated parallelization of simulations and dynamic provisioning of computing resources on the cloud. This combination of automation and scalability dramatically reduces turn-around time while offering the capability to increase the number of analyzed scenarios by a factor of 10, 100 or even 1000. - Access to ready-to-use data and to cloud-based resources leading to a reduction in software, hardware, and IT costs. - Competitive pricing structure, which will make high-volume usage of simulation services affordable. - Availability and affordability of high quality power simulators, which presently only large corporate clients can afford, will level the playing field in developing regional energy policies, determining prudent cost recovery mechanisms and assuring just and reasonable rates to consumers. - Users that presently do not have the resources to internally maintain modeling capabilities will now be able to run simulations. This will invite more players into the industry, ultimately leading to more transparent and liquid power markets.« less
Maestro: an orchestration framework for large-scale WSN simulations.
Riliskis, Laurynas; Osipov, Evgeny
2014-03-18
Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation.
Cost-aware request routing in multi-geography cloud data centres using software-defined networking
NASA Astrophysics Data System (ADS)
Yuan, Haitao; Bi, Jing; Li, Bo Hu; Tan, Wei
2017-03-01
Current geographically distributed cloud data centres (CDCs) require gigantic energy and bandwidth costs to provide multiple cloud applications to users around the world. Previous studies only focus on energy cost minimisation in distributed CDCs. However, a CDC provider needs to deliver gigantic data between users and distributed CDCs through internet service providers (ISPs). Geographical diversity of bandwidth and energy costs brings a highly challenging problem of how to minimise the total cost of a CDC provider. With the recently emerging software-defined networking, we study the total cost minimisation problem for a CDC provider by exploiting geographical diversity of energy and bandwidth costs. We formulate the total cost minimisation problem as a mixed integer non-linear programming (MINLP). Then, we develop heuristic algorithms to solve the problem and to provide a cost-aware request routing for joint optimisation of the selection of ISPs and the number of servers in distributed CDCs. Besides, to tackle the dynamic workload in distributed CDCs, this article proposes a regression-based workload prediction method to obtain future incoming workload. Finally, this work evaluates the cost-aware request routing by trace-driven simulation and compares it with the existing approaches to demonstrate its effectiveness.
GPU-based cloud service for Smith-Waterman algorithm using frequency distance filtration scheme.
Lee, Sheng-Ta; Lin, Chun-Yuan; Hung, Che Lun
2013-01-01
As the conventional means of analyzing the similarity between a query sequence and database sequences, the Smith-Waterman algorithm is feasible for a database search owing to its high sensitivity. However, this algorithm is still quite time consuming. CUDA programming can improve computations efficiently by using the computational power of massive computing hardware as graphics processing units (GPUs). This work presents a novel Smith-Waterman algorithm with a frequency-based filtration method on GPUs rather than merely accelerating the comparisons yet expending computational resources to handle such unnecessary comparisons. A user friendly interface is also designed for potential cloud server applications with GPUs. Additionally, two data sets, H1N1 protein sequences (query sequence set) and human protein database (database set), are selected, followed by a comparison of CUDA-SW and CUDA-SW with the filtration method, referred to herein as CUDA-SWf. Experimental results indicate that reducing unnecessary sequence alignments can improve the computational time by up to 41%. Importantly, by using CUDA-SWf as a cloud service, this application can be accessed from any computing environment of a device with an Internet connection without time constraints.
Maestro: An Orchestration Framework for Large-Scale WSN Simulations
Riliskis, Laurynas; Osipov, Evgeny
2014-01-01
Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation. PMID:24647123
Shi, Xiaobo; Li, Wei; Song, Jeungeun; Hossain, M Shamim; Mizanur Rahman, Sk Md; Alelaiwi, Abdulhameed
2016-10-01
With the development of IoT (Internet of Thing), big data analysis and cloud computing, traditional medical information system integrates with these new technologies. The establishment of cloud-based smart healthcare application gets more and more attention. In this paper, semi-physical simulation technology is applied to cloud-based smart healthcare system. The Body sensor network (BSN) of system transmit has two ways of data collection and transmission. The one is using practical BSN to collect data and transmitting it to the data center. The other is transmitting real medical data to practical data center by simulating BSN. In order to transmit real medical data to practical data center by simulating BSN under semi-physical simulation environment, this paper designs an OPNET packet structure, defines a gateway node model between simulating BSN and practical data center and builds a custom protocol stack. Moreover, this paper conducts a large amount of simulation on the real data transmission through simulation network connecting with practical network. The simulation result can provides a reference for parameter settings of fully practical network and reduces the cost of devices and personnel involved.
An Intelligent Cloud Storage Gateway for Medical Imaging.
Viana-Ferreira, Carlos; Guerra, António; Silva, João F; Matos, Sérgio; Costa, Carlos
2017-09-01
Historically, medical imaging repositories have been supported by indoor infrastructures. However, the amount of diagnostic imaging procedures has continuously increased over the last decades, imposing several challenges associated with the storage volume, data redundancy and availability. Cloud platforms are focused on delivering hardware and software services over the Internet, becoming an appealing solution for repository outsourcing. Although this option may bring financial and technological benefits, it also presents new challenges. In medical imaging scenarios, communication latency is a critical issue that still hinders the adoption of this paradigm. This paper proposes an intelligent Cloud storage gateway that optimizes data access times. This is achieved through a new cache architecture that combines static rules and pattern recognition for eviction and prefetching. The evaluation results, obtained from experiments over a real-world dataset, show that cache hit ratios can reach around 80%, leading to reductions of image retrieval times by over 60%. The combined use of eviction and prefetching policies proposed can significantly reduce communication latency, even when using a small cache in comparison to the total size of the repository. Apart from the performance gains, the proposed system is capable of adjusting to specific workflows of different institutions.
A Survey of Fuel and Energy Information Sources: Volume II
... con"",,, alld r~~i~~1' ,rl~'~~~i!~S' than who.ld Tau! ... I'ART of the year, plenoc fill out thio form. ... effect and na'ure of any impor- \\ant ware acale chanl ...
Real-Time High-Dynamic Range Texture Mapping
2001-01-01
the renderings produced by radiosity and global illumination algorithms. As a particular example, Greg Ward’s RADIANCE synthetic imaging system [32...in soft- ware only. [26] presented a technique for performing Ward’s tone reproduction algo- rithm interactively to visualize radiosity solutions
The development of the time-keeping clock with TS-1 single chip microcomputer.
NASA Astrophysics Data System (ADS)
Zhou, Jiguang; Li, Yongan
The authors have developed a time-keeping clock with Intel 8751 single chip microcomputer that has been successfully used in time-keeping station. The hard-soft ware design and performance of the clock are introduced.
Image-Based Airborne LiDAR Point Cloud Encoding for 3d Building Model Retrieval
NASA Astrophysics Data System (ADS)
Chen, Yi-Chen; Lin, Chao-Hung
2016-06-01
With the development of Web 2.0 and cyber city modeling, an increasing number of 3D models have been available on web-based model-sharing platforms with many applications such as navigation, urban planning, and virtual reality. Based on the concept of data reuse, a 3D model retrieval system is proposed to retrieve building models similar to a user-specified query. The basic idea behind this system is to reuse these existing 3D building models instead of reconstruction from point clouds. To efficiently retrieve models, the models in databases are compactly encoded by using a shape descriptor generally. However, most of the geometric descriptors in related works are applied to polygonal models. In this study, the input query of the model retrieval system is a point cloud acquired by Light Detection and Ranging (LiDAR) systems because of the efficient scene scanning and spatial information collection. Using Point clouds with sparse, noisy, and incomplete sampling as input queries is more difficult than that by using 3D models. Because that the building roof is more informative than other parts in the airborne LiDAR point cloud, an image-based approach is proposed to encode both point clouds from input queries and 3D models in databases. The main goal of data encoding is that the models in the database and input point clouds can be consistently encoded. Firstly, top-view depth images of buildings are generated to represent the geometry surface of a building roof. Secondly, geometric features are extracted from depth images based on height, edge and plane of building. Finally, descriptors can be extracted by spatial histograms and used in 3D model retrieval system. For data retrieval, the models are retrieved by matching the encoding coefficients of point clouds and building models. In experiments, a database including about 900,000 3D models collected from the Internet is used for evaluation of data retrieval. The results of the proposed method show a clear superiority over related methods.
1993-06-08
size, with some of its industrial wares entering international markets. Overall, however, because of the shortage of capital and some irrational ...time. Japan’s exuberance makes it a formidable adversary of the United States and the USSR in the scientific and high technology fields. The grim
Joseph Nowak, a resident of Ware Mass. and Chief Operator of the Upper Blackstone Water Pollution Abatement District (District) in Milbury, Mass., was honored by EPA with a 2016 Regional Wastewater Treatment Plant Operator of the Year Excellence Award.
ERIC Educational Resources Information Center
Germain, Carol Anne, Ed.
2009-01-01
Many professional library associations and affiliates "strongly encourage" the marketing of libraries and their wares. These organizations present awards, certificates, and monetary enticements to honor outstanding marketing programs. For most individuals or teams who are already working on and implementing marketing programs, the added effort of…
Banker, John G.; Holcombe, Jr., Cressie E.
1977-01-01
A method of limiting carbon contamination from graphite ware used in induction melting of uranium alloys is provided comprising coating the graphite surface with a suspension of Y.sub.2 O.sub.3 particles in water containing about 1.5 to 4% by weight sodium carboxymethylcellulose.
Banker, J.G.; Holcombe, C.E. Jr.
1975-11-06
A method of limiting carbon contamination from graphite ware used in induction melting of uranium alloys is provided. The graphite surface is coated with a suspension of Y/sub 2/O/sub 3/ particles in water containing about 1.5 to 4 percent by weight sodium carboxymethylcellulose.
Marine Debris and Plastic Source Reduction Toolkit
Many plastic food service ware items originate on college and university campuses—in cafeterias, snack rooms, cafés, and eateries with take-out dining options. This Campus Toolkit is a detailed “how to” guide for reducing plastic waste on college campuses.
Categorical regression dose-response modeling
The goal of this training is to provide participants with training on the use of the U.S. EPA’s Categorical Regression soft¬ware (CatReg) and its application to risk assessment. Categorical regression fits mathematical models to toxicity data that have been assigned ord...
Featured Molecules: Sucrose and Vanillin
NASA Astrophysics Data System (ADS)
Coleman, William F.; Wildman, Randall J.
2003-04-01
The WebWare molecules of the month for April relate to the sense of taste. Apple Fool, the JCE Classroom Activity, mentions sucrose and vanillin and their use as flavorings. Fully manipulable (Chime) versions of these and other molecules are available at Only@JCE Online.
The big data processing platform for intelligent agriculture
NASA Astrophysics Data System (ADS)
Huang, Jintao; Zhang, Lichen
2017-08-01
Big data technology is another popular technology after the Internet of Things and cloud computing. Big data is widely used in many fields such as social platform, e-commerce, and financial analysis and so on. Intelligent agriculture in the course of the operation will produce large amounts of data of complex structure, fully mining the value of these data for the development of agriculture will be very meaningful. This paper proposes an intelligent data processing platform based on Storm and Cassandra to realize the storage and management of big data of intelligent agriculture.
The Technological Growth in eHealth Services.
Srivastava, Shilpa; Pant, Millie; Abraham, Ajith; Agrawal, Namrata
2015-01-01
The infusion of information communication technology (ICT) into health services is emerging as an active area of research. It has several advantages but perhaps the most important one is providing medical benefits to one and all irrespective of geographic boundaries in a cost effective manner, providing global expertise and holistic services, in a time bound manner. This paper provides a systematic review of technological growth in eHealth services. The present study reviews and analyzes the role of four important technologies, namely, satellite, internet, mobile, and cloud for providing health services.
The Technological Growth in eHealth Services
Srivastava, Shilpa; Pant, Millie; Abraham, Ajith; Agrawal, Namrata
2015-01-01
The infusion of information communication technology (ICT) into health services is emerging as an active area of research. It has several advantages but perhaps the most important one is providing medical benefits to one and all irrespective of geographic boundaries in a cost effective manner, providing global expertise and holistic services, in a time bound manner. This paper provides a systematic review of technological growth in eHealth services. The present study reviews and analyzes the role of four important technologies, namely, satellite, internet, mobile, and cloud for providing health services. PMID:26146515
Lazzari, Marisa; Pereyra Domingorena, Lucas; Stoner, Wesley D; Scattolin, María Cristina; Korstanje, María Alejandra; Glascock, Michael D
2017-05-16
The circulation and exchange of goods and resources at various scales have long been considered central to the understanding of complex societies, and the Andes have provided a fertile ground for investigating this process. However, long-standing archaeological emphasis on typological analysis, although helpful to hypothesize the direction of contacts, has left important aspects of ancient exchange open to speculation. To improve understanding of ancient exchange practices and their potential role in structuring alliances, we examine material exchanges in northwest Argentina (part of the south-central Andes) during 400 BC to AD 1000 (part of the regional Formative Period), with a multianalytical approach (petrography, instrumental neutron activation analysis, laser ablation inductively coupled plasma mass spectrometry) to artifacts previously studied separately. We assess the standard centralized model of interaction vs. a decentralized model through the largest provenance database available to date in the region. The results show: ( i ) intervalley heterogeneity of clays and fabrics for ordinary wares; ( ii ) intervalley homogeneity of clays and fabrics for a wide range of decorated wares (e.g., painted Ciénaga); ( iii ) selective circulation of two distinct polychrome wares (Vaquerías and Condorhuasi); ( iv ) generalized access to obsidian from one major source and various minor sources; and ( v ) selective circulation of volcanic rock tools from a single source. These trends reflect the multiple and conflicting demands experienced by people in small-scale societies, which may be difficult to capitalize by aspiring elites. The study undermines centralized narratives of exchange for this period, offering a new platform for understanding ancient exchange based on actual material transfers, both in the Andes and beyond.
Pereyra Domingorena, Lucas; Stoner, Wesley D.; Scattolin, María Cristina; Korstanje, María Alejandra; Glascock, Michael D.
2017-01-01
The circulation and exchange of goods and resources at various scales have long been considered central to the understanding of complex societies, and the Andes have provided a fertile ground for investigating this process. However, long-standing archaeological emphasis on typological analysis, although helpful to hypothesize the direction of contacts, has left important aspects of ancient exchange open to speculation. To improve understanding of ancient exchange practices and their potential role in structuring alliances, we examine material exchanges in northwest Argentina (part of the south-central Andes) during 400 BC to AD 1000 (part of the regional Formative Period), with a multianalytical approach (petrography, instrumental neutron activation analysis, laser ablation inductively coupled plasma mass spectrometry) to artifacts previously studied separately. We assess the standard centralized model of interaction vs. a decentralized model through the largest provenance database available to date in the region. The results show: (i) intervalley heterogeneity of clays and fabrics for ordinary wares; (ii) intervalley homogeneity of clays and fabrics for a wide range of decorated wares (e.g., painted Ciénaga); (iii) selective circulation of two distinct polychrome wares (Vaquerías and Condorhuasi); (iv) generalized access to obsidian from one major source and various minor sources; and (v) selective circulation of volcanic rock tools from a single source. These trends reflect the multiple and conflicting demands experienced by people in small-scale societies, which may be difficult to capitalize by aspiring elites. The study undermines centralized narratives of exchange for this period, offering a new platform for understanding ancient exchange based on actual material transfers, both in the Andes and beyond. PMID:28461485
MERIS albedo climatology and its effect on the FRESCO+ O2 A-band cloud retrieval from SCIAMACHY data
NASA Astrophysics Data System (ADS)
Popp, Christoph; Wang, Ping; Brunner, Dominik; Stammes, Piet; Zhou, Yipin
2010-05-01
Accurate cloud information is an important prerequisite for the retrieval of atmospheric trace gases from spaceborne UV/VIS sensors. Errors in the estimated cloud fraction and cloud height (pressure) result in an erroneous air mass factor and thus can lead to inaccuracies in the vertical column densities of the retrieved trace gas. In ESA's TEMIS (Tropospheric Emission Monitoring Internet Service) project, the FRESCO+ (Fast Retrieval Scheme for Clouds from the Oxygen A-band) cloud retrieval is applied to, amongst others, SCIAMACHY (SCanning Imaging Absorption SpectroMeter for Atmospheric CartograpHY) data to determine these quantities. Effective cloud fraction and pressure are inverted by (i) radiative transfer simulations of top-of-atmosphere reflectance based on O2 absorption, single Rayleigh scattering, surface and cloud albedo in three spectral windows covering the O2 A-band and (ii) a subsequent fitting of the simulated to the measured spectrum. However, FRESCO+ relies on a relatively coarse resolution surface albedo climatology (1° x 1°) compiled from GOME (Global Ozone Monitoring Experiment) measurements in the 1990's which introduces several artifacts, e.g. an overestimation of cloud fraction at coastlines or over some mountainous regions. Therefore, we test the substitution of the GOME climatology with a new land surface albedo climatology compiled for every month from MEdium Resolution Imaging Spectrometer (MERIS) Albedomap data (0.05° x 0.05°) covering the period January 2003 to October 2006. The MERIS channels at 754nm and 775nm are located spectrally close to the corresponding GOME channels (758nm and 772nm) on both sides of the O2 A-band. Further, the increased spatial resolution of the MERIS product allows to better account for SCIAMACHY's pixel size of approximately 30x60km. The aim of this study is to describe and assess (i) the compilation and quality of the MERIS climatology (ii) the differences to the GOME climatology, and (iii) possible enhancements of the SCIAMACHY cloud retrieval after integrating the MERIS climatology into FRESCO+. First results indicate that in areas where FRESCO+ is overestimating cloud fraction using the GOME climatology, MERIS generally reveals higher albedo values which in turn will lead to lower cloud fractions, e.g. at coastlines, some arid or mountainous areas. The differences between the two data sets are also higher in winter than in summer. It can therefore be expected that the new data base with increased spatial resolution improves SCIAMACHY cloud retrieval with FRESCO+. The most limiting factors for the compilation of the MERIS climatology can be assigned to inappropriate snow cover masking and occasionally unfavorable illumination conditions in high northern latitudes during winter.
ACR Imaging IT Reference Guide: Image Sharing: Evolving Solutions in the Age of Interoperability
Erickson, Bradley J.; Choy, Garry
2014-01-01
Interoperability is a major focus of the quickly evolving world of Health Information Technology. Easy, yet secure and confidential exchange of imaging exams and the associated reports must be a part of the solutions that are implemented. The availability of historical exams is essential in providing a quality interpretation and reducing inappropriate utilization of imaging services. Today exchange of imaging exams is most often achieved via a CD. We describe the virtues of this solution as well as challenges that have surfaced. Internet and cloud based technologies employed for many consumer services can provide a better solution. Vendors are making these solutions available. Standards for internet based exchange are emerging. Just as Radiology converged on DICOM as a standard to store and view images we need a common exchange standard. We will review the existing standards, and how they are organized into useful workflows through Integrating the Healthcare Enterprise (IHE) profiles. IHE and standards development processes are discussed. Healthcare and the domain of Radiology must stay current with quickly evolving internet standards. The successful use of the “cloud” will depend upon both the technologies we discuss and the policies put into place around these technologies. We discuss both aspects. The Radiology community must lead the way and provide a solution that works for radiologists and clinicians in the Electronic Medical Record (EMR). Lastly we describe the features we believe radiologists should consider when considering adding internet based exchange solutions to their practice. PMID:25467903
Examining the Role of Metadata in Testing IED Detection Systems
2009-09-01
energy management, crop assessment, weather forecasting, disaster alerts, and endangered species assessment [ Biagioni and Bridges 2002; Main- waring et...accessed June 10, 2008). Biagioni , E. and K. Bridges. 2002. The application of remote sensor technology to assist the recovery of rare endangered species
Long-Term Storage Studies on Dehydrated Ration Items and Food Packets
1976-06-01
and onions, bacon, corn, fruit salad, steamed fruitcake, chocolate nut bread, plums, rolled oats, raspberry and strawberry Jams , and one brand of... strawberry jam , biscuit spread, and canned peachsa ware unacceptable; scores for chocolate bera, beef and apaghetti, chicksn soup, canned plums
NASA Astrophysics Data System (ADS)
Chen, He; Yang, Yueguang; Su, Guolei; Wang, Xiaoqing; Zhang, Hourong; Sun, Xiaoyu; Fan, Youping
2017-09-01
There are increasingly serious electrocorrosion phenomena on insulator hardware caused by direct current transmission due to the wide-range popularization of extra high voltage direct current transmission engineering in our country. Steel foot corrosion is the main corrosion for insulators on positive polarity side of transmission lines. On one hand, the corrosion leads to the tapering off of steel foot diameter, having a direct influence on mechanical property of insulators; on the other hand, in condition of corrosion on steel foot wrapped in porcelain ware, the volume of the corrosion product is at least 50% more than that of the original steel foot, leading to bursting of porcelain ware, threatening safe operation of transmission lines. Therefore, it is necessary to conduct research on the phenomenon and propose feasible measures for corrosion inhibition. Starting with the corrosion mechanism, this article proposes two measures for corrosion inhibition, and verifies the inhibition effect in laboratory conditions, providing reference for application in engineering.
Design and Implementation of Cloud-Centric Configuration Repository for DIY IoT Applications
Ahmad, Shabir; Kim, Do Hyeun
2018-01-01
The Do-It-Yourself (DIY) vision for the design of a smart and customizable IoT application demands the involvement of the general public in its development process. The general public lacks the technical knowledge for programming state-of-the-art prototyping and development kits. The latest IoT kits, for example, Raspberry Pi, are revolutionizing the DIY paradigm for IoT, and more than ever, a DIY intuitive programming interface is required to enable the masses to interact with and customize the behavior of remote IoT devices on the Internet. However, in most cases, these DIY toolkits store the resultant configuration data in local storage and, thus, cannot be accessed remotely. This paper presents the novel implementation of such a system, which not only enables the general public to customize the behavior of remote IoT devices through a visual interface, but also makes the configuration available everywhere and anytime by leveraging the power of cloud-based platforms. The interface enables the visualization of the resources exposed by remote embedded resources in the form of graphical virtual objects (VOs). These VOs are used to create the service design through simple operations like drag-and-drop and the setting of properties. The configuration created as a result is maintained as an XML document, which is ingested by the cloud platform, thus making it available to be used anywhere. We use the HTTP approach for the communication between the cloud and IoT toolbox and the cloud and real devices, but for communication between the toolbox and actual resources, CoAP is used. Finally, a smart home case study has been implemented and presented in order to assess the effectiveness of the proposed work. PMID:29415450
Design and Implementation of Cloud-Centric Configuration Repository for DIY IoT Applications.
Ahmad, Shabir; Hang, Lei; Kim, Do Hyeun
2018-02-06
The Do-It-Yourself (DIY) vision for the design of a smart and customizable IoT application demands the involvement of the general public in its development process. The general public lacks the technical knowledge for programming state-of-the-art prototyping and development kits. The latest IoT kits, for example, Raspberry Pi, are revolutionizing the DIY paradigm for IoT, and more than ever, a DIY intuitive programming interface is required to enable the masses to interact with and customize the behavior of remote IoT devices on the Internet. However, in most cases, these DIY toolkits store the resultant configuration data in local storage and, thus, cannot be accessed remotely. This paper presents the novel implementation of such a system, which not only enables the general public to customize the behavior of remote IoT devices through a visual interface, but also makes the configuration available everywhere and anytime by leveraging the power of cloud-based platforms. The interface enables the visualization of the resources exposed by remote embedded resources in the form of graphical virtual objects (VOs). These VOs are used to create the service design through simple operations like drag-and-drop and the setting of properties. The configuration created as a result is maintained as an XML document, which is ingested by the cloud platform, thus making it available to be used anywhere. We use the HTTP approach for the communication between the cloud and IoT toolbox and the cloud and real devices, but for communication between the toolbox and actual resources, CoAP is used. Finally, a smart home case study has been implemented and presented in order to assess the effectiveness of the proposed work.
Facilitating Secure Sharing of Personal Health Data in the Cloud.
Thilakanathan, Danan; Calvo, Rafael A; Chen, Shiping; Nepal, Surya; Glozier, Nick
2016-05-27
Internet-based applications are providing new ways of promoting health and reducing the cost of care. Although data can be kept encrypted in servers, the user does not have the ability to decide whom the data are shared with. Technically this is linked to the problem of who owns the data encryption keys required to decrypt the data. Currently, cloud service providers, rather than users, have full rights to the key. In practical terms this makes the users lose full control over their data. Trust and uptake of these applications can be increased by allowing patients to feel in control of their data, generally stored in cloud-based services. This paper addresses this security challenge by providing the user a way of controlling encryption keys independently of the cloud service provider. We provide a secure and usable system that enables a patient to share health information with doctors and specialists. We contribute a secure protocol for patients to share their data with doctors and others on the cloud while keeping complete ownership. We developed a simple, stereotypical health application and carried out security tests, performance tests, and usability tests with both students and doctors (N=15). We developed the health application as an app for Android mobile phones. We carried out the usability tests on potential participants and medical professionals. Of 20 participants, 14 (70%) either agreed or strongly agreed that they felt safer using our system. Using mixed methods, we show that participants agreed that privacy and security of health data are important and that our system addresses these issues. We presented a security protocol that enables patients to securely share their eHealth data with doctors and nurses and developed a secure and usable system that enables patients to share mental health information with doctors.
Cross stratum resources protection in fog-computing-based radio over fiber networks for 5G services
NASA Astrophysics Data System (ADS)
Guo, Shaoyong; Shao, Sujie; Wang, Yao; Yang, Hui
2017-09-01
In order to meet the requirement of internet of things (IoT) and 5G, the cloud radio access network is a paradigm which converges all base stations computational resources into a cloud baseband unit (BBU) pool, while the distributed radio frequency signals are collected by remote radio head (RRH). A precondition for centralized processing in the BBU pool is an interconnection fronthaul network with high capacity and low delay. However, it has become more complex and frequent in the interaction between RRH and BBU and resource scheduling among BBUs in cloud. Cloud radio over fiber network has been proposed in our previous work already. In order to overcome the complexity and latency, in this paper, we first present a novel cross stratum resources protection (CSRP) architecture in fog-computing-based radio over fiber networks (F-RoFN) for 5G services. Additionally, a cross stratum protection (CSP) scheme considering the network survivability is introduced in the proposed architecture. The CSRP with CSP scheme can effectively pull the remote processing resource locally to implement the cooperative radio resource management, enhance the responsiveness and resilience to the dynamic end-to-end 5G service demands, and globally optimize optical network, wireless and fog resources. The feasibility and efficiency of the proposed architecture with CSP scheme are verified on our software defined networking testbed in terms of service latency, transmission success rate, resource occupation rate and blocking probability.
Examining the Use of the Cloud for Seismic Data Centers
NASA Astrophysics Data System (ADS)
Yu, E.; Meisenhelter, S.; Clayton, R. W.
2011-12-01
The Southern California Earthquake Data Center (SCEDC) archives seismic and station sensor metadata related to earthquake activity in southern California. It currently archives nearly 8400 data streams continuously from over 420 stations in near real time at a rate of 584 GB/month to a repository approximately 18 TB in size. Triggered waveform data from an average 12,000 earthquakes/year is also archived. Data are archived on mirrored disk arrays that are maintained and backed-up locally. These data are served over the Internet to scientists and the general public in many countries. The data demand has a steady component, largely needed for ambient noise correlation studies, and an impulsive component that is driven by earthquake activity. Designing a reliable, cost effective, system architecture equipped to handle periods of relatively low steady demand punctuated by unpredictable sharp spikes in demand immediately following a felt earthquake remains a major challenge. To explore an alternative paradigm, we have put one-month of the data in the "cloud" and have developed a user interface with the Google Apps Engine. The purpose is to assess the modifications in data structures that are necessary to make efficient searches. To date we have determined that the database schema must be "denormalized" to take advantage of the dynamic computational capabilities, and that it is likely advantageous to preprocess the waveform data to remove overlaps, gaps, and other artifacts. The final purpose of this study is to compare the cost of the cloud compared to ground-based centers. The major motivations for this study are the security and dynamic load capabilities of the cloud. In the cloud, multiple copies of the data are held in distributed centers thus eliminating the single point of failure associated with one center. The cloud can dynamically increase the level of computational resources during an earthquake, and the major tasks of managing a disk farm are eliminated. The center can also managed from anywhere and is not bound to a particular location.
Original Courseware for Introductory Psychology: Implementation and Evaluation.
ERIC Educational Resources Information Center
Slotnick, Robert S.
1988-01-01
Describes the implementation and field testing of PsychWare, a courseware package for introductory psychology developed and field tested at New York Institute of Technology. Highlights include the courseware package (10 software programs, a faculty manual, and a student workbook), and instructional design features (simulations, real-time…
Language Development and Scaffolding in a Sino-American Telecollaborative Project
ERIC Educational Resources Information Center
Jin, Li
2013-01-01
Previous research (e.g., Belz & Thorne, 2006; Ware & O'Dowd, 2008) has discovered that language learning can be afforded through intercultural telecollaboration. From a sociocultural theoretical perspective, the current study investigated the language development outcomes and process in a 10-week Sino-American telecollaborative project…
Code of Federal Regulations, 2011 CFR
2011-07-01
... without containers or labels, and that is received and handled without mark or count. Bunkers means a..., part A. Cargo means any goods, wares, or merchandise carried, or to be carried, for consideration... interested in the vessel, facility, or OCS facility, except dredge spoils. Cargo vessel means a vessel that...
Code of Federal Regulations, 2012 CFR
2012-07-01
... without containers or labels, and that is received and handled without mark or count. Bunkers means a..., part A. Cargo means any goods, wares, or merchandise carried, or to be carried, for consideration... interested in the vessel, facility, or OCS facility, except dredge spoils. Cargo vessel means a vessel that...
Code of Federal Regulations, 2013 CFR
2013-07-01
... without containers or labels, and that is received and handled without mark or count. Bunkers means a..., part A. Cargo means any goods, wares, or merchandise carried, or to be carried, for consideration... interested in the vessel, facility, or OCS facility, except dredge spoils. Cargo vessel means a vessel that...
Code of Federal Regulations, 2014 CFR
2014-07-01
... without containers or labels, and that is received and handled without mark or count. Bunkers means a..., part A. Cargo means any goods, wares, or merchandise carried, or to be carried, for consideration... interested in the vessel, facility, or OCS facility, except dredge spoils. Cargo vessel means a vessel that...
An IoT-cloud Based Wearable ECG Monitoring System for Smart Healthcare.
Yang, Zhe; Zhou, Qihao; Lei, Lei; Zheng, Kan; Xiang, Wei
2016-12-01
Public healthcare has been paid an increasing attention given the exponential growth human population and medical expenses. It is well known that an effective health monitoring system can detect abnormalities of health conditions in time and make diagnoses according to the gleaned data. As a vital approach to diagnose heart diseases, ECG monitoring is widely studied and applied. However, nearly all existing portable ECG monitoring systems cannot work without a mobile application, which is responsible for data collection and display. In this paper, we propose a new method for ECG monitoring based on Internet-of-Things (IoT) techniques. ECG data are gathered using a wearable monitoring node and are transmitted directly to the IoT cloud using Wi-Fi. Both the HTTP and MQTT protocols are employed in the IoT cloud in order to provide visual and timely ECG data to users. Nearly all smart terminals with a web browser can acquire ECG data conveniently, which has greatly alleviated the cross-platform issue. Experiments are carried out on healthy volunteers in order to verify the reliability of the entire system. Experimental results reveal that the proposed system is reliable in collecting and displaying real-time ECG data, which can aid in the primary diagnosis of certain heart diseases.
An innovative privacy preserving technique for incremental datasets on cloud computing.
Aldeen, Yousra Abdul Alsahib S; Salleh, Mazleena; Aljeroudi, Yazan
2016-08-01
Cloud computing (CC) is a magnificent service-based delivery with gigantic computer processing power and data storage across connected communications channels. It imparted overwhelming technological impetus in the internet (web) mediated IT industry, where users can easily share private data for further analysis and mining. Furthermore, user affable CC services enable to deploy sundry applications economically. Meanwhile, simple data sharing impelled various phishing attacks and malware assisted security threats. Some privacy sensitive applications like health services on cloud that are built with several economic and operational benefits necessitate enhanced security. Thus, absolute cyberspace security and mitigation against phishing blitz became mandatory to protect overall data privacy. Typically, diverse applications datasets are anonymized with better privacy to owners without providing all secrecy requirements to the newly added records. Some proposed techniques emphasized this issue by re-anonymizing the datasets from the scratch. The utmost privacy protection over incremental datasets on CC is far from being achieved. Certainly, the distribution of huge datasets volume across multiple storage nodes limits the privacy preservation. In this view, we propose a new anonymization technique to attain better privacy protection with high data utility over distributed and incremental datasets on CC. The proficiency of data privacy preservation and improved confidentiality requirements is demonstrated through performance evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.
Corporate knowledge repository: Adopting academic LMS into corporate environment
NASA Astrophysics Data System (ADS)
Bakar, Muhamad Shahbani Abu; Jalil, Dzulkafli
2017-10-01
The growth of Knowledge Economy has transformed human capital to be the vital asset in business organization of the 21st century. Arguably, due to its white-collar nature, knowledge-based industry is more favorable than traditional manufacturing business. However, over dependency on human capital can also be a major challenge as any workers will inevitably leave the company or retire. This situation will possibly create knowledge gap that may impact business continuity of the enterprise. Knowledge retention in the corporate environment has been of many research interests. Learning Management System (LMS) refers to the system that provides the delivery, assessment and management tools for an organization to handle its knowledge repository. By using the aspirations of a proven LMS implemented in an academic environment, this paper proposes LMS model that can be used to enable peer-to-peer knowledge capture and sharing in the knowledge-based organization. Cloud Enterprise Resource Planning (ERP), referred to an ERP solution in the internet cloud environment was chosen as the domain knowledge. The complexity of the Cloud ERP business and its knowledge make it very vulnerable to the knowledge retention problem. This paper discusses how the company's essential knowledge can be retained using the LMS system derived from academic environment into the corporate model.
EDGE COMPUTING AND CONTEXTUAL INFORMATION FOR THE INTERNET OF THINGS SENSORS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Levente
Interpreting sensor data require knowledge about sensor placement and the surrounding environment. For a single sensor measurement, it is easy to document the context by visual observation, however for millions of sensors reporting data back to a server, the contextual information needs to be automatically extracted from either data analysis or leveraging complimentary data sources. Data layers that overlap spatially or temporally with sensor locations, can be used to extract the context and to validate the measurement. To minimize the amount of data transmitted through the internet, while preserving signal information content, two methods are explored; computation at the edgemore » and compressed sensing. We validate the above methods on wind and chemical sensor data (1) eliminate redundant measurement from wind sensors and (2) extract peak value of a chemical sensor measuring a methane plume. We present a general cloud based framework to validate sensor data based on statistical and physical modeling and contextual data extracted from geospatial data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blink, F D; Blink, J A
Elementary students are using the internet to experience virtual field trips to learn about areas that they are not able to experience in person. This poster presentation describes a virtual field trip taken by Mendoza Elementary School, Las Vegas, Nevada classes during the summer of 2003. The authors, who are DataStreme Learning Implementation Team members, drove from Las Vegas to Seattle for the annual DataStreme Summer Workshop. During the trip and in Seattle, the authors communicated through the internet with classrooms in Las Vegas. Weather information, pictures, and pertinent information about Seattle or the enroute area were sent to themore » classes each day. The students then compared the weather in Las Vegas with the weather and clouds from the communication. Fourth grade students were studying about volcanoes and were excited to hear about, and see pictures of, Mt. Shasta, Mt. Lassen, Mt. St. Helen and Mt. Rainier during the virtual field trip. Classes were able to track the route taken on a map during the virtual field trip.« less
Psiha, Maria M; Vlamos, Panayiotis
2017-01-01
5G is the next generation of mobile communication technology. Current generation of wireless technologies is being evolved toward 5G for better serving end users and transforming our society. Supported by 5G cloud technology, personal devices will extend their capabilities to various applications, supporting smart life. They will have significant role in health, medical tourism, security, safety, and social life applications. The next wave of mobile communication is to mobilize and automate industries and industry processes via Machine-Type Communication (MTC) and Internet of Things (IoT). The current key performance indicators for the 5G infrastructure for the fully connected society are sufficient to satisfy most of the technical requirements in the healthcare sector. Thus, 5G can be considered as a door opener for new possibilities and use cases, many of which are as yet unknown. In this paper we present heterogeneous use cases in medical tourism sector, based on 5G infrastructure technologies and third-party cloud services.
An Efficient Mutual Authentication Framework for Healthcare System in Cloud Computing.
Kumar, Vinod; Jangirala, Srinivas; Ahmad, Musheer
2018-06-28
The increasing role of Telecare Medicine Information Systems (TMIS) makes its accessibility for patients to explore medical treatment, accumulate and approach medical data through internet connectivity. Security and privacy preservation is necessary for medical data of the patient in TMIS because of the very perceptive purpose. Recently, Mohit et al.'s proposed a mutual authentication protocol for TMIS in the cloud computing environment. In this work, we reviewed their protocol and found that it is not secure against stolen verifier attack, many logged in patient attack, patient anonymity, impersonation attack, and fails to protect session key. For enhancement of security level, we proposed a new mutual authentication protocol for the similar environment. The presented framework is also more capable in terms of computation cost. In addition, the security evaluation of the protocol protects resilience of all possible security attributes, and we also explored formal security evaluation based on random oracle model. The performance of the proposed protocol is much better in comparison to the existing protocol.
Leveraging Cloud Computing to Improve Storage Durability, Availability, and Cost for MER Maestro
NASA Technical Reports Server (NTRS)
Chang, George W.; Powell, Mark W.; Callas, John L.; Torres, Recaredo J.; Shams, Khawaja S.
2012-01-01
The Maestro for MER (Mars Exploration Rover) software is the premiere operation and activity planning software for the Mars rovers, and it is required to deliver all of the processed image products to scientists on demand. These data span multiple storage arrays sized at 2 TB, and a backup scheme ensures data is not lost. In a catastrophe, these data would currently recover at 20 GB/hour, taking several days for a restoration. A seamless solution provides access to highly durable, highly available, scalable, and cost-effective storage capabilities. This approach also employs a novel technique that enables storage of the majority of data on the cloud and some data locally. This feature is used to store the most recent data locally in order to guarantee utmost reliability in case of an outage or disconnect from the Internet. This also obviates any changes to the software that generates the most recent data set as it still has the same interface to the file system as it did before updates
VizieR Online Data Catalog: OGLE II SMC eclipsing binaries (Wyrzykowski+, 2004)
NASA Astrophysics Data System (ADS)
Wyrzykowski, L.; Udalski, A.; Kubiak, M.; Szymanski, M. K.; Zebrun, K.; Soszinski, I.; Wozniak, P. R.; Pietrzynski, G.; Szewczyk, O.
2009-03-01
We present new version of the OGLE-II catalog of eclipsing binary stars detected in the Small Magellanic Cloud, based on Difference Image Analysis catalog of variable stars in the Magellanic Clouds containing data collected from 1997 to 2000. We found 1351 eclipsing binary stars in the central 2.4 square degree area of the SMC. 455 stars are newly discovered objects, not found in the previous release of the catalog. The eclipsing objects were selected with the automatic search algorithm based on the artificial neural network. The full catalog with individual photometry is accessible from the OGLE INTERNET archive, at ftp://sirius.astrouw.edu.pl/ogle/ogle2/var_stars/smc/ecl . Regular observations of the SMC fields started on June 26, 1997 and covered about 2.4 square degrees of central parts of the SMC. Reductions of the photometric data collected up to the end of May 2000 were performed with the Difference Image Analysis (DIA) package. (1 data file).
Design and Implementation of Website Information Disclosure Assessment System
Cho, Ying-Chiang; Pan, Jen-Yi
2015-01-01
Internet application technologies, such as cloud computing and cloud storage, have increasingly changed people’s lives. Websites contain vast amounts of personal privacy information. In order to protect this information, network security technologies, such as database protection and data encryption, attract many researchers. The most serious problems concerning web vulnerability are e-mail address and network database leakages. These leakages have many causes. For example, malicious users can steal database contents, taking advantage of mistakes made by programmers and administrators. In order to mitigate this type of abuse, a website information disclosure assessment system is proposed in this study. This system utilizes a series of technologies, such as web crawler algorithms, SQL injection attack detection, and web vulnerability mining, to assess a website’s information disclosure. Thirty websites, randomly sampled from the top 50 world colleges, were used to collect leakage information. This testing showed the importance of increasing the security and privacy of website information for academic websites. PMID:25768434
A Systematic Process for Developing High Quality SaaS Cloud Services
NASA Astrophysics Data System (ADS)
La, Hyun Jung; Kim, Soo Dong
Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.
ERIC Educational Resources Information Center
Johnson, David L.
2005-01-01
After decades of research in artificial intelligence (AI) and cognitive psychology, a number of companies have emerged that offer intelligent tutor system (ITS) soft ware to schools. These systems try to mimic the help that a human tutor would provide to an individual student, something nearly impossible for teachers to accomplish in the…
MOOCs, High Technology, and Higher Learning
ERIC Educational Resources Information Center
Rhoads, Robert A.
2015-01-01
In "MOOCs, High Technology, and Higher Learning," Robert A. Rhoads places the OpenCourseWare (OCW) movement into the larger context of a revolution in educational technology. In doing so, he seeks to bring greater balance to increasingly polarized discussions of massively open online courses (MOOCs) and show their ongoing relevance to…
26 CFR 301.7304-1 - Penalty for fraudulently claiming drawback.
Code of Federal Regulations, 2010 CFR
2010-04-01
... drawback on goods, wares, or merchandise on which no internal tax shall have been paid, or fraudulently claims any greater allowance of drawback than the tax actually paid, he shall forfeit triple the amount....7304-1 Section 301.7304-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY...
Developing Intercultural Communicative Competence through Online Exchanges
ERIC Educational Resources Information Center
Chun, Dorothy M.
2011-01-01
Based on Byram's (1997) definition of intercultural communicative competence (ICC) and on specific types of discourse analysis proposed by Kramsch and Thorne (2002) and Ware and Kramsch (2005), this article explores how online exchanges can play a role in second language learners' development of pragmatic competence and ICC. With data obtained…
ERIC Educational Resources Information Center
Sun, Jerry Chih-Yuan; Wu, Yu-Ting
2016-01-01
This study aimed to investigate the effectiveness of two different teaching methods on learning effectiveness. OpenCourseWare was integrated into the flipped classroom model (experimental group) and distance learning (control group). Learning effectiveness encompassed learning achievement, teacher-student interactions, and learning satisfaction.…
7 CFR 2902.52 - Disposable tableware.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 15 2011-01-01 2011-01-01 false Disposable tableware. 2902.52 Section 2902.52 Agriculture Regulations of the Department of Agriculture (Continued) OFFICE OF ENERGY POLICY AND NEW USES... and used in dining, such as drink ware and dishware, including but not limited to cups, plates, bowls...
Contributory Factors to Teachers' Sense of Community in Public Urban Elementary Schools
ERIC Educational Resources Information Center
Kirkhus, Debra
2011-01-01
The purpose of this study was to investigate factors that contribute to teachers' sense of community within public, urban, elementary schools. Because previous research has touted the benefits of teacher communities within schools (Kruse, 2001; Leana & Pil, 2006; Ware & Kitsantas, 2007) educational leaders are challenged with creating…
ERIC Educational Resources Information Center
Demski, Jennifer
2012-01-01
When it comes to implementing a large-scale 1-to-1 computing initiative, deciding which device students will use every day to support their learning requires a significant amount of thought and research. Laptop, netbook, Chromebook, tablet--each device has enough similarities to make the decision seem easy, but enough differences to make a big…
40 CFR 63.11436 - What parts of my plant does this subpart cover?
Code of Federal Regulations, 2010 CFR
2010-07-01
... CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Clay Ceramics... subpart cover? (a) This subpart applies to any existing or new affected source located at a clay ceramics... glazed ceramic ware located at a clay ceramics manufacturing facility. (c) An affected source is existing...
40 CFR 63.11441 - What are the notification requirements?
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Clay Ceramics Manufacturing Area... glazed ceramic ware, you must certify that you are maintaining the peak temperature below 1540 °C (2800... ceramics manufacturing facility that uses more than 227 Mg/yr (250 tpy) of wet glaze(s), you must certify...
40 CFR 63.11435 - Am I subject to this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
...) National Emission Standards for Hazardous Air Pollutants for Clay Ceramics Manufacturing Area Sources... if you own or operate a clay ceramics manufacturing facility (as defined in § 63.11444), with an atomized glaze spray booth or kiln that fires glazed ceramic ware, that processes more than 45 megagrams...
Using Simulation Module, PCLAB, for Steady State Disturbance Sensitivity Analysis in Process Control
ERIC Educational Resources Information Center
Ali, Emad; Idriss, Arimiyawo
2009-01-01
Recently, chemical engineering education moves towards utilizing simulation soft wares to enhance the learning process especially in the field of process control. These training simulators provide interactive learning through visualization and practicing which will bridge the gap between the theoretical abstraction of textbooks and the…
Parents' Perception on De La Salle University-Dasmarinas Services
ERIC Educational Resources Information Center
Cortez-Antig, Carmelyn
2011-01-01
The study was conducted to find out the parents' perception on the De La Salle University-Dasmarinas services which are grouped as follows: (1) Academic instruction factor; (2) Quality of human ware (includes faculty, administration, staff support through medical services, guidance and discipline); (3) Quality of hardware (dorm facilities,…
Marketing: Giveaways that Give Back--Marketing with Promotional Items
ERIC Educational Resources Information Center
Germain, Carol Anne
2006-01-01
In marketing one's library wares, an effective publicity strategy is to use promotional giveaway materials, mainly personalized items that promote one's resources and services. As with any marketing effort, it is important to be clear about the promotional initiative and whom librarians are attempting to reach. This author emphasizes that…
Pressures on TV Programs: Coalition for Better Television's Case.
ERIC Educational Resources Information Center
Shipman, John M., Jr.
In 1981, the conservative Coalition for Better Television (CBTV) threatened an economic boycott against advertisers who marketed their wares on programs that the coalition felt had excessive sex and violence. Because television networks are dependent on advertising, the coalition believed economic pressure on advertisers would force a…
77 FR 52313 - Notice of Scope Rulings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-29
..., 2012. A-570-506: Porcelain-On-Steel Cooking Ware from the People's Republic of China Requestor: The Coleman Company, Inc.; the stockpot locking lid of its Signature Series All-In-One Cooking System And Max Series is within the scope of the antidumping duty order; the stockpot cooking base of its Signature...
Effectiveness of Existing Eye Safety Legislation in Arizona.
ERIC Educational Resources Information Center
Gillaspy, Roy Eugene
This study was designed to ascertain the current practices of eye safety in Arizona high school industrial education laboratories, including the enforcement of eye safety legislation, use of eye protection devices, how the eye ware meets the American National Standards Institute specifications, and the teachers' interpretations of the existing eye…
Self-Taught Visually-Guided Pointing for a Humanoid Robot
2006-01-01
Brooks, R., Bryson, J., Marjanovic , M., Stein, L. A., & Wessler, M. (1996), Humanoid Soft- ware, Technical report, MIT Arti cial Intelli- gence Lab...8217, Journal of Biomechanics 19, 231{238. Marjanovic , M. (1995), Learning Functional Maps Between Sensorimotor Systems on a Humanoid Robot, Master’s thesis, MIT
Computer-Mediated Corrective Feedback and Language Accuracy in Telecollaborative Exchanges
ERIC Educational Resources Information Center
Vinagre, Margarita; Munoz, Beatriz
2011-01-01
Recent studies illustrate the potential that intercultural telecollaborative exchanges entail for language development through the use of corrective feedback from collaborating partners (Kessler, 2009; Lee, 2008; Sauro, 2009; Ware & O'Dowd, 2008). We build on this growing body of research by presenting the findings of a three-month-long…
An Analysis of Hope as a Psychological Strength
ERIC Educational Resources Information Center
Valle, Michael F.; Huebner, E. Scott; Suldo, Shannon M.
2006-01-01
Psychologists have placed an increased emphasis on identifying psychological strengths that foster healthy development. Hope, as operationalized in Snyder's hope theory [Snyder, C. R., Hoza, B., Pelham, W. E., Rapoff, M., Ware, L., & Danovsky, M., et al. (1997). The development and validation of the children's hope scale. "Journal of Pediatric…
ERIC Educational Resources Information Center
Feintuch, Howard
2009-01-01
OpenCourseWare (OCW) program, offered at the Massachusetts Institute of Technology (MIT), provides open access to course materials for a large number of MIT classes. From this resource, American Megan Brewster, a recent graduate working in Guatemala, was able to formulate and implement a complete protocol to tackle Guatemala's need for a plastics…
5 CFR Appendix C to Subpart B of... - Appropriated Fund Wage and Survey Areas
Code of Federal Regulations, 2012 CFR
2012-01-01
... Thomas Tift Turner Ware Atlanta Survey Area Georgia: Butts Cherokee Clayton Cobb De Kalb Douglas Fayette... Elbert Emanuel Glascock Hart Jefferson Jenkins Lincoln Taliaferro Warren Wilkes South Carolina: Allendale... Thomas Trego Wallace Wichita Wilson Woodson Kentucky Lexington Survey Area Kentucky: Bourbon Clark...
5 CFR Appendix C to Subpart B of... - Appropriated Fund Wage and Survey Areas
Code of Federal Regulations, 2014 CFR
2014-01-01
... Thomas Tift Turner Ware Atlanta Survey Area Georgia: Butts Cherokee Clayton Cobb De Kalb Douglas Fayette... Elbert Emanuel Glascock Hart Jefferson Jenkins Lincoln Taliaferro Warren Wilkes South Carolina: Allendale... Thomas Trego Wallace Wichita Wilson Woodson Kentucky Lexington Survey Area Kentucky: Bourbon Clark...
5 CFR Appendix C to Subpart B of... - Appropriated Fund Wage and Survey Areas
Code of Federal Regulations, 2013 CFR
2013-01-01
... Thomas Tift Turner Ware Atlanta Survey Area Georgia: Butts Cherokee Clayton Cobb De Kalb Douglas Fayette... Elbert Emanuel Glascock Hart Jefferson Jenkins Lincoln Taliaferro Warren Wilkes South Carolina: Allendale... Thomas Trego Wallace Wichita Wilson Woodson Kentucky Lexington Survey Area Kentucky: Bourbon Clark...
Internet-Based Solutions for a Secure and Efficient Seismic Network
NASA Astrophysics Data System (ADS)
Bhadha, R.; Black, M.; Bruton, C.; Hauksson, E.; Stubailo, I.; Watkins, M.; Alvarez, M.; Thomas, V.
2017-12-01
The Southern California Seismic Network (SCSN), operated by Caltech and USGS, leverages modern Internet-based computing technologies to provide timely earthquake early warning for damage reduction, event notification, ShakeMap, and other data products. Here we present recent and ongoing innovations in telemetry, security, cloud computing, virtualization, and data analysis that have allowed us to develop a network that runs securely and efficiently.Earthquake early warning systems must process seismic data within seconds of being recorded, and SCSN maintains a robust and resilient network of more than 350 digital strong motion and broadband seismic stations to achieve this goal. We have continued to improve the path diversity and fault tolerance within our network, and have also developed new tools for latency monitoring and archiving.Cyberattacks are in the news almost daily, and with most of our seismic data streams running over the Internet, it is only a matter of time before SCSN is targeted. To ensure system integrity and availability across our network, we have implemented strong security, including encryption and Virtual Private Networks (VPNs).SCSN operates its own data center at Caltech, but we have also installed real-time servers on Amazon Web Services (AWS), to provide an additional level of redundancy, and eventually to allow full off-site operations continuity for our network. Our AWS systems receive data from Caltech-based import servers and directly from field locations, and are able to process the seismic data, calculate earthquake locations and magnitudes, and distribute earthquake alerts, directly from the cloud.We have also begun a virtualization project at our Caltech data center, allowing us to serve data from Virtual Machines (VMs), making efficient use of high-performance hardware and increasing flexibility and scalability of our data processing systems.Finally, we have developed new monitoring of station average noise levels at most stations. Noise monitoring is effective at identifying anthropogenic noise sources and malfunctioning acquisition equipment. We have built a dynamic display of results with sorting and mapping capabilities that allow us to quickly identify problematic sites and areas with elevated noise.
Community Seismic Network (CSN)
NASA Astrophysics Data System (ADS)
Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.; Liu, A.; Strand, L.
2012-12-01
We report on developments in sensor connectivity, architecture, and data fusion algorithms executed in Cloud computing systems in the Community Seismic Network (CSN), a network of low-cost sensors housed in homes and offices by volunteers in the Pasadena, CA area. The network has over 200 sensors continuously reporting anomalies in local acceleration through the Internet to a Cloud computing service (the Google App Engine) that continually fuses sensor data to rapidly detect shaking from earthquakes. The Cloud computing system consists of data centers geographically distributed across the continent and is likely to be resilient even during earthquakes and other local disasters. The region of Southern California is partitioned in a multi-grid style into sets of telescoping cells called geocells. Data streams from sensors within a geocell are fused to detect anomalous shaking across the geocell. Temporal spatial patterns across geocells are used to detect anomalies across regions. The challenge is to detect earthquakes rapidly with an extremely low false positive rate. We report on two data fusion algorithms, one that tessellates the surface so as to fuse data from a large region around Pasadena and the other, which uses a standard tessellation of equal-sized cells. Since September 2011, the network has successfully detected earthquakes of magnitude 2.5 or higher within 40 Km of Pasadena. In addition to the standard USB device, which connects to the host's computer, we have developed a stand-alone sensor that directly connects to the internet via Ethernet or wifi. This bypasses security concerns that some companies have with the USB-connected devices, and allows for 24/7 monitoring at sites that would otherwise shut down their computers after working hours. In buildings we use the sensors to model the behavior of the structures during weak events in order to understand how they will perform during strong events. Visualization models of instrumented buildings ranging between five and 22 stories tall have been constructed using Google SketchUp. Ambient vibration records are used to identify the first set of horizontal vibrational modal frequencies of the buildings. These frequencies are used to compute the response on every floor of the building, given either observed data or scenario ground motion input at the buildings' base.
Real-time video streaming in mobile cloud over heterogeneous wireless networks
NASA Astrophysics Data System (ADS)
Abdallah-Saleh, Saleh; Wang, Qi; Grecos, Christos
2012-06-01
Recently, the concept of Mobile Cloud Computing (MCC) has been proposed to offload the resource requirements in computational capabilities, storage and security from mobile devices into the cloud. Internet video applications such as real-time streaming are expected to be ubiquitously deployed and supported over the cloud for mobile users, who typically encounter a range of wireless networks of diverse radio access technologies during their roaming. However, real-time video streaming for mobile cloud users across heterogeneous wireless networks presents multiple challenges. The network-layer quality of service (QoS) provision to support high-quality mobile video delivery in this demanding scenario remains an open research question, and this in turn affects the application-level visual quality and impedes mobile users' perceived quality of experience (QoE). In this paper, we devise a framework to support real-time video streaming in this new mobile video networking paradigm and evaluate the performance of the proposed framework empirically through a lab-based yet realistic testing platform. One particular issue we focus on is the effect of users' mobility on the QoS of video streaming over the cloud. We design and implement a hybrid platform comprising of a test-bed and an emulator, on which our concept of mobile cloud computing, video streaming and heterogeneous wireless networks are implemented and integrated to allow the testing of our framework. As representative heterogeneous wireless networks, the popular WLAN (Wi-Fi) and MAN (WiMAX) networks are incorporated in order to evaluate effects of handovers between these different radio access technologies. The H.264/AVC (Advanced Video Coding) standard is employed for real-time video streaming from a server to mobile users (client nodes) in the networks. Mobility support is introduced to enable continuous streaming experience for a mobile user across the heterogeneous wireless network. Real-time video stream packets are captured for analytical purposes on the mobile user node. Experimental results are obtained and analysed. Future work is identified towards further improvement of the current design and implementation. With this new mobile video networking concept and paradigm implemented and evaluated, results and observations obtained from this study would form the basis of a more in-depth, comprehensive understanding of various challenges and opportunities in supporting high-quality real-time video streaming in mobile cloud over heterogeneous wireless networks.
Toward a Big Data Science: A challenge of "Science Cloud"
NASA Astrophysics Data System (ADS)
Murata, Ken T.; Watanabe, Hidenobu
2013-04-01
During these 50 years, along with appearance and development of high-performance computers (and super-computers), numerical simulation is considered to be a third methodology for science, following theoretical (first) and experimental and/or observational (second) approaches. The variety of data yielded by the second approaches has been getting more and more. It is due to the progress of technologies of experiments and observations. The amount of the data generated by the third methodologies has been getting larger and larger. It is because of tremendous development and programming techniques of super computers. Most of the data files created by both experiments/observations and numerical simulations are saved in digital formats and analyzed on computers. The researchers (domain experts) are interested in not only how to make experiments and/or observations or perform numerical simulations, but what information (new findings) to extract from the data. However, data does not usually tell anything about the science; sciences are implicitly hidden in the data. Researchers have to extract information to find new sciences from the data files. This is a basic concept of data intensive (data oriented) science for Big Data. As the scales of experiments and/or observations and numerical simulations get larger, new techniques and facilities are required to extract information from a large amount of data files. The technique is called as informatics as a fourth methodology for new sciences. Any methodologies must work on their facilities: for example, space environment are observed via spacecraft and numerical simulations are performed on super-computers, respectively in space science. The facility of the informatics, which deals with large-scale data, is a computational cloud system for science. This paper is to propose a cloud system for informatics, which has been developed at NICT (National Institute of Information and Communications Technology), Japan. The NICT science cloud, we named as OneSpaceNet (OSN), is the first open cloud system for scientists who are going to carry out their informatics for their own science. The science cloud is not for simple uses. Many functions are expected to the science cloud; such as data standardization, data collection and crawling, large and distributed data storage system, security and reliability, database and meta-database, data stewardship, long-term data preservation, data rescue and preservation, data mining, parallel processing, data publication and provision, semantic web, 3D and 4D visualization, out-reach and in-reach, and capacity buildings. Figure (not shown here) is a schematic picture of the NICT science cloud. Both types of data from observation and simulation are stored in the storage system in the science cloud. It should be noted that there are two types of data in observation. One is from archive site out of the cloud: this is a data to be downloaded through the Internet to the cloud. The other one is data from the equipment directly connected to the science cloud. They are often called as sensor clouds. In the present talk, we first introduce the NICT science cloud. We next demonstrate the efficiency of the science cloud, showing several scientific results which we achieved with this cloud system. Through the discussions and demonstrations, the potential performance of sciences cloud will be revealed for any research fields.
Flynn, Allen J; Boisvert, Peter; Gittlen, Nate; Gross, Colin; Iott, Brad; Lagoze, Carl; Meng, George; Friedman, Charles P
2018-01-01
The Knowledge Grid (KGrid) is a research and development program toward infrastructure capable of greatly decreasing latency between the publication of new biomedical knowledge and its widespread uptake into practice. KGrid comprises digital knowledge objects, an online Library to store them, and an Activator that uses them to provide Knowledge-as-a-Service (KaaS). KGrid's Activator enables computable biomedical knowledge, held in knowledge objects, to be rapidly deployed at Internet-scale in cloud computing environments for improved health. Here we present the Activator, its system architecture and primary functions.
The remote infrared remote control system based on LPC1114
NASA Astrophysics Data System (ADS)
Ren, Yingjie; Guo, Kai; Xu, Xinni; Sun, Dayu; Wang, Li
2018-05-01
In view of the shortcomings such as the short control distance of the traditional air conditioner remote controller on the market nowadays and combining with the current smart home new mode "Cloud+ Terminal" mode, a smart home system based on internet is designed and designed to be fully applied to the simple and reliable features of the LPC1114 chip. The controller is added with temperature control module, timing module and other modules. Through the actual test, it achieved remote control air conditioning, with reliability and stability and brought great convenience to people's lives.
[A Maternal Health Care System Based on Mobile Health Care].
Du, Xin; Zeng, Weijie; Li, Chengwei; Xue, Junwei; Wu, Xiuyong; Liu, Yinjia; Wan, Yuxin; Zhang, Yiru; Ji, Yurong; Wu, Lei; Yang, Yongzhe; Zhang, Yue; Zhu, Bin; Huang, Yueshan; Wu, Kai
2016-02-01
Wearable devices are used in the new design of the maternal health care system to detect electrocardiogram and oxygen saturation signal while smart terminals are used to achieve assessments and input maternal clinical information. All the results combined with biochemical analysis from hospital are uploaded to cloud server by mobile Internet. Machine learning algorithms are used for data mining of all information of subjects. This system can achieve the assessment and care of maternal physical health as well as mental health. Moreover, the system can send the results and health guidance to smart terminals.
Rule-Based Event Processing and Reaction Rules
NASA Astrophysics Data System (ADS)
Paschke, Adrian; Kozlenkov, Alexander
Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.
Teaching surgical skills using video internet communication in a resource-limited setting.
Autry, Amy M; Knight, Sharon; Lester, Felicia; Dubowitz, Gerald; Byamugisha, Josaphat; Nsubuga, Yosam; Muyingo, Mark; Korn, Abner
2013-07-01
To study the feasibility and acceptability of using video Internet communication to teach and evaluate surgical skills in a low-resource setting. This case-controlled study used video Internet communication for surgical skills teaching and evaluation. We randomized intern physicians rotating in the Obstetrics and Gynecology Department at Mulago Hospital at Makerere University in Kampala, Uganda, to the control arm (usual practice) or intervention arm (three video teaching sessions with University of California, San Francisco faculty). We made preintervention and postintervention videos of all interns tying knots using a small video camera and uploaded the files to a file hosting service that offers cloud storage. A blinded faculty member graded all of the videos. Both groups completed a survey at the end of the study. We randomized 18 interns with complete data for eight in the intervention group and seven in the control group. We found score improvement of 50% or more in six of eight (75%) interns in the intervention group compared with one of seven (14%) in the control group (P=.04). Scores declined in five of the seven (71%) controls but in none in the intervention group. Both intervention and control groups used attendings, colleagues, and the Internet as sources for learning about knot-tying. The control group was less likely to practice knot-tying than the intervention group. The trainees and the instructors felt this method of training was enjoyable and helpful. Remote teaching in low-resource settings, where faculty time is limited and access to visiting faculty is sporadic, is feasible, effective, and well-accepted by both learner and teacher. II.
Evaluating the accuracy of orthophotos and 3D models from UAV photogrammetry
NASA Astrophysics Data System (ADS)
Julge, Kalev; Ellmann, Artu
2015-04-01
Rapid development of unmanned aerial vehicles (UAV) in recent years has made their use for various applications more feasible. This contribution evaluates the accuracy and quality of different UAV remote sensing products (i.e. orthorectified image, point cloud and 3D model). Two different autonomous fixed wing UAV systems were used to collect the aerial photographs. One is a mass-produced commercial UAV system, the other is a similar state-of-the-art UAV system. Three different study areas with varying sizes and characteristics (including urban areas, forests, fields, etc.) were surveyed. The UAV point clouds, 3D models and orthophotos were generated with three different commercial and free-ware software. The performance of each of these was evaluated. The effect of flying height on the accuracy of the results was explored, as well as the optimum number and placement of ground control points. Also the achieved results, when the only georeferencing data originates from the UAV system's on-board GNSS and inertial measurement unit, are investigated. Problems regarding the alignment of certain types of aerial photos (e.g. captured over forested areas) are discussed. The quality and accuracy of UAV photogrammetry products are evaluated by comparing them with control measurements made with GNSS-measurements on the ground, as well as high-resolution airborne laser scanning data and other available orthophotos (e.g. those acquired for large scale national mapping). Vertical comparisons are made on surfaces that have remained unchanged in all campaigns, e.g. paved roads. Planar comparisons are performed by control surveys of objects that are clearly identifiable on orthophotos. The statistics of these differences are used to evaluate the accuracy of UAV remote sensing. Some recommendations are given on how to conduct UAV mapping campaigns cost-effectively and with minimal time-consumption while still ensuring the quality and accuracy of the UAV data products. Also the benefits and drawbacks of UAV remote sensing compared to more traditional methods (e.g. national mapping from airplanes or direct measurements on the ground with GNSS devices or total stations) are outlined.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-13
.... Princess Cruise Lines, Ltd. (Corp), Carnival PLC, and Carnival Corporation; Notice of Filing of Complaint...,'' against Princess Cruise Lines, Ltd (Corp), Carnival plc, and Carnival Corporation hereinafter... from ports in the United States;'' Respondent Carnival plc ``is a corporation established under the...
VFDs: Are They Electrical Parasites?
ERIC Educational Resources Information Center
Frank, Ned
2013-01-01
Variable Frequency Drives (VFDs) are electronic speed controllers used mainly to modulate and reduce the overall speed and power consumption of an electrical motor. They can be used as soft starters for equipment that has a large rotational mass, thus reducing belt ware and large electrical peaks when starting large pieces of equipment. VFDs have…
Johns Hopkins Bloomberg School of Public Health OpenCourseWare
ERIC Educational Resources Information Center
Kanchanaraksa, Sukon; Gooding, Ira; Klaas, Brian; Yager, James D.
2009-01-01
The need for public health knowledge is ever increasing, but the educational options have been limited to coursework delivered by academics to individuals who can afford the cost of tuition at public health institutions. To overcome this disparity, Johns Hopkins Bloomberg School of Public Health (JHSPH) has joined the Massachusetts Institute of…
Fiestaware as an Icon in the Popular Culture of America.
ERIC Educational Resources Information Center
Dale, Sharon; Dale, J. Alexander
Fiestaware, the brightly colored dinnerware first introduced in the United States in 1936, has been a cultural phenomenon from its inception. This paper seeks to explain the extraordinary popularity of Fiestaware and to understand the role the ware occupies in U.S. popular culture. Fiestaware achieved enormous success, in spite of its introduction…
The Impact of Standards-Based Reform: Applying Brantlinger's Critique of "Hierarchical Ideologies"
ERIC Educational Resources Information Center
Bacon, Jessica; Ferri, Beth
2013-01-01
Brantlinger's [2004b. "Ideologies Discerned, Values Determined: Getting past the Hierarchies of Special Education." In "Ideology and the Politics of (in)Exclusion," edited by L. Ware, 11-31. New York: Peter Lang Publishing] critique of hierarchical ideologies lays bare the logics embedded in standards-based reform. Drawing on…
Range Environmental Assessment for Test Area C-52 Complex, Eglin Air Force Base, Florida
2014-10-31
tourism in the region. Promotes and develops general business, trade, and tourism components of the state economy. Chapter 334 Transportation...sale in BWB. 124 Canterbury Circle. Sat. Aug. 23rd, 8am-2pm. Sports equip, house- wares, books, puzzles, linens, glassware, all must go! Indoor sale
Teaching Undergraduate Software Engineering Using Open Source Development Tools
2012-01-01
ware. Some example appliances are: a LAMP stack, Redmine, MySQL database, Moodle, Tom- cat on Apache, and Bugzilla. Some of the important features...Ada, C, C++, PHP , Py- thon, etc., and also supports a wide range of SDKs such as Google’s Android SDK and the Google Web Toolkit SDK. Additionally
CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 11
2006-11-01
8>. 7. Wallace, Delores R. Practical Soft- ware Reliability Modeling. Proc. of the 26th Annual NASA Goddard Software Engineering Workshop, Nov. 2001...STAR WARS TO STAR TREK To Request Back Issues on Topics Not Listed Above, Please Contact <stsc. customerservice@hill.af.mil>. About the Authors Kym
The Educational Situation in Utopia: Why "What Is," Is
ERIC Educational Resources Information Center
Seaman, Jayson; Quay, John
2015-01-01
In this response to Molly Ware's review of our 2013 book, "John Dewey and Education Outdoors," we extend her suggestion that complexity be regarded as an important, generative force in education reform. Drawing on Dewey's 1933 "Utopian Schools" speech, we discuss the "level deeper" that Dewey sought as he…
1993-07-01
dimensions topics such as timing (e.g., Gibbon & Allan, from interresponse-time duration (e.g.. Alle- 1984), numerositv (e.g., Gallistel , 1989), and...Behavior, 51, 145-162. Snapper, A. G., & Inglis, G. B. (1985). SKED-li soft- Gallistel , C. R. (1989). Animal cognition: The repre- ware system. Kalamazoo
Used Jmol to Help Students Better Understand Fluxional Processes
ERIC Educational Resources Information Center
Coleman, William F.; Fedosky, Edward W.
2006-01-01
This new WebWare combines instructional text and Jmol interactive, animated illustrations that help students visualize the mechanism. It is concluded that by animating the fluxional behavior of a simple model for chiral metal catalyst Sn(amidinate)[subscript 2], in which axial/equatorial exchange within the amidinate rings occurs through a Berry…
Why Do People Share Content? Identifying Why Students Support Sharing Course Material
ERIC Educational Resources Information Center
Tromp, Gerhard Wieger; Long, Phillip D.
2013-01-01
To establish which factors predict student intentions to contribute towards an OpenCourseWare site, an online questionnaire was distributed among University of Queensland students via email. The 320 participants completed items that were based on the theory of planned behaviour and were designed to measure attitudes, subjective norms and perceived…
EXTERIOR VIEW, LOOKING SOUTHEAST, WITH FRONT FACADE AND PORCH. FREE ...
EXTERIOR VIEW, LOOKING SOUTHEAST, WITH FRONT FACADE AND PORCH. FREE STANDING BRICK GABLED ROOF SHOWS EVIDENCE OF RECENT FIRE WHICH PARTIALLY DESTROYED THE PROPERTY WHICH WAS BUILT IN THE 1840S FOR THE THEN IRON MASTER HORACE WARE. - Shelby Iron Works, Iron Master's House, County Road 42, Shelby, Shelby County, AL
Hydrogeologic data from the US Geological Survey test wells near Waycross, Ware County, Georgia
Matthews, S.E.; Krause, R.E.
1983-01-01
Two wells were constructed near Waycross, Ware County, Georgia, from July 1980 to May 1981 to collect stratigraphic, structural, geophysical, hydrologic, hydraulic, and geochemical information for the U.S. Geological Survey Tertiary Limestone Regional Aquifer-System Analysis. Data collection included geologic sampling and coring, borehole geophysical logging, packer testing, water-level measuring, water-quality sampling, and aquifer testing. In the study area, the Tertiary limestone aquifer system is about 1,300 feet thick and is confined and overlain by about 610 feet of clastic sediments. The aquifer system consists of limestone, dolomite, and minor evaporites and has high porosity and permeability. A 4-day continuous discharge aquifer test was conducted, from which a transmissivity of about 1 million feet squared per day and a storage coefficient of 0.0001 were calculated. Water from the upper part of the aquifer is of a calcium bicarbonate type. The deeper highly mineralized zone produces a sodium bicarbonate type water in which concentrations of magnesium, sulfate, chloride, sodium, and some trace metals increase with depth. (USGS)
Marinoni, Nicoletta; D'Alessio, Daniela; Diella, Valeria; Pavese, Alessandro; Francescon, Ferdinando
2013-07-30
The effects of soda-lime waste glass, from the recovery of bottle glass cullet, in partial replacement of Na-feldspar for sanitary-ware ceramic production are discussed. Attention is paid to the mullite growth kinetics and to the macroscopic properties of the final output, the latter ones depending on the developed micro-structures and vitrification grade. Measurements have been performed by in situ high temperature X-ray powder diffraction, scanning electron microscopy, thermal dilatometry, water absorption and mechanical testing. Glass substituting feldspar from 30 to 50 wt% allows one (i) to accelerate the mullite growth reaction kinetics, and (ii) to achieve macroscopic features of the ceramic output that comply with the latest technical requirements. The introduction of waste glass leads to (i) a general saving of fuel and reduction of the CO2-emissions during the firing stage, (ii) a preservation of mineral resources in terms of feldspars, and (iii) an efficient management of the bottle glass refuse by readdressing a part of it in the sanitary-ware manufacturing. Copyright © 2013 Elsevier Ltd. All rights reserved.
The educational situation in Utopia: why what is, is
NASA Astrophysics Data System (ADS)
Seaman, Jayson; Quay, John
2015-03-01
In this response to Molly Ware's review of our 2013 book, John Dewey and Education Outdoors, we extend her suggestion that complexity be regarded as an important, generative force in education reform. Drawing on Dewey's 1933 Utopian Schools speech, we discuss the "level deeper" that Dewey sought as he criticized the method/subject mater dichotomy, which he saw as an artifact of social class carried forward in the form of a curricular debate rather than a natural source of tension that would be productive to democratic education. Dewey radically argued that learning itself contained similar anti-democratic potential. Eschewing the false child versus curriculum dichotomy, Dewey believed complexity as a catalyst for educational action would be achieved by engaging children in historically formed occupations, harnessing the forces that drive technological and cultural evolution in order to spur interest, effort, and the formation of social attitudes among students. Following Ware, we suggest that reformers should seek to understand at a lever deeper the many sources of complexity they encounter as they both challenge and honor what is.
The role of turbulent suppression in the triggering ITBs on C-Mod
NASA Astrophysics Data System (ADS)
Zhurovich, K.; Fiore, C. L.; Ernst, D. R.; Bonoli, P. T.; Greenwald, M. J.; Hubbard, A. E.; Hughes, J. W.; Marmar, E. S.; Mikkelsen, D. R.; Phillips, P.; Rice, J. E.
2007-11-01
Internal transport barriers can be routinely produced in C-Mod steady EDA H-mode plasmas by applying ICRF at |r/a|>= 0.5. Access to the off-axis ICRF heated ITBs may be understood within the paradigm of marginal stability. Analysis of the Te profiles shows a decrease of R/LTe in the ITB region as the RF resonance is moved off axis. Ti profiles broaden as the ICRF power deposition changes from on-axis to off-axis. TRANSP calculations of the Ti profiles support this trend. Linear GS2 calculations do not reveal any difference in ETG growth rate profiles for ITB vs. non-ITB discharges. However, they do show that the region of stability to ITG modes widens as the ICRF resonance is moved outward. Non-linear simulations show that the outward turbulent particle flux exceeds the Ware pinch by factor of 2 in the outer plasma region. Reducing the temperature gradient significantly decreases the diffusive flux and allows the Ware pinch to peak the density profile. Details of these experiments and simulations will be presented.
Migration of formaldehyde from melamine-ware: UK 2008 survey results.
Potter, E L J; Bradley, E L; Davies, C R; Barnes, K A; Castle, L
2010-06-01
Fifty melamine-ware articles were tested for the migration of formaldehyde - with hexamethylenetetramine (HMTA) expressed as formaldehyde - to see whether the total specific migration limit (SML(T)) was being observed. The SML(T), given in European Commission Directive 2002/72/EC as amended, is 15 mg kg(-1). Fourier transform-infrared (FT-IR) spectroscopy was carried out on the articles to confirm the plastic type. Articles were exposed to the food simulant 3% (w/v) aqueous acetic acid under conditions representing their worst foreseeable use. Formaldehyde and HMTA in food simulants were determined by a spectrophotometric derivatization procedure. Positive samples were confirmed by a second spectrophotometric procedure using an alternative derivatization agent. As all products purchased were intended for repeat use, three sequential exposures to the simulant were carried out. Formaldehyde was detected in the simulant exposed to 43 samples. Most of the levels found were well below the limits set in law such that 84% of the samples tested were compliant. However, eight samples had formaldehyde levels that were clearly above the legal maximum at six to 65 times the SML(T).
Hu, Yu-Chen
2018-01-01
The emergence of smart Internet of Things (IoT) devices has highly favored the realization of smart homes in a down-stream sector of a smart grid. The underlying objective of Demand Response (DR) schemes is to actively engage customers to modify their energy consumption on domestic appliances in response to pricing signals. Domestic appliance scheduling is widely accepted as an effective mechanism to manage domestic energy consumption intelligently. Besides, to residential customers for DR implementation, maintaining a balance between energy consumption cost and users’ comfort satisfaction is a challenge. Hence, in this paper, a constrained Particle Swarm Optimization (PSO)-based residential consumer-centric load-scheduling method is proposed. The method can be further featured with edge computing. In contrast with cloud computing, edge computing—a method of optimizing cloud computing technologies by driving computing capabilities at the IoT edge of the Internet as one of the emerging trends in engineering technology—addresses bandwidth-intensive contents and latency-sensitive applications required among sensors and central data centers through data analytics at or near the source of data. A non-intrusive load-monitoring technique proposed previously is utilized to automatic determination of physical characteristics of power-intensive home appliances from users’ life patterns. The swarm intelligence, constrained PSO, is used to minimize the energy consumption cost while considering users’ comfort satisfaction for DR implementation. The residential consumer-centric load-scheduling method proposed in this paper is evaluated under real-time pricing with inclining block rates and is demonstrated in a case study. The experimentation reported in this paper shows the proposed residential consumer-centric load-scheduling method can re-shape loads by home appliances in response to DR signals. Moreover, a phenomenal reduction in peak power consumption is achieved by 13.97%. PMID:29702607
Grid and Cloud for Developing Countries
NASA Astrophysics Data System (ADS)
Petitdidier, Monique
2014-05-01
The European Grid e-infrastructure has shown the capacity to connect geographically distributed heterogeneous compute resources in a secure way taking advantages of a robust and fast REN (Research and Education Network). In many countries like in Africa the first step has been to implement a REN and regional organizations like Ubuntunet, WACREN or ASREN to coordinate the development, improvement of the network and its interconnection. The Internet connections are still exploding in those countries. The second step has been to fill up compute needs of the scientists. Even if many of them have their own multi-core or not laptops for more and more applications it is not enough because they have to face intensive computing due to the large amount of data to be processed and/or complex codes. So far one solution has been to go abroad in Europe or in America to run large applications or not to participate to international communities. The Grid is very attractive to connect geographically-distributed heterogeneous resources, aggregate new ones and create new sites on the REN with a secure access. All the users have the same servicers even if they have no resources in their institute. With faster and more robust internet they will be able to take advantage of the European Grid. There are different initiatives to provide resources and training like UNESCO/HP Brain Gain initiative, EUMEDGrid, ..Nowadays Cloud becomes very attractive and they start to be developed in some countries. In this talk challenges for those countries to implement such e-infrastructures, to develop in parallel scientific and technical research and education in the new technologies will be presented illustrated by examples.
Birschmann, Ingvild; Dittrich, Marcus; Eller, Thomas; Wiegmann, Bettina; Reininger, Armin J; Budde, Ulrich; Strüber, Martin
2014-01-01
Thromboembolic and bleeding events in patients with a left ventricular assist device (LVAD) are still a major cause of complications. Therefore, the balance between anti-coagulant and pro-coagulant factors needs to be tightly controlled. The principle hypothesis of this study is that different pump designs may have an effect on hemolysis and activation of the coagulation system. Referring to this, the HeartMate II (HMII; Thoratec Corp, Pleasanton, CA) and the HeartWare HVAD (HeartWare International Inc, Framingham, MA) were investigated. For 20 patients with LVAD support (n = 10 each), plasma coagulation, full blood count, and clinical chemistry parameters were measured. Platelet function was monitored using platelet aggregometry, platelet function analyzer-100 system ( Siemens, Marburg, Germany), vasodilator-stimulated phosphoprotein phosphorylation assay, immature platelet fraction, platelet-derived microparticles, and von Willebrand diagnostic. Acquired von Willebrand syndrome could be detected in all patients. Signs of hemolysis, as measured by lactate dehydrogenase levels (mean, 470 U/liter HMII, 250 U/liter HVAD; p < 0.001), were more pronounced in the HMII patients. In contrast, D-dimer analysis indicated a significantly higher activation of the coagulation system in HVAD patients (mean, 0.94 mg/liter HMII, 2.01 mg/liter HVAD; p < 0.01). The efficacy of anti-platelet therapy using clopidogrel was not sufficient in more than 50% of the patients. Our results support the finding that all patients with rotary blood pumps suffered from von Willebrand syndrome. In addition, a distinct footprint of effects on hemolysis and the coagulation system can be attributed to different devices. As a consequence, the individual status of the coagulation system needs to be controlled in long-term patients. © 2013 Published by International Society for the Heart and Lung Transplantation on behalf of International Society for Heart and Lung Transplantation.
Jorde, Ulrich P; Aaronson, Keith D; Najjar, Samer S; Pagani, Francis D; Hayward, Christopher; Zimpfer, Daniel; Schlöglhofer, Thomas; Pham, Duc T; Goldstein, Daniel J; Leadley, Katrin; Chow, Ming-Jay; Brown, Michael C; Uriel, Nir
2015-11-01
The study sought to characterize patterns in the HeartWare (HeartWare Inc., Framingham, Massachusetts) ventricular assist device (HVAD) log files associated with successful medical treatment of device thrombosis. Device thrombosis is a serious adverse event for mechanical circulatory support devices and is often preceded by increased power consumption. Log files of the pump power are easily accessible on the bedside monitor of HVAD patients and may allow early diagnosis of device thrombosis. Furthermore, analysis of the log files may be able to predict the success rate of thrombolysis or the need for pump exchange. The log files of 15 ADVANCE trial patients (algorithm derivation cohort) with 16 pump thrombus events treated with tissue plasminogen activator (tPA) were assessed for changes in the absolute and rate of increase in power consumption. Successful thrombolysis was defined as a clinical resolution of pump thrombus including normalization of power consumption and improvement in biochemical markers of hemolysis. Significant differences in log file patterns between successful and unsuccessful thrombolysis treatments were verified in 43 patients with 53 pump thrombus events implanted outside of clinical trials (validation cohort). The overall success rate of tPA therapy was 57%. Successful treatments had significantly lower measures of percent of expected power (130.9% vs. 196.1%, p = 0.016) and rate of increase in power (0.61 vs. 2.87, p < 0.0001). Medical therapy was successful in 77.7% of the algorithm development cohort and 81.3% of the validation cohort when the rate of power increase and percent of expected power values were <1.25% and 200%, respectively. Log file parameters can potentially predict the likelihood of successful tPA treatments and if validated prospectively, could substantially alter the approach to thrombus management. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Analysis of melamine migration from melamine food contact articles.
Chik, Z; Haron, D E Mohamad; Ahmad, E D; Taha, H; Mustafa, A M
2011-01-01
Migration of melamine has been determined for 41 types of retail melamine-ware products in Malaysia. This study was initiated by the Ministry of Health, Malaysia, in the midst of public anxiety on the possibility of melamine leaching into foods that come into contact with the melamine-ware. Thus, the objective of this study was to investigate the level of melamine migration in melamine utensils available on the market. Samples of melamine tableware, including cups and plates, forks and spoons, tumblers, bowls, etc., were collected from various retail outlets. Following the test guidelines for melamine migration set by the European Committee for Standardisation (CEN 2004) with some modifications, the samples were exposed to two types of food simulants (3% acetic acid and distilled water) at three test conditions (25°C (room temperature), 70 and 100°C) for 30 min. Melamine analysis was carried out using LC-MS/MS with a HILIC column and mobile phase consisting of ammonium acetate/formic acid (0.05%) in water and ammonium acetate/formic acid (0.05%) in acetonitrile (95 : 5, v/v). The limit of quantification (LOQ) was 5 ng/ml. Melamine migration was detected from all samples. For the articles tested with distilled water, melamine migration were [median (interquartile range)] 22.2 (32.6), 49.3 (50.9), 84.9 (89.9) ng/ml at room temperature (25°C), 70 and 100°C, respectively. In 3% acetic acid, melamine migration was 31.5 (35.7), 81.5 (76.2), 122.0 (126.7) ng/ml at room temperature (25°C), 70 and 100°C, respectively. This study suggests that excessive heat and acidity may directly affect melamine migration from melamine-ware products. However the results showed that melamine migration in the tested items were well below the specific migration limit (SML) of 30 mg/kg (30,000 ng/ml) set out in European Commission Directive 2002/72/EC.
Implementing Internet of Things in a military command and control environment
NASA Astrophysics Data System (ADS)
Raglin, Adrienne; Metu, Somiya; Russell, Stephen; Budulas, Peter
2017-05-01
While the term Internet of Things (IoT) has been coined relatively recently, it has deep roots in multiple other areas of research including cyber-physical systems, pervasive and ubiquitous computing, embedded systems, mobile ad-hoc networks, wireless sensor networks, cellular networks, wearable computing, cloud computing, big data analytics, and intelligent agents. As the Internet of Things, these technologies have created a landscape of diverse heterogeneous capabilities and protocols that will require adaptive controls to effect linkages and changes that are useful to end users. In the context of military applications, it will be necessary to integrate disparate IoT devices into a common platform that necessarily must interoperate with proprietary military protocols, data structures, and systems. In this environment, IoT devices and data will not be homogeneous and provenance-controlled (i.e. single vendor/source/supplier owned). This paper presents a discussion of the challenges of integrating varied IoT devices and related software in a military environment. A review of contemporary commercial IoT protocols is given and as a practical example, a middleware implementation is proffered that provides transparent interoperability through a proactive message dissemination system. The implementation is described as a framework through which military applications can integrate and utilize commercial IoT in conjunction with existing military sensor networks and command and control (C2) systems.
Hybrid teaching method for undergraduate student in Marine Geology class in Indonesia
NASA Astrophysics Data System (ADS)
Yusuf Awaluddin, M.; Yuliadi, Lintang
2016-04-01
Bridging Geosciences to the future generations in interesting and interactive ways are challenging for lecturers and teachers. In the past, one-way 'classic' face-to-face teaching method has been used as the only alternative for undergraduate's Marine Geology class in Padjadjaran University, Indonesia. Currently, internet users in Indonesia have been increased significantly, among of them are young generations and students. The advantage of the internet as a teaching method in Geosciences topic in Indonesia is still limited. Here we have combined between the classic and the online method for undergraduate teaching. The case study was in Marine Geology class, Padjadjaran University, with 70 students as participants and 2 instructors. We used Edmodo platform as a primary tool in our teaching and Dropbox as cloud storage. All online teaching activities such as assignment, quiz, discussion and examination were done in concert with the classic one with proportion 60% and 40% respectively. We found that the students had the different experience in this hybrid teaching method as shown in their feedback through this platform. This hybrid method offers interactive ways not only between the lecturers and the students but also among students. Classroom meeting is still needed to expose their work and for general discussion.Nevertheless, the only problem was the lack of internet access in the campus when all our students accessing the platform at the same time.
CloVR-Comparative: automated, cloud-enabled comparative microbial genome sequence analysis pipeline.
Agrawal, Sonia; Arze, Cesar; Adkins, Ricky S; Crabtree, Jonathan; Riley, David; Vangala, Mahesh; Galens, Kevin; Fraser, Claire M; Tettelin, Hervé; White, Owen; Angiuoli, Samuel V; Mahurkar, Anup; Fricke, W Florian
2017-04-27
The benefit of increasing genomic sequence data to the scientific community depends on easy-to-use, scalable bioinformatics support. CloVR-Comparative combines commonly used bioinformatics tools into an intuitive, automated, and cloud-enabled analysis pipeline for comparative microbial genomics. CloVR-Comparative runs on annotated complete or draft genome sequences that are uploaded by the user or selected via a taxonomic tree-based user interface and downloaded from NCBI. CloVR-Comparative runs reference-free multiple whole-genome alignments to determine unique, shared and core coding sequences (CDSs) and single nucleotide polymorphisms (SNPs). Output includes short summary reports and detailed text-based results files, graphical visualizations (phylogenetic trees, circular figures), and a database file linked to the Sybil comparative genome browser. Data up- and download, pipeline configuration and monitoring, and access to Sybil are managed through CloVR-Comparative web interface. CloVR-Comparative and Sybil are distributed as part of the CloVR virtual appliance, which runs on local computers or the Amazon EC2 cloud. Representative datasets (e.g. 40 draft and complete Escherichia coli genomes) are processed in <36 h on a local desktop or at a cost of <$20 on EC2. CloVR-Comparative allows anybody with Internet access to run comparative genomics projects, while eliminating the need for on-site computational resources and expertise.
Web-Based Satellite Products Database for Meteorological and Climate Applications
NASA Technical Reports Server (NTRS)
Phan, Dung; Spangenberg, Douglas A.; Palikonda, Rabindra; Khaiyer, Mandana M.; Nordeen, Michele L.; Nguyen, Louis; Minnis, Patrick
2004-01-01
The need for ready access to satellite data and associated physical parameters such as cloud properties has been steadily growing. Air traffic management, weather forecasters, energy producers, and weather and climate researchers among others can utilize more satellite information than in the past. Thus, it is essential that such data are made available in near real-time and as archival products in an easy-access and user friendly environment. A host of Internet web sites currently provide a variety of satellite products for various applications. Each site has a unique contribution with appeal to a particular segment of the public and scientific community. This is no less true for the NASA Langley's Clouds and Radiation (NLCR) website (http://www-pm.larc.nasa.gov) that has been evolving over the past 10 years to support a variety of research projects This website was originally developed to display cloud products derived from the Geostationary Operational Environmental Satellite (GOES) over the Southern Great Plains for the Atmospheric Radiation Measurement (ARM) Program. It has evolved into a site providing a comprehensive database of near real-time and historical satellite products used for meteorological, aviation, and climate studies. To encourage the user community to take advantage of the site, this paper summarizes the various products and projects supported by the website and discusses future options for new datasets.
Looking at Earth from Space: Teacher's Guide with Activities for Earth and Space Science
NASA Technical Reports Server (NTRS)
Steele, Colleen (Editor); Steele, Colleen; Ryan, William F.
1995-01-01
The Maryland Pilot Earth Science and Technology Education Network (MAPS-NET) project was sponsored by the National Aeronautics and Space Administration (NASA) to enrich teacher preparation and classroom learning in the area of Earth system science. This publication includes a teacher's guide that replicates material taught during a graduate-level course of the project and activities developed by the teachers. The publication was developed to provide teachers with a comprehensive approach to using satellite imagery to enhance science education. The teacher's guide is divided into topical chapters and enables teachers to expand their knowledge of the atmosphere, common weather patterns, and remote sensing. Topics include: weather systems and satellite imagery including mid-latitude weather systems; wave motion and the general circulation; cyclonic disturbances and baroclinic instability; clouds; additional common weather patterns; satellite images and the internet; environmental satellites; orbits; and ground station set-up. Activities are listed by suggested grade level and include the following topics: using weather symbols; forecasting the weather; cloud families and identification; classification of cloud types through infrared Automatic Picture Transmission (APT) imagery; comparison of visible and infrared imagery; cold fronts; to ski or not to ski (imagery as a decision making tool), infrared and visible satellite images; thunderstorms; looping satellite images; hurricanes; intertropical convergence zone; and using weather satellite images to enhance a study of the Chesapeake Bay. A list of resources is also included.
NASA Astrophysics Data System (ADS)
Xu, Chong-Yao; Zheng, Xin; Xiong, Xiao-Ming
2017-02-01
With the development of Internet of Things (IoT) and the popularity of intelligent mobile terminals, smart home system has come into people’s vision. However, due to the high cost, complex installation and inconvenience, as well as network security issues, smart home system has not been popularized. In this paper, combined with Wi-Fi technology, Android system, cloud server and SSL security protocol, a new set of smart home system is designed, with low cost, easy operation, high security and stability. The system consists of Wi-Fi smart node (WSN), Android client and cloud server. In order to reduce system cost and complexity of the installation, each Wi-Fi transceiver, appliance control logic and data conversion in the WSN is setup by a single chip. In addition, all the data of the WSN can be uploaded to the server through the home router, without having to transit through the gateway. All the appliance status information and environmental information are preserved in the cloud server. Furthermore, to ensure the security of information, the Secure Sockets Layer (SSL) protocol is used in the WSN communication with the server. What’s more, to improve the comfort and simplify the operation, Android client is designed with room pattern to control home appliances more realistic, and more convenient.
Usability evaluation of cloud-based mapping tools for the display of very large datasets
NASA Astrophysics Data System (ADS)
Stotz, Nicole Marie
The elasticity and on-demand nature of cloud services have made it easier to create web maps. Users only need access to a web browser and the Internet to utilize cloud based web maps, eliminating the need for specialized software. To encourage a wide variety of users, a map must be well designed; usability is a very important concept in designing a web map. Fusion Tables, a new product from Google, is one example of newer cloud-based distributed GIS services. It allows for easy spatial data manipulation and visualization, within the Google Maps framework. ESRI has also introduced a cloud based version of their software, called ArcGIS Online, built on Amazon's EC2 cloud. Utilizing a user-centered design framework, two prototype maps were created with data from the San Diego East County Economic Development Council. One map was built on Fusion Tables, and another on ESRI's ArcGIS Online. A usability analysis was conducted and used to compare both map prototypes in term so of design and functionality. Load tests were also ran, and performance metrics gathered on both map prototypes. The usability analysis was taken by 25 geography students, and consisted of time based tasks and questions on map design and functionality. Survey participants completed the time based tasks for the Fusion Tables map prototype quicker than those of the ArcGIS Online map prototype. While response was generally positive towards the design and functionality of both prototypes, overall the Fusion Tables map prototype was preferred. For the load tests, the data set was broken into 22 groups for a total of 44 tests. While the Fusion Tables map prototype performed more efficiently than the ArcGIS Online prototype, differences are almost unnoticeable. A SWOT analysis was conducted for each prototype. The results from this research point to the Fusion Tables map prototype. A redesign of this prototype would incorporate design suggestions from the usability survey, while some functionality would need to be dropped. This is a free product and would therefore be the best option if cost is an issue, but this map may not be supported in the future.
Facilitating Secure Sharing of Personal Health Data in the Cloud
Nepal, Surya; Glozier, Nick
2016-01-01
Background Internet-based applications are providing new ways of promoting health and reducing the cost of care. Although data can be kept encrypted in servers, the user does not have the ability to decide whom the data are shared with. Technically this is linked to the problem of who owns the data encryption keys required to decrypt the data. Currently, cloud service providers, rather than users, have full rights to the key. In practical terms this makes the users lose full control over their data. Trust and uptake of these applications can be increased by allowing patients to feel in control of their data, generally stored in cloud-based services. Objective This paper addresses this security challenge by providing the user a way of controlling encryption keys independently of the cloud service provider. We provide a secure and usable system that enables a patient to share health information with doctors and specialists. Methods We contribute a secure protocol for patients to share their data with doctors and others on the cloud while keeping complete ownership. We developed a simple, stereotypical health application and carried out security tests, performance tests, and usability tests with both students and doctors (N=15). Results We developed the health application as an app for Android mobile phones. We carried out the usability tests on potential participants and medical professionals. Of 20 participants, 14 (70%) either agreed or strongly agreed that they felt safer using our system. Using mixed methods, we show that participants agreed that privacy and security of health data are important and that our system addresses these issues. Conclusions We presented a security protocol that enables patients to securely share their eHealth data with doctors and nurses and developed a secure and usable system that enables patients to share mental health information with doctors. PMID:27234691
NASA Astrophysics Data System (ADS)
Snyder, P. L.; Brown, V. W.
2017-12-01
IBM has created a general purpose, data-agnostic solution that provides high performance, low data latency, high availability, scalability, and persistent access to the captured data, regardless of source or type. This capability is hosted on commercially available cloud environments and uses much faster, more efficient, reliable, and secure data transfer protocols than the more typically used FTP. The design incorporates completely redundant data paths at every level, including at the cloud data center level, in order to provide the highest assurance of data availability to the data consumers. IBM has been successful in building and testing a Proof of Concept instance on our IBM Cloud platform to receive and disseminate actual GOES-16 data as it is being downlinked. This solution leverages the inherent benefits of a cloud infrastructure configured and tuned for continuous, stable, high-speed data dissemination to data consumers worldwide at the downlink rate. It also is designed to ingest data from multiple simultaneous sources and disseminate data to multiple consumers. Nearly linear scalability is achieved by adding servers and storage.The IBM Proof of Concept system has been tested with our partners to achieve in excess of 5 Gigabits/second over public internet infrastructure. In tests with live GOES-16 data, the system routinely achieved 2.5 Gigabits/second pass-through to The Weather Company from the University of Wisconsin-Madison SSEC. Simulated data was also transferred from the Cooperative Institute for Climate and Satellites — North Carolina to The Weather Company, as well. The storage node allocated to our Proof of Concept system as tested was sized at 480 Terabytes of RAID protected disk as a worst case sizing to accommodate the data from four GOES-16 class satellites for 30 days in a circular buffer. This shows that an abundance of performance and capacity headroom exists in the IBM design that can be applied to additional missions.