Sample records for personal computer platform

  1. Discovery and analysis of time delay sources in the USGS personal computer data collection platform (PCDCP) system

    USGS Publications Warehouse

    White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.

    2014-01-01

    Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.

  2. Traffic information computing platform for big data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  3. To Mac or Not To Mac? One Apple Devotee's Excruciating Purchase Dilemma.

    ERIC Educational Resources Information Center

    Shenk, David

    1998-01-01

    Discusses the pros and cons of selecting Apple Macintosh computers versus a personal computer that runs the Windows platform. Graphical user interfaces, current and future support, and aesthetics are considered, as well as personal preferences. (LRW)

  4. A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools

    DTIC Science & Technology

    2015-07-14

    computer that establishes an encrypted Virtual Private Network ( OpenVPN [44]) based on the Secure Socket Layer (SSL) paradigm. Each user is given a...security certificate for each device used to connect to the computing nodes. Stable OpenVPN clients are available for Linux, Microsoft Windows, Apple OSX...platform is granted by an encrypted connection base on the Secure Socket Layer (SSL) protocol, and implemented in the OpenVPN Virtual Personal Network

  5. Design Tools for Accelerating Development and Usage of Multi-Core Computing Platforms

    DTIC Science & Technology

    2014-04-01

    Government formulated or supplied the drawings, specifications, or other data does not license the holder or any other person or corporation ; or convey...multicore PDSP platforms. The GPU- based capabilities of TDIF are currently oriented towards NVIDIA GPUs, based on the Compute Unified Device Architecture...CUDA) programming language [ NVIDIA 2007], which can be viewed as an extension of C. The multicore PDSP capabilities currently in TDIF are oriented

  6. Homomorphic encryption experiments on IBM's cloud quantum computing platform

    NASA Astrophysics Data System (ADS)

    Huang, He-Liang; Zhao, You-Wei; Li, Tan; Li, Feng-Guang; Du, Yu-Tao; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su

    2017-02-01

    Quantum computing has undergone rapid development in recent years. Owing to limitations on scalability, personal quantum computers still seem slightly unrealistic in the near future. The first practical quantum computer for ordinary users is likely to be on the cloud. However, the adoption of cloud computing is possible only if security is ensured. Homomorphic encryption is a cryptographic protocol that allows computation to be performed on encrypted data without decrypting them, so it is well suited to cloud computing. Here, we first applied homomorphic encryption on IBM's cloud quantum computer platform. In our experiments, we successfully implemented a quantum algorithm for linear equations while protecting our privacy. This demonstration opens a feasible path to the next stage of development of cloud quantum information technology.

  7. IAServ: an intelligent home care web services platform in a cloud for aging-in-place.

    PubMed

    Su, Chuan-Jun; Chiang, Chang-Yu

    2013-11-12

    As the elderly population has been rapidly expanding and the core tax-paying population has been shrinking, the need for adequate elderly health and housing services continues to grow while the resources to provide such services are becoming increasingly scarce. Thus, increasing the efficiency of the delivery of healthcare services through the use of modern technology is a pressing issue. The seamless integration of such enabling technologies as ontology, intelligent agents, web services, and cloud computing is transforming healthcare from hospital-based treatments to home-based self-care and preventive care. A ubiquitous healthcare platform based on this technological integration, which synergizes service providers with patients' needs to be developed to provide personalized healthcare services at the right time, in the right place, and the right manner. This paper presents the development and overall architecture of IAServ (the Intelligent Aging-in-place Home care Web Services Platform) to provide personalized healthcare service ubiquitously in a cloud computing setting to support the most desirable and cost-efficient method of care for the aged-aging in place. The IAServ is expected to offer intelligent, pervasive, accurate and contextually-aware personal care services. Architecturally the implemented IAServ leverages web services and cloud computing to provide economic, scalable, and robust healthcare services over the Internet.

  8. IAServ: An Intelligent Home Care Web Services Platform in a Cloud for Aging-in-Place

    PubMed Central

    Su, Chuan-Jun; Chiang, Chang-Yu

    2013-01-01

    As the elderly population has been rapidly expanding and the core tax-paying population has been shrinking, the need for adequate elderly health and housing services continues to grow while the resources to provide such services are becoming increasingly scarce. Thus, increasing the efficiency of the delivery of healthcare services through the use of modern technology is a pressing issue. The seamless integration of such enabling technologies as ontology, intelligent agents, web services, and cloud computing is transforming healthcare from hospital-based treatments to home-based self-care and preventive care. A ubiquitous healthcare platform based on this technological integration, which synergizes service providers with patients’ needs to be developed to provide personalized healthcare services at the right time, in the right place, and the right manner. This paper presents the development and overall architecture of IAServ (the Intelligent Aging-in-place Home care Web Services Platform) to provide personalized healthcare service ubiquitously in a cloud computing setting to support the most desirable and cost-efficient method of care for the aged-aging in place. The IAServ is expected to offer intelligent, pervasive, accurate and contextually-aware personal care services. Architecturally the implemented IAServ leverages web services and cloud computing to provide economic, scalable, and robust healthcare services over the Internet. PMID:24225647

  9. The Personal Motion Platform

    NASA Technical Reports Server (NTRS)

    Park, Brian Vandellyn

    1993-01-01

    The Neutral Body Posture experienced in microgravity creates a biomechanical equilibrium by enabling the internal forces within the body to find their own balance. A patented reclining chair based on this posture provides a minimal stress environment for interfacing with computer systems for extended periods. When the chair is mounted on a 3 or 6 axis motion platform, a generic motion simulator for simulated digital environments is created. The Personal Motion Platform provides motional feedback to the occupant in synchronization with their movements inside the digital world which enhances the simulation experience. Existing HMD based simulation systems can be integrated to the turnkey system. Future developments are discussed.

  10. NEW GIS WATERSHED ANALYSIS TOOLS FOR SOIL CHARACTERIZATION AND EROSION AND SEDIMENTATION MODELING

    EPA Science Inventory

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed which utilizes a suite of automated scripts and a pair of processing-intensive executable programs operating on a personal computer platform.

  11. Climate@Home: Crowdsourcing Climate Change Research

    NASA Astrophysics Data System (ADS)

    Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.

    2011-12-01

    Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.

  12. MAPPER: A personal computer map projection tool

    NASA Technical Reports Server (NTRS)

    Bailey, Steven A.

    1993-01-01

    MAPPER is a set of software tools designed to let users create and manipulate map projections on a personal computer (PC). The capability exists to generate five popular map projections. These include azimuthal, cylindrical, mercator, lambert, and sinusoidal projections. Data for projections are contained in five coordinate databases at various resolutions. MAPPER is managed by a system of pull-down windows. This interface allows the user to intuitively create, view and export maps to other platforms.

  13. The cloud services innovation platform- enabling service-based environmental modelling using infrastructure-as-a-service cloud computing

    USDA-ARS?s Scientific Manuscript database

    Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...

  14. Comparison of scientific computing platforms for MCNP4A Monte Carlo calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, J.S.; Brockhoff, R.C.

    1994-04-01

    The performance of seven computer platforms is evaluated with the widely used and internationally available MCNP4A Monte Carlo radiation transport code. All results are reproducible and are presented in such a way as to enable comparison with computer platforms not in the study. The authors observed that the HP/9000-735 workstation runs MCNP 50% faster than the Cray YMP 8/64. Compared with the Cray YMP 8/64, the IBM RS/6000-560 is 68% as fast, the Sun Sparc10 is 66% as fast, the Silicon Graphics ONYX is 90% as fast, the Gateway 2000 model 4DX2-66V personal computer is 27% as fast, and themore » Sun Sparc2 is 24% as fast. In addition to comparing the timing performance of the seven platforms, the authors observe that changes in compilers and software over the past 2 yr have resulted in only modest performance improvements, hardware improvements have enhanced performance by less than a factor of [approximately]3, timing studies are very problem dependent, MCNP4Q runs about as fast as MCNP4.« less

  15. Trusted Computing Management Server Making Trusted Computing User Friendly

    NASA Astrophysics Data System (ADS)

    Sothmann, Sönke; Chaudhuri, Sumanta

    Personal Computers (PC) with build in Trusted Computing (TC) technology are already well known and widely distributed. Nearly every new business notebook contains now a Trusted Platform Module (TPM) and could be used with increased trust and security features in daily application and use scenarios. However in real life the number of notebooks and PCs where the TPM is really activated and used is still very small.

  16. The Cybermobile: A Gateway for Public Access to Network-Based Information.

    ERIC Educational Resources Information Center

    Drumm, John E.; Groom, Frank M.

    1997-01-01

    Though the bookmobile has fallen on hard times, the cybermobile, a technology platform combining personal computing, CD-ROMs, fiber network, and wireless access to the Internet, may be the next step in mobile library services. Discusses standard vehicle, computer hardware, software, wireless access, and alliances with users, vendors, and community…

  17. Tele-Medicine Applications of an ISDN-Based Tele-Working Platform

    DTIC Science & Technology

    2001-10-25

    developed over the Hellenic Integrated Services Digital Network (ISDN), is based on user terminals (personal computers), networking apparatus, and a...key infrastructure, ready to offer enhanced message switching and translation in response to market trends [8]. Three (3) years ago, the Hellenic PTT...should outcome to both an integrated Tele- Working platform, a main central database (completed with maintenance facilities), and a ready-to-be

  18. An optimized and low-cost FPGA-based DNA sequence alignment--a step towards personal genomics.

    PubMed

    Shah, Hurmat Ali; Hasan, Laiq; Ahmad, Nasir

    2013-01-01

    DNA sequence alignment is a cardinal process in computational biology but also is much expensive computationally when performing through traditional computational platforms like CPU. Of many off the shelf platforms explored for speeding up the computation process, FPGA stands as the best candidate due to its performance per dollar spent and performance per watt. These two advantages make FPGA as the most appropriate choice for realizing the aim of personal genomics. The previous implementation of DNA sequence alignment did not take into consideration the price of the device on which optimization was performed. This paper presents optimization over previous FPGA implementation that increases the overall speed-up achieved as well as the price incurred by the platform that was optimized. The optimizations are (1) The array of processing elements is made to run on change in input value and not on clock, so eliminating the need for tight clock synchronization, (2) the implementation is unrestrained by the size of the sequences to be aligned, (3) the waiting time required for the sequences to load to FPGA is reduced to the minimum possible and (4) an efficient method is devised to store the output matrix that make possible to save the diagonal elements to be used in next pass, in parallel with the computation of output matrix. Implemented on Spartan3 FPGA, this implementation achieved 20 times performance improvement in terms of CUPS over GPP implementation.

  19. Utilising Raspberry Pi as a cheap and easy do it yourself streaming device for astronomy

    NASA Astrophysics Data System (ADS)

    Maulana, F.; Soegijoko, W.; Yamani, A.

    2016-11-01

    Recent developments in personal computing platforms have been revolutionary. With the advent of the Raspberry Pi series and the Arduino series, sub USD 100 computing platforms have changed the playing field altogether. It used to be that you would need a PC or an FPGA platform costing thousands of USD to create a dedicated device for a a dedicated task. Combining a PiCam with the Raspberry Pi allows for smaller budgets to be able to stream live images to the internet and to the public in general. This paper traces our path in designing and adapting the PiCam to a common sized eyepiece and telescope in preparation for the TSE in Indonesia this past March.

  20. A PC-Based Controller for Dextrous Arms

    NASA Technical Reports Server (NTRS)

    Fiorini, Paolo; Seraji, Homayoun; Long, Mark

    1996-01-01

    This paper describes the architecture and performance of a PC-based controller for 7-DOF dextrous manipulators. The computing platform is a 486-based personal computer equipped with a bus extender to access the robot Multibus controller, together with a single board computer as the graphical engine, and with a parallel I/O board to interface with a force-torque sensor mounted on the manipulator wrist.

  1. Wireless Technologies, Ubiquitous Computing and Mobile Health: Application to Drug Abuse Treatment and Compliance with HIV Therapies.

    PubMed

    Boyer, Edward W; Smelson, David; Fletcher, Richard; Ziedonis, Douglas; Picard, Rosalind W

    2010-06-01

    Beneficial advances in the treatment of substance abuse and compliance with medical therapies, including HAART, are possible with new mobile technologies related to personal physiological sensing and computational methods. When incorporated into mobile platforms that allow for ubiquitous computing, these technologies have great potential for extending the reach of behavioral interventions from clinical settings where they are learned into natural environments.

  2. Development of embedded real-time and high-speed vision platform

    NASA Astrophysics Data System (ADS)

    Ouyang, Zhenxing; Dong, Yimin; Yang, Hua

    2015-12-01

    Currently, high-speed vision platforms are widely used in many applications, such as robotics and automation industry. However, a personal computer (PC) whose over-large size is not suitable and applicable in compact systems is an indispensable component for human-computer interaction in traditional high-speed vision platforms. Therefore, this paper develops an embedded real-time and high-speed vision platform, ER-HVP Vision which is able to work completely out of PC. In this new platform, an embedded CPU-based board is designed as substitution for PC and a DSP and FPGA board is developed for implementing image parallel algorithms in FPGA and image sequential algorithms in DSP. Hence, the capability of ER-HVP Vision with size of 320mm x 250mm x 87mm can be presented in more compact condition. Experimental results are also given to indicate that the real-time detection and counting of the moving target at a frame rate of 200 fps at 512 x 512 pixels under the operation of this newly developed vision platform are feasible.

  3. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    PubMed

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  4. Historical review of computer-assisted cognitive retraining.

    PubMed

    Lynch, Bill

    2002-10-01

    This article details the introduction and development of the use of microcomputers as adjuncts to traditional cognitive rehabilitation of persons with acquired brain injury. The initial application of video games as therapeutic recreation in the late 1970s was soon followed in the early 1980s by the use of the first personal computers and available educational software. By the mid-1980s, both the IBM PC and Macintosh platforms were established, along with simplified programming languages that allowed individuals without extensive technical expertise to develop their own software. Several rehabilitation clinicians began to produce and market specially written cognitive retraining software for one or the other platform. Their work was detailed and reviewed, as was recently released software from commercial sources. The latter discussion included the latest developments in the rehabilitation applications of personal digital assistants and related organizing, reminding, and dictation devices. A summary of research on the general and specific efficacy of computer-assisted cognitive retraining illustrated the lingering controversy and skepticism that have been associated with this field since its inception. Computer-assisted cognitive retraining (CACR) can be an effective adjunct to a comprehensive program of cognitive rehabilitation. Training needs to be focused, structured, monitored, and as ecologically relevant as possible for optimum effect. Transfer or training or generalizability of skills remains a key issue in the field and should be considered the key criterion in evaluating whether to initiate or continue CACR.

  5. The community FabLab platform: applications and implications in biomedical engineering.

    PubMed

    Stephenson, Makeda K; Dow, Douglas E

    2014-01-01

    Skill development in science, technology, engineering and math (STEM) education present one of the most formidable challenges of modern society. The Community FabLab platform presents a viable solution. Each FabLab contains a suite of modern computer numerical control (CNC) equipment, electronics and computing hardware and design, programming, computer aided design (CAD) and computer aided machining (CAM) software. FabLabs are community and educational resources and open to the public. Development of STEM based workforce skills such as digital fabrication and advanced manufacturing can be enhanced using this platform. Particularly notable is the potential of the FabLab platform in STEM education. The active learning environment engages and supports a diversity of learners, while the iterative learning that is supported by the FabLab rapid prototyping platform facilitates depth of understanding, creativity, innovation and mastery. The product and project based learning that occurs in FabLabs develops in the student a personal sense of accomplishment, self-awareness, command of the material and technology. This helps build the interest and confidence necessary to excel in STEM and throughout life. Finally the introduction and use of relevant technologies at every stage of the education process ensures technical familiarity and a broad knowledge base needed for work in STEM based fields. Biomedical engineering education strives to cultivate broad technical adeptness, creativity, interdisciplinary thought, and an ability to form deep conceptual understanding of complex systems. The FabLab platform is well designed to enhance biomedical engineering education.

  6. Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments

    PubMed Central

    Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu

    2017-01-01

    High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments. PMID:28835734

  7. Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments.

    PubMed

    Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu

    2017-01-01

    High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments.

  8. Embedded systems for supporting computer accessibility.

    PubMed

    Mulfari, Davide; Celesti, Antonio; Fazio, Maria; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Nowadays, customized AT software solutions allow their users to interact with various kinds of computer systems. Such tools are generally available on personal devices (e.g., smartphones, laptops and so on) commonly used by a person with a disability. In this paper, we investigate a way of using the aforementioned AT equipments in order to access many different devices without assistive preferences. The solution takes advantage of open source hardware and its core component consists of an affordable Linux embedded system: it grabs data coming from the assistive software, which runs on the user's personal device, then, after processing, it generates native keyboard and mouse HID commands for the target computing device controlled by the end user. This process supports any operating system available on the target machine and it requires no specialized software installation; therefore the user with a disability can rely on a single assistive tool to control a wide range of computing platforms, including conventional computers and many kinds of mobile devices, which receive input commands through the USB HID protocol.

  9. A cell-phone-based brain-computer interface for communication in daily life

    NASA Astrophysics Data System (ADS)

    Wang, Yu-Te; Wang, Yijun; Jung, Tzyy-Ping

    2011-04-01

    Moving a brain-computer interface (BCI) system from a laboratory demonstration to real-life applications still poses severe challenges to the BCI community. This study aims to integrate a mobile and wireless electroencephalogram (EEG) system and a signal-processing platform based on a cell phone into a truly wearable and wireless online BCI. Its practicality and implications in a routine BCI are demonstrated through the realization and testing of a steady-state visual evoked potential (SSVEP)-based BCI. This study implemented and tested online signal processing methods in both time and frequency domains for detecting SSVEPs. The results of this study showed that the performance of the proposed cell-phone-based platform was comparable, in terms of the information transfer rate, with other BCI systems using bulky commercial EEG systems and personal computers. To the best of our knowledge, this study is the first to demonstrate a truly portable, cost-effective and miniature cell-phone-based platform for online BCIs.

  10. A cell-phone-based brain-computer interface for communication in daily life.

    PubMed

    Wang, Yu-Te; Wang, Yijun; Jung, Tzyy-Ping

    2011-04-01

    Moving a brain-computer interface (BCI) system from a laboratory demonstration to real-life applications still poses severe challenges to the BCI community. This study aims to integrate a mobile and wireless electroencephalogram (EEG) system and a signal-processing platform based on a cell phone into a truly wearable and wireless online BCI. Its practicality and implications in a routine BCI are demonstrated through the realization and testing of a steady-state visual evoked potential (SSVEP)-based BCI. This study implemented and tested online signal processing methods in both time and frequency domains for detecting SSVEPs. The results of this study showed that the performance of the proposed cell-phone-based platform was comparable, in terms of the information transfer rate, with other BCI systems using bulky commercial EEG systems and personal computers. To the best of our knowledge, this study is the first to demonstrate a truly portable, cost-effective and miniature cell-phone-based platform for online BCIs.

  11. Informatics in radiology (infoRAD): free DICOM image viewing and processing software for the Macintosh computer: what's available and what it can do for you.

    PubMed

    Escott, Edward J; Rubinstein, David

    2004-01-01

    It is often necessary for radiologists to use digital images in presentations and conferences. Most imaging modalities produce images in the Digital Imaging and Communications in Medicine (DICOM) format. The image files tend to be large and thus cannot be directly imported into most presentation software, such as Microsoft PowerPoint; the large files also consume storage space. There are many free programs that allow viewing and processing of these files on a personal computer, including conversion to more common file formats such as the Joint Photographic Experts Group (JPEG) format. Free DICOM image viewing and processing software for computers running on the Microsoft Windows operating system has already been evaluated. However, many people use the Macintosh (Apple Computer) platform, and a number of programs are available for these users. The World Wide Web was searched for free DICOM image viewing or processing software that was designed for the Macintosh platform or is written in Java and is therefore platform independent. The features of these programs and their usability were evaluated. There are many free programs for the Macintosh platform that enable viewing and processing of DICOM images. (c) RSNA, 2004.

  12. CloVR: A virtual machine for automated and portable sequence analysis from the desktop using cloud computing

    PubMed Central

    2011-01-01

    Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105

  13. Bioinformatics and Microarray Data Analysis on the Cloud.

    PubMed

    Calabrese, Barbara; Cannataro, Mario

    2016-01-01

    High-throughput platforms such as microarray, mass spectrometry, and next-generation sequencing are producing an increasing volume of omics data that needs large data storage and computing power. Cloud computing offers massive scalable computing and storage, data sharing, on-demand anytime and anywhere access to resources and applications, and thus, it may represent the key technology for facing those issues. In fact, in the recent years it has been adopted for the deployment of different bioinformatics solutions and services both in academia and in the industry. Although this, cloud computing presents several issues regarding the security and privacy of data, that are particularly important when analyzing patients data, such as in personalized medicine. This chapter reviews main academic and industrial cloud-based bioinformatics solutions; with a special focus on microarray data analysis solutions and underlines main issues and problems related to the use of such platforms for the storage and analysis of patients data.

  14. [Construction and application of bioinformatic analysis platform for aquatic pathogen based on the MilkyWay-2 supercomputer].

    PubMed

    Fang, Xiang; Li, Ning-qiu; Fu, Xiao-zhe; Li, Kai-bin; Lin, Qiang; Liu, Li-hui; Shi, Cun-bin; Wu, Shu-qin

    2015-07-01

    As a key component of life science, bioinformatics has been widely applied in genomics, transcriptomics, and proteomics. However, the requirement of high-performance computers rather than common personal computers for constructing a bioinformatics platform significantly limited the application of bioinformatics in aquatic science. In this study, we constructed a bioinformatic analysis platform for aquatic pathogen based on the MilkyWay-2 supercomputer. The platform consisted of three functional modules, including genomic and transcriptomic sequencing data analysis, protein structure prediction, and molecular dynamics simulations. To validate the practicability of the platform, we performed bioinformatic analysis on aquatic pathogenic organisms. For example, genes of Flavobacterium johnsoniae M168 were identified and annotated via Blast searches, GO and InterPro annotations. Protein structural models for five small segments of grass carp reovirus HZ-08 were constructed by homology modeling. Molecular dynamics simulations were performed on out membrane protein A of Aeromonas hydrophila, and the changes of system temperature, total energy, root mean square deviation and conformation of the loops during equilibration were also observed. These results showed that the bioinformatic analysis platform for aquatic pathogen has been successfully built on the MilkyWay-2 supercomputer. This study will provide insights into the construction of bioinformatic analysis platform for other subjects.

  15. Potential of a suite of robot/computer-assisted motivating systems for personalized, home-based, stroke rehabilitation.

    PubMed

    Johnson, Michelle J; Feng, Xin; Johnson, Laura M; Winters, Jack M

    2007-03-01

    There is a need to improve semi-autonomous stroke therapy in home environments often characterized by low supervision of clinical experts and low extrinsic motivation. Our distributed device approach to this problem consists of an integrated suite of low-cost robotic/computer-assistive technologies driven by a novel universal access software framework called UniTherapy. Our design strategy for personalizing the therapy, providing extrinsic motivation and outcome assessment is presented and evaluated. Three studies were conducted to evaluate the potential of the suite. A conventional force-reflecting joystick, a modified joystick therapy platform (TheraJoy), and a steering wheel platform (TheraDrive) were tested separately with the UniTherapy software. Stroke subjects with hemiparesis and able-bodied subjects completed tracking activities with the devices in different positions. We quantify motor performance across subject groups and across device platforms and muscle activation across devices at two positions in the arm workspace. Trends in the assessment metrics were consistent across devices with able-bodied and high functioning strokes subjects being significantly more accurate and quicker in their motor performance than low functioning subjects. Muscle activation patterns were different for shoulder and elbow across different devices and locations. The Robot/CAMR suite has potential for stroke rehabilitation. By manipulating hardware and software variables, we can create personalized therapy environments that engage patients, address their therapy need, and track their progress. A larger longitudinal study is still needed to evaluate these systems in under-supervised environments such as the home.

  16. A Wireless Monitoring Sub-nA Resolution Test Platform for Nanostructure Sensors

    PubMed Central

    Jang, Chi Woong; Byun, Young Tae; Lee, Taikjin; Woo, Deok Ha; Lee, Seok; Jhon, Young Min

    2013-01-01

    We have constructed a wireless monitoring test platform with a sub-nA resolution signal amplification/processing circuit (SAPC) and a wireless communication network to test the real-time remote monitoring of the signals from carbon nanotube (CNT) sensors. The operation characteristics of the CNT sensors can also be measured by the ISD-VSD curve with the SAPC. The SAPC signals are transmitted to a personal computer by Bluetooth communication and the signals from the computer are transmitted to smart phones by Wi-Fi communication, in such a way that the signals from the sensors can be remotely monitored through a web browser. Successful remote monitoring of signals from a CNT sensor was achieved with the wireless monitoring test platform for detection of 0.15% methanol vapor with 0.5 nA resolution and 7 Hz sampling rate. PMID:23783735

  17. Grace: A cross-platform micromagnetic simulator on graphics processing units

    NASA Astrophysics Data System (ADS)

    Zhu, Ru

    2015-12-01

    A micromagnetic simulator running on graphics processing units (GPUs) is presented. Different from GPU implementations of other research groups which are predominantly running on NVidia's CUDA platform, this simulator is developed with C++ Accelerated Massive Parallelism (C++ AMP) and is hardware platform independent. It runs on GPUs from venders including NVidia, AMD and Intel, and achieves significant performance boost as compared to previous central processing unit (CPU) simulators, up to two orders of magnitude. The simulator paved the way for running large size micromagnetic simulations on both high-end workstations with dedicated graphics cards and low-end personal computers with integrated graphics cards, and is freely available to download.

  18. What does media use reveal about personality and mental health? An exploratory investigation among German students.

    PubMed

    Brailovskaia, Julia; Margraf, Jürgen

    2018-01-01

    The present study aimed to investigate the relationship between personality traits, mental health variables and media use among German students. The data of 633 participants were collected. Results indicate a positive association between general Internet use, general use of social platforms and Facebook use, on the one hand, and self-esteem, extraversion, narcissism, life satisfaction, social support and resilience, on the other hand. Use of computer games was found to be negatively related to these personality and mental health variables. The use of platforms that focus more on written interaction (Twitter, Tumblr) was assumed to be negatively associated with positive mental health variables and significantly positively with depression, anxiety, and stress symptoms. In contrast, Instagram use, which focuses more on photo-sharing, correlated positively with positive mental health variables. Possible practical implications of the present results for mental health, as well as the limitations of the present work are discussed.

  19. Development and Validation of an Interactive Internet Platform for Older People: The Healthy Ageing Through Internet Counselling in the Elderly Study.

    PubMed

    Jongstra, Susan; Beishuizen, Cathrien; Andrieu, Sandrine; Barbera, Mariagnese; van Dorp, Matthijs; van de Groep, Bram; Guillemont, Juliette; Mangialasche, Francesca; van Middelaar, Tessa; Moll van Charante, Eric; Soininen, Hilkka; Kivipelto, Miia; Richard, Edo

    2017-02-01

    A myriad of Web-based applications on self-management have been developed, but few focus on older people. In the face of global aging, older people form an important target population for cardiovascular prevention. This article describes the full development of an interactive Internet platform for older people, which was designed for the Healthy Ageing Through Internet Counselling in the Elderly (HATICE) study. We provide recommendations to design senior-friendly Web-based applications for a new approach to multicomponent cardiovascular prevention. The development of the platform followed five phases: (1) conceptual framework; (2) platform concept and functional design; (3) platform building (software and content); (4) testing and pilot study; and (5) final product. We performed a meta-analysis, reviewed guidelines for cardiovascular diseases, and consulted end users, experts, and software developers to create the platform concept and content. The software was built in iterative cycles. In the pilot study, 41 people aged ≥65 years used the platform for 8 weeks. Participants used the interactive features of the platform and appreciated the coach support. During all phases adjustments were made to incorporate all improvements from the previous phases. The final platform is a personal, secured, and interactive platform supported by a coach. When carefully designed, an interactive Internet platform is acceptable and feasible for use by older people with basic computer skills. To improve acceptability by older people, we recommend involving the end users in the process of development, to personalize the platform and to combine the application with human support. The interactive HATICE platform will be tested for efficacy in a multinational randomized controlled trial (ISRCTN48151589).

  20. Digital Storytelling as an Interactive Digital Media Context

    ERIC Educational Resources Information Center

    Anderson, Kate T.; Chua, Puay Hoe

    2010-01-01

    Digital storytelling involves the creation of short, personal narratives combining images, sounds, and text in a multimedia computer-based platform. In education, digital storytelling has been used to foster learning in formal and informal spaces worldwide. The authors offer a critical discussion of claims about digital storytelling's usefulness…

  1. VISUAL PLUMES CONCEPTS TO POTENTIALLY ADAPT OR ADOPT IN MODELING PLATFORMS SUCH AS VISJET

    EPA Science Inventory

    Windows-based programs share many familiar features and components. For example, file dialogue windows are familiar to most Windows-based personal computer users. Such program elements are desirable because the user is already familiar with how they function, obviating the need f...

  2. Direct Comparison of a Tablet Computer and a Personal Digital Assistant for Point-of-Care Documentation in Eye Care

    PubMed Central

    Silvey, Garry M.; Macri, Jennifer M.; Lee, Paul P.; Lobach, David F.

    2005-01-01

    New mobile computing devices including personal digital assistants (PDAs) and tablet computers have emerged to facilitate data collection at the point of care. Unfortunately, little research has been reported regarding which device is optimal for a given care setting. In this study we created and compared functionally identical applications on a Palm operating system-based PDA and a Windows-based tablet computer for point-of-care documentation of clinical observations by eye care professionals when caring for patients with diabetes. Eye-care professionals compared the devices through focus group sessions and through validated usability surveys. We found that the application on the tablet computer was preferred over the PDA for documenting the complex data related to eye care. Our findings suggest that the selection of a mobile computing platform depends on the amount and complexity of the data to be entered; the tablet computer functions better for high volume, complex data entry, and the PDA, for low volume, simple data entry. PMID:16779128

  3. Potential of a suite of robot/computer-assisted motivating systems for personalized, home-based, stroke rehabilitation

    PubMed Central

    Johnson, Michelle J; Feng, Xin; Johnson, Laura M; Winters, Jack M

    2007-01-01

    Background There is a need to improve semi-autonomous stroke therapy in home environments often characterized by low supervision of clinical experts and low extrinsic motivation. Our distributed device approach to this problem consists of an integrated suite of low-cost robotic/computer-assistive technologies driven by a novel universal access software framework called UniTherapy. Our design strategy for personalizing the therapy, providing extrinsic motivation and outcome assessment is presented and evaluated. Methods Three studies were conducted to evaluate the potential of the suite. A conventional force-reflecting joystick, a modified joystick therapy platform (TheraJoy), and a steering wheel platform (TheraDrive) were tested separately with the UniTherapy software. Stroke subjects with hemiparesis and able-bodied subjects completed tracking activities with the devices in different positions. We quantify motor performance across subject groups and across device platforms and muscle activation across devices at two positions in the arm workspace. Results Trends in the assessment metrics were consistent across devices with able-bodied and high functioning strokes subjects being significantly more accurate and quicker in their motor performance than low functioning subjects. Muscle activation patterns were different for shoulder and elbow across different devices and locations. Conclusion The Robot/CAMR suite has potential for stroke rehabilitation. By manipulating hardware and software variables, we can create personalized therapy environments that engage patients, address their therapy need, and track their progress. A larger longitudinal study is still needed to evaluate these systems in under-supervised environments such as the home. PMID:17331243

  4. An u-Service Model Based on a Smart Phone for Urban Computing Environments

    NASA Astrophysics Data System (ADS)

    Cho, Yongyun; Yoe, Hyun

    In urban computing environments, all of services should be based on the interaction between humans and environments around them, which frequently and ordinarily in home and office. This paper propose an u-service model based on a smart phone for urban computing environments. The suggested service model includes a context-aware and personalized service scenario development environment that can instantly describe user's u-service demand or situation information with smart devices. To do this, the architecture of the suggested service model consists of a graphical service editing environment for smart devices, an u-service platform, and an infrastructure with sensors and WSN/USN. The graphic editor expresses contexts as execution conditions of a new service through a context model based on ontology. The service platform deals with the service scenario according to contexts. With the suggested service model, an user in urban computing environments can quickly and easily make u-service or new service using smart devices.

  5. A Computing Platform for Parallel Sparse Matrix Computations

    DTIC Science & Technology

    2016-01-05

    REPORT NUMBER 19a. NAME OF RESPONSIBLE PERSON 19b. TELEPHONE NUMBER Ahmed Sameh Ahmed H. Sameh, Alicia Klinvex, Yao Zhu 611103 c. THIS PAGE The...PERCENT_SUPPORTEDNAME FTE Equivalent: Total Number: Discipline Yao Zhu 0.50 Alicia Klinvex 0.10 0.60 2 Names of Post Doctorates Names of Faculty Supported...PERCENT_SUPPORTEDNAME FTE Equivalent: Total Number: NAME Total Number: NAME Total Number: Yao Zhu Alicia Klinvex 2 ...... ...... Sub Contractors (DD882) Names of other

  6. A Perspective on Implementing a Quantitative Systems Pharmacology Platform for Drug Discovery and the Advancement of Personalized Medicine

    PubMed Central

    Stern, Andrew M.; Schurdak, Mark E.; Bahar, Ivet; Berg, Jeremy M.; Taylor, D. Lansing

    2016-01-01

    Drug candidates exhibiting well-defined pharmacokinetic and pharmacodynamic profiles that are otherwise safe often fail to demonstrate proof-of-concept in phase II and III trials. Innovation in drug discovery and development has been identified as a critical need for improving the efficiency of drug discovery, especially through collaborations between academia, government agencies, and industry. To address the innovation challenge, we describe a comprehensive, unbiased, integrated, and iterative quantitative systems pharmacology (QSP)–driven drug discovery and development strategy and platform that we have implemented at the University of Pittsburgh Drug Discovery Institute. Intrinsic to QSP is its integrated use of multiscale experimental and computational methods to identify mechanisms of disease progression and to test predicted therapeutic strategies likely to achieve clinical validation for appropriate subpopulations of patients. The QSP platform can address biological heterogeneity and anticipate the evolution of resistance mechanisms, which are major challenges for drug development. The implementation of this platform is dedicated to gaining an understanding of mechanism(s) of disease progression to enable the identification of novel therapeutic strategies as well as repurposing drugs. The QSP platform will help promote the paradigm shift from reactive population-based medicine to proactive personalized medicine by focusing on the patient as the starting and the end point. PMID:26962875

  7. Surveillance in the Information Age: Text Quantification, Anomaly Detection, and Empirical Evaluation

    ERIC Educational Resources Information Center

    Lu, Hsin-Min

    2010-01-01

    Deep penetration of personal computers, data communication networks, and the Internet has created a massive platform for data collection, dissemination, storage, and retrieval. Large amounts of textual data are now available at a very low cost. Valuable information, such as consumer preferences, new product developments, trends, and opportunities,…

  8. Three-Dimensional Space to Assess Cloud Interoperability

    DTIC Science & Technology

    2013-03-01

    12 1. Portability and Mobility ...collection of network-enabled services that guarantees to provide a scalable, easy accessible, reliable, and personalized computing infrastructure , based on...are used in research to describe cloud models, such as SaaS (Software as a Service), PaaS (Platform as a service), IaaS ( Infrastructure as a Service

  9. xdamp Version 6 : an IDL-based data and image manipulation program.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballard, William Parker

    2012-04-01

    The original DAMP (DAta Manipulation Program) was written by Mark Hedemann of Sandia National Laboratories and used the CA-DISSPLA{trademark} (available from Computer Associates International, Inc., Garden City, NY) graphics package as its engine. It was used to plot, modify, and otherwise manipulate the one-dimensional data waveforms (data vs. time) from a wide variety of accelerators. With the waning of CA-DISSPLA and the increasing popularity of Unix(reg sign)-based workstations, a replacement was needed. This package uses the IDL(reg sign) software, available from Research Systems Incorporated, a Xerox company, in Boulder, Colorado, as the engine, and creates a set of widgets tomore » manipulate the data in a manner similar to the original DAMP and earlier versions of xdamp. IDL is currently supported on a wide variety of Unix platforms such as IBM(reg sign) workstations, Hewlett Packard workstations, SUN(reg sign) workstations, Microsoft(reg sign) Windows{trademark} computers, Macintosh(reg sign) computers and Digital Equipment Corporation VMS(reg sign) and Alpha(reg sign) systems. Thus, xdamp is portable across many platforms. We have verified operation, albeit with some minor IDL bugs, on personal computers using Windows 7 and Windows Vista; Unix platforms; and Macintosh computers. Version 6 is an update that uses the IDL Virtual Machine to resolve the need for licensing IDL.« less

  10. What does media use reveal about personality and mental health? An exploratory investigation among German students

    PubMed Central

    Margraf, Jürgen

    2018-01-01

    The present study aimed to investigate the relationship between personality traits, mental health variables and media use among German students. The data of 633 participants were collected. Results indicate a positive association between general Internet use, general use of social platforms and Facebook use, on the one hand, and self-esteem, extraversion, narcissism, life satisfaction, social support and resilience, on the other hand. Use of computer games was found to be negatively related to these personality and mental health variables. The use of platforms that focus more on written interaction (Twitter, Tumblr) was assumed to be negatively associated with positive mental health variables and significantly positively with depression, anxiety, and stress symptoms. In contrast, Instagram use, which focuses more on photo-sharing, correlated positively with positive mental health variables. Possible practical implications of the present results for mental health, as well as the limitations of the present work are discussed. PMID:29370275

  11. Embedded Web Technology: Internet Technology Applied to Real-Time System Control

    NASA Technical Reports Server (NTRS)

    Daniele, Carl J.

    1998-01-01

    The NASA Lewis Research Center is developing software tools to bridge the gap between the traditionally non-real-time Internet technology and the real-time, embedded-controls environment for space applications. Internet technology has been expanding at a phenomenal rate. The simple World Wide Web browsers (such as earlier versions of Netscape, Mosaic, and Internet Explorer) that resided on personal computers just a few years ago only enabled users to log into and view a remote computer site. With current browsers, users not only view but also interact with remote sites. In addition, the technology now supports numerous computer platforms (PC's, MAC's, and Unix platforms), thereby providing platform independence.In contrast, the development of software to interact with a microprocessor (embedded controller) that is used to monitor and control a space experiment has generally been a unique development effort. For each experiment, a specific graphical user interface (GUI) has been developed. This procedure works well for a single-user environment. However, the interface for the International Space Station (ISS) Fluids and Combustion Facility will have to enable scientists throughout the world and astronauts onboard the ISS, using different computer platforms, to interact with their experiments in the Fluids and Combustion Facility. Developing a specific GUI for all these users would be cost prohibitive. An innovative solution to this requirement, developed at Lewis, is to use Internet technology, where the general problem of platform independence has already been partially solved, and to leverage this expanding technology as new products are developed. This approach led to the development of the Embedded Web Technology (EWT) program at Lewis, which has the potential to significantly reduce software development costs for both flight and ground software.

  12. The Cell Collective: Toward an open and collaborative approach to systems biology

    PubMed Central

    2012-01-01

    Background Despite decades of new discoveries in biomedical research, the overwhelming complexity of cells has been a significant barrier to a fundamental understanding of how cells work as a whole. As such, the holistic study of biochemical pathways requires computer modeling. Due to the complexity of cells, it is not feasible for one person or group to model the cell in its entirety. Results The Cell Collective is a platform that allows the world-wide scientific community to create these models collectively. Its interface enables users to build and use models without specifying any mathematical equations or computer code - addressing one of the major hurdles with computational research. In addition, this platform allows scientists to simulate and analyze the models in real-time on the web, including the ability to simulate loss/gain of function and test what-if scenarios in real time. Conclusions The Cell Collective is a web-based platform that enables laboratory scientists from across the globe to collaboratively build large-scale models of various biological processes, and simulate/analyze them in real time. In this manuscript, we show examples of its application to a large-scale model of signal transduction. PMID:22871178

  13. Dueler’s Dilemma: A One-Person Computer Gaming Platform for Decision Making

    DTIC Science & Technology

    2007-11-01

    Peter Tikuisis; Keefe, Allan A.; DRDC Toronto TM 2007-091; R & D pour la défense Canada – Toronto; Novembre 2007. « Dueler’s Dilemma » est un jeu... 13 5 Summary...numbers. The possibility that both the subject and opponent are hit is negligible and therefore ignored (see text). .................................. 13

  14. The Effect of Web-Based Portfolio Use on Academic Achievement and Retention

    ERIC Educational Resources Information Center

    Guzeller, Cem Oktay

    2012-01-01

    The web-based portfolio emerged as a result of the influence of technological developments on educational practices. In this study, the effect of the web-based portfolio building process on academic achievement and retention is explored. For this purpose, a study platform known as a computer-assisted personal development portfolio was designed for…

  15. A Perspective on Implementing a Quantitative Systems Pharmacology Platform for Drug Discovery and the Advancement of Personalized Medicine.

    PubMed

    Stern, Andrew M; Schurdak, Mark E; Bahar, Ivet; Berg, Jeremy M; Taylor, D Lansing

    2016-07-01

    Drug candidates exhibiting well-defined pharmacokinetic and pharmacodynamic profiles that are otherwise safe often fail to demonstrate proof-of-concept in phase II and III trials. Innovation in drug discovery and development has been identified as a critical need for improving the efficiency of drug discovery, especially through collaborations between academia, government agencies, and industry. To address the innovation challenge, we describe a comprehensive, unbiased, integrated, and iterative quantitative systems pharmacology (QSP)-driven drug discovery and development strategy and platform that we have implemented at the University of Pittsburgh Drug Discovery Institute. Intrinsic to QSP is its integrated use of multiscale experimental and computational methods to identify mechanisms of disease progression and to test predicted therapeutic strategies likely to achieve clinical validation for appropriate subpopulations of patients. The QSP platform can address biological heterogeneity and anticipate the evolution of resistance mechanisms, which are major challenges for drug development. The implementation of this platform is dedicated to gaining an understanding of mechanism(s) of disease progression to enable the identification of novel therapeutic strategies as well as repurposing drugs. The QSP platform will help promote the paradigm shift from reactive population-based medicine to proactive personalized medicine by focusing on the patient as the starting and the end point. © 2016 Society for Laboratory Automation and Screening.

  16. Multimodal browsing using VoiceXML

    NASA Astrophysics Data System (ADS)

    Caccia, Giuseppe; Lancini, Rosa C.; Peschiera, Giuseppe

    2003-06-01

    With the increasing development of devices such as personal computers, WAP and personal digital assistants connected to the World Wide Web, end users feel the need to browse the Internet through multiple modalities. We intend to investigate on how to create a user interface and a service distribution platform granting the user access to the Internet through standard I/O modalities and voice simultaneously. Different architectures are evaluated suggesting the more suitable for each client terminal (PC o WAP). In particular the design of the multimodal usermachine interface considers the synchronization issue between graphical and voice contents.

  17. A low cost, high precision extreme/harsh cold environment, autonomous sensor data gathering and transmission platform.

    NASA Astrophysics Data System (ADS)

    Chetty, S.; Field, L. A.

    2014-12-01

    SWIMS III, is a low cost, autonomous sensor data gathering platform developed specifically for extreme/harsh cold environments. Arctic ocean's continuing decrease of summer-time ice is related to rapidly diminishing multi-year ice due to the effects of climate change. Ice911 Research aims to develop environmentally inert materials that when deployed will increase the albedo, enabling the formation and/preservation of multi-year ice. SWIMS III's sophisticated autonomous sensors are designed to measure the albedo, weather, water temperature and other environmental parameters. This platform uses low cost, high accuracy/precision sensors, extreme environment command and data handling computer system using satellite and terrestrial wireless solution. The system also incorporates tilt sensors and sonar based ice thickness sensors. The system is light weight and can be deployed by hand by a single person. This presentation covers the technical, and design challenges in developing and deploying these platforms.

  18. Mining Personal Data Using Smartphones and Wearable Devices: A Survey

    PubMed Central

    Rehman, Muhammad Habib ur; Liew, Chee Sun; Wah, Teh Ying; Shuja, Junaid; Daghighi, Babak

    2015-01-01

    The staggering growth in smartphone and wearable device use has led to a massive scale generation of personal (user-specific) data. To explore, analyze, and extract useful information and knowledge from the deluge of personal data, one has to leverage these devices as the data-mining platforms in ubiquitous, pervasive, and big data environments. This study presents the personal ecosystem where all computational resources, communication facilities, storage and knowledge management systems are available in user proximity. An extensive review on recent literature has been conducted and a detailed taxonomy is presented. The performance evaluation metrics and their empirical evidences are sorted out in this paper. Finally, we have highlighted some future research directions and potentially emerging application areas for personal data mining using smartphones and wearable devices. PMID:25688592

  19. Lessons learned from the usability assessment of home-based telemedicine systems.

    PubMed

    Agnisarman, Sruthy Orozhiyathumana; Chalil Madathil, Kapil; Smith, Kevin; Ashok, Aparna; Welch, Brandon; McElligott, James T

    2017-01-01

    At-home telemedicine visits are quickly becoming an acceptable alternative for in-person patient visits. However, little work has been done to understand the usability of these home-based telemedicine solutions. It is critical for user acceptance and real-world applicability to evaluate available telemedicine solutions within the context-specific needs of the users of this technology. To address this need, this study evaluated the usability of four home-based telemedicine software platforms: Doxy.me, Vidyo, VSee, and Polycom. Using a within-subjects experimental design, twenty participants were asked to complete a telemedicine session involving several tasks using the four platforms. Upon completion of these tasks for each platform, participants completed the IBM computer system usability questionnaire (CSUQ) and the NASA Task Load Index test. Upon completing the tasks on all four platforms, the participants completed a final post-test subjective questionnaire ranking the platforms based on their preference. Of the twenty participants, 19 completed the study. Statistically significant differences among the telemedicine software platforms were found for task completion time, total workload, mental demand, effort, frustration, preference ranking and computer system usability scores. Usability problems with installation and account creation led to high mental demand and task completion time, suggesting the participants preferred a system without such requirements. Majority of the usability issues were identified at the telemedicine initiation phase. The findings from this study can be used by software developers to develop user-friendly telemedicine systems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. E-Learning Application of Tarsier with Virtual Reality using Android Platform

    NASA Astrophysics Data System (ADS)

    Oroh, H. N.; Munir, R.; Paseru, D.

    2017-01-01

    Spectral Tarsier is a primitive primate that can only be found in the province of North Sulawesi. To study these primates can be used an e-learning application with Augmented Reality technology that uses a marker to confronted the camera computer to interact with three dimensions Tarsier object. But that application only shows tarsier object in three dimensions without habitat and requires a lot of resources because it runs on a Personal Computer. The same technology can be shown three dimensions’ objects is Virtual Reality to excess can make the user like venturing into the virtual world with Android platform that requires fewer resources. So, put on Virtual Reality technology using the Android platform that can make users not only to view and interact with the tarsiers but also the habitat. The results of this research indicate that the user can learn the Tarsier and habitat with good. Thus, the use of Virtual Reality technology in the e-learning application of tarsiers can help people to see, know, and learn about Spectral Tarsier.

  1. Reactions of Standing Bipeds on Moving Platforms to Keep Their Balance May Increase the Amplitude of Oscillations of Platforms Satisfying Hooke’s Law

    PubMed Central

    Goldsztein, Guillermo H.

    2016-01-01

    Consider a person standing on a platform that oscillates laterally, i.e. to the right and left of the person. Assume the platform satisfies Hooke’s law. As the platform moves, the person reacts and moves its body attempting to keep its balance. We develop a simple model to study this phenomenon and show that the person, while attempting to keep its balance, may do positive work on the platform and increase the amplitude of its oscillations. The studies in this article are motivated by the oscillations in pedestrian bridges that are sometimes observed when large crowds cross them. PMID:27304857

  2. Reactions of Standing Bipeds on Moving Platforms to Keep Their Balance May Increase the Amplitude of Oscillations of Platforms Satisfying Hooke's Law.

    PubMed

    Goldsztein, Guillermo H

    2016-01-01

    Consider a person standing on a platform that oscillates laterally, i.e. to the right and left of the person. Assume the platform satisfies Hooke's law. As the platform moves, the person reacts and moves its body attempting to keep its balance. We develop a simple model to study this phenomenon and show that the person, while attempting to keep its balance, may do positive work on the platform and increase the amplitude of its oscillations. The studies in this article are motivated by the oscillations in pedestrian bridges that are sometimes observed when large crowds cross them.

  3. A new mobile ubiquitous computing application to control obesity: SapoFit.

    PubMed

    Rodrigues, Joel J P C; Lopes, Ivo M C; Silva, Bruno M C; Torre, Isabel de La

    2013-01-01

    The objective of this work was the proposal, design, construction and validation of a mobile health system for dietetic monitoring and assessment, called SapoFit. This application may be personalized to keep a daily personal health record of an individual's food intake and daily exercise and to share this with a social network. The initiative is a partnership with SAPO - Portugal Telecom. SapoFit uses Web services architecture, a relatively new model for distributed computing and application integration. SapoFit runs on a range of mobile platforms, and it has been implemented successfully in a range of mobile devices and has been evaluated by over 100 users. Most users strongly agree that SapoFit has an attractive design, the environment is user-friendly and intuitive, and the navigation options are clear.

  4. Solving optimization problems on computational grids.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, S. J.; Mathematics and Computer Science

    2001-05-01

    Multiprocessor computing platforms, which have become more and more widely available since the mid-1980s, are now heavily used by organizations that need to solve very demanding computational problems. Parallel computing is now central to the culture of many research communities. Novel parallel approaches were developed for global optimization, network optimization, and direct-search methods for nonlinear optimization. Activity was particularly widespread in parallel branch-and-bound approaches for various problems in combinatorial and network optimization. As the cost of personal computers and low-end workstations has continued to fall, while the speed and capacity of processors and networks have increased dramatically, 'cluster' platforms havemore » become popular in many settings. A somewhat different type of parallel computing platform know as a computational grid (alternatively, metacomputer) has arisen in comparatively recent times. Broadly speaking, this term refers not to a multiprocessor with identical processing nodes but rather to a heterogeneous collection of devices that are widely distributed, possibly around the globe. The advantage of such platforms is obvious: they have the potential to deliver enormous computing power. Just as obviously, however, the complexity of grids makes them very difficult to use. The Condor team, headed by Miron Livny at the University of Wisconsin, were among the pioneers in providing infrastructure for grid computations. More recently, the Globus project has developed technologies to support computations on geographically distributed platforms consisting of high-end computers, storage and visualization devices, and other scientific instruments. In 1997, we started the metaneos project as a collaborative effort between optimization specialists and the Condor and Globus groups. Our aim was to address complex, difficult optimization problems in several areas, designing and implementing the algorithms and the software infrastructure need to solve these problems on computational grids. This article describes some of the results we have obtained during the first three years of the metaneos project. Our efforts have led to development of the runtime support library MW for implementing algorithms with master-worker control structure on Condor platforms. This work is discussed here, along with work on algorithms and codes for integer linear programming, the quadratic assignment problem, and stochastic linear programmming. Our experiences in the metaneos project have shown that cheap, powerful computational grids can be used to tackle large optimization problems of various types. In an industrial or commercial setting, the results demonstrate that one may not have to buy powerful computational servers to solve many of the large problems arising in areas such as scheduling, portfolio optimization, or logistics; the idle time on employee workstations (or, at worst, an investment in a modest cluster of PCs) may do the job. For the optimization research community, our results motivate further work on parallel, grid-enabled algorithms for solving very large problems of other types. The fact that very large problems can be solved cheaply allows researchers to better understand issues of 'practical' complexity and of the role of heuristics.« less

  5. Los Alamos radiation transport code system on desktop computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less

  6. FARSITE: a fire area simulator for fire managers

    Treesearch

    Mark A. Finney

    1995-01-01

    A fire growth model (FARSITE) has been developed for use on personal computers (PC’s). Because PC’s are commonly used by land and fire managers, this portable platform would be an accustomed means to bring fire growth modeling technology to management applications. The FARSITE model is intended for use in projecting the growth of prescribed natural fires for wilderness...

  7. Benefits and Limitations of iPads in the High School Science Classroom and a Trophic Cascade Lesson Plan

    ERIC Educational Resources Information Center

    Ward, Nicholas D.; Finley, Rachel J.; Keil, Richard G.; Clay, Tansy G.

    2013-01-01

    This study explores the utility of a set of tablet-based personal computers in the K-12 science, technology, engineering, and mathematics classroom. Specifically, a lesson on food-chain dynamics and predator-prey population controls was designed on the Apple® iPad platform and delivered to three sophomore-level ecology classes (roughly 30 students…

  8. Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays.

    PubMed

    Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Cooper, Emily A; Wetzstein, Gordon

    2017-02-28

    From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one.

  9. Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays

    NASA Astrophysics Data System (ADS)

    Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Cooper, Emily A.; Wetzstein, Gordon

    2017-02-01

    From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one.

  10. Incorporating CLIPS into a personal-computer-based Intelligent Tutoring System

    NASA Technical Reports Server (NTRS)

    Mueller, Stephen J.

    1990-01-01

    A large number of Intelligent Tutoring Systems (ITS's) have been built since they were first proposed in the early 1970's. Research conducted on the use of the best of these systems has demonstrated their effectiveness in tutoring in selected domains. Computer Sciences Corporation, Applied Technology Division, Houston Operations has been tasked by the Spacecraft Software Division at NASA/Johnson Space Center (NASA/JSC) to develop a number of lTS's in a variety of domains and on many different platforms. This paper will address issues facing the development of an ITS on a personal computer using the CLIPS (C Language Integrated Production System) language. For an ITS to be widely accepted, not only must it be effective, flexible, and very responsive, it must also be capable of functioning on readily available computers. There are many issues to consider when using CLIPS to develop an ITS on a personal computer. Some of these issues are the following: when to use CLIPS and when to use a procedural language such as C, how to maximize speed and minimize memory usage, and how to decrease the time required to load your rule base once you are ready to deliver the system. Based on experiences in developing the CLIPS Intelligent Tutoring System (CLIPSITS) on an IBM PC clone and an intelligent Physics Tutor on a Macintosh 2, this paper reports results on how to address some of these issues. It also suggests approaches for maintaining a powerful learning environment while delivering robust performance within the speed and memory constraints of the personal computer.

  11. Good clean fun? A content analysis of profanity in video games and its prevalence across game systems and ratings.

    PubMed

    Ivory, James D; Williams, Dmitri; Martins, Nicole; Consalvo, Mia

    2009-08-01

    Although violent video game content and its effects have been examined extensively by empirical research, verbal aggression in the form of profanity has received less attention. Building on preliminary findings from previous studies, an extensive content analysis of profanity in video games was conducted using a sample of the 150 top-selling video games across all popular game platforms (including home consoles, portable consoles, and personal computers). The frequency of profanity, both in general and across three profanity categories, was measured and compared to games' ratings, sales, and platforms. Generally, profanity was found in about one in five games and appeared primarily in games rated for teenagers or above. Games containing profanity, however, tended to contain it frequently. Profanity was not found to be related to games' sales or platforms.

  12. The Architecture of an Automatic eHealth Platform With Mobile Client for Cerebrovascular Disease Detection

    PubMed Central

    Wang, Xingce; Bie, Rongfang; Wu, Zhongke; Zhou, Mingquan; Cao, Rongfei; Xie, Lizhi; Zhang, Dong

    2013-01-01

    Background In recent years, cerebrovascular disease has been the leading cause of death and adult disability in the world. This study describes an efficient approach to detect cerebrovascular disease. Objective In order to improve cerebrovascular treatment, prevention, and care, an automatic cerebrovascular disease detection eHealth platform is designed and studied. Methods We designed an automatic eHealth platform for cerebrovascular disease detection with a four-level architecture: object control layer, data transmission layer, service supporting layer, and application service layer. The platform has eight main functions: cerebrovascular database management, preprocessing of cerebral image data, image viewing and adjustment model, image cropping compression and measurement, cerebrovascular segmentation, 3-dimensional cerebrovascular reconstruction, cerebrovascular rendering, cerebrovascular virtual endoscope, and automatic detection. Several key technologies were employed for the implementation of the platform. The anisotropic diffusion model was used to reduce the noise. Statistics segmentation with Gaussian-Markov random field model (G-MRF) and Stochastic Estimation Maximization (SEM) parameter estimation method were used to realize the cerebrovascular segmentation. Ball B-Spline curve was proposed to model the cerebral blood vessels. Compute unified device architecture (CUDA) based on ray-casting volume rendering presented by curvature enhancement and boundary enhancement were used to realize the volume rendering model. We implemented the platform with a network client and mobile phone client to fit different users. Results The implemented platform is running on a common personal computer. Experiments on 32 patients’ brain computed tomography data or brain magnetic resonance imaging data stored in the system verified the feasibility and validity of each model we proposed. The platform is partly used in the cranial nerve surgery of the First Hospital Affiliated to the General Hospital of People's Liberation Army and radiology of Beijing Navy General Hospital. At the same time it also gets some applications in medical imaging specialty teaching of Tianjin Medical University. The application results have also been validated by our neurosurgeon and radiologist. Conclusions The platform appears beneficial in diagnosis of the cerebrovascular disease. The long-term benefits and additional applications of this technology warrant further study. The research built a diagnosis and treatment platform of the human tissue with complex geometry and topology such as brain vessel based on the Internet of things. PMID:25098861

  13. Real-time software-based end-to-end wireless visual communications simulation platform

    NASA Astrophysics Data System (ADS)

    Chen, Ting-Chung; Chang, Li-Fung; Wong, Andria H.; Sun, Ming-Ting; Hsing, T. Russell

    1995-04-01

    Wireless channel impairments pose many challenges to real-time visual communications. In this paper, we describe a real-time software based wireless visual communications simulation platform which can be used for performance evaluation in real-time. This simulation platform consists of two personal computers serving as hosts. Major components of each PC host include a real-time programmable video code, a wireless channel simulator, and a network interface for data transport between the two hosts. The three major components are interfaced in real-time to show the interaction of various wireless channels and video coding algorithms. The programmable features in the above components allow users to do performance evaluation of user-controlled wireless channel effects without physically carrying out these experiments which are limited in scope, time-consuming, and costly. Using this simulation platform as a testbed, we have experimented with several wireless channel effects including Rayleigh fading, antenna diversity, channel filtering, symbol timing, modulation, and packet loss.

  14. Toward ubiquitous healthcare services with a novel efficient cloud platform.

    PubMed

    He, Chenguang; Fan, Xiaomao; Li, Ye

    2013-01-01

    Ubiquitous healthcare services are becoming more and more popular, especially under the urgent demand of the global aging issue. Cloud computing owns the pervasive and on-demand service-oriented natures, which can fit the characteristics of healthcare services very well. However, the abilities in dealing with multimodal, heterogeneous, and nonstationary physiological signals to provide persistent personalized services, meanwhile keeping high concurrent online analysis for public, are challenges to the general cloud. In this paper, we proposed a private cloud platform architecture which includes six layers according to the specific requirements. This platform utilizes message queue as a cloud engine, and each layer thereby achieves relative independence by this loosely coupled means of communications with publish/subscribe mechanism. Furthermore, a plug-in algorithm framework is also presented, and massive semistructure or unstructured medical data are accessed adaptively by this cloud architecture. As the testing results showing, this proposed cloud platform, with robust, stable, and efficient features, can satisfy high concurrent requests from ubiquitous healthcare services.

  15. PHIT for Duty, a Personal Health Intervention Tool for Psychological Health and Traumatic Brain Injury

    DTIC Science & Technology

    2016-06-01

    smartphone or tablet computer platforms, including both Google Android™ and Apple iOS based devices. Recruiting for the pilot study was very...framework design.. 15. SUBJECT TERMS PTSD, post-traumatic stress disorder, mobile health, self-help, iOS , Android, mindfulness, relaxation... study and subsequent randomized controlled trial (RCT) with post-deployed personnel; and (5) adapting the developed system for several popular

  16. The VISPA internet platform for outreach, education and scientific research in various experiments

    NASA Astrophysics Data System (ADS)

    van Asseldonk, D.; Erdmann, M.; Fischer, B.; Fischer, R.; Glaser, C.; Heidemann, F.; Müller, G.; Quast, T.; Rieger, M.; Urban, M.; Welling, C.

    2015-12-01

    VISPA provides a graphical front-end to computing infrastructures giving its users all functionality needed for working conditions comparable to a personal computer. It is a framework that can be extended with custom applications to support individual needs, e.g. graphical interfaces for experiment-specific software. By design, VISPA serves as a multipurpose platform for many disciplines and experiments as demonstrated in the following different use-cases. A GUI to the analysis framework OFFLINE of the Pierre Auger collaboration, submission and monitoring of computing jobs, university teaching of hundreds of students, and outreach activity, especially in CERN's open data initiative. Serving heterogeneous user groups and applications gave us lots of experience. This helps us in maturing the system, i.e. improving the robustness and responsiveness, and the interplay of the components. Among the lessons learned are the choice of a file system, the implementation of websockets, efficient load balancing, and the fine-tuning of existing technologies like the RPC over SSH. We present in detail the improved server setup and report on the performance, the user acceptance and the realized applications of the system.

  17. Optimization of image processing algorithms on mobile platforms

    NASA Astrophysics Data System (ADS)

    Poudel, Pramod; Shirvaikar, Mukul

    2011-03-01

    This work presents a technique to optimize popular image processing algorithms on mobile platforms such as cell phones, net-books and personal digital assistants (PDAs). The increasing demand for video applications like context-aware computing on mobile embedded systems requires the use of computationally intensive image processing algorithms. The system engineer has a mandate to optimize them so as to meet real-time deadlines. A methodology to take advantage of the asymmetric dual-core processor, which includes an ARM and a DSP core supported by shared memory, is presented with implementation details. The target platform chosen is the popular OMAP 3530 processor for embedded media systems. It has an asymmetric dual-core architecture with an ARM Cortex-A8 and a TMS320C64x Digital Signal Processor (DSP). The development platform was the BeagleBoard with 256 MB of NAND RAM and 256 MB SDRAM memory. The basic image correlation algorithm is chosen for benchmarking as it finds widespread application for various template matching tasks such as face-recognition. The basic algorithm prototypes conform to OpenCV, a popular computer vision library. OpenCV algorithms can be easily ported to the ARM core which runs a popular operating system such as Linux or Windows CE. However, the DSP is architecturally more efficient at handling DFT algorithms. The algorithms are tested on a variety of images and performance results are presented measuring the speedup obtained due to dual-core implementation. A major advantage of this approach is that it allows the ARM processor to perform important real-time tasks, while the DSP addresses performance-hungry algorithms.

  18. Atlas2 Cloud: a framework for personal genome analysis in the cloud

    PubMed Central

    2012-01-01

    Background Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. Results We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. Conclusions We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms. PMID:23134663

  19. Atlas2 Cloud: a framework for personal genome analysis in the cloud.

    PubMed

    Evani, Uday S; Challis, Danny; Yu, Jin; Jackson, Andrew R; Paithankar, Sameer; Bainbridge, Matthew N; Jakkamsetti, Adinarayana; Pham, Peter; Coarfa, Cristian; Milosavljevic, Aleksandar; Yu, Fuli

    2012-01-01

    Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms.

  20. Strategies for Sharing Seismic Data Among Multiple Computer Platforms

    NASA Astrophysics Data System (ADS)

    Baker, L. M.; Fletcher, J. B.

    2001-12-01

    Seismic waveform data is readily available from a variety of sources, but it often comes in a distinct, instrument-specific data format. For example, data may be from portable seismographs, such as those made by Refraction Technology or Kinemetrics, from permanent seismograph arrays, such as the USGS Parkfield Dense Array, from public data centers, such as the IRIS Data Center, or from personal communication with other researchers through e-mail or ftp. A computer must be selected to import the data - usually whichever is the most suitable for reading the originating format. However, the computer best suited for a specific analysis may not be the same. When copies of the data are then made for analysis, a proliferation of copies of the same data results, in possibly incompatible, computer-specific formats. In addition, if an error is detected and corrected in one copy, or some other change is made, all the other copies must be updated to preserve their validity. Keeping track of what data is available, where it is located, and which copy is authoritative requires an effort that is easy to neglect. We solve this problem by importing waveform data to a shared network file server that is accessible to all our computers on our campus LAN. We use a Network Appliance file server running Sun's Network File System (NFS) software. Using an NFS client software package on each analysis computer, waveform data can then be read by our MatLab or Fortran applications without first copying the data. Since there is a single copy of the waveform data in a single location, the NFS file system hierarchy provides an implicit complete waveform data catalog and the single copy is inherently authoritative. Another part of our solution is to convert the original data into a blocked-binary format (known historically as USGS DR100 or VFBB format) that is interpreted by MatLab or Fortran library routines available on each computer so that the idiosyncrasies of each machine are not visible to the user. Commercial software packages, such as MatLab, also have the ability to share data in their own formats across multiple computer platforms. Our Fortran applications can create plot files in Adobe PostScript, Illustrator, and Portable Document Format (PDF) formats. Vendor support for reading these files is readily available on multiple computer platforms. We will illustrate by example our strategies for sharing seismic data among our multiple computer platforms, and we will discuss our positive and negative experiences. We will include our solutions for handling the different byte ordering, floating-point formats, and text file ``end-of-line'' conventions on the various computer platforms we use (6 different operating systems on 5 processor architectures).

  1. Dance-the-Music: an educational platform for the modeling, recognition and audiovisual monitoring of dance steps using spatiotemporal motion templates

    NASA Astrophysics Data System (ADS)

    Maes, Pieter-Jan; Amelynck, Denis; Leman, Marc

    2012-12-01

    In this article, a computational platform is presented, entitled "Dance-the-Music", that can be used in a dance educational context to explore and learn the basics of dance steps. By introducing a method based on spatiotemporal motion templates, the platform facilitates to train basic step models from sequentially repeated dance figures performed by a dance teacher. Movements are captured with an optical motion capture system. The teachers' models can be visualized from a first-person perspective to instruct students how to perform the specific dance steps in the correct manner. Moreover, recognition algorithms-based on a template matching method-can determine the quality of a student's performance in real time by means of multimodal monitoring techniques. The results of an evaluation study suggest that the Dance-the-Music is effective in helping dance students to master the basics of dance figures.

  2. Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays

    PubMed Central

    Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Wetzstein, Gordon

    2017-01-01

    From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one. PMID:28193871

  3. Patient-Centered e-Health Record over the Cloud.

    PubMed

    Koumaditis, Konstantinos; Themistocleous, Marinos; Vassilacopoulos, George; Prentza, Andrianna; Kyriazis, Dimosthenis; Malamateniou, Flora; Maglaveras, Nicos; Chouvarda, Ioanna; Mourouzis, Alexandros

    2014-01-01

    The purpose of this paper is to introduce the Patient-Centered e-Health (PCEH) conceptual aspects alongside a multidisciplinary project that combines state-of-the-art technologies like cloud computing. The project, by combining several aspects of PCEH, such as: (a) electronic Personal Healthcare Record (e-PHR), (b) homecare telemedicine technologies, (c) e-prescribing, e-referral, e-learning, with advanced technologies like cloud computing and Service Oriented Architecture (SOA), will lead to an innovative integrated e-health platform of many benefits to the society, the economy, the industry, and the research community. To achieve this, a consortium of experts, both from industry (two companies, one hospital and one healthcare organization) and academia (three universities), was set to investigate, analyse, design, build and test the new platform. This paper provides insights to the PCEH concept and to the current stage of the project. In doing so, we aim at increasing the awareness of this important endeavor and sharing the lessons learned so far throughout our work.

  4. Address-event-based platform for bioinspired spiking systems

    NASA Astrophysics Data System (ADS)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA to allow the platform to implement eventbased algorithms to interact to the AER system, like control algorithms, network connectivity, USB support, etc. The LVDS transceiver allows a bandwidth of up to 1.32 Gbps, around ~66 Mega events per second (Mevps).

  5. A novel collaborative e-learning platform for medical students - ALERT STUDENT

    PubMed Central

    2014-01-01

    Background The increasing complexity of medical curricula would benefit from adaptive computer supported collaborative learning systems that support study management using instructional design and learning object principles. However, to our knowledge, there are scarce reports regarding applications developed to meet this goal and encompass the complete medical curriculum. The aim of ths study was to develop and assess the usability of an adaptive computer supported collaborative learning system for medical students to manage study sessions. Results A study platform named ALERT STUDENT was built as a free web application. Content chunks are represented as Flashcards that hold knowledge and open ended questions. These can be created in a collaborative fashion. Multiple Flashcards can be combined into custom stacks called Notebooks that can be accessed in study Groups that belong to the user institution. The system provides a Study Mode that features text markers, text notes, timers and color-coded content prioritization based on self-assessment of open ended questions presented in a Quiz Mode. Time spent studying and Perception of knowledge are displayed for each student and peers using charts. Computer supported collaborative learning is achieved by allowing for simultaneous creation of Notebooks and self-assessment questions by many users in a pre-defined Group. Past personal performance data is retrieved when studying new Notebooks containing previously studied Flashcards. Self-report surveys showed that students highly agreed that the system was useful and were willing to use it as a reference tool. Conclusions The platform employs various instructional design and learning object principles in a computer supported collaborative learning platform for medical students that allows for study management. The application broadens student insight over learning results and supports informed decisions based on past learning performance. It serves as a potential educational model for the medical education setting that has gathered strong positive feedback from students at our school. This platform provides a case study on how effective blending of instructional design and learning object principles can be brought together to manage study, and takes an important step towards bringing information management tools to support study decisions and improving learning outcomes. PMID:25017028

  6. A novel collaborative e-learning platform for medical students - ALERT STUDENT.

    PubMed

    Taveira-Gomes, Tiago; Saffarzadeh, Areo; Severo, Milton; Guimarães, M Jorge; Ferreira, Maria Amélia

    2014-07-14

    The increasing complexity of medical curricula would benefit from adaptive computer supported collaborative learning systems that support study management using instructional design and learning object principles. However, to our knowledge, there are scarce reports regarding applications developed to meet this goal and encompass the complete medical curriculum. The aim of ths study was to develop and assess the usability of an adaptive computer supported collaborative learning system for medical students to manage study sessions. A study platform named ALERT STUDENT was built as a free web application. Content chunks are represented as Flashcards that hold knowledge and open ended questions. These can be created in a collaborative fashion. Multiple Flashcards can be combined into custom stacks called Notebooks that can be accessed in study Groups that belong to the user institution. The system provides a Study Mode that features text markers, text notes, timers and color-coded content prioritization based on self-assessment of open ended questions presented in a Quiz Mode. Time spent studying and Perception of knowledge are displayed for each student and peers using charts. Computer supported collaborative learning is achieved by allowing for simultaneous creation of Notebooks and self-assessment questions by many users in a pre-defined Group. Past personal performance data is retrieved when studying new Notebooks containing previously studied Flashcards. Self-report surveys showed that students highly agreed that the system was useful and were willing to use it as a reference tool. The platform employs various instructional design and learning object principles in a computer supported collaborative learning platform for medical students that allows for study management. The application broadens student insight over learning results and supports informed decisions based on past learning performance. It serves as a potential educational model for the medical education setting that has gathered strong positive feedback from students at our school.This platform provides a case study on how effective blending of instructional design and learning object principles can be brought together to manage study, and takes an important step towards bringing information management tools to support study decisions and improving learning outcomes.

  7. Portable Computer Technology (PCT) Research and Development Program Phase 2

    NASA Technical Reports Server (NTRS)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  8. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    PubMed

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  9. A scalable parallel black oil simulator on distributed memory parallel computers

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Liu, Hui; Chen, Zhangxin

    2015-11-01

    This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.

  10. The design of an m-Health monitoring system based on a cloud computing platform

    NASA Astrophysics Data System (ADS)

    Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi

    2017-01-01

    Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.

  11. A PC-based multispectral scanner data evaluation workstation: Application to Daedalus scanners

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; James, Mark W.; Smith, Matthew R.; Atkinson, Robert J.

    1991-01-01

    In late 1989, a personal computer (PC)-based data evaluation workstation was developed to support post flight processing of Multispectral Atmospheric Mapping Sensor (MAMS) data. The MAMS Quick View System (QVS) is an image analysis and display system designed to provide the capability to evaluate Daedalus scanner data immediately after an aircraft flight. Even in its original form, the QVS offered the portability of a personal computer with the advanced analysis and display features of a mainframe image analysis system. It was recognized, however, that the original QVS had its limitations, both in speed and processing of MAMS data. Recent efforts are presented that focus on overcoming earlier limitations and adapting the system to a new data tape structure. In doing so, the enhanced Quick View System (QVS2) will accommodate data from any of the four spectrometers used with the Daedalus scanner on the NASA ER2 platform. The QVS2 is designed around the AST 486/33 MHz CPU personal computer and comes with 10 EISA expansion slots, keyboard, and 4.0 mbytes of memory. Specialized PC-McIDAS software provides the main image analysis and display capability for the system. Image analysis and display of the digital scanner data is accomplished with PC-McIDAS software.

  12. Programmable personality interface for the dynamic infrared scene generator (IRSG2)

    NASA Astrophysics Data System (ADS)

    Buford, James A., Jr.; Mobley, Scott B.; Mayhall, Anthony J.; Braselton, William J.

    1998-07-01

    As scene generator platforms begin to rely specifically on commercial off-the-shelf (COTS) hardware and software components, the need for high speed programmable personality interfaces (PPIs) are required for interfacing to Infrared (IR) flight computer/processors and complex IR projectors in the hardware-in-the-loop (HWIL) simulation facilities. Recent technological advances and innovative applications of established technologies are beginning to allow development of cost effective PPIs to interface to COTS scene generators. At the U.S. Army Aviation and Missile Command (AMCOM) Missile Research, Development, and Engineering Center (MRDEC) researchers have developed such a PPI to reside between the AMCOM MRDEC IR Scene Generator (IRSG) and either a missile flight computer or the dynamic Laser Diode Array Projector (LDAP). AMCOM MRDEC has developed several PPIs for the first and second generation IRSGs (IRSG1 and IRSG2), which are based on Silicon Graphics Incorporated (SGI) Onyx and Onyx2 computers with Reality Engine 2 (RE2) and Infinite Reality (IR/IR2) graphics engines. This paper provides an overview of PPIs designed, integrated, tested, and verified at AMCOM MRDEC, specifically the IRSG2's PPI.

  13. Mobile computing in critical care.

    PubMed

    Lapinsky, Stephen E

    2007-03-01

    Handheld computing devices are increasingly used by health care workers, and offer a mobile platform for point-of-care information access. Improved technology, with larger memory capacity, higher screen resolution, faster processors, and wireless connectivity has broadened the potential roles for these devices in critical care. In addition to the personal information management functions, handheld computers have been used to access reference information, management guidelines and pharmacopoeias as well as to track the educational experience of trainees. They can act as an interface with a clinical information system, providing rapid access to patient information. Despite their popularity, these devices have limitations related to their small size, and acceptance by physicians has not been uniform. In the critical care environment, the risk of transmitting microorganisms by such a portable device should always be considered.

  14. IVAG: An Integrative Visualization Application for Various Types of Genomic Data Based on R-Shiny and the Docker Platform.

    PubMed

    Lee, Tae-Rim; Ahn, Jin Mo; Kim, Gyuhee; Kim, Sangsoo

    2017-12-01

    Next-generation sequencing (NGS) technology has become a trend in the genomics research area. There are many software programs and automated pipelines to analyze NGS data, which can ease the pain for traditional scientists who are not familiar with computer programming. However, downstream analyses, such as finding differentially expressed genes or visualizing linkage disequilibrium maps and genome-wide association study (GWAS) data, still remain a challenge. Here, we introduce a dockerized web application written in R using the Shiny platform to visualize pre-analyzed RNA sequencing and GWAS data. In addition, we have integrated a genome browser based on the JBrowse platform and an automated intermediate parsing process required for custom track construction, so that users can easily build and navigate their personal genome tracks with in-house datasets. This application will help scientists perform series of downstream analyses and obtain a more integrative understanding about various types of genomic data by interactively visualizing them with customizable options.

  15. A Web Browsing System by Eye-gaze Input

    NASA Astrophysics Data System (ADS)

    Abe, Kiyohiko; Owada, Kosuke; Ohi, Shoichi; Ohyama, Minoru

    We have developed an eye-gaze input system for people with severe physical disabilities, such as amyotrophic lateral sclerosis (ALS) patients. This system utilizes a personal computer and a home video camera to detect eye-gaze under natural light. The system detects both vertical and horizontal eye-gaze by simple image analysis, and does not require special image processing units or sensors. We also developed the platform for eye-gaze input based on our system. In this paper, we propose a new web browsing system for physically disabled computer users as an application of the platform for eye-gaze input. The proposed web browsing system uses a method of direct indicator selection. The method categorizes indicators by their function. These indicators are hierarchized relations; users can select the felicitous function by switching indicators group. This system also analyzes the location of selectable object on web page, such as hyperlink, radio button, edit box, etc. This system stores the locations of these objects, in other words, the mouse cursor skips to the object of candidate input. Therefore it enables web browsing at a faster pace.

  16. Integration of the virtual model of a Stewart platform with the avatar of a vehicle in a virtual reality

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2016-08-01

    The development of methods of computer aided design and engineering allows conducting virtual tests, among others concerning motion simulation of technical means. The paper presents a method of integrating an object in the form of a virtual model of a Stewart platform with an avatar of a vehicle moving in a virtual environment. The area of the problem includes issues related to the problem of fidelity of mapping the work of the analyzed technical mean. The main object of investigations is a 3D model of a Stewart platform, which is a subsystem of the simulator designated for driving learning for disabled persons. The analyzed model of the platform, prepared for motion simulation, was created in the “Motion Simulation” module of a CAD/CAE class system Siemens PLM NX. Whereas the virtual environment, in which the moves the avatar of the passenger car, was elaborated in a VR class system EON Studio. The element integrating both of the mentioned software environments is a developed application that reads information from the virtual reality (VR) concerning the current position of the car avatar. Then, basing on the accepted algorithm, it sends control signals to respective joints of the model of the Stewart platform (CAD).

  17. SARANA: language, compiler and run-time system support for spatially aware and resource-aware mobile computing.

    PubMed

    Hari, Pradip; Ko, Kevin; Koukoumidis, Emmanouil; Kremer, Ulrich; Martonosi, Margaret; Ottoni, Desiree; Peh, Li-Shiuan; Zhang, Pei

    2008-10-28

    Increasingly, spatial awareness plays a central role in many distributed and mobile computing applications. Spatially aware applications rely on information about the geographical position of compute devices and their supported services in order to support novel functionality. While many spatial application drivers already exist in mobile and distributed computing, very little systems research has explored how best to program these applications, to express their spatial and temporal constraints, and to allow efficient implementations on highly dynamic real-world platforms. This paper proposes the SARANA system architecture, which includes language and run-time system support for spatially aware and resource-aware applications. SARANA allows users to express spatial regions of interest, as well as trade-offs between quality of result (QoR), latency and cost. The goal is to produce applications that use resources efficiently and that can be run on diverse resource-constrained platforms ranging from laptops to personal digital assistants and to smart phones. SARANA's run-time system manages QoR and cost trade-offs dynamically by tracking resource availability and locations, brokering usage/pricing agreements and migrating programs to nodes accordingly. A resource cost model permeates the SARANA system layers, permitting users to express their resource needs and QoR expectations in units that make sense to them. Although we are still early in the system development, initial versions have been demonstrated on a nine-node system prototype.

  18. Energy Consumption Management of Virtual Cloud Computing Platform

    NASA Astrophysics Data System (ADS)

    Li, Lin

    2017-11-01

    For energy consumption management research on virtual cloud computing platforms, energy consumption management of virtual computers and cloud computing platform should be understood deeper. Only in this way can problems faced by energy consumption management be solved. In solving problems, the key to solutions points to data centers with high energy consumption, so people are in great need to use a new scientific technique. Virtualization technology and cloud computing have become powerful tools in people’s real life, work and production because they have strong strength and many advantages. Virtualization technology and cloud computing now is in a rapid developing trend. It has very high resource utilization rate. In this way, the presence of virtualization and cloud computing technologies is very necessary in the constantly developing information age. This paper has summarized, explained and further analyzed energy consumption management questions of the virtual cloud computing platform. It eventually gives people a clearer understanding of energy consumption management of virtual cloud computing platform and brings more help to various aspects of people’s live, work and son on.

  19. SPAIDE: A Real-time Research Platform for the Clarion CII/90K Cochlear Implant

    NASA Astrophysics Data System (ADS)

    Van Immerseel, L.; Peeters, S.; Dykmans, P.; Vanpoucke, F.; Bracke, P.

    2005-12-01

    SPAIDE ( sound-processing algorithm integrated development environment) is a real-time platform of Advanced Bionics Corporation (Sylmar, Calif, USA) to facilitate advanced research on sound-processing and electrical-stimulation strategies with the Clarion CII and 90K implants. The platform is meant for testing in the laboratory. SPAIDE is conceptually based on a clear separation of the sound-processing and stimulation strategies, and, in specific, on the distinction between sound-processing and stimulation channels and electrode contacts. The development environment has a user-friendly interface to specify sound-processing and stimulation strategies, and includes the possibility to simulate the electrical stimulation. SPAIDE allows for real-time sound capturing from file or audio input on PC, sound processing and application of the stimulation strategy, and streaming the results to the implant. The platform is able to cover a broad range of research applications; from noise reduction and mimicking of normal hearing, over complex (simultaneous) stimulation strategies, to psychophysics. The hardware setup consists of a personal computer, an interface board, and a speech processor. The software is both expandable and to a great extent reusable in other applications.

  20. FORCEnet Net Centric Architecture - A Standards View

    DTIC Science & Technology

    2006-06-01

    SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM DATA INTERCHANGE/INTEGRATION DATA MANAGEMENT APPLICATION...R V I C E P L A T F O R M S E R V I C E F R A M E W O R K USER-FACING SERVICES SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM...E F R A M E W O R K USER-FACING SERVICES SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM DATA INTERCHANGE/INTEGRATION

  1. Person detection and tracking with a 360° lidar system

    NASA Astrophysics Data System (ADS)

    Hammer, Marcus; Hebel, Marcus; Arens, Michael

    2017-10-01

    Today it is easily possible to generate dense point clouds of the sensor environment using 360° LiDAR (Light Detection and Ranging) sensors which are available since a number of years. The interpretation of these data is much more challenging. For the automated data evaluation the detection and classification of objects is a fundamental task. Especially in urban scenarios moving objects like persons or vehicles are of particular interest, for instance in automatic collision avoidance, for mobile sensor platforms or surveillance tasks. In literature there are several approaches for automated person detection in point clouds. While most techniques show acceptable results in object detection, the computation time is often crucial. The runtime can be problematic, especially due to the amount of data in the panoramic 360° point clouds. On the other hand, for most applications an object detection and classification in real time is needed. The paper presents a proposal for a fast, real-time capable algorithm for person detection, classification and tracking in panoramic point clouds.

  2. Pre-Hardware Optimization of Spacecraft Image Processing Algorithms and Hardware Implementation

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Petrick, David J.; Flatley, Thomas P.; Hestnes, Phyllis; Jentoft-Nilsen, Marit; Day, John H. (Technical Monitor)

    2002-01-01

    Spacecraft telemetry rates and telemetry product complexity have steadily increased over the last decade presenting a problem for real-time processing by ground facilities. This paper proposes a solution to a related problem for the Geostationary Operational Environmental Spacecraft (GOES-8) image data processing and color picture generation application. Although large super-computer facilities are the obvious heritage solution, they are very costly, making it imperative to seek a feasible alternative engineering solution at a fraction of the cost. The proposed solution is based on a Personal Computer (PC) platform and synergy of optimized software algorithms, and reconfigurable computing hardware (RC) technologies, such as Field Programmable Gate Arrays (FPGA) and Digital Signal Processors (DSP). It has been shown that this approach can provide superior inexpensive performance for a chosen application on the ground station or on-board a spacecraft.

  3. Intelligent Systems for Assessing Aging Changes: Home-Based, Unobtrusive, and Continuous Assessment of Aging

    PubMed Central

    Maxwell, Shoshana A.; Mattek, Nora; Hayes, Tamara L.; Dodge, Hiroko; Pavel, Misha; Jimison, Holly B.; Wild, Katherine; Boise, Linda; Zitzelberger, Tracy A.

    2011-01-01

    Objectives. To describe a longitudinal community cohort study, Intelligent Systems for Assessing Aging Changes, that has deployed an unobtrusive home-based assessment platform in many seniors homes in the existing community. Methods. Several types of sensors have been installed in the homes of 265 elderly persons for an average of 33 months. Metrics assessed by the sensors include total daily activity, time out of home, and walking speed. Participants were given a computer as well as training, and computer usage was monitored. Participants are assessed annually with health and function questionnaires, physical examinations, and neuropsychological testing. Results. Mean age was 83.3 years, mean years of education was 15.5, and 73% of cohort were women. During a 4-week snapshot, participants left their home twice a day on average for a total of 208 min per day. Mean in-home walking speed was 61.0 cm/s. Participants spent 43% of days on the computer averaging 76 min per day. Discussion. These results demonstrate for the first time the feasibility of engaging seniors in a large-scale deployment of in-home activity assessment technology and the successful collection of these activity metrics. We plan to use this platform to determine if continuous unobtrusive monitoring may detect incident cognitive decline. PMID:21743050

  4. Seqcrawler: biological data indexing and browsing platform.

    PubMed

    Sallou, Olivier; Bretaudeau, Anthony; Roult, Aurelien

    2012-07-24

    Seqcrawler takes its roots in software like SRS or Lucegene. It provides an indexing platform to ease the search of data and meta-data in biological banks and it can scale to face the current flow of data. While many biological bank search tools are available on the Internet, mainly provided by large organizations to search their data, there is a lack of free and open source solutions to browse one's own set of data with a flexible query system and able to scale from a single computer to a cloud system. A personal index platform will help labs and bioinformaticians to search their meta-data but also to build a larger information system with custom subsets of data. The software is scalable from a single computer to a cloud-based infrastructure. It has been successfully tested in a private cloud with 3 index shards (pieces of index) hosting ~400 millions of sequence information (whole GenBank, UniProt, PDB and others) for a total size of 600 GB in a fault tolerant architecture (high-availability). It has also been successfully integrated with software to add extra meta-data from blast results to enhance users' result analysis. Seqcrawler provides a complete open source search and store solution for labs or platforms needing to manage large amount of data/meta-data with a flexible and customizable web interface. All components (search engine, visualization and data storage), though independent, share a common and coherent data system that can be queried with a simple HTTP interface. The solution scales easily and can also provide a high availability infrastructure.

  5. Using e-Learning Platforms for Mastery Learning in Developmental Mathematics Courses

    ERIC Educational Resources Information Center

    Boggs, Stacey; Shore, Mark; Shore, JoAnna

    2004-01-01

    Many colleges and universities have adopted e-learning platforms to utilize computers as an instructional tool in developmental (i.e., beginning and intermediate algebra) mathematics courses. An e-learning platform is a computer program used to enhance course instruction via computers and the Internet. Allegany College of Maryland is currently…

  6. Macedonian journal of chemistry and chemical engineering: open journal systems--editor's perspective.

    PubMed

    Zdravkovski, Zoran

    2014-01-01

    The development and availability of personal computers and software as well as printing techniques in the last twenty years have made a profound change in the publication of scientific journals. Additionally, the Internet in the last decade has revolutionized the publication process to the point of changing the basic paradigm of printed journals. The Macedonian Journal of Chemistry and Chemical Engineering in its 40-year history has adopted and adapted to all these transformations. In order to keep up with the inevitable changes, as editor-in-chief I felt my responsibility was to introduce an electronic editorial managing of the journal. The choice was between commercial and open source platforms, and because of the limited funding of the journal we chose the latter. We decided on Open Journal Systems, which provided online submission and management of all content, had flexible configuration--requirements, sections, review process, etc., had options for comprehensive indexing, offered various reading tools, had email notification and commenting ability for readers, had an option for thesis abstracts and was installed locally. However, since there is limited support it requires a moderate computer knowledge/skills and effort in order to set up. Overall, it is an excellent editorial platform and a convenient solution for journals with a low budget or journals that do not want to spend their resources on commercial platforms or simply support the idea of open source software.

  7. Novel droplet platforms for the detection of disease biomarkers.

    PubMed

    Zec, Helena; Shin, Dong Jin; Wang, Tza-Huei

    2014-09-01

    Personalized medicine - healthcare based on individual genetic variation - has the potential to transform the way healthcare is delivered to patients. The promise of personalized medicine has been predicated on the predictive and diagnostic power of genomic and proteomic biomarkers. Biomarker screening may help improve health outcomes, for example, by identifying individuals' susceptibility to diseases and predicting how patients will respond to drugs. Microfluidic droplet technology offers an exciting opportunity to revolutionize the accessibility of personalized medicine. A framework for the role of droplet microfluidics in biomarker detection can be based on two main themes. Emulsion-based microdroplet platforms can provide new ways to measure and detect biomolecules. In addition, microdroplet platforms facilitate high-throughput screening of biomarkers. Meanwhile, surface-based droplet platforms provide an opportunity to develop miniaturized diagnostic systems. These platforms may function as portable benchtop environments that dramatically shorten the transition of a benchtop assay into a point-of-care format.

  8. Boutiques: a flexible framework to integrate command-line applications in computing platforms.

    PubMed

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-05-01

    We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science.

  9. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  10. Playable stories: Making programming and 3D role-playing game design personally and socially relevant

    NASA Astrophysics Data System (ADS)

    Ingram-Goble, Adam

    This is an exploratory design study of a novel system for learning programming and 3D role-playing game design as tools for social change. This study was conducted at two sites. Participants in the study were ages 9-14 and worked for up to 15 hours with the platform to learn how to program and design video games with personally or socially relevant narratives. This first study was successful in that students learned to program a narrative game, and they viewed the social problem framing for the practices as an interesting aspect of the experience. The second study provided illustrative examples of how providing less general structure up-front, afforded players the opportunity to produce the necessary structures as needed for their particular design, and therefore had a richer understanding of what those structures represented. This study demonstrates that not only were participants able to use computational thinking skills such as Boolean and conditional logic, planning, modeling, abstraction, and encapsulation, they were able to bridge these skills to social domains they cared about. In particular, participants created stories about socially relevant topics without to explicit pushes by the instructors. The findings also suggest that the rapid uptake, and successful creation of personally and socially relevant narratives may have been facilitated by close alignment between the conceptual tools represented in the platform, and the domain of 3D role-playing games.

  11. A vector-product information retrieval system adapted to heterogeneous, distributed computing environments

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E.

    1991-01-01

    Vector-product information retrieval (IR) systems produce retrieval results superior to all other searching methods but presently have no commercial implementations beyond the personal computer environment. The NASA Electronic Library Systems (NELS) provides a ranked list of the most likely relevant objects in collections in response to a natural language query. Additionally, the system is constructed using standards and tools (Unix, X-Windows, Notif, and TCP/IP) that permit its operation in organizations that possess many different hosts, workstations, and platforms. There are no known commercial equivalents to this product at this time. The product has applications in all corporate management environments, particularly those that are information intensive, such as finance, manufacturing, biotechnology, and research and development.

  12. Medical diagnostics with mobile devices: Comparison of intrinsic and extrinsic sensing.

    PubMed

    Kwon, L; Long, K D; Wan, Y; Yu, H; Cunningham, B T

    2016-01-01

    We review the recent development of mobile detection instruments used for medical diagnostics, and consider the relative advantages of approaches that utilize the internal sensing capabilities of commercially available mobile communication devices (such as smartphones and tablet computers) compared to those that utilize a custom external sensor module. In this review, we focus specifically upon mobile medical diagnostic platforms that are being developed to serve the need in global health, personalized medicine, and point-of-care diagnostics. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. A novel sensor platform for the rapid hydraulic characterisation of freshwater ecosystems

    NASA Astrophysics Data System (ADS)

    Kriechbaumer, Thomas; Blackburn, Kim; Breckon, Toby; Gill, Andrew; Everard, Nick; Wright, Ros; Rivas Casado, Monica

    2014-05-01

    The spatially explicit quantification of hydraulic features provides valuable information for the physical habitat assessment of freshwater ecosystems. Collection of data on water velocities and depths using in-situ current meters or acoustic sensors on tethered boats is time-consuming and requires good site accessibility. Moreover, on smaller rivers precise spatial data referencing can be challenging, as river bank vegetation can block sky view to navigation satellites over a considerable proportion of the water surface. This paper describes the development and testing of a new small sized remote control sensor platform and a novel approach to spatial data referencing based on computer vision to enable the rapid hydraulic characterisation of habitats in small rivers. It highlights the manifold opportunities that recent achievements in the disciplines of computer science and electronics can create for the environmental sciences. The platform carries an acoustic Doppler current profiler (ADCP) to rapidly collect large amounts of data on water velocities and river depths, from which the spatial and temporal water velocity distributions can be derived. The 1.30m long and 0.60m wide platform hull has been designed to enable single person deployment. Platform pitch and roll magnitudes and periods are quantified at a frequency of 512Hz through a low-cost inertial measurement unit on board, allowing the quantification of the errors that these platform motions can cause in the ADCP data. Jet propulsion and a tail thruster ensure high manoeuvrability, minimum draught operation and greater safety than propellers. An on-board Raspberry Pi computer enables time-synchronised logging of data from a GPS unit, the ADCP and further sensors that may be added to the platform. Real-time serial communication between the Raspberry Pi and the embedded propulsion system control (an Arduino Uno microcontroller) builds the basis for future platform autonomy. This can enable the autonomous implementation of pre-defined data collection strategies. Through field experiments, a set of technologies to position the platform in the river environment has been evaluated. Simultaneous localisation and mapping (SLAM) based on frames from a stereo camera has been identified as a promising alternative to satellite-based platform positioning. In terrestrial environments, SLAM has recently achieved high position accuracies, comparable with those of differential GPS. Software that implements SLAM for the river environment is currently developed. This constitutes the first application of visual SLAM on water and, to the authors' knowledge, its first application in the context of environmental research. Furthermore, platform tracking with a motorised Total Station has been found to be a highly accurate (cm-level) positioning technique despite fast platform movements, as long as line of sight to the tracked object is given. In the near future, the platform will be used to characterise the hydraulic conditions downstream of fish passes in order to rapidly assess the attractivity of these facilities to migrating fish species. Several of the applied technologies (e.g. Raspberry Pi, Arduino) are cheap and easily accessible. They provide a multitude of opportunities to facilitate data collection and prototype development in the environmental sciences.

  14. Boutiques: a flexible framework to integrate command-line applications in computing platforms

    PubMed Central

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-01-01

    Abstract We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science. PMID:29718199

  15. A Lightweight Remote Parallel Visualization Platform for Interactive Massive Time-varying Climate Data Analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Zhang, T.; Huang, Q.; Liu, Q.

    2014-12-01

    Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.

  16. Hybrid 3D printing: a game-changer in personalized cardiac medicine?

    PubMed

    Kurup, Harikrishnan K N; Samuel, Bennett P; Vettukattil, Joseph J

    2015-12-01

    Three-dimensional (3D) printing in congenital heart disease has the potential to increase procedural efficiency and patient safety by improving interventional and surgical planning and reducing radiation exposure. Cardiac magnetic resonance imaging and computed tomography are usually the source datasets to derive 3D printing. More recently, 3D echocardiography has been demonstrated to derive 3D-printed models. The integration of multiple imaging modalities for hybrid 3D printing has also been shown to create accurate printed heart models, which may prove to be beneficial for interventional cardiologists, cardiothoracic surgeons, and as an educational tool. Further advancements in the integration of different imaging modalities into a single platform for hybrid 3D printing and virtual 3D models will drive the future of personalized cardiac medicine.

  17. Intra-Personal and Inter-Personal Kinetic Synergies During Jumping.

    PubMed

    Slomka, Kajetan; Juras, Grzegorz; Sobota, Grzegorz; Furmanek, Mariusz; Rzepko, Marian; Latash, Mark L

    2015-12-22

    We explored synergies between two legs and two subjects during preparation for a long jump into a target. Synergies were expected during one-person jumping. No such synergies were expected between two persons jumping in parallel without additional contact, while synergies were expected to emerge with haptic contact and become stronger with strong mechanical contact. Subjects performed jumps either alone (each foot standing on a separate force platform) or in dyads (parallel to each other, each person standing on a separate force platform) without any contact, with haptic contact, and with strong coupling. Strong negative correlations between pairs of force variables (strong synergies) were seen in the vertical force in one-person jumps and weaker synergies in two-person jumps with the strong contact. For other force variables, only weak synergies were present in one-person jumps and no negative correlations between pairs of force variable for two-person jumps. Pairs of moment variables from the two force platforms at steady state showed positive correlations, which were strong in one-person jumps and weaker, but still significant, in two-person jumps with the haptic and strong contact. Anticipatory synergy adjustments prior to action initiation were observed in one-person trials only. We interpret the different results for the force and moment variables at steady state as reflections of postural sway.

  18. Intra-Personal and Inter-Personal Kinetic Synergies During Jumping

    PubMed Central

    Slomka, Kajetan; Juras, Grzegorz; Sobota, Grzegorz; Furmanek, Mariusz; Rzepko, Marian; Latash, Mark L.

    2015-01-01

    We explored synergies between two legs and two subjects during preparation for a long jump into a target. Synergies were expected during one-person jumping. No such synergies were expected between two persons jumping in parallel without additional contact, while synergies were expected to emerge with haptic contact and become stronger with strong mechanical contact. Subjects performed jumps either alone (each foot standing on a separate force platform) or in dyads (parallel to each other, each person standing on a separate force platform) without any contact, with haptic contact, and with strong coupling. Strong negative correlations between pairs of force variables (strong synergies) were seen in the vertical force in one-person jumps and weaker synergies in two-person jumps with the strong contact. For other force variables, only weak synergies were present in one-person jumps and no negative correlations between pairs of force variable for two-person jumps. Pairs of moment variables from the two force platforms at steady state showed positive correlations, which were strong in one-person jumps and weaker, but still significant, in two-person jumps with the haptic and strong contact. Anticipatory synergy adjustments prior to action initiation were observed in one-person trials only. We interpret the different results for the force and moment variables at steady state as reflections of postural sway. PMID:26839608

  19. An environment for representing and using medical checklists on mobile devices.

    PubMed

    Losiouk, Eleonora; Lanzola, Giordano; Visetti, Enrico; Quaglini, Silvana

    2015-01-01

    Checklists have been recently introduced in the medical practice playing the role of summarized guidelines, streamlined for rapid consultations. However, there are still some barriers preventing their widespread diffusion. Those concern the representation, dissemination and update of their underlying knowledge, as well as the means currently adopted for their actual use, that is still mostly paper-based. In this paper we propose a new platform for the implementation and use of checklists. First, an editor supports domain experts in porting the checklist from the traditional paper-based format into an electronic one. Then, an application allows the distribution and usage of checklists on portable devices such as smartphones and tablets, exploiting their additional features in comparison with those made available by Personal Computers. The platform will be illustrated through some examples designed to support volunteers and paramedic staff in dealing with emergency situations.

  20. Formal design and verification of a reliable computing platform for real-time control. Phase 1: Results

    NASA Technical Reports Server (NTRS)

    Divito, Ben L.; Butler, Ricky W.; Caldwell, James L.

    1990-01-01

    A high-level design is presented for a reliable computing platform for real-time control applications. Design tradeoffs and analyses related to the development of the fault-tolerant computing platform are discussed. The architecture is formalized and shown to satisfy a key correctness property. The reliable computing platform uses replicated processors and majority voting to achieve fault tolerance. Under the assumption of a majority of processors working in each frame, it is shown that the replicated system computes the same results as a single processor system not subject to failures. Sufficient conditions are obtained to establish that the replicated system recovers from transient faults within a bounded amount of time. Three different voting schemes are examined and proved to satisfy the bounded recovery time conditions.

  1. The Identity Mapping Project: Demographic differences in patterns of distributed identity.

    PubMed

    Gilbert, Richard L; Dionisio, John David N; Forney, Andrew; Dorin, Philip

    2015-01-01

    The advent of cloud computing and a multi-platform digital environment is giving rise to a new phase of human identity called "The Distributed Self." In this conception, aspects of the self are distributed into a variety of 2D and 3D digital personas with the capacity to reflect any number of combinations of now malleable personality traits. In this way, the source of human identity remains internal and embodied, but the expression or enactment of the self becomes increasingly external, disembodied, and distributed on demand. The Identity Mapping Project (IMP) is an interdisciplinary collaboration between psychology and computer Science designed to empirically investigate the development of distributed forms of identity. Methodologically, it collects a large database of "identity maps" - computerized graphical representations of how active someone is online and how their identity is expressed and distributed across 7 core digital domains: email, blogs/personal websites, social networks, online forums, online dating sites, character based digital games, and virtual worlds. The current paper reports on gender and age differences in online identity based on an initial database of distributed identity profiles.

  2. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads.

    PubMed

    Stone, John E; Hallock, Michael J; Phillips, James C; Peterson, Joseph R; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-05-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers.

  3. AIRE-Linux

    NASA Astrophysics Data System (ADS)

    Zhou, Jianfeng; Xu, Benda; Peng, Chuan; Yang, Yang; Huo, Zhuoxi

    2015-08-01

    AIRE-Linux is a dedicated Linux system for astronomers. Modern astronomy faces two big challenges: massive observed raw data which covers the whole electromagnetic spectrum, and overmuch professional data processing skill which exceeds personal or even a small team's abilities. AIRE-Linux, which is a specially designed Linux and will be distributed to users by Virtual Machine (VM) images in Open Virtualization Format (OVF), is to help astronomers confront the challenges. Most astronomical software packages, such as IRAF, MIDAS, CASA, Heasoft etc., will be integrated into AIRE-Linux. It is easy for astronomers to configure and customize the system and use what they just need. When incorporated into cloud computing platforms, AIRE-Linux will be able to handle data intensive and computing consuming tasks for astronomers. Currently, a Beta version of AIRE-Linux is ready for download and testing.

  4. Apps seeking theories: results of a study on the use of health behavior change theories in cancer survivorship mobile apps.

    PubMed

    Vollmer Dahlke, Deborah; Fair, Kayla; Hong, Y Alicia; Beaudoin, Christopher E; Pulczinski, Jairus; Ory, Marcia G

    2015-03-27

    Thousands of mobile health apps are now available for use on mobile phones for a variety of uses and conditions, including cancer survivorship. Many of these apps appear to deliver health behavior interventions but may fail to consider design considerations based in human computer interface and health behavior change theories. This study is designed to assess the presence of and manner in which health behavior change and health communication theories are applied in mobile phone cancer survivorship apps. The research team selected a set of criteria-based health apps for mobile phones and assessed each app using qualitative coding methods to assess the application of health behavior change and communication theories. Each app was assessed using a coding derived from the taxonomy of 26 health behavior change techniques by Abraham and Michie with a few important changes based on the characteristics of mHealth apps that are specific to information processing and human computer interaction such as control theory and feedback systems. A total of 68 mobile phone apps and games built on the iOS and Android platforms were coded, with 65 being unique. Using a Cohen's kappa analysis statistic, the inter-rater reliability for the iOS apps was 86.1 (P<.001) and for the Android apps, 77.4 (P<.001). For the most part, the scores for inclusion of theory-based health behavior change characteristics in the iOS platform cancer survivorship apps were consistently higher than those of the Android platform apps. For personalization and tailoring, 67% of the iOS apps (24/36) had these elements as compared to 38% of the Android apps (12/32). In the area of prompting for intention formation, 67% of the iOS apps (34/36) indicated these elements as compared to 16% (5/32) of the Android apps. Mobile apps are rapidly emerging as a way to deliver health behavior change interventions that can be tailored or personalized for individuals. As these apps and games continue to evolve and include interactive and adaptive sensors and other forms of dynamic feedback, their content and interventional elements need to be grounded in human computer interface design and health behavior and communication theory and practice.

  5. Apps Seeking Theories: Results of a Study on the Use of Health Behavior Change Theories in Cancer Survivorship Mobile Apps

    PubMed Central

    Fair, Kayla; Hong, Y Alicia; Beaudoin, Christopher E; Pulczinski, Jairus; Ory, Marcia G

    2015-01-01

    Background Thousands of mobile health apps are now available for use on mobile phones for a variety of uses and conditions, including cancer survivorship. Many of these apps appear to deliver health behavior interventions but may fail to consider design considerations based in human computer interface and health behavior change theories. Objective This study is designed to assess the presence of and manner in which health behavior change and health communication theories are applied in mobile phone cancer survivorship apps. Methods The research team selected a set of criteria-based health apps for mobile phones and assessed each app using qualitative coding methods to assess the application of health behavior change and communication theories. Each app was assessed using a coding derived from the taxonomy of 26 health behavior change techniques by Abraham and Michie with a few important changes based on the characteristics of mHealth apps that are specific to information processing and human computer interaction such as control theory and feedback systems. Results A total of 68 mobile phone apps and games built on the iOS and Android platforms were coded, with 65 being unique. Using a Cohen’s kappa analysis statistic, the inter-rater reliability for the iOS apps was 86.1 (P<.001) and for the Android apps, 77.4 (P<.001). For the most part, the scores for inclusion of theory-based health behavior change characteristics in the iOS platform cancer survivorship apps were consistently higher than those of the Android platform apps. For personalization and tailoring, 67% of the iOS apps (24/36) had these elements as compared to 38% of the Android apps (12/32). In the area of prompting for intention formation, 67% of the iOS apps (34/36) indicated these elements as compared to 16% (5/32) of the Android apps. Conclusions Mobile apps are rapidly emerging as a way to deliver health behavior change interventions that can be tailored or personalized for individuals. As these apps and games continue to evolve and include interactive and adaptive sensors and other forms of dynamic feedback, their content and interventional elements need to be grounded in human computer interface design and health behavior and communication theory and practice. PMID:25830810

  6. Development of a computer model to predict platform station keeping requirements in the Gulf of Mexico using remote sensing data

    NASA Technical Reports Server (NTRS)

    Barber, Bryan; Kahn, Laura; Wong, David

    1990-01-01

    Offshore operations such as oil drilling and radar monitoring require semisubmersible platforms to remain stationary at specific locations in the Gulf of Mexico. Ocean currents, wind, and waves in the Gulf of Mexico tend to move platforms away from their desired locations. A computer model was created to predict the station keeping requirements of a platform. The computer simulation uses remote sensing data from satellites and buoys as input. A background of the project, alternate approaches to the project, and the details of the simulation are presented.

  7. Smart adaptable system for older adults' Daily Life Activities Management - The ABLE platform.

    PubMed

    Giokas, Kostas; Anastasiou, Athanasios; Tsirmpas, Charalampos; Koutsouri, Georgia; Koutsouris, Dimitris; Iliopoulou, Dimitra

    2014-01-01

    In this paper we propose a system (ABLE) that will act as the main platform for a number of low-cost, mature technologies that will be integrated in order to create a dynamically adaptive Daily Life Activities Management environment in order to facilitate the everyday life of senior (but not exclusively) citizens at home. While the main target group of ABLE's users is the ageing population its use can be extended to all people that are vulnerable or atypical in body, intellect or emotions and are categorized by society as disabled. The classes of assistive products that are well defined in the international standard, ISO9999 such as assistive products for personal medical treatment, personal care and protection, communication, information and reaction and for personal mobility, will be easily incorporated in our proposed platform. Furthermore, our platform could integrate and implement the above classes under several service models that will be analyzed further.

  8. Ethical sharing of health data in online platforms - which values should be considered?

    PubMed

    Riso, Brígida; Tupasela, Aaro; Vears, Danya F; Felzmann, Heike; Cockbain, Julian; Loi, Michele; Kongsholm, Nana C H; Zullo, Silvia; Rakic, Vojin

    2017-08-21

    Intensified and extensive data production and data storage are characteristics of contemporary western societies. Health data sharing is increasing with the growth of Information and Communication Technology (ICT) platforms devoted to the collection of personal health and genomic data. However, the sensitive and personal nature of health data poses ethical challenges when data is disclosed and shared even if for scientific research purposes.With this in mind, the Science and Values Working Group of the COST Action CHIP ME 'Citizen's Health through public-private Initiatives: Public health, Market and Ethical perspectives' (IS 1303) identified six core values they considered to be essential for the ethical sharing of health data using ICT platforms. We believe that using this ethical framework will promote respectful scientific practices in order to maintain individuals' trust in research.We use these values to analyse five ICT platforms and explore how emerging data sharing platforms are reconfiguring the data sharing experience from a range of perspectives. We discuss which types of values, rights and responsibilities they entail and enshrine within their philosophy or outlook on what it means to share personal health information. Through this discussion we address issues of the design and the development process of personal health data and patient-oriented infrastructures, as well as new forms of technologically-mediated empowerment.

  9. Open source acceleration of wave optics simulations on energy efficient high-performance computing platforms

    NASA Astrophysics Data System (ADS)

    Beck, Jeffrey; Bos, Jeremy P.

    2017-05-01

    We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.

  10. Pre-Hardware Optimization of Spacecraft Image Processing Software Algorithms and Hardware Implementation

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Flatley, Thomas P.; Hestnes, Phyllis; Jentoft-Nilsen, Marit; Petrick, David J.; Day, John H. (Technical Monitor)

    2001-01-01

    Spacecraft telemetry rates have steadily increased over the last decade presenting a problem for real-time processing by ground facilities. This paper proposes a solution to a related problem for the Geostationary Operational Environmental Spacecraft (GOES-8) image processing application. Although large super-computer facilities are the obvious heritage solution, they are very costly, making it imperative to seek a feasible alternative engineering solution at a fraction of the cost. The solution is based on a Personal Computer (PC) platform and synergy of optimized software algorithms and re-configurable computing hardware technologies, such as Field Programmable Gate Arrays (FPGA) and Digital Signal Processing (DSP). It has been shown in [1] and [2] that this configuration can provide superior inexpensive performance for a chosen application on the ground station or on-board a spacecraft. However, since this technology is still maturing, intensive pre-hardware steps are necessary to achieve the benefits of hardware implementation. This paper describes these steps for the GOES-8 application, a software project developed using Interactive Data Language (IDL) (Trademark of Research Systems, Inc.) on a Workstation/UNIX platform. The solution involves converting the application to a PC/Windows/RC platform, selected mainly by the availability of low cost, adaptable high-speed RC hardware. In order for the hybrid system to run, the IDL software was modified to account for platform differences. It was interesting to examine the gains and losses in performance on the new platform, as well as unexpected observations before implementing hardware. After substantial pre-hardware optimization steps, the necessity of hardware implementation for bottleneck code in the PC environment became evident and solvable beginning with the methodology described in [1], [2], and implementing a novel methodology for this specific application [6]. The PC-RC interface bandwidth problem for the class of applications with moderate input-output data rates but large intermediate multi-thread data streams has been addressed and mitigated. This opens a new class of satellite image processing applications for bottleneck problems solution using RC technologies. The issue of a science algorithm level of abstraction necessary for RC hardware implementation is also described. Selected Matlab functions already implemented in hardware were investigated for their direct applicability to the GOES-8 application with the intent to create a library of Matlab and IDL RC functions for ongoing work. A complete class of spacecraft image processing applications using embedded re-configurable computing technology to meet real-time requirements, including performance results and comparison with the existing system, is described in this paper.

  11. MyDiabetesMyWay: An Evolving National Data Driven Diabetes Self-Management Platform.

    PubMed

    Wake, Deborah J; He, Jinzhang; Czesak, Anna Maria; Mughal, Fezan; Cunningham, Scott G

    2016-09-01

    MyDiabetesMyWay (MDMW) is an award-wining national electronic personal health record and self-management platform for diabetes patients in Scotland. This platform links multiple national institutional and patient-recorded data sources to provide a unique resource for patient care and self-management. This review considers the current evidence for online interventions in diabetes and discusses these in the context of current and ongoing developments for MDMW. Evaluation of MDMW through patient reported outcomes demonstrates a positive impact on self-management. User feedback has highlighted barriers to uptake and has guided platform evolution from an education resource website to an electronic personal health record now encompassing remote monitoring, communication tools and personalized education links. Challenges in delivering digital interventions for long-term conditions include integration of data between institutional and personal recorded sources to perform big data analytics and facilitating technology use in those with disabilities, low digital literacy, low socioeconomic status and in minority groups. The potential for technology supported health improvement is great, but awareness and adoption by health workers and patients remains a significant barrier. © 2016 Diabetes Technology Society.

  12. Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet?

    PubMed Central

    Billah, Syed Masum; Ashok, Vikas; Porter, Donald E.; Ramakrishnan, IV

    2017-01-01

    Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access—an early forerunner of true ubiquitous access—screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments. PMID:28782061

  13. Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet?

    PubMed

    Billah, Syed Masum; Ashok, Vikas; Porter, Donald E; Ramakrishnan, I V

    2017-05-01

    Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access-an early forerunner of true ubiquitous access-screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments.

  14. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    ERIC Educational Resources Information Center

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  15. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads

    PubMed Central

    Stone, John E.; Hallock, Michael J.; Phillips, James C.; Peterson, Joseph R.; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-01-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers. PMID:27516922

  16. Autonomous self-organizing resource manager for multiple networked platforms

    NASA Astrophysics Data System (ADS)

    Smith, James F., III

    2002-08-01

    A fuzzy logic based expert system for resource management has been developed that automatically allocates electronic attack (EA) resources in real-time over many dissimilar autonomous naval platforms defending their group against attackers. The platforms can be very general, e.g., ships, planes, robots, land based facilities, etc. Potential foes the platforms deal with can also be general. This paper provides an overview of the resource manager including the four fuzzy decision trees that make up the resource manager; the fuzzy EA model; genetic algorithm based optimization; co-evolutionary data mining through gaming; and mathematical, computational and hardware based validation. Methods of automatically designing new multi-platform EA techniques are considered. The expert system runs on each defending platform rendering it an autonomous system requiring no human intervention. There is no commanding platform. Instead the platforms work cooperatively as a function of battlespace geometry; sensor data such as range, bearing, ID, uncertainty measures for sensor output; intelligence reports; etc. Computational experiments will show the defending networked platform's ability to self- organize. The platforms' ability to self-organize is illustrated through the output of the scenario generator, a software package that automates the underlying data mining problem and creates a computer movie of the platforms' interaction for evaluation.

  17. Uncover the Cloud for Geospatial Sciences and Applications to Adopt Cloud Computing

    NASA Astrophysics Data System (ADS)

    Yang, C.; Huang, Q.; Xia, J.; Liu, K.; Li, J.; Xu, C.; Sun, M.; Bambacus, M.; Xu, Y.; Fay, D.

    2012-12-01

    Cloud computing is emerging as the future infrastructure for providing computing resources to support and enable scientific research, engineering development, and application construction, as well as work force education. On the other hand, there is a lot of doubt about the readiness of cloud computing to support a variety of scientific research, development and educations. This research is a project funded by NASA SMD to investigate through holistic studies how ready is the cloud computing to support geosciences. Four applications with different computing characteristics including data, computing, concurrent, and spatiotemporal intensities are taken to test the readiness of cloud computing to support geosciences. Three popular and representative cloud platforms including Amazon EC2, Microsoft Azure, and NASA Nebula as well as a traditional cluster are utilized in the study. Results illustrates that cloud is ready to some degree but more research needs to be done to fully implemented the cloud benefit as advertised by many vendors and defined by NIST. Specifically, 1) most cloud platform could help stand up new computing instances, a new computer, in a few minutes as envisioned, therefore, is ready to support most computing needs in an on demand fashion; 2) the load balance and elasticity, a defining characteristic, is ready in some cloud platforms, such as Amazon EC2, to support bigger jobs, e.g., needs response in minutes, while some are not ready to support the elasticity and load balance well. All cloud platform needs further research and development to support real time application at subminute level; 3) the user interface and functionality of cloud platforms vary a lot and some of them are very professional and well supported/documented, such as Amazon EC2, some of them needs significant improvement for the general public to adopt cloud computing without professional training or knowledge about computing infrastructure; 4) the security is a big concern in cloud computing platform, with the sharing spirit of cloud computing, it is very hard to ensure higher level security, except a private cloud is built for a specific organization without public access, public cloud platform does not support FISMA medium level yet and may never be able to support FISMA high level; 5) HPC jobs needs of cloud computing is not well supported and only Amazon EC2 supports this well. The research is being taken by NASA and other agencies to consider cloud computing adoption. We hope the publication of the research would also benefit the public to adopt cloud computing.

  18. Formulation Predictive Dissolution (fPD) Testing to Advance Oral Drug Product Development: an Introduction to the US FDA Funded '21st Century BA/BE' Project.

    PubMed

    Hens, Bart; Sinko, Patrick; Job, Nicholas; Dean, Meagan; Al-Gousous, Jozef; Salehi, Niloufar; Ziff, Robert M; Tsume, Yasuhiro; Bermejo, Marival; Paixão, Paulo; Brasseur, James G; Yu, Alex; Talattof, Arjang; Benninghoff, Gail; Langguth, Peter; Lennernäs, Hans; Hasler, William L; Marciani, Luca; Dickens, Joseph; Shedden, Kerby; Sun, Duxin; Amidon, Gregory E; Amidon, Gordon L

    2018-06-23

    Over the past decade, formulation predictive dissolution (fPD) testing has gained increasing attention. Another mindset is pushed forward where scientists in our field are more confident to explore the in vivo behavior of an oral drug product by performing predictive in vitro dissolution studies. Similarly, there is an increasing interest in the application of modern computational fluid dynamics (CFD) frameworks and high-performance computing platforms to study the local processes underlying absorption within the gastrointestinal (GI) tract. In that way, CFD and computing platforms both can inform future PBPK-based in silico frameworks and determine the GI-motility-driven hydrodynamic impacts that should be incorporated into in vitro dissolution methods for in vivo relevance. Current compendial dissolution methods are not always reliable to predict the in vivo behavior, especially not for biopharmaceutics classification system (BCS) class 2/4 compounds suffering from a low aqueous solubility. Developing a predictive dissolution test will be more reliable, cost-effective and less time-consuming as long as the predictive power of the test is sufficiently strong. There is a need to develop a biorelevant, predictive dissolution method that can be applied by pharmaceutical drug companies to facilitate marketing access for generic and novel drug products. In 2014, Prof. Gordon L. Amidon and his team initiated a far-ranging research program designed to integrate (1) in vivo studies in humans in order to further improve the understanding of the intraluminal processing of oral dosage forms and dissolved drug along the gastrointestinal (GI) tract, (2) advancement of in vitro methodologies that incorporates higher levels of in vivo relevance and (3) computational experiments to study the local processes underlying dissolution, transport and absorption within the intestines performed with a new unique CFD based framework. Of particular importance is revealing the physiological variables determining the variability in in vivo dissolution and GI absorption from person to person in order to address (potential) in vivo BE failures. This paper provides an introduction to this multidisciplinary project, informs the reader about current achievements and outlines future directions. Copyright © 2018. Published by Elsevier B.V.

  19. Perspectives of health and self-care among older persons-To be implemented in an interactive information and communication technology-platform.

    PubMed

    Göransson, Carina; Wengström, Yvonne; Ziegert, Kristina; Langius-Eklöf, Ann; Eriksson, Irene; Kihlgren, Annica; Blomberg, Karin

    2017-12-01

    To acquire knowledge regarding the contents to be implemented in an interactive information and communication technology-platform perceived to be relevant to health and self-care among older persons based on the literature, healthcare professionals and the older persons themselves. The growing ageing population places demands on the healthcare system to promote healthy ageing and to strengthen the older person's self-care ability. This requires innovative approaches to facilitate communication between the older person and healthcare professionals, and to increase the older person's participation in their care. An information and communication technology-platform could be used for this purpose, but the content needs to be relevant to both the older persons and the healthcare professionals. Descriptive qualitative design. This study was based on three samplings: a scoping review of the literature (n = 20 articles), interviews with healthcare professionals (n = 5) and a secondary analysis of interviews with older persons (n = 8) and nursing assistants (n = 7). The data were analysed using qualitative content analysis. Four areas were identified to be of relevance to older persons' perceived health: frame of mind, having relationships and social activities, physical ability and concerns, and maintaining self-care. Self-care was described in the literature and by the healthcare professionals more than by the older persons. The results show a concordance in the data samplings that give a clear indication of the areas relevant to older persons' health and self-care that can be integrated in an interactive information and communication technology-platform for use in regular daily care assessments. Descriptions of self-care were limited indicating a possible gap in knowledge that requires further research. Areas relevant to older persons' health and self-care could be used for regular assessment to support and promote healthy ageing. © 2017 John Wiley & Sons Ltd.

  20. GREEN SUPERCOMPUTING IN A DESKTOP BOX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HSU, CHUNG-HSING; FENG, WU-CHUN; CHING, AVERY

    2007-01-17

    The computer workstation, introduced by Sun Microsystems in 1982, was the tool of choice for scientists and engineers as an interactive computing environment for the development of scientific codes. However, by the mid-1990s, the performance of workstations began to lag behind high-end commodity PCs. This, coupled with the disappearance of BSD-based operating systems in workstations and the emergence of Linux as an open-source operating system for PCs, arguably led to the demise of the workstation as we knew it. Around the same time, computational scientists started to leverage PCs running Linux to create a commodity-based (Beowulf) cluster that provided dedicatedmore » computer cycles, i.e., supercomputing for the rest of us, as a cost-effective alternative to large supercomputers, i.e., supercomputing for the few. However, as the cluster movement has matured, with respect to cluster hardware and open-source software, these clusters have become much more like their large-scale supercomputing brethren - a shared (and power-hungry) datacenter resource that must reside in a machine-cooled room in order to operate properly. Consequently, the above observations, when coupled with the ever-increasing performance gap between the PC and cluster supercomputer, provide the motivation for a 'green' desktop supercomputer - a turnkey solution that provides an interactive and parallel computing environment with the approximate form factor of a Sun SPARCstation 1 'pizza box' workstation. In this paper, they present the hardware and software architecture of such a solution as well as its prowess as a developmental platform for parallel codes. In short, imagine a 12-node personal desktop supercomputer that achieves 14 Gflops on Linpack but sips only 185 watts of power at load, resulting in a performance-power ratio that is over 300% better than their reference SMP platform.« less

  1. Rotating Desk for Collaboration by Two Computer Programmers

    NASA Technical Reports Server (NTRS)

    Riley, John Thomas

    2005-01-01

    A special-purpose desk has been designed to facilitate collaboration by two computer programmers sharing one desktop computer or computer terminal. The impetus for the design is a trend toward what is known in the software industry as extreme programming an approach intended to ensure high quality without sacrificing the quantity of computer code produced. Programmers working in pairs is a major feature of extreme programming. The present desk design minimizes the stress of the collaborative work environment. It supports both quality and work flow by making it unnecessary for programmers to get in each other s way. The desk (see figure) includes a rotating platform that supports a computer video monitor, keyboard, and mouse. The desk enables one programmer to work on the keyboard for any amount of time and then the other programmer to take over without breaking the train of thought. The rotating platform is supported by a turntable bearing that, in turn, is supported by a weighted base. The platform contains weights to improve its balance. The base includes a stand for a computer, and is shaped and dimensioned to provide adequate foot clearance for both users. The platform includes an adjustable stand for the monitor, a surface for the keyboard and mouse, and spaces for work papers, drinks, and snacks. The heights of the monitor, keyboard, and mouse are set to minimize stress. The platform can be rotated through an angle of 40 to give either user a straight-on view of the monitor and full access to the keyboard and mouse. Magnetic latches keep the platform preferentially at either of the two extremes of rotation. To switch between users, one simply grabs the edge of the platform and pulls it around. The magnetic latch is easily released, allowing the platform to rotate freely to the position of the other user

  2. Micromagnetics on high-performance workstation and mobile computational platforms

    NASA Astrophysics Data System (ADS)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  3. Continuous measurement of breast tumor hormone receptor expression: a comparison of two computational pathology platforms

    PubMed Central

    Ahern, Thomas P.; Beck, Andrew H.; Rosner, Bernard A.; Glass, Ben; Frieling, Gretchen; Collins, Laura C.; Tamimi, Rulla M.

    2017-01-01

    Background Computational pathology platforms incorporate digital microscopy with sophisticated image analysis to permit rapid, continuous measurement of protein expression. We compared two computational pathology platforms on their measurement of breast tumor estrogen receptor (ER) and progesterone receptor (PR) expression. Methods Breast tumor microarrays from the Nurses’ Health Study were stained for ER (n=592) and PR (n=187). One expert pathologist scored cases as positive if ≥1% of tumor nuclei exhibited stain. ER and PR were then measured with the Definiens Tissue Studio (automated) and Aperio Digital Pathology (user-supervised) platforms. Platform-specific measurements were compared using boxplots, scatter plots, and correlation statistics. Classification of ER and PR positivity by platform-specific measurements was evaluated with areas under receiver operating characteristic curves (AUC) from univariable logistic regression models, using expert pathologist classification as the standard. Results Both platforms showed considerable overlap in continuous measurements of ER and PR between positive and negative groups classified by expert pathologist. Platform-specific measurements were strongly and positively correlated with one another (rho≥0.77). The user-supervised Aperio workflow performed slightly better than the automated Definiens workflow at classifying ER positivity (AUCAperio=0.97; AUCDefiniens=0.90; difference=0.07, 95% CI: 0.05, 0.09) and PR positivity (AUCAperio=0.94; AUCDefiniens=0.87; difference=0.07, 95% CI: 0.03, 0.12). Conclusion Paired hormone receptor expression measurements from two different computational pathology platforms agreed well with one another. The user-supervised workflow yielded better classification accuracy than the automated workflow. Appropriately validated computational pathology algorithms enrich molecular epidemiology studies with continuous protein expression data and may accelerate tumor biomarker discovery. PMID:27729430

  4. Study on the application of mobile internet cloud computing platform

    NASA Astrophysics Data System (ADS)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  5. [The Key Technology Study on Cloud Computing Platform for ECG Monitoring Based on Regional Internet of Things].

    PubMed

    Yang, Shu; Qiu, Yuyan; Shi, Bo

    2016-09-01

    This paper explores the methods of building the internet of things of a regional ECG monitoring, focused on the implementation of ECG monitoring center based on cloud computing platform. It analyzes implementation principles of automatic identifi cation in the types of arrhythmia. It also studies the system architecture and key techniques of cloud computing platform, including server load balancing technology, reliable storage of massive smalfi les and the implications of quick search function.

  6. Single software platform used for high speed data transfer implementation in a 65k pixel camera working in single photon counting mode

    NASA Astrophysics Data System (ADS)

    Maj, P.; Kasiński, K.; Gryboś, P.; Szczygieł, R.; Kozioł, A.

    2015-12-01

    Integrated circuits designed for specific applications generally use non-standard communication methods. Hybrid pixel detector readout electronics produces a huge amount of data as a result of number of frames per seconds. The data needs to be transmitted to a higher level system without limiting the ASIC's capabilities. Nowadays, the Camera Link interface is still one of the fastest communication methods, allowing transmission speeds up to 800 MB/s. In order to communicate between a higher level system and the ASIC with a dedicated protocol, an FPGA with dedicated code is required. The configuration data is received from the PC and written to the ASIC. At the same time, the same FPGA should be able to transmit the data from the ASIC to the PC at the very high speed. The camera should be an embedded system enabling autonomous operation and self-monitoring. In the presented solution, at least three different hardware platforms are used—FPGA, microprocessor with real-time operating system and the PC with end-user software. We present the use of a single software platform for high speed data transfer from 65k pixel camera to the personal computer.

  7. Systems Medicine: The Future of Medical Genomics, Healthcare, and Wellness.

    PubMed

    Saqi, Mansoor; Pellet, Johann; Roznovat, Irina; Mazein, Alexander; Ballereau, Stéphane; De Meulder, Bertrand; Auffray, Charles

    2016-01-01

    Recent advances in genomics have led to the rapid and relatively inexpensive collection of patient molecular data including multiple types of omics data. The integration of these data with clinical measurements has the potential to impact on our understanding of the molecular basis of disease and on disease management. Systems medicine is an approach to understanding disease through an integration of large patient datasets. It offers the possibility for personalized strategies for healthcare through the development of a new taxonomy of disease. Advanced computing will be an important component in effectively implementing systems medicine. In this chapter we describe three computational challenges associated with systems medicine: disease subtype discovery using integrated datasets, obtaining a mechanistic understanding of disease, and the development of an informatics platform for the mining, analysis, and visualization of data emerging from translational medicine studies.

  8. Cloud computing for comparative genomics with windows azure platform.

    PubMed

    Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.

  9. Cloud Computing for Comparative Genomics with Windows Azure Platform

    PubMed Central

    Kim, Insik; Jung, Jae-Yoon; DeLuca, Todd F.; Nelson, Tristan H.; Wall, Dennis P.

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services. PMID:23032609

  10. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3DMIP platform when a larger number of cores is available.

  11. Cloud Based Applications and Platforms (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brodt-Giles, D.

    2014-05-15

    Presentation to the Cloud Computing East 2014 Conference, where we are highlighting our cloud computing strategy, describing the platforms on the cloud (including Smartgrid.gov), and defining our process for implementing cloud based applications.

  12. A cloud computing based platform for sleep behavior and chronic diseases collaborative research.

    PubMed

    Kuo, Mu-Hsing; Borycki, Elizabeth; Kushniruk, Andre; Huang, Yueh-Min; Hung, Shu-Hui

    2014-01-01

    The objective of this study is to propose a Cloud Computing based platform for sleep behavior and chronic disease collaborative research. The platform consists of two main components: (1) a sensing bed sheet with textile sensors to automatically record patient's sleep behaviors and vital signs, and (2) a service-oriented cloud computing architecture (SOCCA) that provides a data repository and allows for sharing and analysis of collected data. Also, we describe our systematic approach to implementing the SOCCA. We believe that the new cloud-based platform can provide nurse and other health professional researchers located in differing geographic locations with a cost effective, flexible, secure and privacy-preserved research environment.

  13. Learner Characteristic Based Learning Effort Curve Mode: The Core Mechanism on Developing Personalized Adaptive E-Learning Platform

    ERIC Educational Resources Information Center

    Hsu, Pi-Shan

    2012-01-01

    This study aims to develop the core mechanism for realizing the development of personalized adaptive e-learning platform, which is based on the previous learning effort curve research and takes into account the learner characteristics of learning style and self-efficacy. 125 university students from Taiwan are classified into 16 groups according…

  14. Privacy and information security risks in a technology platform for home-based chronic disease rehabilitation and education.

    PubMed

    Henriksen, Eva; Burkow, Tatjana M; Johnsen, Elin; Vognild, Lars K

    2013-08-09

    Privacy and information security are important for all healthcare services, including home-based services. We have designed and implemented a prototype technology platform for providing home-based healthcare services. It supports a personal electronic health diary and enables secure and reliable communication and interaction with peers and healthcare personnel. The platform runs on a small computer with a dedicated remote control. It is connected to the patient's TV and to a broadband Internet. The platform has been tested with home-based rehabilitation and education programs for chronic obstructive pulmonary disease and diabetes. As part of our work, a risk assessment of privacy and security aspects has been performed, to reveal actual risks and to ensure adequate information security in this technical platform. Risk assessment was performed in an iterative manner during the development process. Thus, security solutions have been incorporated into the design from an early stage instead of being included as an add-on to a nearly completed system. We have adapted existing risk management methods to our own environment, thus creating our own method. Our method conforms to ISO's standard for information security risk management. A total of approximately 50 threats and possible unwanted incidents were identified and analysed. Among the threats to the four information security aspects: confidentiality, integrity, availability, and quality; confidentiality threats were identified as most serious, with one threat given an unacceptable level of High risk. This is because health-related personal information is regarded as sensitive. Availability threats were analysed as low risk, as the aim of the home programmes is to provide education and rehabilitation services; not for use in acute situations or for continuous health monitoring. Most of the identified threats are applicable for healthcare services intended for patients or citizens in their own homes. Confidentiality risks in home are different from in a more controlled environment such as a hospital; and electronic equipment located in private homes and communicating via Internet, is more exposed to unauthorised access. By implementing the proposed measures, it has been possible to design a home-based service which ensures the necessary level of information security and privacy.

  15. 49 CFR 571.404 - Standard No. 404; Platform lift installations in motor vehicles.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 6 2010-10-01 2010-10-01 false Standard No. 404; Platform lift installations in... VEHICLE SAFETY STANDARDS Federal Motor Vehicle Safety Standards § 571.404 Standard No. 404; Platform lift... platform lifts used to assist persons with limited mobility in entering or leaving a vehicle. S2. Purpose...

  16. 49 CFR 571.404 - Standard No. 404; Platform lift installations in motor vehicles.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 6 2011-10-01 2011-10-01 false Standard No. 404; Platform lift installations in... VEHICLE SAFETY STANDARDS Federal Motor Vehicle Safety Standards § 571.404 Standard No. 404; Platform lift... platform lifts used to assist persons with limited mobility in entering or leaving a vehicle. S2. Purpose...

  17. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research

    PubMed Central

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C.

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction. PMID:24904400

  18. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research.

    PubMed

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction.

  19. Acceleration of Cherenkov angle reconstruction with the new Intel Xeon/FPGA compute platform for the particle identification in the LHCb Upgrade

    NASA Astrophysics Data System (ADS)

    Faerber, Christian

    2017-10-01

    The LHCb experiment at the LHC will upgrade its detector by 2018/2019 to a ‘triggerless’ readout scheme, where all the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40 MHz. This increases the data bandwidth from the detector down to the Event Filter farm to 40 TBit/s, which also has to be processed to select the interesting proton-proton collision for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered for use inside the new Event Filter farm. In the high performance computing sector more and more FPGA compute accelerators are used to improve the compute performance and reduce the power consumption (e.g. in the Microsoft Catapult project and Bing search engine). Also for the LHCb upgrade the usage of an experimental FPGA accelerated computing platform in the Event Building or in the Event Filter farm is being considered and therefore tested. This platform from Intel hosts a general CPU and a high performance FPGA linked via a high speed link which is for this platform a QPI link. On the FPGA an accelerator is implemented. The used system is a two socket platform from Intel with a Xeon CPU and an FPGA. The FPGA has cache-coherent memory access to the main memory of the server and can collaborate with the CPU. As a first step, a computing intensive algorithm to reconstruct Cherenkov angles for the LHCb RICH particle identification was successfully ported in Verilog to the Intel Xeon/FPGA platform and accelerated by a factor of 35. The same algorithm was ported to the Intel Xeon/FPGA platform with OpenCL. The implementation work and the performance will be compared. Also another FPGA accelerator the Nallatech 385 PCIe accelerator with the same Stratix V FPGA were tested for performance. The results show that the Intel Xeon/FPGA platforms, which are built in general for high performance computing, are also very interesting for the High Energy Physics community.

  20. Instrumentino: An Open-Source Software for Scientific Instruments.

    PubMed

    Koenka, Israel Joel; Sáiz, Jorge; Hauser, Peter C

    2015-01-01

    Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of such arrangements from a personal computer. This task is particularly complex if some of the included hardware components are connected directly to the computer and not via the microcontroller. A graphical user interface framework, Instrumentino, was therefore developed to allow the creation of control programs for complex systems with minimal programming effort. By writing a single code file, a powerful custom user interface is generated, which enables the automatic running of elaborate operation sequences and observation of acquired experimental data in real time. The framework, which is written in Python, allows extension by users, and is made available as an open source project.

  1. Computer Assisted REhabilitation (CARE) Lab: A novel approach towards Pediatric Rehabilitation 2.0.

    PubMed

    Olivieri, Ivana; Meriggi, Paolo; Fedeli, Cristina; Brazzoli, Elena; Castagna, Anna; Roidi, Marina Luisa Rodocanachi; Angelini, Lucia

    2018-01-01

    Pediatric Rehabilitation therapists have always worked using a variety of off-the-shelf or custom-made objects and devices, more recently including computer based systems. These Information and Communication Technology (ICT) solutions vary widely in complexity, from easy-to-use interactive videogame consoles originally intended for entertainment purposes to sophisticated systems specifically developed for rehabilitation.This paper describes the principles underlying an innovative "Pediatric Rehabilitation 2.0" approach, based on the combination of suitable ICT solutions and traditional rehabilitation, which has been progressively refined while building up and using a computer-assisted rehabilitation laboratory. These principles are thus summarized in the acronym EPIQ, to account for the terms Ecological, Personalized, Interactive and Quantitative. The paper also presents the laboratory, which has been designed to meet the children's rehabilitation needs and to empower therapists in their work. The laboratory is equipped with commercial hardware and specially developed software called VITAMIN: a virtual reality platform for motor and cognitive rehabilitation.

  2. Image processing for navigation on a mobile embedded platform

    NASA Astrophysics Data System (ADS)

    Preuss, Thomas; Gentsch, Lars; Rambow, Mark

    2006-02-01

    Mobile computing devices such as PDAs or cellular phones may act as "Personal Multimedia Exchanges", but they are limited in their processing power as well as in their connectivity. Sensors as well as cellular phones and PDAs are able to gather multimedia data, e. g. images, but leak computing power to process that data on their own. Therefore, it is necessary, that these devices connect to devices with more performance, which provide e.g. image processing services. In this paper, a generic approach is presented that connects different kinds of clients with each other and allows them to interact with more powerful devices. This architecture, called BOSPORUS, represents a communication framework for dynamic peer-to-peer computing. Each peer offers and uses services in this network and communicates loosely coupled and asynchronously with the others. These features make BOSPORUS a service oriented network architecture (SONA). A mobile embedded system, which uses external services for image processing based on the BOSPORUS Framework is shown as an application of the BOSPORUS framework.

  3. FIREDOC users manual, 3rd edition

    NASA Astrophysics Data System (ADS)

    Jason, Nora H.

    1993-12-01

    FIREDOC is the on-line bibliographic database which reflects the holdings (published reports, journal articles, conference proceedings, books, and audiovisual items) of the Fire Research Information Services (FRIS) at the Building and Fire Research Laboratory (BFRL), National Institute of Standards and Technology (NIST). This manual provides step-by-step procedures for entering and exiting the database via telecommunication lines, as well as a number of techniques for searching the database and processing the results of the searches. This Third Edition is necessitated by the change to a UNIX platform. The new computer allows for faster response time if searching via a modem and, in addition, offers internet accessibility. FIREDOC may be used with personal computers, using DOS or Windows, or with Macintosh computers and workstations. A new section on how to access Internet is included, and one on how to obtain the references of interest to you. Appendix F: Quick Guide to Getting Started will be useful to both modem and Internet users.

  4. Evaluation of a Smartphone Platform as a Wireless Interface Between Tongue Drive System and Electric-Powered Wheelchairs

    PubMed Central

    Kim, Jeonghee; Huo, Xueliang; Minocha, Julia; Holbrook, Jaimee; Laumann, Anne; Ghovanloo, Maysam

    2013-01-01

    Tongue drive system (TDS) is a new wireless assistive technology (AT) for the mobility impaired population. It provides users with the ability to drive powered wheelchairs (PWC) and access computers using their unconstrained tongue motion. Migration of the TDS processing unit and user interface platform from a bulky personal computer to a smartphone (iPhone) has significantly facilitated its usage by turning it into a true wireless and wearable AT. After implementation of the necessary interfacing hardware and software to allow the smartphone to act as a bridge between the TDS and PWC, the wheelchair navigation performance and associated learning was evaluated in nine able-bodied subjects in five sessions over a 5-week period. Subjects wore magnetic tongue studs over the duration of the study and drove the PWC in an obstacle course with their tongue using three different navigation strategies; namely unlatched, latched, and semiproportional. Qualitative aspects of using the TDS–iPhone–PWC interface were also evaluated via a five-point Likert scale questionnaire. Subjects showed more than 20% improvement in the overall completion time between the first and second sessions, and maintained a modest improvement of ~9% per session over the following three sessions. PMID:22531737

  5. Person and gesture tracking with smart stereo cameras

    NASA Astrophysics Data System (ADS)

    Gordon, Gaile; Chen, Xiangrong; Buck, Ron

    2008-02-01

    Physical security increasingly involves sophisticated, real-time visual tracking of a person's location inside a given environment, often in conjunction with biometrics and other security-related technologies. However, demanding real-world conditions like crowded rooms, changes in lighting and physical obstructions have proved incredibly challenging for 2D computer vision technology. In contrast, 3D imaging technology is not affected by constant changes in lighting and apparent color, and thus allows tracking accuracy to be maintained in dynamically lit environments. In addition, person tracking with a 3D stereo camera can provide the location and movement of each individual very precisely, even in a very crowded environment. 3D vision only requires that the subject be partially visible to a single stereo camera to be correctly tracked; multiple cameras are used to extend the system's operational footprint, and to contend with heavy occlusion. A successful person tracking system, must not only perform visual analysis robustly, but also be small, cheap and consume relatively little power. The TYZX Embedded 3D Vision systems are perfectly suited to provide the low power, small footprint, and low cost points required by these types of volume applications. Several security-focused organizations, including the U.S Government, have deployed TYZX 3D stereo vision systems in security applications. 3D image data is also advantageous in the related application area of gesture tracking. Visual (uninstrumented) tracking of natural hand gestures and movement provides new opportunities for interactive control including: video gaming, location based entertainment, and interactive displays. 2D images have been used to extract the location of hands within a plane, but 3D hand location enables a much broader range of interactive applications. In this paper, we provide some background on the TYZX smart stereo cameras platform, describe the person tracking and gesture tracking systems implemented on this platform, and discuss some deployed applications.

  6. The Cyborg Astrobiologist: testing a novelty detection algorithm on two mobile exploration systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah

    NASA Astrophysics Data System (ADS)

    McGuire, P. C.; Gross, C.; Wendt, L.; Bonnici, A.; Souza-Egipsy, V.; Ormö, J.; Díaz-Martínez, E.; Foing, B. H.; Bose, R.; Walter, S.; Oesker, M.; Ontrup, J.; Haschke, R.; Ritter, H.

    2010-01-01

    In previous work, a platform was developed for testing computer-vision algorithms for robotic planetary exploration. This platform consisted of a digital video camera connected to a wearable computer for real-time processing of images at geological and astrobiological field sites. The real-time processing included image segmentation and the generation of interest points based upon uncommonness in the segmentation maps. Also in previous work, this platform for testing computer-vision algorithms has been ported to a more ergonomic alternative platform, consisting of a phone camera connected via the Global System for Mobile Communications (GSM) network to a remote-server computer. The wearable-computer platform has been tested at geological and astrobiological field sites in Spain (Rivas Vaciamadrid and Riba de Santiuste), and the phone camera has been tested at a geological field site in Malta. In this work, we (i) apply a Hopfield neural-network algorithm for novelty detection based upon colour, (ii) integrate a field-capable digital microscope on the wearable computer platform, (iii) test this novelty detection with the digital microscope at Rivas Vaciamadrid, (iv) develop a Bluetooth communication mode for the phone-camera platform, in order to allow access to a mobile processing computer at the field sites, and (v) test the novelty detection on the Bluetooth-enabled phone camera connected to a netbook computer at the Mars Desert Research Station in Utah. This systems engineering and field testing have together allowed us to develop a real-time computer-vision system that is capable, for example, of identifying lichens as novel within a series of images acquired in semi-arid desert environments. We acquired sequences of images of geologic outcrops in Utah and Spain consisting of various rock types and colours to test this algorithm. The algorithm robustly recognized previously observed units by their colour, while requiring only a single image or a few images to learn colours as familiar, demonstrating its fast learning capability.

  7. Platform-independent method for computer aided schematic drawings

    DOEpatents

    Vell, Jeffrey L [Slingerlands, NY; Siganporia, Darius M [Clifton Park, NY; Levy, Arthur J [Fort Lauderdale, FL

    2012-02-14

    A CAD/CAM method is disclosed for a computer system to capture and interchange schematic drawing and associated design information. The schematic drawing and design information are stored in an extensible, platform-independent format.

  8. A Big Data Platform for Storing, Accessing, Mining and Learning Geospatial Data

    NASA Astrophysics Data System (ADS)

    Yang, C. P.; Bambacus, M.; Duffy, D.; Little, M. M.

    2017-12-01

    Big Data is becoming a norm in geoscience domains. A platform that is capable to effiently manage, access, analyze, mine, and learn the big data for new information and knowledge is desired. This paper introduces our latest effort on developing such a platform based on our past years' experiences on cloud and high performance computing, analyzing big data, comparing big data containers, and mining big geospatial data for new information. The platform includes four layers: a) the bottom layer includes a computing infrastructure with proper network, computer, and storage systems; b) the 2nd layer is a cloud computing layer based on virtualization to provide on demand computing services for upper layers; c) the 3rd layer is big data containers that are customized for dealing with different types of data and functionalities; d) the 4th layer is a big data presentation layer that supports the effient management, access, analyses, mining and learning of big geospatial data.

  9. The Efficacy of the Internet-Based Blackboard Platform in Developmental Writing Classes

    ERIC Educational Resources Information Center

    Shudooh, Yusuf M.

    2016-01-01

    The application of computer-assisted platforms in writing classes is a relatively new paradigm in education. The adoption of computers-assisted writing classes is gaining ground in many western and non western universities. Numerous issues can be addressed when conducting computer-assisted classes (CAC). However, a few studies conducted to assess…

  10. Transforming clinical imaging and 3D data for virtual reality learning objects: HTML5 and mobile devices implementation.

    PubMed

    Trelease, Robert B; Nieder, Gary L

    2013-01-01

    Web deployable anatomical simulations or "virtual reality learning objects" can easily be produced with QuickTime VR software, but their use for online and mobile learning is being limited by the declining support for web browser plug-ins for personal computers and unavailability on popular mobile devices like Apple iPad and Android tablets. This article describes complementary methods for creating comparable, multiplatform VR learning objects in the new HTML5 standard format, circumventing platform-specific limitations imposed by the QuickTime VR multimedia file format. Multiple types or "dimensions" of anatomical information can be embedded in such learning objects, supporting different kinds of online learning applications, including interactive atlases, examination questions, and complex, multi-structure presentations. Such HTML5 VR learning objects are usable on new mobile devices that do not support QuickTime VR, as well as on personal computers. Furthermore, HTML5 VR learning objects can be embedded in "ebook" document files, supporting the development of new types of electronic textbooks on mobile devices that are increasingly popular and self-adopted for mobile learning. © 2012 American Association of Anatomists.

  11. Distributing Data to Hand-Held Devices in a Wireless Network

    NASA Technical Reports Server (NTRS)

    Hodges, Mark; Simmons, Layne

    2008-01-01

    ADROIT is a developmental computer program for real-time distribution of complex data streams for display on Web-enabled, portable terminals held by members of an operational team of a spacecraft-command-and-control center who may be located away from the center. Examples of such terminals include personal data assistants, laptop computers, and cellular telephones. ADROIT would make it unnecessary to equip each terminal with platform- specific software for access to the data streams or with software that implements the information-sharing protocol used to deliver telemetry data to clients in the center. ADROIT is a combination of middleware plus software specific to the center. (Middleware enables one application program to communicate with another by performing such functions as conversion, translation, consolidation, and/or integration.) ADROIT translates a data stream (voice, video, or alphanumerical data) from the center into Extensible Markup Language, effectuates a subscription process to determine who gets what data when, and presents the data to each user in real time. Thus, ADROIT is expected to enable distribution of operations and to reduce the cost of operations by reducing the number of persons required to be in the center.

  12. Integration of Personalized Healthcare Pathways in an ICT Platform for Diabetes Managements: A Small-Scale Exploratory Study.

    PubMed

    Fico, Giuseppe; Fioravanti, Alessio; Arredondo, Maria Teresa; Gorman, Joe; Diazzi, Chiara; Arcuri, Giovanni; Conti, Claudio; Pirini, Giampiero

    2016-01-01

    The availability of new tools able to support patient monitoring and personalized care may substantially improve the quality of chronic disease management. A personalized healthcare pathway (PHP) has been developed for diabetes disease management and integrated into an information and communication technology system to accomplish a shift from organization-centered care to patient-centered care. A small-scale exploratory study was conducted to test the platform. Preliminary results are presented that shed light on how the PHP influences system usage and performance outcomes.

  13. CANDO and the infinite drug discovery frontier

    PubMed Central

    Minie, Mark; Chopra, Gaurav; Sethi, Geetika; Horst, Jeremy; White, George; Roy, Ambrish; Hatti, Kaushik; Samudrala, Ram

    2014-01-01

    The Computational Analysis of Novel Drug Opportunities (CANDO) platform (http://protinfo.org/cando) uses similarity of compound–proteome interaction signatures to infer homology of compound/drug behavior. We constructed interaction signatures for 3733 human ingestible compounds covering 48,278 protein structures mapping to 2030 indications based on basic science methodologies to predict and analyze protein structure, function, and interactions developed by us and others. Our signature comparison and ranking approach yielded benchmarking accuracies of 12–25% for 1439 indications with at least two approved compounds. We prospectively validated 49/82 ‘high value’ predictions from nine studies covering seven indications, with comparable or better activity to existing drugs, which serve as novel repurposed therapeutics. Our approach may be generalized to compounds beyond those approved by the FDA, and can also consider mutations in protein structures to enable personalization. Our platform provides a holistic multiscale modeling framework of complex atomic, molecular, and physiological systems with broader applications in medicine and engineering. PMID:24980786

  14. Topological insulator bismuth selenide as a theranostic platform for simultaneous cancer imaging and therapy.

    PubMed

    Li, Juan; Jiang, Fei; Yang, Bo; Song, Xiao-Rong; Liu, Yan; Yang, Huang-Hao; Cao, Dai-Rong; Shi, Wen-Rong; Chen, Guo-Nan

    2013-01-01

    Employing theranostic nanoparticles, which combine both therapeutic and diagnostic capabilities in one dose, has promise to propel the biomedical field toward personalized medicine. Here we investigate the theranostic properties of topological insulator bismuth selenide (Bi2Se3) in in vivo and in vitro system for the first time. We show that Bi2Se3 nanoplates can absorb near-infrared (NIR) laser light and effectively convert laser energy into heat. Such photothermal conversion property may be due to the unique physical properties of topological insulators. Furthermore, localized and irreversible photothermal ablation of tumors in the mouse model is successfully achieved by using Bi2Se3 nanoplates and NIR laser irradiation. In addition, we also demonstrate that Bi2Se3 nanoplates exhibit strong X-ray attenuation and can be utilized for enhanced X-ray computed tomography imaging of tumor tissue in vivo. This study highlights Bi2Se3 nanoplates could serve as a promising platform for cancer diagnosis and therapy.

  15. Design and Evaluation of a Personal Digital Assistant-based Research Platform for Cochlear Implants

    PubMed Central

    Ali, Hussnain; Lobo, Arthur P.; Loizou, Philipos C.

    2014-01-01

    This paper discusses the design, development, features, and clinical evaluation of a personal digital assistant (PDA)-based platform for cochlear implant research. This highly versatile and portable research platform allows researchers to design and perform complex experiments with cochlear implants manufactured by Cochlear Corporation with great ease and flexibility. The research platform includes a portable processor for implementing and evaluating novel speech processing algorithms, a stimulator unit which can be used for electrical stimulation and neurophysio-logic studies with animals, and a recording unit for collecting electroencephalogram/evoked potentials from human subjects. The design of the platform for real time and offline stimulation modes is discussed for electric-only and electric plus acoustic stimulation followed by results from an acute study with implant users for speech intelligibility in quiet and noisy conditions. The results are comparable with users’ clinical processor and very promising for undertaking chronic studies. PMID:23674422

  16. Power Efficient Hardware Architecture of SHA-1 Algorithm for Trusted Mobile Computing

    NASA Astrophysics Data System (ADS)

    Kim, Mooseop; Ryou, Jaecheol

    The Trusted Mobile Platform (TMP) is developed and promoted by the Trusted Computing Group (TCG), which is an industry standard body to enhance the security of the mobile computing environment. The built-in SHA-1 engine in TMP is one of the most important circuit blocks and contributes the performance of the whole platform because it is used as key primitives supporting platform integrity and command authentication. Mobile platforms have very stringent limitations with respect to available power, physical circuit area, and cost. Therefore special architecture and design methods for low power SHA-1 circuit are required. In this paper, we present a novel and efficient hardware architecture of low power SHA-1 design for TMP. Our low power SHA-1 hardware can compute 512-bit data block using less than 7,000 gates and has a power consumption about 1.1 mA on a 0.25μm CMOS process.

  17. An Evaluation of Architectural Platforms for Parallel Navier-Stokes Computations

    NASA Technical Reports Server (NTRS)

    Jayasimha, D. N.; Hayder, M. E.; Pillay, S. K.

    1996-01-01

    We study the computational, communication, and scalability characteristics of a computational fluid dynamics application, which solves the time accurate flow field of a jet using the compressible Navier-Stokes equations, on a variety of parallel architecture platforms. The platforms chosen for this study are a cluster of workstations (the LACE experimental testbed at NASA Lewis), a shared memory multiprocessor (the Cray YMP), and distributed memory multiprocessors with different topologies - the IBM SP and the Cray T3D. We investigate the impact of various networks connecting the cluster of workstations on the performance of the application and the overheads induced by popular message passing libraries used for parallelization. The work also highlights the importance of matching the memory bandwidth to the processor speed for good single processor performance. By studying the performance of an application on a variety of architectures, we are able to point out the strengths and weaknesses of each of the example computing platforms.

  18. Parallelizing Navier-Stokes Computations on a Variety of Architectural Platforms

    NASA Technical Reports Server (NTRS)

    Jayasimha, D. N.; Hayder, M. E.; Pillay, S. K.

    1997-01-01

    We study the computational, communication, and scalability characteristics of a Computational Fluid Dynamics application, which solves the time accurate flow field of a jet using the compressible Navier-Stokes equations, on a variety of parallel architectural platforms. The platforms chosen for this study are a cluster of workstations (the LACE experimental testbed at NASA Lewis), a shared memory multiprocessor (the Cray YMP), distributed memory multiprocessors with different topologies-the IBM SP and the Cray T3D. We investigate the impact of various networks, connecting the cluster of workstations, on the performance of the application and the overheads induced by popular message passing libraries used for parallelization. The work also highlights the importance of matching the memory bandwidth to the processor speed for good single processor performance. By studying the performance of an application on a variety of architectures, we are able to point out the strengths and weaknesses of each of the example computing platforms.

  19. A Web Tool for Research in Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Prikhod'ko, Nikolay V.; Abramovsky, Viktor A.; Abramovskaya, Natalia V.; Demichev, Andrey P.; Kryukov, Alexandr P.; Polyakov, Stanislav P.

    2016-02-01

    This paper presents a project of developing the web platform called WebNLO for computer modeling of nonlinear optics phenomena. We discuss a general scheme of the platform and a model for interaction between the platform modules. The platform is built as a set of interacting RESTful web services (SaaS approach). Users can interact with the platform through a web browser or command line interface. Such a resource has no analogues in the field of nonlinear optics and will be created for the first time therefore allowing researchers to access high-performance computing resources that will significantly reduce the cost of the research and development process.

  20. Programmable Hydrogels for Cell Encapsulation and Neo-Tissue Growth to Enable Personalized Tissue Engineering.

    PubMed

    Bryant, Stephanie J; Vernerey, Franck J

    2018-01-01

    Biomimetic and biodegradable synthetic hydrogels are emerging as a promising platform for cell encapsulation and tissue engineering. Notably, synthetic-based hydrogels offer highly programmable macroscopic properties (e.g., mechanical, swelling and transport properties) and degradation profiles through control over several tunable parameters (e.g., the initial network structure, degradation kinetics and behavior, and polymer properties). One component to success is the ability to maintain structural integrity as the hydrogel transitions to neo-tissue. This seamless transition is complicated by the fact that cellular activity is highly variable among donors. Thus, computational models provide an important tool in tissue engineering due to their unique ability to explore the coupled processes of hydrogel degradation and neo-tissue growth across multiple length scales. In addition, such models provide new opportunities to develop predictive computational tools to overcome the challenges with designing hydrogels for different donors. In this report, programmable properties of synthetic-based hydrogels and their relation to the hydrogel's structural properties and their evolution with degradation are reviewed. This is followed by recent progress on the development of computational models that describe hydrogel degradation with neo-tissue growth when cells are encapsulated in a hydrogel. Finally, the potential for predictive models to enable patient-specific hydrogel designs for personalized tissue engineering is discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Continuous measurement of breast tumour hormone receptor expression: a comparison of two computational pathology platforms.

    PubMed

    Ahern, Thomas P; Beck, Andrew H; Rosner, Bernard A; Glass, Ben; Frieling, Gretchen; Collins, Laura C; Tamimi, Rulla M

    2017-05-01

    Computational pathology platforms incorporate digital microscopy with sophisticated image analysis to permit rapid, continuous measurement of protein expression. We compared two computational pathology platforms on their measurement of breast tumour oestrogen receptor (ER) and progesterone receptor (PR) expression. Breast tumour microarrays from the Nurses' Health Study were stained for ER (n=592) and PR (n=187). One expert pathologist scored cases as positive if ≥1% of tumour nuclei exhibited stain. ER and PR were then measured with the Definiens Tissue Studio (automated) and Aperio Digital Pathology (user-supervised) platforms. Platform-specific measurements were compared using boxplots, scatter plots and correlation statistics. Classification of ER and PR positivity by platform-specific measurements was evaluated with areas under receiver operating characteristic curves (AUC) from univariable logistic regression models, using expert pathologist classification as the standard. Both platforms showed considerable overlap in continuous measurements of ER and PR between positive and negative groups classified by expert pathologist. Platform-specific measurements were strongly and positively correlated with one another (r≥0.77). The user-supervised Aperio workflow performed slightly better than the automated Definiens workflow at classifying ER positivity (AUC Aperio =0.97; AUC Definiens =0.90; difference=0.07, 95% CI 0.05 to 0.09) and PR positivity (AUC Aperio =0.94; AUC Definiens =0.87; difference=0.07, 95% CI 0.03 to 0.12). Paired hormone receptor expression measurements from two different computational pathology platforms agreed well with one another. The user-supervised workflow yielded better classification accuracy than the automated workflow. Appropriately validated computational pathology algorithms enrich molecular epidemiology studies with continuous protein expression data and may accelerate tumour biomarker discovery. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  2. 47 CFR 64.617 - Neutral Video Communication Service Platform.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 3 2013-10-01 2013-10-01 false Neutral Video Communication Service Platform... Related Customer Premises Equipment for Persons With Disabilities § 64.617 Neutral Video Communication... Neutral Video Communication Service Platform to process VRS calls. Each VRS CA service provider shall be...

  3. 47 CFR 64.617 - Neutral Video Communication Service Platform.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 3 2014-10-01 2014-10-01 false Neutral Video Communication Service Platform... Related Customer Premises Equipment for Persons With Disabilities § 64.617 Neutral Video Communication... Neutral Video Communication Service Platform to process VRS calls. Each VRS CA service provider shall be...

  4. Design of platform for removing screws from LCD display shields

    NASA Astrophysics Data System (ADS)

    Tu, Zimei; Qin, Qin; Dou, Jianfang; Zhu, Dongdong

    2017-11-01

    Removing the screws on the sides of a shield is a necessary process in disassembling a computer LCD display. To solve this issue, a platform has been designed for removing the screws on display shields. This platform uses virtual instrument technology with LabVIEW as the development environment to design the mechanical structure with the technologies of motion control, human-computer interaction and target recognition. This platform removes the screws from the sides of the shield of an LCD display mechanically thus to guarantee follow-up separation and recycle.

  5. On the performances of computer vision algorithms on mobile platforms

    NASA Astrophysics Data System (ADS)

    Battiato, S.; Farinella, G. M.; Messina, E.; Puglisi, G.; Ravì, D.; Capra, A.; Tomaselli, V.

    2012-01-01

    Computer Vision enables mobile devices to extract the meaning of the observed scene from the information acquired with the onboard sensor cameras. Nowadays, there is a growing interest in Computer Vision algorithms able to work on mobile platform (e.g., phone camera, point-and-shot-camera, etc.). Indeed, bringing Computer Vision capabilities on mobile devices open new opportunities in different application contexts. The implementation of vision algorithms on mobile devices is still a challenging task since these devices have poor image sensors and optics as well as limited processing power. In this paper we have considered different algorithms covering classic Computer Vision tasks: keypoint extraction, face detection, image segmentation. Several tests have been done to compare the performances of the involved mobile platforms: Nokia N900, LG Optimus One, Samsung Galaxy SII.

  6. 3D Slicer as an Image Computing Platform for the Quantitative Imaging Network

    PubMed Central

    Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron

    2012-01-01

    Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690

  7. Implementation of High Speed Distributed Data Acquisition System

    NASA Astrophysics Data System (ADS)

    Raju, Anju P.; Sekhar, Ambika

    2012-09-01

    This paper introduces a high speed distributed data acquisition system based on a field programmable gate array (FPGA). The aim is to develop a "distributed" data acquisition interface. The development of instruments such as personal computers and engineering workstations based on "standard" platforms is the motivation behind this effort. Using standard platforms as the controlling unit allows independence in hardware from a particular vendor and hardware platform. The distributed approach also has advantages from a functional point of view: acquisition resources become available to multiple instruments; the acquisition front-end can be physically remote from the rest of the instrument. High speed data acquisition system transmits data faster to a remote computer system through Ethernet interface. The data is acquired through 16 analog input channels. The input data commands are multiplexed and digitized and then the data is stored in 1K buffer for each input channel. The main control unit in this design is the 16 bit processor implemented in the FPGA. This 16 bit processor is used to set up and initialize the data source and the Ethernet controller, as well as control the flow of data from the memory element to the NIC. Using this processor we can initialize and control the different configuration registers in the Ethernet controller in a easy manner. Then these data packets are sending to the remote PC through the Ethernet interface. The main advantages of the using FPGA as standard platform are its flexibility, low power consumption, short design duration, fast time to market, programmability and high density. The main advantages of using Ethernet controller AX88796 over others are its non PCI interface, the presence of embedded SRAM where transmit and reception buffers are located and high-performance SRAM-like interface. The paper introduces the implementation of the distributed data acquisition using FPGA by VHDL. The main advantages of this system are high accuracy, high speed, real time monitoring.

  8. Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing

    NASA Astrophysics Data System (ADS)

    Shi, X.

    2017-10-01

    Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  9. SU-D-BRD-02: A Web-Based Image Processing and Plan Evaluation Platform (WIPPEP) for Future Cloud-Based Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai, X; Liu, L; Xing, L

    Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less

  10. Digital imaging of root traits (DIRT): a high-throughput computing and collaboration platform for field-based root phenomics.

    PubMed

    Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander

    2015-01-01

    Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots. It enables scientists to store, manage and share crop root images with metadata and compute RSA traits from thousands of images in parallel. It makes high-throughput RSA trait computation available to the community with just a few button clicks. As such it enables plant scientists to spend more time on science rather than on technology. All stored and computed data is easily accessible to the public and broader scientific community. We hope that easy data accessibility will attract new tool developers and spur creative data usage that may even be applied to other fields of science.

  11. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  12. Design and Evaluation of a Pervasive Coaching and Gamification Platform for Young Diabetes Patients.

    PubMed

    Klaassen, Randy; Bul, Kim C M; Op den Akker, Rieks; van der Burg, Gert Jan; Kato, Pamela M; Di Bitonto, Pierpaolo

    2018-01-30

    Self monitoring, personal goal-setting and coaching, education and social support are strategies to help patients with chronic conditions in their daily care. Various tools have been developed, e.g., mobile digital coaching systems connected with wearable sensors, serious games and patient web portals to personal health records, that aim to support patients with chronic conditions and their caregivers in realizing the ideal of self-management. We describe a platform that integrates these tools to support young patients in diabetes self-management through educational game playing, monitoring and motivational feedback. We describe the design of the platform referring to principles from healthcare, persuasive system design and serious game design. The virtual coach is a game guide that can also provide personalized feedback about the user's daily care related activities which have value for making progress in the game world. User evaluations with patients under pediatric supervision revealed that the use of mobile technology in combination with web-based elements is feasible but some assumptions made about how users would connect to the platform were not satisfied in reality, resulting in less than optimal user experiences. We discuss challenges with suggestions for further development of integrated pervasive coaching and gamification platforms in medical practice.

  13. Design and Evaluation of a Pervasive Coaching and Gamification Platform for Young Diabetes Patients †

    PubMed Central

    Klaassen, Randy; Bul, Kim C. M.; op den Akker, Rieks; van der Burg, Gert Jan; Di Bitonto, Pierpaolo

    2018-01-01

    Self monitoring, personal goal-setting and coaching, education and social support are strategies to help patients with chronic conditions in their daily care. Various tools have been developed, e.g., mobile digital coaching systems connected with wearable sensors, serious games and patient web portals to personal health records, that aim to support patients with chronic conditions and their caregivers in realizing the ideal of self-management. We describe a platform that integrates these tools to support young patients in diabetes self-management through educational game playing, monitoring and motivational feedback. We describe the design of the platform referring to principles from healthcare, persuasive system design and serious game design. The virtual coach is a game guide that can also provide personalized feedback about the user’s daily care related activities which have value for making progress in the game world. User evaluations with patients under pediatric supervision revealed that the use of mobile technology in combination with web-based elements is feasible but some assumptions made about how users would connect to the platform were not satisfied in reality, resulting in less than optimal user experiences. We discuss challenges with suggestions for further development of integrated pervasive coaching and gamification platforms in medical practice. PMID:29385750

  14. OpenDrop: An Integrated Do-It-Yourself Platform for Personal Use of Biochips

    PubMed Central

    Alistar, Mirela; Gaudenz, Urs

    2017-01-01

    Biochips, or digital labs-on-chip, are developed with the purpose of being used by laboratory technicians or biologists in laboratories or clinics. In this article, we expand this vision with the goal of enabling everyone, regardless of their expertise, to use biochips for their own personal purposes. We developed OpenDrop, an integrated electromicrofluidic platform that allows users to develop and program their own bio-applications. We address the main challenges that users may encounter: accessibility, bio-protocol design and interaction with microfluidics. OpenDrop consists of a do-it-yourself biochip, an automated software tool with visual interface and a detailed technique for at-home operations of microfluidics. We report on two years of use of OpenDrop, released as an open-source platform. Our platform attracted a highly diverse user base with participants originating from maker communities, academia and industry. Our findings show that 47% of attempts to replicate OpenDrop were successful, the main challenge remaining the assembly of the device. In terms of usability, the users managed to operate their platforms at home and are working on designing their own bio-applications. Our work provides a step towards a future in which everyone will be able to create microfluidic devices for their personal applications, thereby democratizing parts of health care. PMID:28952524

  15. An open platform for personal health record apps with platform-level privacy protection.

    PubMed

    Van Gorp, P; Comuzzi, M; Jahnen, A; Kaymak, U; Middleton, B

    2014-08-01

    One of the main barriers to the adoption of Personal Health Records (PHR) systems is their closed nature. It has been argued in the literature that this barrier can be overcome by introducing an open market of substitutable PHR apps. The requirements introduced by such an open market on the underlying platform have also been derived. In this paper, we argue that MyPHRMachines, a cloud-based PHR platform recently developed by the authors, satisfies these requirements better than its alternatives. The MyPHRMachines platform leverages Virtual Machines as flexible and secure execution sandboxes for health apps. MyPHRMachines does not prevent pushing hospital- or patient-generated data to one of its instances, nor does it prevent patients from sharing data with their trusted caregivers. External software developers have minimal barriers to contribute innovative apps to the platform, since apps are only required to avoid pushing patient data outside a MyPHRMachines cloud. We demonstrate the potential of MyPHRMachines by presenting two externally contributed apps. Both apps provide functionality going beyond the state-of-the-art in their application domain, while they did not require any specific MyPHRMachines platform extension. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    PubMed

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-05

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.

  17. A Platform-Independent Plugin for Navigating Online Radiology Cases.

    PubMed

    Balkman, Jason D; Awan, Omer A

    2016-06-01

    Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.

  18. Exploring the Potential of the iPad and Xbox Kinect for Cognitive Science Research.

    PubMed

    Rolle, Camarin E; Voytek, Bradley; Gazzaley, Adam

    2015-06-01

    Many studies have validated consumer-facing hardware platforms as efficient, cost-effective, and accessible data collection instruments. However, there are few reports that have assessed the reliability of these platforms as assessment tools compared with traditional data collection platforms. Here we evaluated performance on a spatial attention paradigm obtained by our standard in-lab data collection platform, the personal computer (PC), and compared performance with that of two widely adopted, consumer technology devices: the Apple (Cupertino, CA) iPad(®) 2 and Microsoft (Redmond, WA) Xbox(®) Kinect(®). The task assessed spatial attention, a fundamental ability that we use to navigate the complex sensory input we face daily in order to effectively engage in goal-directed activities. Participants were presented with a central spatial cue indicating where on the screen a stimulus would appear. We manipulated spatial cueing such that, on a given trial, the cue presented one of four levels of information indicating the upcoming target location. Based on previous research, we hypothesized that as information of the cued spatial area decreased (i.e., larger area of possible target location) there would be a parametric decrease in performance, as revealed by slower response times and lower accuracies. Identical paradigm parameters were used for each of the three platforms, and testing was performed in a single session with a counterbalanced design. We found that performance on the Kinect and iPad showed a stronger parametric effect across the cued-information levels than that on the PC. Our results suggest that not only can the Kinect and iPad be reliably used as assessment tools to yield research-quality behavioral data, but that these platforms exploit mechanics that could be useful in building more interactive, and therefore effective, cognitive assessment and training designs. We include a discussion on the possible contributing factors to the differential effects between platforms, as well as potential confounds of the study.

  19. Trust and Decision Making: An Empirical Platform

    DTIC Science & Technology

    2008-06-01

    13th ICCRTS “C2 for Complex Endeavors” Trust and Decision Making : An Empirical Platform Topic(s): Cognitive and Social Issues...and Decision Making : An Empirical Platform 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Trust and Decision Making : An Empirical Platform Dr. Joseph B

  20. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozacik, Stephen

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  1. Functional requirements document for the Earth Observing System Data and Information System (EOSDIS) Scientific Computing Facilities (SCF) of the NASA/MSFC Earth Science and Applications Division, 1992

    NASA Technical Reports Server (NTRS)

    Botts, Michael E.; Phillips, Ron J.; Parker, John V.; Wright, Patrick D.

    1992-01-01

    Five scientists at MSFC/ESAD have EOS SCF investigator status. Each SCF has unique tasks which require the establishment of a computing facility dedicated to accomplishing those tasks. A SCF Working Group was established at ESAD with the charter of defining the computing requirements of the individual SCFs and recommending options for meeting these requirements. The primary goal of the working group was to determine which computing needs can be satisfied using either shared resources or separate but compatible resources, and which needs require unique individual resources. The requirements investigated included CPU-intensive vector and scalar processing, visualization, data storage, connectivity, and I/O peripherals. A review of computer industry directions and a market survey of computing hardware provided information regarding important industry standards and candidate computing platforms. It was determined that the total SCF computing requirements might be most effectively met using a hierarchy consisting of shared and individual resources. This hierarchy is composed of five major system types: (1) a supercomputer class vector processor; (2) a high-end scalar multiprocessor workstation; (3) a file server; (4) a few medium- to high-end visualization workstations; and (5) several low- to medium-range personal graphics workstations. Specific recommendations for meeting the needs of each of these types are presented.

  2. University Students Use of Computers and Mobile Devices for Learning and Their Reading Speed on Different Platforms

    ERIC Educational Resources Information Center

    Mpofu, Bongeka

    2016-01-01

    This research was aimed at the investigation of mobile device and computer use at a higher learning institution. The goal was to determine the current use of computers and mobile devices for learning and the students' reading speed on different platforms. The research was contextualised in a sample of students at the University of South Africa.…

  3. Application verification research of cloud computing technology in the field of real time aerospace experiment

    NASA Astrophysics Data System (ADS)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  4. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  5. Personalized health care system with virtual reality rehabilitation and appropriate information for seniors.

    PubMed

    Gachet Páez, Diego; Aparicio, Fernando; de Buenaga, Manuel; Padrón, Víctor

    2012-01-01

    The concept of the information society is now a common one, as opposed to the industrial society that dominated the economy during the last years. It is assumed that all sectors should have access to information and reap its benefits. Elderly people are, in this respect, a major challenge, due to their lack of interest in technological progress and their lack of knowledge regarding the potential benefits that information society technologies might have on their lives. The Naviga Project (An Open and Adaptable Platform for the Elderly and Persons with Disability to Access the Information Society) is a European effort, whose main goal is to design and develop a technological platform allowing elder people and persons with disability to access the internet and the information society. Naviga also allows the creation of services targeted to social networks, mind training and personalized health care. In this paper we focus on the health care and information services designed on the project, the technological platform developed and details of two representative elements, the virtual reality hand rehabilitation and the health information intelligent system.

  6. Personalized Health Care System with Virtual Reality Rehabilitation and Appropriate Information for Seniors

    PubMed Central

    Páez, Diego Gachet; Aparicio, Fernando; de Buenaga, Manuel; Padrón, Víctor

    2012-01-01

    The concept of the information society is now a common one, as opposed to the industrial society that dominated the economy during the last years. It is assumed that all sectors should have access to information and reap its benefits. Elderly people are, in this respect, a major challenge, due to their lack of interest in technological progress and their lack of knowledge regarding the potential benefits that information society technologies might have on their lives. The Naviga Project (An Open and Adaptable Platform for the Elderly and Persons with Disability to Access the Information Society) is a European effort, whose main goal is to design and develop a technological platform allowing elder people and persons with disability to access the internet and the information society. Naviga also allows the creation of services targeted to social networks, mind training and personalized health care. In this paper we focus on the health care and information services designed on the project, the technological platform developed and details of two representative elements, the virtual reality hand rehabilitation and the health information intelligent system. PMID:22778598

  7. MyDiabetesMyWay

    PubMed Central

    Wake, Deborah J.; He, Jinzhang; Czesak, Anna Maria; Mughal, Fezan; Cunningham, Scott G.

    2016-01-01

    MyDiabetesMyWay (MDMW) is an award-wining national electronic personal health record and self-management platform for diabetes patients in Scotland. This platform links multiple national institutional and patient-recorded data sources to provide a unique resource for patient care and self-management. This review considers the current evidence for online interventions in diabetes and discusses these in the context of current and ongoing developments for MDMW. Evaluation of MDMW through patient reported outcomes demonstrates a positive impact on self-management. User feedback has highlighted barriers to uptake and has guided platform evolution from an education resource website to an electronic personal health record now encompassing remote monitoring, communication tools and personalized education links. Challenges in delivering digital interventions for long-term conditions include integration of data between institutional and personal recorded sources to perform big data analytics and facilitating technology use in those with disabilities, low digital literacy, low socioeconomic status and in minority groups. The potential for technology supported health improvement is great, but awareness and adoption by health workers and patients remains a significant barrier. PMID:27162192

  8. Simulation of complex pharmacokinetic models in Microsoft Excel.

    PubMed

    Meineke, Ingolf; Brockmöller, Jürgen

    2007-12-01

    With the arrival of powerful personal computers in the office numerical methods are accessible to everybody. Simulation of complex processes therefore has become an indispensible tool in research and education. In this paper Microsoft EXCEL is used as a platform for a universal differential equation solver. The software is designed as an add-in aiming at a minimum of required user input to perform a given task. Four examples are included to demonstrate both, the simplicity of use and the versatility of possible applications. While the layout of the program is admittedly geared to the needs of pharmacokineticists, it can be used in any field where sets of differential equations are involved. The software package is available upon request.

  9. CheD: chemical database compilation tool, Internet server, and client for SQL servers.

    PubMed

    Trepalin, S V; Yarkov, A V

    2001-01-01

    An efficient program, which runs on a personal computer, for the storage, retrieval, and processing of chemical information, is presented, The program can work both as a stand-alone application or in conjunction with a specifically written Web server application or with some standard SQL servers, e.g., Oracle, Interbase, and MS SQL. New types of data fields are introduced, e.g., arrays for spectral information storage, HTML and database links, and user-defined functions. CheD has an open architecture; thus, custom data types, controls, and services may be added. A WWW server application for chemical data retrieval features an easy and user-friendly installation on Windows NT or 95 platforms.

  10. Ultra-Compact Transputer-Based Controller for High-Level, Multi-Axis Coordination

    NASA Technical Reports Server (NTRS)

    Zenowich, Brian; Crowell, Adam; Townsend, William T.

    2013-01-01

    The design of machines that rely on arrays of servomotors such as robotic arms, orbital platforms, and combinations of both, imposes a heavy computational burden to coordinate their actions to perform coherent tasks. For example, the robotic equivalent of a person tracing a straight line in space requires enormously complex kinematics calculations, and complexity increases with the number of servo nodes. A new high-level architecture for coordinated servo-machine control enables a practical, distributed transputer alternative to conventional central processor electronics. The solution is inherently scalable, dramatically reduces bulkiness and number of conductor runs throughout the machine, requires only a fraction of the power, and is designed for cooling in a vacuum.

  11. OVERGRID: A Unified Overset Grid Generation Graphical Interface

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Akien, Edwin W. (Technical Monitor)

    1999-01-01

    This paper presents a unified graphical interface and gridding strategy for performing overset grid generation. The interface called OVERGRID has been specifically designed to follow an efficient overset gridding strategy, and contains general grid manipulation capabilities as well as modules that are specifically suited for overset grids. General grid utilities include functions for grid redistribution, smoothing, concatenation, extraction, extrapolation, projection, and many others. Modules specially tailored for overset grids include a seam curve extractor, hyperbolic and algebraic surface grid generators, a hyperbolic volume grid generator, and a Cartesian box grid generator, Grid visualization is achieved using OpenGL while widgets are constructed with Tcl/Tk. The software is portable between various platforms from UNIX workstations to personal computers.

  12. Electronic screen media for persons with autism spectrum disorders: results of a survey.

    PubMed

    Shane, Howard C; Albert, Patti Ducoff

    2008-09-01

    Social and anecdotal reports suggest a predilection for visual media among individuals on the autism spectrum, yet no formal investigation has explored the extent of that use. Using a distributed questionnaire design, parents and caregivers report on time allotted toward media, including observable behaviors and communicative responses. More time was spent engaged with electronic screen media (ESM) than any other leisure activity. Television and movie viewing was more popular than computer usage. Across media platforms, animated programs were more highly preferred. Prevalent verbal and physical imitation was reported to occur during and following exposure to ESM. Clinical implications to strategically incorporate ESM into learning approaches for children with autism spectrum disorders (ASD) are provided.

  13. A miniaturized NQR spectrometer for a multi-channel NQR-based detection device

    NASA Astrophysics Data System (ADS)

    Beguš, Samo; Jazbinšek, Vojko; Pirnat, Janez; Trontelj, Zvonko

    2014-10-01

    A low frequency (0.5-5 MHz) battery operated sensitive pulsed NQR spectrometer with a transmitter power up to 5 W and a total mass of about 3 kg aimed at detecting 14 N NQR signals, predominantly of illicit materials, was designed and assembled. This spectrometer uses a standard software defined radio (SDR) platform for the data acquisition unit. Signal processing is done with the LabView Virtual instrument on a personal computer. We successfully tested the spectrometer by measuring 14 N NQR signals from aminotetrazole monohydrate (ATMH), potassium nitrate (PN), paracetamol (PCM) and trinitrotoluene (TNT). Such a spectrometer is a feasible component of a portable single or multichannel 14 N NQR based detection device.

  14. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.

    PubMed

    Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.

  15. Authentication of Smartphone Users Based on Activity Recognition and Mobile Sensing.

    PubMed

    Ehatisham-Ul-Haq, Muhammad; Azam, Muhammad Awais; Loo, Jonathan; Shuang, Kai; Islam, Syed; Naeem, Usman; Amin, Yasar

    2017-09-06

    Smartphones are context-aware devices that provide a compelling platform for ubiquitous computing and assist users in accomplishing many of their routine tasks anytime and anywhere, such as sending and receiving emails. The nature of tasks conducted with these devices has evolved with the exponential increase in the sensing and computing capabilities of a smartphone. Due to the ease of use and convenience, many users tend to store their private data, such as personal identifiers and bank account details, on their smartphone. However, this sensitive data can be vulnerable if the device gets stolen or lost. A traditional approach for protecting this type of data on mobile devices is to authenticate users with mechanisms such as PINs, passwords, and fingerprint recognition. However, these techniques are vulnerable to user compliance and a plethora of attacks, such as smudge attacks. The work in this paper addresses these challenges by proposing a novel authentication framework, which is based on recognizing the behavioral traits of smartphone users using the embedded sensors of smartphone, such as Accelerometer, Gyroscope and Magnetometer. The proposed framework also provides a platform for carrying out multi-class smart user authentication, which provides different levels of access to a wide range of smartphone users. This work has been validated with a series of experiments, which demonstrate the effectiveness of the proposed framework.

  16. Authentication of Smartphone Users Based on Activity Recognition and Mobile Sensing

    PubMed Central

    Ehatisham-ul-Haq, Muhammad; Azam, Muhammad Awais; Loo, Jonathan; Shuang, Kai; Islam, Syed; Naeem, Usman; Amin, Yasar

    2017-01-01

    Smartphones are context-aware devices that provide a compelling platform for ubiquitous computing and assist users in accomplishing many of their routine tasks anytime and anywhere, such as sending and receiving emails. The nature of tasks conducted with these devices has evolved with the exponential increase in the sensing and computing capabilities of a smartphone. Due to the ease of use and convenience, many users tend to store their private data, such as personal identifiers and bank account details, on their smartphone. However, this sensitive data can be vulnerable if the device gets stolen or lost. A traditional approach for protecting this type of data on mobile devices is to authenticate users with mechanisms such as PINs, passwords, and fingerprint recognition. However, these techniques are vulnerable to user compliance and a plethora of attacks, such as smudge attacks. The work in this paper addresses these challenges by proposing a novel authentication framework, which is based on recognizing the behavioral traits of smartphone users using the embedded sensors of smartphone, such as Accelerometer, Gyroscope and Magnetometer. The proposed framework also provides a platform for carrying out multi-class smart user authentication, which provides different levels of access to a wide range of smartphone users. This work has been validated with a series of experiments, which demonstrate the effectiveness of the proposed framework. PMID:28878177

  17. Smartloss: A Personalized Mobile Health Intervention for Weight Management and Health Promotion

    PubMed Central

    Gilmore, L. Anne; Apolzan, John W; Myers, Candice A; Thomas, Diana M

    2016-01-01

    Background Synonymous with increased use of mobile phones has been the development of mobile health (mHealth) technology for improving health, including weight management. Behavior change theory (eg, the theory of planned behavior) can be effectively encapsulated into mobile phone-based health improvement programs, which is fostered by the ability of mobile phones and related devices to collect and transmit objective data in near real time and for health care or research professionals and clients to communicate easily. Objective To describe SmartLoss, a semiautomated mHealth platform for weight loss. Methods We developed and validated a dynamic energy balance model that determines the amount of weight an individual will lose over time if they are adherent to an energy intake prescription. This model was incorporated into computer code that enables adherence to a prescribed caloric prescription determined from the change in body weight of the individual. Data from the individual are then used to guide personalized recommendations regarding weight loss and behavior change via a semiautomated mHealth platform called SmartLoss, which consists of 2 elements: (1) a clinician dashboard and (2) a mobile phone app. SmartLoss includes and interfaces with a network-connected bathroom scale and a Bluetooth-connected accelerometer, which enables automated collection of client information (eg, body weight change and physical activity patterns), as well as the systematic delivery of preplanned health materials and automated feedback that is based on client data and is designed to foster prolonged adherence with body weight, diet, and exercise goals. The clinician dashboard allows for efficient remote monitoring of all clients simultaneously, which may further increase adherence, personalization of treatment, treatment fidelity, and efficacy. Results Evidence of the efficacy of the SmartLoss approach has been reported previously. The present report provides a thorough description of the SmartLoss Virtual Weight Management Suite, a professionally programmed platform that facilitates treatment fidelity and the ability to customize interventions and disseminate them widely. Conclusions SmartLoss functions as a virtual weight management clinic that relies upon empirical weight loss research and behavioral theory to promote behavior change and weight loss. PMID:26983937

  18. Smartloss: A Personalized Mobile Health Intervention for Weight Management and Health Promotion.

    PubMed

    Martin, Corby K; Gilmore, L Anne; Apolzan, John W; Myers, Candice A; Thomas, Diana M; Redman, Leanne M

    2016-03-16

    Synonymous with increased use of mobile phones has been the development of mobile health (mHealth) technology for improving health, including weight management. Behavior change theory (eg, the theory of planned behavior) can be effectively encapsulated into mobile phone-based health improvement programs, which is fostered by the ability of mobile phones and related devices to collect and transmit objective data in near real time and for health care or research professionals and clients to communicate easily. To describe SmartLoss, a semiautomated mHealth platform for weight loss. We developed and validated a dynamic energy balance model that determines the amount of weight an individual will lose over time if they are adherent to an energy intake prescription. This model was incorporated into computer code that enables adherence to a prescribed caloric prescription determined from the change in body weight of the individual. Data from the individual are then used to guide personalized recommendations regarding weight loss and behavior change via a semiautomated mHealth platform called SmartLoss, which consists of 2 elements: (1) a clinician dashboard and (2) a mobile phone app. SmartLoss includes and interfaces with a network-connected bathroom scale and a Bluetooth-connected accelerometer, which enables automated collection of client information (eg, body weight change and physical activity patterns), as well as the systematic delivery of preplanned health materials and automated feedback that is based on client data and is designed to foster prolonged adherence with body weight, diet, and exercise goals. The clinician dashboard allows for efficient remote monitoring of all clients simultaneously, which may further increase adherence, personalization of treatment, treatment fidelity, and efficacy. Evidence of the efficacy of the SmartLoss approach has been reported previously. The present report provides a thorough description of the SmartLoss Virtual Weight Management Suite, a professionally programmed platform that facilitates treatment fidelity and the ability to customize interventions and disseminate them widely. SmartLoss functions as a virtual weight management clinic that relies upon empirical weight loss research and behavioral theory to promote behavior change and weight loss.

  19. GPU-based High-Performance Computing for Radiation Therapy

    PubMed Central

    Jia, Xun; Ziegenhein, Peter; Jiang, Steve B.

    2014-01-01

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. Graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past a few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of studies have been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this article, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented. PMID:24486639

  20. OpenACC performance for simulating 2D radial dambreak using FVM HLLE flux

    NASA Astrophysics Data System (ADS)

    Gunawan, P. H.; Pahlevi, M. R.

    2018-03-01

    The aim of this paper is to investigate the performances of openACC platform for computing 2D radial dambreak. Here, the shallow water equation will be used to describe and simulate 2D radial dambreak with finite volume method (FVM) using HLLE flux. OpenACC is a parallel computing platform based on GPU cores. Indeed, from this research this platform is used to minimize computational time on the numerical scheme performance. The results show the using OpenACC, the computational time is reduced. For the dry and wet radial dambreak simulations using 2048 grids, the computational time of parallel is obtained 575.984 s and 584.830 s respectively for both simulations. These results show the successful of OpenACC when they are compared with the serial time of dry and wet radial dambreak simulations which are collected 28047.500 s and 29269.40 s respectively.

  1. MACBenAbim: A Multi-platform Mobile Application for searching keyterms in Computational Biology and Bioinformatics.

    PubMed

    Oluwagbemi, Olugbenga O; Adewumi, Adewole; Esuruoso, Abimbola

    2012-01-01

    Computational biology and bioinformatics are gradually gaining grounds in Africa and other developing nations of the world. However, in these countries, some of the challenges of computational biology and bioinformatics education are inadequate infrastructures, and lack of readily-available complementary and motivational tools to support learning as well as research. This has lowered the morale of many promising undergraduates, postgraduates and researchers from aspiring to undertake future study in these fields. In this paper, we developed and described MACBenAbim (Multi-platform Mobile Application for Computational Biology and Bioinformatics), a flexible user-friendly tool to search for, define and describe the meanings of keyterms in computational biology and bioinformatics, thus expanding the frontiers of knowledge of the users. This tool also has the capability of achieving visualization of results on a mobile multi-platform context. MACBenAbim is available from the authors for non-commercial purposes.

  2. 2000 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Greg; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2001-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective. high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA'S Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 1999 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2000 by the High Performance Computing and Communications Program.

  3. 2001 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Gregory; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2002-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA's Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 2000 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2001 by the High Performance Computing and Communications Program.

  4. 46 CFR 163.002-13 - Construction.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... a rigid ladder or a lift platform on which a person being raised or lowered may stand. (b) Spreader. Each hoist must have a spreader or other device to prevent twisting of its ladder or lift platform. If.... The rigid ladder or lift platform on a pilot hoist and the ends of its spreader (if a spreader is...

  5. 46 CFR 163.002-13 - Construction.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... a rigid ladder or a lift platform on which a person being raised or lowered may stand. (b) Spreader. Each hoist must have a spreader or other device to prevent twisting of its ladder or lift platform. If.... The rigid ladder or lift platform on a pilot hoist and the ends of its spreader (if a spreader is...

  6. Workload Characterization of CFD Applications Using Partial Differential Equation Solvers

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Workload characterization is used for modeling and evaluating of computing systems at different levels of detail. We present workload characterization for a class of Computational Fluid Dynamics (CFD) applications that solve Partial Differential Equations (PDEs). This workload characterization focuses on three high performance computing platforms: SGI Origin2000, EBM SP-2, a cluster of Intel Pentium Pro bases PCs. We execute extensive measurement-based experiments on these platforms to gather statistics of system resource usage, which results in workload characterization. Our workload characterization approach yields a coarse-grain resource utilization behavior that is being applied for performance modeling and evaluation of distributed high performance metacomputing systems. In addition, this study enhances our understanding of interactions between PDE solver workloads and high performance computing platforms and is useful for tuning these applications.

  7. QuantifyMe: An Open-Source Automated Single-Case Experimental Design Platform.

    PubMed

    Taylor, Sara; Sano, Akane; Ferguson, Craig; Mohan, Akshay; Picard, Rosalind W

    2018-04-05

    Smartphones and wearable sensors have enabled unprecedented data collection, with many products now providing feedback to users about recommended step counts or sleep durations. However, these recommendations do not provide personalized insights that have been shown to be best suited for a specific individual. A scientific way to find individualized recommendations and causal links is to conduct experiments using single-case experimental design; however, properly designed single-case experiments are not easy to conduct on oneself. We designed, developed, and evaluated a novel platform, QuantifyMe, for novice self-experimenters to conduct proper-methodology single-case self-experiments in an automated and scientific manner using their smartphones. We provide software for the platform that we used (available for free on GitHub), which provides the methodological elements to run many kinds of customized studies. In this work, we evaluate its use with four different kinds of personalized investigations, examining how variables such as sleep duration and regularity, activity, and leisure time affect personal happiness, stress, productivity, and sleep efficiency. We conducted a six-week pilot study ( N = 13) to evaluate QuantifyMe. We describe the lessons learned developing the platform and recommendations for its improvement, as well as its potential for enabling personalized insights to be scientifically evaluated in many individuals, reducing the high administrative cost for advancing human health and wellbeing.

  8. QuantifyMe: An Open-Source Automated Single-Case Experimental Design Platform †

    PubMed Central

    Sano, Akane; Ferguson, Craig; Mohan, Akshay; Picard, Rosalind W.

    2018-01-01

    Smartphones and wearable sensors have enabled unprecedented data collection, with many products now providing feedback to users about recommended step counts or sleep durations. However, these recommendations do not provide personalized insights that have been shown to be best suited for a specific individual. A scientific way to find individualized recommendations and causal links is to conduct experiments using single-case experimental design; however, properly designed single-case experiments are not easy to conduct on oneself. We designed, developed, and evaluated a novel platform, QuantifyMe, for novice self-experimenters to conduct proper-methodology single-case self-experiments in an automated and scientific manner using their smartphones. We provide software for the platform that we used (available for free on GitHub), which provides the methodological elements to run many kinds of customized studies. In this work, we evaluate its use with four different kinds of personalized investigations, examining how variables such as sleep duration and regularity, activity, and leisure time affect personal happiness, stress, productivity, and sleep efficiency. We conducted a six-week pilot study (N = 13) to evaluate QuantifyMe. We describe the lessons learned developing the platform and recommendations for its improvement, as well as its potential for enabling personalized insights to be scientifically evaluated in many individuals, reducing the high administrative cost for advancing human health and wellbeing. PMID:29621133

  9. Future perspectives of personalized medicine in traditional Chinese medicine: a systems biology approach.

    PubMed

    Zhang, Aihua; Sun, Hui; Wang, Ping; Han, Ying; Wang, Xijun

    2012-01-01

    Deconstruction of molecular pathways and advances in enabling technology platforms have opened new horizons for disease management, exploring therapeutic solutions to each individual patient beyond the one-size fits all practice. Application of personalized medicine paradigms aims to achieve the right diagnosis and right treatment for the right patient at the right time at the right cost. With the potential to transform medical practice across global communities, personalized medicine is emerging as the flagship of modern medicine. In recent years, the health care paradigm has shifted from a focus on diseases to a major hot of personalized traditional Chinese medicine (TCM) with holistic approach. TCM focuses on health maintenance, emphasizes on enhancing the body's resistance to diseases and especially showes great advantages in early intervention, personalized and combination therapies, etc. Systems biology, a new science of the 21st century, becomes practically available and resembles TCM in many aspects such as study method and design, and is current key component technologies that serves as the major driving force for translation of the personalized medicine revolution of TCM principles into practice, will advance personalized therapy principles into healthcare management tools for individuals and populations. Such system approach concepts are transforming principles of TCM to modern therapeutic approaches, enable a predictive and preventive medicine and will lead to personalized medicine. To realise the full potential of personalized TCM, we describe the current status of principles and practice of TCM integrated with systems biology platform. Some characteristic examples are presented to highlight the application of this platform to personalized TCM research and development as well as some of the necessary milestones for moving TCM into mainstream health care. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Demagnetization Analysis in Excel (DAIE) - An open source workbook in Excel for viewing and analyzing demagnetization data from paleomagnetic discrete samples and u-channels

    NASA Astrophysics Data System (ADS)

    Sagnotti, Leonardo

    2013-04-01

    Modern rock magnetometers and stepwise demagnetization procedures result in the production of large datasets, which need a versatile and fast software for their display and analysis. Various software packages for paleomagnetic analyses have been recently developed to overcome the problems linked to the limited capability and the loss of operability of early codes written in obsolete computer languages and/or platforms, not compatible with modern 64 bit processors. The Demagnetization Analysis in Excel (DAIE) workbook is a new software that has been designed to make the analysis of demagnetization data easy and accessible on an application (Microsoft Excel) widely diffused and available on both the Microsoft Windows and Mac OS X operating systems. The widespread diffusion of Excel should guarantee a long term working life, since compatibility and functionality of current Excel files should be most likely maintained during the development of new processors and operating systems. DAIE is designed for viewing and analyzing stepwise demagnetization data of both discrete and u-channel samples. DAIE consists of a single file and has an open modular structure organized in 10 distinct worksheets. The standard demagnetization diagrams and various parameters of common use are shown on the same worksheet including selectable parameters and user's choices. The remanence characteristic components may be computed by principal component analysis (PCA) on a selected interval of demagnetization steps. Saving of the PCA data can be done both sample by sample, or in automatic by applying the selected choices to all the samples included in the file. The DAIE open structure allows easy personalization, development and improvement. The workbook has the following features which may be valuable for various users: - Operability in nearly all the computers and platforms; - Easy inputs of demagnetization data by "copy and paste" from ASCII files; - Easy export of computed parameters and demagnetization plots; - Complete control of the whole workflow and possibility of implementation of the workbook by any user; - Modular structure in distinct worksheets for each type of analyses and plots, in order to make implementation and personalization easier; - Opportunity to use the workbook for educational purposes, since all the computations and analyses are easily traceable and accessible; - Automatic and fast analysis of a large batch of demagnetization data, such as those measured on u-channel samples. The DAIE workbook and the "User manual" are available for download on a dedicated web site (http://roma2.rm.ingv.it/en/facilities/software/49/daie).

  11. Web Program for Development of GUIs for Cluster Computers

    NASA Technical Reports Server (NTRS)

    Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward

    2003-01-01

    WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.

  12. Fine-grained parallel RNAalifold algorithm for RNA secondary structure prediction on FPGA

    PubMed Central

    Xia, Fei; Dou, Yong; Zhou, Xingming; Yang, Xuejun; Xu, Jiaqing; Zhang, Yang

    2009-01-01

    Background In the field of RNA secondary structure prediction, the RNAalifold algorithm is one of the most popular methods using free energy minimization. However, general-purpose computers including parallel computers or multi-core computers exhibit parallel efficiency of no more than 50%. Field Programmable Gate-Array (FPGA) chips provide a new approach to accelerate RNAalifold by exploiting fine-grained custom design. Results RNAalifold shows complicated data dependences, in which the dependence distance is variable, and the dependence direction is also across two dimensions. We propose a systolic array structure including one master Processing Element (PE) and multiple slave PEs for fine grain hardware implementation on FPGA. We exploit data reuse schemes to reduce the need to load energy matrices from external memory. We also propose several methods to reduce energy table parameter size by 80%. Conclusion To our knowledge, our implementation with 16 PEs is the only FPGA accelerator implementing the complete RNAalifold algorithm. The experimental results show a factor of 12.2 speedup over the RNAalifold (ViennaPackage – 1.6.5) software for a group of aligned RNA sequences with 2981-residue running on a Personal Computer (PC) platform with Pentium 4 2.6 GHz CPU. PMID:19208138

  13. An integrated compact airborne multispectral imaging system using embedded computer

    NASA Astrophysics Data System (ADS)

    Zhang, Yuedong; Wang, Li; Zhang, Xuguo

    2015-08-01

    An integrated compact airborne multispectral imaging system using embedded computer based control system was developed for small aircraft multispectral imaging application. The multispectral imaging system integrates CMOS camera, filter wheel with eight filters, two-axis stabilized platform, miniature POS (position and orientation system) and embedded computer. The embedded computer has excellent universality and expansibility, and has advantages in volume and weight for airborne platform, so it can meet the requirements of control system of the integrated airborne multispectral imaging system. The embedded computer controls the camera parameters setting, filter wheel and stabilized platform working, image and POS data acquisition, and stores the image and data. The airborne multispectral imaging system can connect peripheral device use the ports of the embedded computer, so the system operation and the stored image data management are easy. This airborne multispectral imaging system has advantages of small volume, multi-function, and good expansibility. The imaging experiment results show that this system has potential for multispectral remote sensing in applications such as resource investigation and environmental monitoring.

  14. The Design of a High Performance Earth Imagery and Raster Data Management and Processing Platform

    NASA Astrophysics Data System (ADS)

    Xie, Qingyun

    2016-06-01

    This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC). Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.

  15. Privacy and information security risks in a technology platform for home-based chronic disease rehabilitation and education

    PubMed Central

    2013-01-01

    Background Privacy and information security are important for all healthcare services, including home-based services. We have designed and implemented a prototype technology platform for providing home-based healthcare services. It supports a personal electronic health diary and enables secure and reliable communication and interaction with peers and healthcare personnel. The platform runs on a small computer with a dedicated remote control. It is connected to the patient’s TV and to a broadband Internet. The platform has been tested with home-based rehabilitation and education programs for chronic obstructive pulmonary disease and diabetes. As part of our work, a risk assessment of privacy and security aspects has been performed, to reveal actual risks and to ensure adequate information security in this technical platform. Methods Risk assessment was performed in an iterative manner during the development process. Thus, security solutions have been incorporated into the design from an early stage instead of being included as an add-on to a nearly completed system. We have adapted existing risk management methods to our own environment, thus creating our own method. Our method conforms to ISO’s standard for information security risk management. Results A total of approximately 50 threats and possible unwanted incidents were identified and analysed. Among the threats to the four information security aspects: confidentiality, integrity, availability, and quality; confidentiality threats were identified as most serious, with one threat given an unacceptable level of High risk. This is because health-related personal information is regarded as sensitive. Availability threats were analysed as low risk, as the aim of the home programmes is to provide education and rehabilitation services; not for use in acute situations or for continuous health monitoring. Conclusions Most of the identified threats are applicable for healthcare services intended for patients or citizens in their own homes. Confidentiality risks in home are different from in a more controlled environment such as a hospital; and electronic equipment located in private homes and communicating via Internet, is more exposed to unauthorised access. By implementing the proposed measures, it has been possible to design a home-based service which ensures the necessary level of information security and privacy. PMID:23937965

  16. Flexible and Stretchable Physical Sensor Integrated Platforms for Wearable Human-Activity Monitoringand Personal Healthcare.

    PubMed

    Trung, Tran Quang; Lee, Nae-Eung

    2016-06-01

    Flexible and stretchable physical sensors that can measure and quantify electrical signals generated by human activities are attracting a great deal of attention as they have unique characteristics, such as ultrathinness, low modulus, light weight, high flexibility, and stretchability. These flexible and stretchable physical sensors conformally attached on the surface of organs or skin can provide a new opportunity for human-activity monitoring and personal healthcare. Consequently, in recent years there has been considerable research effort devoted to the development of flexible and stretchable physical sensors to fulfill the requirements of future technology, and much progress has been achieved. Here, the most recent developments of flexible and stretchable physical sensors are described, including temperature, pressure, and strain sensors, and flexible and stretchable sensor-integrated platforms. The latest successful examples of flexible and stretchable physical sensors for the detection of temperature, pressure, and strain, as well as their novel structures, technological innovations, and challenges, are reviewed first. In the next section, recent progress regarding sensor-integrated wearable platforms is overviewed in detail. Some of the latest achievements regarding self-powered sensor-integrated wearable platform technologies are also reviewed. Further research direction and challenges are also proposed to develop a fully sensor-integrated wearable platform for monitoring human activity and personal healthcare in the near future. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Personality Traits, Motivations, and Emotional Consequences of Social Media Usage.

    PubMed

    Lin, Jhih-Syuan; Lee, Yen-I; Jin, Yan; Gilbreath, Bob

    2017-10-01

    This study explores social media users' personality traits and motivations for the usage of two different social media platforms, Facebook and Pinterest, as well as how the varied uses impact users' negative emotional experiences. The findings suggest that the intensity of social media usage is positively related to negative emotions. For Facebook users, socialization, entertainment, and information seeking motivations significantly influence their platform use intensity and, subsequently, lead to negative emotions. Self-status seeking also has a direct effect on Facebook users' negative emotions. For Pinterest users, socialization is not a significant motivation for usage of that platform. However, entertainment, information seeking, and self-status seeking significantly predict their platform use intensity, which subsequently lead to negative emotions. Similarly, all four motivations for Facebook and Pinterest uses are influenced by users' personality traits: extraversion and openness. Yet, openness has a greater impact on using Pinterest than Facebook in terms of fulfilling socialization needs. Neuroticism has a positive impact on socialization and information seeking motives for use of both platforms, while conscientiousness and agreeableness have a negative influence on fulfilling self-status seeking needs. In addition, agreeable social networking site users are less likely to use Facebook than Pinterest for fulfilling self-status related gratifications, while they are likely to use Pinterest instead of Facebook for entertainment and information needs. Implications of the findings and suggestions for future research are discussed.

  18. A Platform for Designing Genome-Based Personalized Immunotherapy or Vaccine against Cancer

    PubMed Central

    Gupta, Sudheer; Chaudhary, Kumardeep; Dhanda, Sandeep Kumar; Kumar, Rahul; Kumar, Shailesh; Sehgal, Manika; Nagpal, Gandharva

    2016-01-01

    Due to advancement in sequencing technology, genomes of thousands of cancer tissues or cell-lines have been sequenced. Identification of cancer-specific epitopes or neoepitopes from cancer genomes is one of the major challenges in the field of immunotherapy or vaccine development. This paper describes a platform Cancertope, developed for designing genome-based immunotherapy or vaccine against a cancer cell. Broadly, the integrated resources on this platform are apportioned into three precise sections. First section explains a cancer-specific database of neoepitopes generated from genome of 905 cancer cell lines. This database harbors wide range of epitopes (e.g., B-cell, CD8+ T-cell, HLA class I, HLA class II) against 60 cancer-specific vaccine antigens. Second section describes a partially personalized module developed for predicting potential neoepitopes against a user-specific cancer genome. Finally, we describe a fully personalized module developed for identification of neoepitopes from genomes of cancerous and healthy cells of a cancer-patient. In order to assist the scientific community, wide range of tools are incorporated in this platform that includes screening of epitopes against human reference proteome (http://www.imtech.res.in/raghava/cancertope/). PMID:27832200

  19. The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science

    PubMed Central

    Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo

    2008-01-01

    The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570

  20. Social Computing as Next-Gen Learning Paradigm: A Platform and Applications

    NASA Astrophysics Data System (ADS)

    Margherita, Alessandro; Taurino, Cesare; Del Vecchio, Pasquale

    As a field at the intersection between computer science and people behavior, social computing can contribute significantly in the endeavor of innovating how individuals and groups interact for learning and working purposes. In particular, the generation of Internet applications tagged as web 2.0 provides an opportunity to create new “environments” where people can exchange knowledge and experience, create new knowledge and learn together. This chapter illustrates the design and application of a prototypal platform which embeds tools such as blog, wiki, folksonomy and RSS in a unique web-based system. This platform has been developed to support a case-based and project-driven learning strategy for the development of business and technology management competencies in undergraduate and graduate education programs. A set of illustrative scenarios are described to show how a learning community can be promoted, created, and sustained through the technological platform.

  1. A generic, cost-effective, and scalable cell lineage analysis platform

    PubMed Central

    Biezuner, Tamir; Spiro, Adam; Raz, Ofir; Amir, Shiran; Milo, Lilach; Adar, Rivka; Chapal-Ilani, Noa; Berman, Veronika; Fried, Yael; Ainbinder, Elena; Cohen, Galit; Barr, Haim M.; Halaban, Ruth; Shapiro, Ehud

    2016-01-01

    Advances in single-cell genomics enable commensurate improvements in methods for uncovering lineage relations among individual cells. Current sequencing-based methods for cell lineage analysis depend on low-resolution bulk analysis or rely on extensive single-cell sequencing, which is not scalable and could be biased by functional dependencies. Here we show an integrated biochemical-computational platform for generic single-cell lineage analysis that is retrospective, cost-effective, and scalable. It consists of a biochemical-computational pipeline that inputs individual cells, produces targeted single-cell sequencing data, and uses it to generate a lineage tree of the input cells. We validated the platform by applying it to cells sampled from an ex vivo grown tree and analyzed its feasibility landscape by computer simulations. We conclude that the platform may serve as a generic tool for lineage analysis and thus pave the way toward large-scale human cell lineage discovery. PMID:27558250

  2. An interactive parallel programming environment applied in atmospheric science

    NASA Technical Reports Server (NTRS)

    vonLaszewski, G.

    1996-01-01

    This article introduces an interactive parallel programming environment (IPPE) that simplifies the generation and execution of parallel programs. One of the tasks of the environment is to generate message-passing parallel programs for homogeneous and heterogeneous computing platforms. The parallel programs are represented by using visual objects. This is accomplished with the help of a graphical programming editor that is implemented in Java and enables portability to a wide variety of computer platforms. In contrast to other graphical programming systems, reusable parts of the programs can be stored in a program library to support rapid prototyping. In addition, runtime performance data on different computing platforms is collected in a database. A selection process determines dynamically the software and the hardware platform to be used to solve the problem in minimal wall-clock time. The environment is currently being tested on a Grand Challenge problem, the NASA four-dimensional data assimilation system.

  3. Scalable and massively parallel Monte Carlo photon transport simulations for heterogeneous computing platforms

    NASA Astrophysics Data System (ADS)

    Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian

    2018-01-01

    We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs.

  4. Cellular computational platform and neurally inspired elements thereof

    DOEpatents

    Okandan, Murat

    2016-11-22

    A cellular computational platform is disclosed that includes a multiplicity of functionally identical, repeating computational hardware units that are interconnected electrically and optically. Each computational hardware unit includes a reprogrammable local memory and has interconnections to other such units that have reconfigurable weights. Each computational hardware unit is configured to transmit signals into the network for broadcast in a protocol-less manner to other such units in the network, and to respond to protocol-less broadcast messages that it receives from the network. Each computational hardware unit is further configured to reprogram the local memory in response to incoming electrical and/or optical signals.

  5. 78 FR 14914 - Addition of Certain Persons to the Entity List

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-08

    .... This person will be listed on the Entity List under Germany, Russia, and Taiwan. DATES: Effective date... different countries for the person being added to the Entity List, consist of single entries in Russia... one person being listed under three separate entries, T -Platforms, a company headquartered in Russia...

  6. Human genomics projects and precision medicine.

    PubMed

    Carrasco-Ramiro, F; Peiró-Pastor, R; Aguado, B

    2017-09-01

    The completion of the Human Genome Project (HGP) in 2001 opened the floodgates to a deeper understanding of medicine. There are dozens of HGP-like projects which involve from a few tens to several million genomes currently in progress, which vary from having specialized goals or a more general approach. However, data generation, storage, management and analysis in public and private cloud computing platforms have raised concerns about privacy and security. The knowledge gained from further research has changed the field of genomics and is now slowly permeating into clinical medicine. The new precision (personalized) medicine, where genome sequencing and data analysis are essential components, allows tailored diagnosis and treatment according to the information from the patient's own genome and specific environmental factors. P4 (predictive, preventive, personalized and participatory) medicine is introducing new concepts, challenges and opportunities. This review summarizes current sequencing technologies, concentrates on ongoing human genomics projects, and provides some examples in which precision medicine has already demonstrated clinical impact in diagnosis and/or treatment.

  7. The Effect of In-Service Training of Computer Science Teachers on Scratch Programming Language Skills Using an Electronic Learning Platform on Programming Skills and the Attitudes towards Teaching Programming

    ERIC Educational Resources Information Center

    Alkaria, Ahmed; Alhassan, Riyadh

    2017-01-01

    This study was conducted to examine the effect of in-service training of computer science teachers in Scratch language using an electronic learning platform on acquiring programming skills and attitudes towards teaching programming. The sample of this study consisted of 40 middle school computer science teachers. They were assigned into two…

  8. Worldwide Telescope as an earth and planetary science educational platform

    NASA Astrophysics Data System (ADS)

    Fatland, D. R.; Rush, K.; van Ingen, C.; Wong, C.; Fay, J.; Xu, Y.; Fay, D.

    2009-12-01

    Worldwide Telescope (WWT) -available at no cost from Microsoft Research as both Windows desktop and web browser applications - enables personal computers to function as virtual telescopes for viewing the earth, the solar system and the cosmos across many wavelengths. Bringing together imagery from ground and space-based telescopes as well as photography from Mars rovers and Apollo astronauts, WWT is designed to work as both a research tool and a platform for educational exploration. Central to the latter purpose is the Tour authoring facility which enables a student or educator to create narrative stories with dynamic perspective, voice-over narrative, background sound and superimposed content. We describe here the application of recent developments in WWT, particularly the 2009 updates, towards planetary science education with particular emphasis on WWT earth models. Two core themes informing this development are the notions of enabling social networking through WWT Communities and including the earth as part of the bigger picture, in effect swinging the telescope around from the deep sky to look back at our observatory. moon, earth (WWT solar system view)

  9. Design and Evaluation of a Scalable and Reconfigurable Multi-Platform System for Acoustic Imaging

    PubMed Central

    Izquierdo, Alberto; Villacorta, Juan José; del Val Puente, Lara; Suárez, Luis

    2016-01-01

    This paper proposes a scalable and multi-platform framework for signal acquisition and processing, which allows for the generation of acoustic images using planar arrays of MEMS (Micro-Electro-Mechanical Systems) microphones with low development and deployment costs. Acoustic characterization of MEMS sensors was performed, and the beam pattern of a module, based on an 8 × 8 planar array and of several clusters of modules, was obtained. A flexible framework, formed by an FPGA, an embedded processor, a computer desktop, and a graphic processing unit, was defined. The processing times of the algorithms used to obtain the acoustic images, including signal processing and wideband beamforming via FFT, were evaluated in each subsystem of the framework. Based on this analysis, three frameworks are proposed, defined by the specific subsystems used and the algorithms shared. Finally, a set of acoustic images obtained from sound reflected from a person are presented as a case study in the field of biometric identification. These results reveal the feasibility of the proposed system. PMID:27727174

  10. Public health practice course using Google Plus.

    PubMed

    Wu, Ting-Ting; Sung, Tien-Wen

    2014-03-01

    In recent years, mobile device-assisted clinical education has become popular among nursing school students. The introduction of mobile devices saves manpower and reduces errors while enhancing nursing students' professional knowledge and skills. To respond to the demands of various learning strategies and to maintain existing systems of education, the concept of Cloud Learning is gradually being introduced to instructional environments. Cloud computing facilitates learning that is personalized, diverse, and virtual. This study involved assessing the advantages of mobile devices and Cloud Learning in a public health practice course, in which Google+ was used as the learning platform, integrating various application tools. Users could save and access data by using any wireless Internet device. The platform was student centered and based on resource sharing and collaborative learning. With the assistance of highly flexible and convenient technology, certain obstacles in traditional practice training can be resolved. Our findings showed that the students who adopted Google+ were learned more effectively compared with those who were limited to traditional learning systems. Most students and the nurse educator expressed a positive attitude toward and were satisfied with the innovative learning method.

  11. Detectable Warning Surfaces : Color, Contrast, and Reflectance

    DOT National Transportation Integrated Search

    1994-09-01

    The visual contrast of ten detectable warning surface/platform pairs was measured on an interior platform illuminated at 20 foot-candles, as recommended by the Americans with Disabilities Act Accessibility Guidelines (ADAAG) A4.429.2, by 24 persons h...

  12. Environmental Detectives--The Development of an Augmented Reality Platform for Environmental Simulations

    ERIC Educational Resources Information Center

    Klopfer, Eric; Squire, Kurt

    2008-01-01

    The form factors of handheld computers make them increasingly popular among K-12 educators. Although some compelling examples of educational software for handhelds exist, we believe that the potential of this platform are just being discovered. This paper reviews innovative applications for mobile computing for both education and entertainment…

  13. Beam Dynamics Simulation Platform and Studies of Beam Breakup in Dielectric Wakefield Structures

    NASA Astrophysics Data System (ADS)

    Schoessow, P.; Kanareykin, A.; Jing, C.; Kustov, A.; Altmark, A.; Gai, W.

    2010-11-01

    A particle-Green's function beam dynamics code (BBU-3000) to study beam breakup effects is incorporated into a parallel computing framework based on the Boinc software environment, and supports both task farming on a heterogeneous cluster and local grid computing. User access to the platform is through a web browser.

  14. Assessing the Decision Process towards Bring Your Own Device

    ERIC Educational Resources Information Center

    Koester, Richard F.

    2017-01-01

    Information technology continues to evolve to the point where mobile technologies--such as smart phones, tablets, and ultra-mobile computers have the embedded flexibility and power to be a ubiquitous platform to fulfill the entire user's computing needs. Mobile technology users view these platforms as adaptable enough to be the single solution for…

  15. Multivariate Gradient Analysis for Evaluating and Visualizing a Learning System Platform for Computer Programming

    ERIC Educational Resources Information Center

    Mather, Richard

    2015-01-01

    This paper explores the application of canonical gradient analysis to evaluate and visualize student performance and acceptance of a learning system platform. The subject of evaluation is a first year BSc module for computer programming. This uses "Ceebot," an animated and immersive game-like development environment. Multivariate…

  16. A Parallel Point Matching Algorithm for Landmark Based Image Registration Using Multicore Platform

    PubMed Central

    Yang, Lin; Gong, Leiguang; Zhang, Hong; Nosher, John L.; Foran, David J.

    2013-01-01

    Point matching is crucial for many computer vision applications. Establishing the correspondence between a large number of data points is a computationally intensive process. Some point matching related applications, such as medical image registration, require real time or near real time performance if applied to critical clinical applications like image assisted surgery. In this paper, we report a new multicore platform based parallel algorithm for fast point matching in the context of landmark based medical image registration. We introduced a non-regular data partition algorithm which utilizes the K-means clustering algorithm to group the landmarks based on the number of available processing cores, which optimize the memory usage and data transfer. We have tested our method using the IBM Cell Broadband Engine (Cell/B.E.) platform. The results demonstrated a significant speed up over its sequential implementation. The proposed data partition and parallelization algorithm, though tested only on one multicore platform, is generic by its design. Therefore the parallel algorithm can be extended to other computing platforms, as well as other point matching related applications. PMID:24308014

  17. Portability and Cross-Platform Performance of an MPI-Based Parallel Polygon Renderer

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1999-01-01

    Visualizing the results of computations performed on large-scale parallel computers is a challenging problem, due to the size of the datasets involved. One approach is to perform the visualization and graphics operations in place, exploiting the available parallelism to obtain the necessary rendering performance. Over the past several years, we have been developing algorithms and software to support visualization applications on NASA's parallel supercomputers. Our results have been incorporated into a parallel polygon rendering system called PGL. PGL was initially developed on tightly-coupled distributed-memory message-passing systems, including Intel's iPSC/860 and Paragon, and IBM's SP2. Over the past year, we have ported it to a variety of additional platforms, including the HP Exemplar, SGI Origin2OOO, Cray T3E, and clusters of Sun workstations. In implementing PGL, we have had two primary goals: cross-platform portability and high performance. Portability is important because (1) our manpower resources are limited, making it difficult to develop and maintain multiple versions of the code, and (2) NASA's complement of parallel computing platforms is diverse and subject to frequent change. Performance is important in delivering adequate rendering rates for complex scenes and ensuring that parallel computing resources are used effectively. Unfortunately, these two goals are often at odds. In this paper we report on our experiences with portability and performance of the PGL polygon renderer across a range of parallel computing platforms.

  18. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud

    PubMed Central

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Background Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. Results We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. Conclusions This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation. PMID:26501966

  19. Performance of MCNP4A on seven computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, J.S.; Brockhoff, R.C.

    1994-12-31

    The performance of seven computer platforms has been evaluated with the MCNP4A Monte Carlo radiation transport code. For the first time we report timing results using MCNP4A and its new test set and libraries. Comparisons are made on platforms not available to us in previous MCNP timing studies. By using MCNP4A and its 325-problem test set, a widely-used and readily-available physics production code is used; the timing comparison is not limited to a single ``typical`` problem, demonstrating the problem dependence of timing results; the results are reproducible at the more than 100 installations around the world using MCNP; comparison ofmore » performance of other computer platforms to the ones tested in this study is possible because we present raw data rather than normalized results; and a measure of the increase in performance of computer hardware and software over the past two years is possible. The computer platforms reported are the Cray-YMP 8/64, IBM RS/6000-560, Sun Sparc10, Sun Sparc2, HP/9000-735, 4 processor 100 MHz Silicon Graphics ONYX, and Gateway 2000 model 4DX2-66V PC. In 1991 a timing study of MCNP4, the predecessor to MCNP4A, was conducted using ENDF/B-V cross-section libraries, which are export protected. The new study is based upon the new MCNP 25-problem test set which utilizes internationally available data. MCNP4A, its test problems and the test data library are available from the Radiation Shielding and Information Center in Oak Ridge, Tennessee, or from the NEA Data Bank in Saclay, France. Anyone with the same workstation and compiler can get the same test problem sets, the same library files, and the same MCNP4A code from RSIC or NEA and replicate our results. And, because we report raw data, comparison of the performance of other compute platforms and compilers can be made.« less

  20. Particle Identification on an FPGA Accelerated Compute Platform for the LHCb Upgrade

    NASA Astrophysics Data System (ADS)

    Fäerber, Christian; Schwemmer, Rainer; Machen, Jonathan; Neufeld, Niko

    2017-07-01

    The current LHCb readout system will be upgraded in 2018 to a “triggerless” readout of the entire detector at the Large Hadron Collider collision rate of 40 MHz. The corresponding bandwidth from the detector down to the foreseen dedicated computing farm (event filter farm), which acts as the trigger, has to be increased by a factor of almost 100 from currently 500 Gb/s up to 40 Tb/s. The event filter farm will preanalyze the data and will select the events on an event by event basis. This will reduce the bandwidth down to a manageable size to write the interesting physics data to tape. The design of such a system is a challenging task, and the reason why different new technologies are considered and have to be investigated for the different parts of the system. For the usage in the event building farm or in the event filter farm (trigger), an experimental field programmable gate array (FPGA) accelerated computing platform is considered and, therefore, tested. FPGA compute accelerators are used more and more in standard servers such as for Microsoft Bing search or Baidu search. The platform we use hosts a general Intel CPU and a high-performance FPGA linked via the high-speed Intel QuickPath Interconnect. An accelerator is implemented on the FPGA. It is very likely that these platforms, which are built, in general, for high-performance computing, are also very interesting for the high-energy physics community. First, the performance results of smaller test cases performed at the beginning are presented. Afterward, a part of the existing LHCb RICH particle identification is tested and is ported to the experimental FPGA accelerated platform. We have compared the performance of the LHCb RICH particle identification running on a normal CPU with the performance of the same algorithm, which is running on the Xeon-FPGA compute accelerator platform.

  1. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.

    PubMed

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation.

  2. Multipath for Agricultural and Rural Information Services in China

    NASA Astrophysics Data System (ADS)

    Ge, Ningning; Zang, Zhiyuan; Gao, Lingwang; Shi, Qiang; Li, Jie; Xing, Chunlin; Shen, Zuorui

    Internet cannot provide perfect information services for farmers in rural regions in China, because farmers in rural regions can hardly access the internet by now. But the wide coverage of mobile signal, telephone line, and television network, etc. gave us a chance to solve the problem. The integrated pest management platform of Northern fruit trees were developed based on the integrated technology, which can integrate the internet, mobile and fixed-line telephone network, and television network, to provide integrated pest management(IPM) information services for farmers in rural regions in E-mail, telephone-voice, short message, voice mail, videoconference or other format, to users' telephone, cell phone, personal computer, personal digital assistant(PDA), television, etc. alternatively. The architecture and the functions of the system were introduced in the paper. The system can manage the field monitoring data of agricultural pests, deal with enquiries to provide the necessary information to farmers accessing the interactive voice response(IVR) in the system with the experts on-line or off-line, and issue the early warnings about the fruit tree pests when it is necessary according to analysis on the monitoring data about the pests of fruit trees in variety of ways including SMS, fax, voice and intersystem e-mail.The system provides a platform and a new pattern for agricultural technology extension with a high coverage rate of agricultural technology in rural regions, and it can solve the problem of agriculture information service 'last kilometer' in China. The effectiveness of the system was certified.

  3. Semi-physical Simulation Platform of a Parafoil Nonlinear Dynamic System

    NASA Astrophysics Data System (ADS)

    Gao, Hai-Tao; Yang, Sheng-Bo; Zhu, Er-Lin; Sun, Qing-Lin; Chen, Zeng-Qiang; Kang, Xiao-Feng

    2013-11-01

    Focusing on the problems in the process of simulation and experiment on a parafoil nonlinear dynamic system, such as limited methods, high cost and low efficiency we present a semi-physical simulation platform. It is designed by connecting parts of physical objects to a computer, and remedies the defect that a computer simulation is divorced from a real environment absolutely. The main components of the platform and its functions, as well as simulation flows, are introduced. The feasibility and validity are verified through a simulation experiment. The experimental results show that the platform has significance for improving the quality of the parafoil fixed-point airdrop system, shortening the development cycle and saving cost.

  4. Load monitoring of aerospace structures utilizing micro-electro-mechanical systems for static and quasi-static loading conditions

    NASA Astrophysics Data System (ADS)

    Martinez, M.; Rocha, B.; Li, M.; Shi, G.; Beltempo, A.; Rutledge, R.; Yanishevsky, M.

    2012-11-01

    The National Research Council Canada (NRC) has worked on the development of structural health monitoring (SHM) test platforms for assessing the performance of sensor systems for load monitoring applications. The first SHM platform consists of a 5.5 m cantilever aluminum beam that provides an optimal scenario for evaluating the ability of a load monitoring system to measure bending, torsion and shear loads. The second SHM platform contains an added level of structural complexity, by consisting of aluminum skins with bonded/riveted stringers, typical of an aircraft lower wing structure. These two load monitoring platforms are well characterized and documented, providing loading conditions similar to those encountered during service. In this study, a micro-electro-mechanical system (MEMS) for acquiring data from triads of gyroscopes, accelerometers and magnetometers is described. The system was used to compute changes in angles at discrete stations along the platforms. The angles obtained from the MEMS were used to compute a second, third or fourth order degree polynomial surface from which displacements at every point could be computed. The use of a new Kalman filter was evaluated for angle estimation, from which displacements in the structure were computed. The outputs of the newly developed algorithms were then compared to the displacements obtained from the linear variable displacement transducers connected to the platforms. The displacement curves were subsequently post-processed either analytically, or with the help of a finite element model of the structure, to estimate strains and loads. The estimated strains were compared with baseline strain gauge instrumentation installed on the platforms. This new approach for load monitoring was able to provide accurate estimates of applied strains and shear loads.

  5. Commercial Smartphone-Based Devices and Smart Applications for Personalized Healthcare Monitoring and Management

    PubMed Central

    Vashist, Sandeep Kumar; Schneider, E. Marion; Luong, John H.T.

    2014-01-01

    Smartphone-based devices and applications (SBDAs) with cost effectiveness and remote sensing are the most promising and effective means of delivering mobile healthcare (mHealthcare). Several SBDAs have been commercialized for the personalized monitoring and/or management of basic physiological parameters, such as blood pressure, weight, body analysis, pulse rate, electrocardiograph, blood glucose, blood glucose saturation, sleeping and physical activity. With advances in Bluetooth technology, software, cloud computing and remote sensing, SBDAs provide real-time on-site analysis and telemedicine opportunities in remote areas. This scenario is of utmost importance for developing countries, where the number of smartphone users is about 70% of 6.8 billion cell phone subscribers worldwide with limited access to basic healthcare service. The technology platform facilitates patient-doctor communication and the patients to effectively manage and keep track of their medical conditions. Besides tremendous healthcare cost savings, SBDAs are very critical for the monitoring and effective management of emerging epidemics and food contamination outbreaks. The next decade will witness pioneering advances and increasing applications of SBDAs in this exponentially growing field of mHealthcare. This article provides a critical review of commercial SBDAs that are being widely used for personalized healthcare monitoring and management. PMID:26852680

  6. Commercial Smartphone-Based Devices and Smart Applications for Personalized Healthcare Monitoring and Management.

    PubMed

    Vashist, Sandeep Kumar; Schneider, E Marion; Luong, John H T

    2014-08-18

    Smartphone-based devices and applications (SBDAs) with cost effectiveness and remote sensing are the most promising and effective means of delivering mobile healthcare (mHealthcare). Several SBDAs have been commercialized for the personalized monitoring and/or management of basic physiological parameters, such as blood pressure, weight, body analysis, pulse rate, electrocardiograph, blood glucose, blood glucose saturation, sleeping and physical activity. With advances in Bluetooth technology, software, cloud computing and remote sensing, SBDAs provide real-time on-site analysis and telemedicine opportunities in remote areas. This scenario is of utmost importance for developing countries, where the number of smartphone users is about 70% of 6.8 billion cell phone subscribers worldwide with limited access to basic healthcare service. The technology platform facilitates patient-doctor communication and the patients to effectively manage and keep track of their medical conditions. Besides tremendous healthcare cost savings, SBDAs are very critical for the monitoring and effective management of emerging epidemics and food contamination outbreaks. The next decade will witness pioneering advances and increasing applications of SBDAs in this exponentially growing field of mHealthcare. This article provides a critical review of commercial SBDAs that are being widely used for personalized healthcare monitoring and management.

  7. Personalized Services Oriented towards Commercial Establishments

    NASA Astrophysics Data System (ADS)

    Marin Díaz, David; Rico Zuluaga, Alejandro; Carrillo-Ramos, Angela

    This paper presents the platform PlaSerEs, whose main objective is to provide information about the products y/o services offered by commercial establishments to their clients in a personalized way. This platform is structured in four layers: i) the adaptation layer, composed of four modules: the one of the context, the one of the access device, the one of the user and the one of the wireless connection. The latter is one of the main contributions of this work. ii) The general services layer, iii) the personalized services layer and iv) the application layer. In order to validate and to evaluate how PlaSerEs works, we developed a functional prototype for a restaurant.

  8. Consumer and provider responses to a computerized version of the Illness Management and Recovery Program.

    PubMed

    Wright-Berryman, Jennifer L; Salyers, Michelle P; O'Halloran, James P; Kemp, Aaron S; Mueser, Kim T; Diazoni, Amanda J

    2013-12-01

    To explore mental health consumer and provider responses to a computerized version of the Illness Management and Recovery (IMR) program. Semistructured interviews were conducted to gather data from 6 providers and 12 consumers who participated in a computerized prototype of the IMR program. An inductive-consensus-based approach was used to analyze the interview responses. Qualitative analysis revealed consumers perceived various personal benefits and ease of use afforded by the new technology platform. Consumers also highly valued provider assistance and offered several suggestions to improve the program. The largest perceived barriers to future implementation were lack of computer skills and access to computers. Similarly, IMR providers commented on its ease and convenience, and the reduction of time intensive material preparation. Providers also expressed that the use of technology creates more options for the consumer to access treatment. The technology was acceptable, easy to use, and well-liked by consumers and providers. Clinician assistance with technology was viewed as helpful to get clients started with the program, as lack of computer skills and access to computers was a concern. Access to materials between sessions appears to be desired; however, given perceived barriers of computer skills and computer access, additional supports may be needed for consumers to achieve full benefits of a computerized version of IMR. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  9. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    PubMed

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  10. Wireless sensor platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, Pooran C.; Killough, Stephen M.; Kuruganti, Phani Teja

    A wireless sensor platform and methods of manufacture are provided. The platform involves providing a plurality of wireless sensors, where each of the sensors is fabricated on flexible substrates using printing techniques and low temperature curing. Each of the sensors can include planar sensor elements and planar antennas defined using the printing and curing. Further, each of the sensors can include a communications system configured to encode the data from the sensors into a spread spectrum code sequence that is transmitted to a central computer(s) for use in monitoring an area associated with the sensors.

  11. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: I. Features and user interface.

    PubMed

    Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F

    1997-12-01

    Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.

  12. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE PAGES

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  13. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  14. The Relationship between Chief Information Officer Transformational Leadership and Computing Platform Operating Systems

    ERIC Educational Resources Information Center

    Anderson, George W.

    2010-01-01

    The purpose of this study was to relate the strength of Chief Information Officer (CIO) transformational leadership behaviors to 1 of 5 computing platform operating systems (OSs) that may be selected for a firm's Enterprise Resource Planning (ERP) business system. Research shows executive leader behaviors may promote innovation through the use of…

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, C.; Yu, G.; Wang, K.

    The physical designs of the new concept reactors which have complex structure, various materials and neutronic energy spectrum, have greatly improved the requirements to the calculation methods and the corresponding computing hardware. Along with the widely used parallel algorithm, heterogeneous platforms architecture has been introduced into numerical computations in reactor physics. Because of the natural parallel characteristics, the CPU-FPGA architecture is often used to accelerate numerical computation. This paper studies the application and features of this kind of heterogeneous platforms used in numerical calculation of reactor physics through practical examples. After the designed neutron diffusion module based on CPU-FPGA architecturemore » achieves a 11.2 speed up factor, it is proved to be feasible to apply this kind of heterogeneous platform into reactor physics. (authors)« less

  16. Modular HPC I/O characterization with Darshan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, Shane; Carns, Philip; Harms, Kevin

    2016-11-13

    Contemporary high-performance computing (HPC) applications encompass a broad range of distinct I/O strategies and are often executed on a number of different compute platforms in their lifetime. These large-scale HPC platforms employ increasingly complex I/O subsystems to provide a suitable level of I/O performance to applications. Tuning I/O workloads for such a system is nontrivial, and the results generally are not portable to other HPC systems. I/O profiling tools can help to address this challenge, but most existing tools only instrument specific components within the I/O subsystem that provide a limited perspective on I/O performance. The increasing diversity of scientificmore » applications and computing platforms calls for greater flexibililty and scope in I/O characterization.« less

  17. Research on private cloud computing based on analysis on typical opensource platform: a case study with Eucalyptus and Wavemaker

    NASA Astrophysics Data System (ADS)

    Yu, Xiaoyuan; Yuan, Jian; Chen, Shi

    2013-03-01

    Cloud computing is one of the most popular topics in the IT industry and is recently being adopted by many companies. It has four development models, as: public cloud, community cloud, hybrid cloud and private cloud. Except others, private cloud can be implemented in a private network, and delivers some benefits of cloud computing without pitfalls. This paper makes a comparison of typical open source platforms through which we can implement a private cloud. After this comparison, we choose Eucalyptus and Wavemaker to do a case study on the private cloud. We also do some performance estimation of cloud platform services and development of prototype software as cloud services.

  18. Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing

    PubMed Central

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931

  19. Design Strategy for a Formally Verified Reliable Computing Platform

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; DiVito, Ben L.

    1991-01-01

    This paper presents a high-level design for a reliable computing platform for real-time control applications. The design tradeoffs and analyses related to the development of a formally verified reliable computing platform are discussed. The design strategy advocated in this paper requires the use of techniques that can be completely characterized mathematically as opposed to more powerful or more flexible algorithms whose performance properties can only be analyzed by simulation and testing. The need for accurate reliability models that can be related to the behavior models is also stressed. Tradeoffs between reliability and voting complexity are explored. In particular, the transient recovery properties of the system are found to be fundamental to both the reliability analysis as well as the "correctness" models.

  20. Analysis of outcomes in radiation oncology: An integrated computational platform

    PubMed Central

    Liu, Dezhi; Ajlouni, Munther; Jin, Jian-Yue; Ryu, Samuel; Siddiqui, Farzan; Patel, Anushka; Movsas, Benjamin; Chetty, Indrin J.

    2009-01-01

    Radiotherapy research and outcome analyses are essential for evaluating new methods of radiation delivery and for assessing the benefits of a given technology on locoregional control and overall survival. In this article, a computational platform is presented to facilitate radiotherapy research and outcome studies in radiation oncology. This computational platform consists of (1) an infrastructural database that stores patient diagnosis, IMRT treatment details, and follow-up information, (2) an interface tool that is used to import and export IMRT plans in DICOM RT and AAPM/RTOG formats from a wide range of planning systems to facilitate reproducible research, (3) a graphical data analysis and programming tool that visualizes all aspects of an IMRT plan including dose, contour, and image data to aid the analysis of treatment plans, and (4) a software package that calculates radiobiological models to evaluate IMRT treatment plans. Given the limited number of general-purpose computational environments for radiotherapy research and outcome studies, this computational platform represents a powerful and convenient tool that is well suited for analyzing dose distributions biologically and correlating them with the delivered radiation dose distributions and other patient-related clinical factors. In addition the database is web-based and accessible by multiple users, facilitating its convenient application and use. PMID:19544785

  1. Superconducting Optoelectronic Circuits for Neuromorphic Computing

    NASA Astrophysics Data System (ADS)

    Shainline, Jeffrey M.; Buckley, Sonia M.; Mirin, Richard P.; Nam, Sae Woo

    2017-03-01

    Neural networks have proven effective for solving many difficult computational problems, yet implementing complex neural networks in software is computationally expensive. To explore the limits of information processing, it is necessary to implement new hardware platforms with large numbers of neurons, each with a large number of connections to other neurons. Here we propose a hybrid semiconductor-superconductor hardware platform for the implementation of neural networks and large-scale neuromorphic computing. The platform combines semiconducting few-photon light-emitting diodes with superconducting-nanowire single-photon detectors to behave as spiking neurons. These processing units are connected via a network of optical waveguides, and variable weights of connection can be implemented using several approaches. The use of light as a signaling mechanism overcomes fanout and parasitic constraints on electrical signals while simultaneously introducing physical degrees of freedom which can be employed for computation. The use of supercurrents achieves the low power density (1 mW /cm2 at 20-MHz firing rate) necessary to scale to systems with enormous entropy. Estimates comparing the proposed hardware platform to a human brain show that with the same number of neurons (1 011) and 700 independent connections per neuron, the hardware presented here may achieve an order of magnitude improvement in synaptic events per second per watt.

  2. TDat: An Efficient Platform for Processing Petabyte-Scale Whole-Brain Volumetric Images.

    PubMed

    Li, Yuxin; Gong, Hui; Yang, Xiaoquan; Yuan, Jing; Jiang, Tao; Li, Xiangning; Sun, Qingtao; Zhu, Dan; Wang, Zhenyu; Luo, Qingming; Li, Anan

    2017-01-01

    Three-dimensional imaging of whole mammalian brains at single-neuron resolution has generated terabyte (TB)- and even petabyte (PB)-sized datasets. Due to their size, processing these massive image datasets can be hindered by the computer hardware and software typically found in biological laboratories. To fill this gap, we have developed an efficient platform named TDat, which adopts a novel data reformatting strategy by reading cuboid data and employing parallel computing. In data reformatting, TDat is more efficient than any other software. In data accessing, we adopted parallelization to fully explore the capability for data transmission in computers. We applied TDat in large-volume data rigid registration and neuron tracing in whole-brain data with single-neuron resolution, which has never been demonstrated in other studies. We also showed its compatibility with various computing platforms, image processing software and imaging systems.

  3. Personalized use of ICT--from telemonitoring to ambient assisted living.

    PubMed

    Norgall, Thomas; Wichert, Reiner

    2013-01-01

    Individual availability of information and communications technology (ICT) has enabled "Personal Health" applications like the continuous ubiquitous telemonitoring of vital signs. The concept of Ambient Assisted Living (AAL) goes beyond health and care applications utilizing home automation technology for supporting individuals with specific needs, particularly enabling elderly to live in their accustomed home as long as possible. These users usually suffer from more than one disease and need compensation of several impairments. Most current AAL projects and products however provide insulated solutions addressing only a small selection of these user needs. For comprehensive dynamic system adaptation to changing user needs an open platform supporting interoperable components is required. While the industry-driven Continua Health Alliance developed a corresponding Personal Health ecosystem, the ongoing European project universAAL aims at a universal platform for both AAL and Personal Health applications.

  4. PERSONAL COMPUTERS AND ENVIRONMENTAL ENGINEERING

    EPA Science Inventory

    This article discusses how personal computers can be applied to environmental engineering. fter explaining some of the differences between mainframe and Personal computers, we will review the development of personal computers and describe the areas of data management, interactive...

  5. Interfacing HTCondor-CE with OpenStack

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Caballero Bejar, J.; Hover, J.

    2017-10-01

    Over the past few years, Grid Computing technologies have reached a high level of maturity. One key aspect of this success has been the development and adoption of newer Compute Elements to interface the external Grid users with local batch systems. These new Compute Elements allow for better handling of jobs requirements and a more precise management of diverse local resources. However, despite this level of maturity, the Grid Computing world is lacking diversity in local execution platforms. As Grid Computing technologies have historically been driven by the needs of the High Energy Physics community, most resource providers run the platform (operating system version and architecture) that best suits the needs of their particular users. In parallel, the development of virtualization and cloud technologies has accelerated recently, making available a variety of solutions, both commercial and academic, proprietary and open source. Virtualization facilitates performing computational tasks on platforms not available at most computing sites. This work attempts to join the technologies, allowing users to interact with computing sites through one of the standard Computing Elements, HTCondor-CE, but running their jobs within VMs on a local cloud platform, OpenStack, when needed. The system will re-route, in a transparent way, end user jobs into dynamically-launched VM worker nodes when they have requirements that cannot be satisfied by the static local batch system nodes. Also, once the automated mechanisms are in place, it becomes straightforward to allow an end user to invoke a custom Virtual Machine at the site. This will allow cloud resources to be used without requiring the user to establish a separate account. Both scenarios are described in this work.

  6. A miniaturized NQR spectrometer for a multi-channel NQR-based detection device.

    PubMed

    Beguš, Samo; Jazbinšek, Vojko; Pirnat, Janez; Trontelj, Zvonko

    2014-10-01

    A low frequency (0.5-5 MHz) battery operated sensitive pulsed NQR spectrometer with a transmitter power up to 5 W and a total mass of about 3 kg aimed at detecting (14)N NQR signals, predominantly of illicit materials, was designed and assembled. This spectrometer uses a standard software defined radio (SDR) platform for the data acquisition unit. Signal processing is done with the LabView Virtual instrument on a personal computer. We successfully tested the spectrometer by measuring (14)N NQR signals from aminotetrazole monohydrate (ATMH), potassium nitrate (PN), paracetamol (PCM) and trinitrotoluene (TNT). Such a spectrometer is a feasible component of a portable single or multichannel (14)N NQR based detection device. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...

  8. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...

  9. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...

  10. Cloud Computing with iPlant Atmosphere.

    PubMed

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  11. Optical RISC computer

    NASA Astrophysics Data System (ADS)

    Guilfoyle, Peter S.; Stone, Richard V.; Hessenbruch, John M.; Zeise, Frederick F.

    1993-07-01

    A second generation digital optical computer (DOC II) has been developed which utilizes a RISC based operating system as its host. This 32 bit, high performance (12.8 GByte/sec), computing platform demonstrates a number of basic principals that are inherent to parallel free space optical interconnects such as speed (up to 1012 bit operations per second) and low power 1.2 fJ per bit). Although DOC II is a general purpose machine, special purpose applications have been developed and are currently being evaluated on the optical platform.

  12. SOI layout decomposition for double patterning lithography on high-performance computer platforms

    NASA Astrophysics Data System (ADS)

    Verstov, Vladimir; Zinchenko, Lyudmila; Makarchuk, Vladimir

    2014-12-01

    In the paper silicon on insulator layout decomposition algorithms for the double patterning lithography on high performance computing platforms are discussed. Our approach is based on the use of a contradiction graph and a modified concurrent breadth-first search algorithm. We evaluate our technique on 45 nm Nangate Open Cell Library including non-Manhattan geometry. Experimental results show that our soft computing algorithms decompose layout successfully and a minimal distance between polygons in layout is increased.

  13. Transitioning ISR architecture into the cloud

    NASA Astrophysics Data System (ADS)

    Lash, Thomas D.

    2012-06-01

    Emerging cloud computing platforms offer an ideal opportunity for Intelligence, Surveillance, and Reconnaissance (ISR) intelligence analysis. Cloud computing platforms help overcome challenges and limitations of traditional ISR architectures. Modern ISR architectures can benefit from examining commercial cloud applications, especially as they relate to user experience, usage profiling, and transformational business models. This paper outlines legacy ISR architectures and their limitations, presents an overview of cloud technologies and their applications to the ISR intelligence mission, and presents an idealized ISR architecture implemented with cloud computing.

  14. [Measurement of intracranial hematoma volume by personal computer].

    PubMed

    DU, Wanping; Tan, Lihua; Zhai, Ning; Zhou, Shunke; Wang, Rui; Xue, Gongshi; Xiao, An

    2011-01-01

    To explore the method for intracranial hematoma volume measurement by the personal computer. Forty cases of various intracranial hematomas were measured by the computer tomography with quantitative software and personal computer with Photoshop CS3 software, respectively. the data from the 2 methods were analyzed and compared. There was no difference between the data from the computer tomography and the personal computer (P>0.05). The personal computer with Photoshop CS3 software can measure the volume of various intracranial hematomas precisely, rapidly and simply. It should be recommended in the clinical medicolegal identification.

  15. High-speed multiple sequence alignment on a reconfigurable platform.

    PubMed

    Oliver, Tim; Schmidt, Bertil; Maskell, Douglas; Nathan, Darran; Clemens, Ralf

    2006-01-01

    Progressive alignment is a widely used approach to compute multiple sequence alignments (MSAs). However, aligning several hundred sequences by popular progressive alignment tools requires hours on sequential computers. Due to the rapid growth of sequence databases biologists have to compute MSAs in a far shorter time. In this paper we present a new approach to MSA on reconfigurable hardware platforms to gain high performance at low cost. We have constructed a linear systolic array to perform pairwise sequence distance computations using dynamic programming. This results in an implementation with significant runtime savings on a standard FPGA.

  16. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    NASA Technical Reports Server (NTRS)

    Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce

    2011-01-01

    Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases

  17. Computer-based personality judgments are more accurate than those made by humans

    PubMed Central

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  18. Computer-based personality judgments are more accurate than those made by humans.

    PubMed

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

  19. Integration of lyoplate based flow cytometry and computational analysis for standardized immunological biomarker discovery.

    PubMed

    Villanova, Federica; Di Meglio, Paola; Inokuma, Margaret; Aghaeepour, Nima; Perucha, Esperanza; Mollon, Jennifer; Nomura, Laurel; Hernandez-Fuentes, Maria; Cope, Andrew; Prevost, A Toby; Heck, Susanne; Maino, Vernon; Lord, Graham; Brinkman, Ryan R; Nestle, Frank O

    2013-01-01

    Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid) flow cytometry platform (CFP) and a unique lyoplate-based flow cytometry platform (LFP) in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10) and activation markers (Foxp3 and CD25). Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases.

  20. Integration of Lyoplate Based Flow Cytometry and Computational Analysis for Standardized Immunological Biomarker Discovery

    PubMed Central

    Villanova, Federica; Di Meglio, Paola; Inokuma, Margaret; Aghaeepour, Nima; Perucha, Esperanza; Mollon, Jennifer; Nomura, Laurel; Hernandez-Fuentes, Maria; Cope, Andrew; Prevost, A. Toby; Heck, Susanne; Maino, Vernon; Lord, Graham; Brinkman, Ryan R.; Nestle, Frank O.

    2013-01-01

    Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid) flow cytometry platform (CFP) and a unique lyoplate-based flow cytometry platform (LFP) in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10) and activation markers (Foxp3 and CD25). Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases. PMID:23843942

  1. Making Spatial Statistics Service Accessible On Cloud Platform

    NASA Astrophysics Data System (ADS)

    Mu, X.; Wu, J.; Li, T.; Zhong, Y.; Gao, X.

    2014-04-01

    Web service can bring together applications running on diverse platforms, users can access and share various data, information and models more effectively and conveniently from certain web service platform. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services. With the rampant growth of massive data and restriction of net, traditional web services platforms have some prominent problems existing in development such as calculation efficiency, maintenance cost and data security. In this paper, we offer a spatial statistics service based on Microsoft cloud. An experiment was carried out to evaluate the availability and efficiency of this service. The results show that this spatial statistics service is accessible for the public conveniently with high processing efficiency.

  2. High performance GPU processing for inversion using uniform grid searches

    NASA Astrophysics Data System (ADS)

    Venetis, Ioannis E.; Saltogianni, Vasso; Stiros, Stathis; Gallopoulos, Efstratios

    2017-04-01

    Many geophysical problems are described by systems of redundant, highly non-linear systems of ordinary equations with constant terms deriving from measurements and hence representing stochastic variables. Solution (inversion) of such problems is based on numerical, optimization methods, based on Monte Carlo sampling or on exhaustive searches in cases of two or even three "free" unknown variables. Recently the TOPological INVersion (TOPINV) algorithm, a grid search-based technique in the Rn space, has been proposed. TOPINV is not based on the minimization of a certain cost function and involves only forward computations, hence avoiding computational errors. The basic concept is to transform observation equations into inequalities on the basis of an optimization parameter k and of their standard errors, and through repeated "scans" of n-dimensional search grids for decreasing values of k to identify the optimal clusters of gridpoints which satisfy observation inequalities and by definition contain the "true" solution. Stochastic optimal solutions and their variance-covariance matrices are then computed as first and second statistical moments. Such exhaustive uniform searches produce an excessive computational load and are extremely time consuming for common computers based on a CPU. An alternative is to use a computing platform based on a GPU, which nowadays is affordable to the research community, which provides a much higher computing performance. Using the CUDA programming language to implement TOPINV allows the investigation of the attained speedup in execution time on such a high performance platform. Based on synthetic data we compared the execution time required for two typical geophysical problems, modeling magma sources and seismic faults, described with up to 18 unknown variables, on both CPU/FORTRAN and GPU/CUDA platforms. The same problems for several different sizes of search grids (up to 1012 gridpoints) and numbers of unknown variables were solved on both platforms, and execution time as a function of the grid dimension for each problem was recorded. Results indicate an average speedup in calculations by a factor of 100 on the GPU platform; for example problems with 1012 grid-points require less than two hours instead of several days on conventional desktop computers. Such a speedup encourages the application of TOPINV on high performance platforms, as a GPU, in cases where nearly real time decisions are necessary, for example finite fault modeling to identify possible tsunami sources.

  3. Applications of personal computers in geophysics

    NASA Astrophysics Data System (ADS)

    Lee, W. H. K.; Lahr, J. C.; Habermann, R. E.

    Since 1981, the use of personal computers (PCs) to increase productivity has become widespread. At present, more than 5 million personal computers are in operation for business, education, engineering, and scientific purposes. Activities within AGU reflect this trend: KOSMOS, the AGU electronic network, was introduced this year, and the AGU Committee on Personal Computers, chaired by W.H K. Lee (U.S. Geological Survey, Menlo Park, Calif.), was recently formed. In addition, in conjunction with the 1986 AGU Fall Meeting, this committee is organizing a personal computer session and hands-on demonstrations to promote applications of personal computers in geophysics.

  4. Hardware Support for Malware Defense and End-to-End Trust

    DTIC Science & Technology

    2017-02-01

    IoT) sensors and actuators, mobile devices and servers; cloud based, stand alone, and traditional mainframes. The prototype developed demonstrated...virtual machines. For mobile platforms we developed and prototyped an architecture supporting separation of personalities on the same platform...4 3.1. MOBILE

  5. A Set of Free Cross-Platform Authoring Programs for Flexible Web-Based CALL Exercises

    ERIC Educational Resources Information Center

    O'Brien, Myles

    2012-01-01

    The Mango Suite is a set of three freely downloadable cross-platform authoring programs for flexible network-based CALL exercises. They are Adobe Air applications, so they can be used on Windows, Macintosh, or Linux computers, provided the freely-available Adobe Air has been installed on the computer. The exercises which the programs generate are…

  6. Development of a Cloud Computing-Based Pier Type Port Structure Stability Evaluation Platform Using Fiber Bragg Grating Sensors.

    PubMed

    Jo, Byung Wan; Jo, Jun Ho; Khan, Rana Muhammad Asad; Kim, Jung Hoon; Lee, Yun Sung

    2018-05-23

    Structure Health Monitoring is a topic of great interest in port structures due to the ageing of structures and the limitations of evaluating structures. This paper presents a cloud computing-based stability evaluation platform for a pier type port structure using Fiber Bragg Grating (FBG) sensors in a system consisting of a FBG strain sensor, FBG displacement gauge, FBG angle meter, gateway, and cloud computing-based web server. The sensors were installed on core components of the structure and measurements were taken to evaluate the structures. The measurement values were transmitted to the web server via the gateway to analyze and visualize them. All data were analyzed and visualized in the web server to evaluate the structure based on the safety evaluation index (SEI). The stability evaluation platform for pier type port structures involves the efficient monitoring of the structures which can be carried out easily anytime and anywhere by converging new technologies such as cloud computing and FBG sensors. In addition, the platform has been successfully implemented at “Maryang Harbor” situated in Maryang-Meyon of Korea to test its durability.

  7. 20170312 - Computer Simulation of Developmental ...

    EPA Pesticide Factsheets

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  8. Computer Simulation of Developmental Processes and ...

    EPA Pesticide Factsheets

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  9. Software-codec-based full motion video conferencing on the PC using visual pattern image sequence coding

    NASA Astrophysics Data System (ADS)

    Barnett, Barry S.; Bovik, Alan C.

    1995-04-01

    This paper presents a real time full motion video conferencing system based on the Visual Pattern Image Sequence Coding (VPISC) software codec. The prototype system hardware is comprised of two personal computers, two camcorders, two frame grabbers, and an ethernet connection. The prototype system software has a simple structure. It runs under the Disk Operating System, and includes a user interface, a video I/O interface, an event driven network interface, and a free running or frame synchronous video codec that also acts as the controller for the video and network interfaces. Two video coders have been tested in this system. Simple implementations of Visual Pattern Image Coding and VPISC have both proven to support full motion video conferencing with good visual quality. Future work will concentrate on expanding this prototype to support the motion compensated version of VPISC, as well as encompassing point-to-point modem I/O and multiple network protocols. The application will be ported to multiple hardware platforms and operating systems. The motivation for developing this prototype system is to demonstrate the practicality of software based real time video codecs. Furthermore, software video codecs are not only cheaper, but are more flexible system solutions because they enable different computer platforms to exchange encoded video information without requiring on-board protocol compatible video codex hardware. Software based solutions enable true low cost video conferencing that fits the `open systems' model of interoperability that is so important for building portable hardware and software applications.

  10. Waggle: A Framework for Intelligent Attentive Sensing and Actuation

    NASA Astrophysics Data System (ADS)

    Sankaran, R.; Jacob, R. L.; Beckman, P. H.; Catlett, C. E.; Keahey, K.

    2014-12-01

    Advances in sensor-driven computation and computationally steered sensing will greatly enable future research in fields including environmental and atmospheric sciences. We will present "Waggle," an open-source hardware and software infrastructure developed with two goals: (1) reducing the separation and latency between sensing and computing and (2) improving the reliability and longevity of sensing-actuation platforms in challenging and costly deployments. Inspired by "deep-space probe" systems, the Waggle platform design includes features that can support longitudinal studies, deployments with varying communication links, and remote management capabilities. Waggle lowers the barrier for scientists to incorporate real-time data from their sensors into their computations and to manipulate the sensors or provide feedback through actuators. A standardized software and hardware design allows quick addition of new sensors/actuators and associated software in the nodes and enables them to be coupled with computational codes both insitu and on external compute infrastructure. The Waggle framework currently drives the deployment of two observational systems - a portable and self-sufficient weather platform for study of small-scale effects in Chicago's urban core and an open-ended distributed instrument in Chicago that aims to support several research pursuits across a broad range of disciplines including urban planning, microbiology and computer science. Built around open-source software, hardware, and Linux OS, the Waggle system comprises two components - the Waggle field-node and Waggle cloud-computing infrastructure. Waggle field-node affords a modular, scalable, fault-tolerant, secure, and extensible platform for hosting sensors and actuators in the field. It supports insitu computation and data storage, and integration with cloud-computing infrastructure. The Waggle cloud infrastructure is designed with the goal of scaling to several hundreds of thousands of Waggle nodes. It supports aggregating data from sensors hosted by the nodes, staging computation, relaying feedback to the nodes and serving data to end-users. We will discuss the Waggle design principles and their applicability to various observational research pursuits, and demonstrate its capabilities.

  11. Megatux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-25

    The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less

  12. Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds

    NASA Astrophysics Data System (ADS)

    Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.

    In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.

  13. Implementation and performance test of cloud platform based on Hadoop

    NASA Astrophysics Data System (ADS)

    Xu, Jingxian; Guo, Jianhong; Ren, Chunlan

    2018-01-01

    Hadoop, as an open source project for the Apache foundation, is a distributed computing framework that deals with large amounts of data and has been widely used in the Internet industry. Therefore, it is meaningful to study the implementation of Hadoop platform and the performance of test platform. The purpose of this subject is to study the method of building Hadoop platform and to study the performance of test platform. This paper presents a method to implement Hadoop platform and a test platform performance method. Experimental results show that the proposed test performance method is effective and it can detect the performance of Hadoop platform.

  14. Smartphone as a personal, pervasive health informatics services platform: literature review.

    PubMed

    Wac, K

    2012-01-01

    The article provides an overview of current trends in personal sensor, signal and imaging informatics, that are based on emerging mobile computing and communications technologies enclosed in a smartphone and enabling the provision of personal, pervasive health informatics services. The article reviews examples of these trends from the PubMed and Google scholar literature search engines, which, by no means claim to be complete, as the field is evolving and some recent advances may not be documented yet. There exist critical technological advances in the surveyed smartphone technologies, employed in provision and improvement of diagnosis, acute and chronic treatment and rehabilitation health services, as well as in education and training of healthcare practitioners. However, the most emerging trend relates to a routine application of these technologies in a prevention/wellness sector, helping its users in self-care to stay healthy. Smartphone-based personal health informatics services exist, but still have a long way to go to become an everyday, personalized healthcare-provisioning tool in the medical field and in a clinical practice. Key main challenge for their widespread adoption involve lack of user acceptance striving from variable credibility and reliability of applications and solutions as they a) lack evidence- based approach; b) have low levels of medical professional involvement in their design and content; c) are provided in an unreliable way, influencing negatively its usability; and, in some cases, d) being industry-driven, hence exposing bias in information provided, for example towards particular types of treatment or intervention procedures.

  15. Using Social Media Activity to Identify Personality Characteristics of Navy Personnel

    DTIC Science & Technology

    2016-03-01

    1]. The list of social media platforms is constantly growing. The most commonly used platforms are YouTube , Facebook, Google+, and Twitter; others...such as anxiety, anger, depression , self- consciousness, impulsiveness, and vulnerability • Openness to Experience, described with terms such as fantasy

  16. 29 CFR 1926.1431 - Hoisting personnel.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... structural design or worksite conditions. This paragraph does not apply to work covered by subpart R (Steel... platform criteria. (1) A qualified person familiar with structural design must design the personnel... of boom angle. (3) The suspension system must be designed to minimize tipping of the platform due to...

  17. 29 CFR 1926.1431 - Hoisting personnel.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... structural design or worksite conditions. This paragraph does not apply to work covered by subpart R (Steel... platform criteria. (1) A qualified person familiar with structural design must design the personnel... of boom angle. (3) The suspension system must be designed to minimize tipping of the platform due to...

  18. TERRA REF: Advancing phenomics with high resolution, open access sensor and genomics data

    NASA Astrophysics Data System (ADS)

    LeBauer, D.; Kooper, R.; Burnette, M.; Willis, C.

    2017-12-01

    Automated plant measurement has the potential to improve understanding of genetic and environmental controls on plant traits (phenotypes). The application of sensors and software in the automation of high throughput phenotyping reflects a fundamental shift from labor intensive hand measurements to drone, tractor, and robot mounted sensing platforms. These tools are expected to speed the rate of crop improvement by enabling plant breeders to more accurately select plants with improved yields, resource use efficiency, and stress tolerance. However, there are many challenges facing high throughput phenomics: sensors and platforms are expensive, currently there are few standard methods of data collection and storage, and the analysis of large data sets requires high performance computers and automated, reproducible computing pipelines. To overcome these obstacles and advance the science of high throughput phenomics, the TERRA Phenotyping Reference Platform (TERRA-REF) team is developing an open-access database of high resolution sensor data. TERRA REF is an integrated field and greenhouse phenotyping system that includes: a reference field scanner with fifteen sensors that can generate terrabytes of data each day at mm resolution; UAV, tractor, and fixed field sensing platforms; and an automated controlled-environment scanner. These platforms will enable investigation of diverse sensing modalities, and the investigation of traits under controlled and field environments. It is the goal of TERRA REF to lower the barrier to entry for academic and industry researchers by providing high-resolution data, open source software, and online computing resources. Our project is unique in that all data will be made fully public in November 2018, and is already available to early adopters through the beta-user program. We will describe the datasets and how to use them as well as the databases and computing pipeline and how these can be reused and remixed in other phenomics pipelines. Finally, we will describe the National Data Service workbench, a cloud computing platform that can access the petabyte scale data while supporting reproducible research.

  19. Implementation methodology for interoperable personal health devices with low-voltage low-power constraints.

    PubMed

    Martinez-Espronceda, Miguel; Martinez, Ignacio; Serrano, Luis; Led, Santiago; Trigo, Jesús Daniel; Marzo, Asier; Escayola, Javier; Garcia, José

    2011-05-01

    Traditionally, e-Health solutions were located at the point of care (PoC), while the new ubiquitous user-centered paradigm draws on standard-based personal health devices (PHDs). Such devices place strict constraints on computation and battery efficiency that encouraged the International Organization for Standardization/IEEE11073 (X73) standard for medical devices to evolve from X73PoC to X73PHD. In this context, low-voltage low-power (LV-LP) technologies meet the restrictions of X73PHD-compliant devices. Since X73PHD does not approach the software architecture, the accomplishment of an efficient design falls directly on the software developer. Therefore, computational and battery performance of such LV-LP-constrained devices can even be outperformed through an efficient X73PHD implementation design. In this context, this paper proposes a new methodology to implement X73PHD into microcontroller-based platforms with LV-LP constraints. Such implementation methodology has been developed through a patterns-based approach and applied to a number of X73PHD-compliant agents (including weighing scale, blood pressure monitor, and thermometer specializations) and microprocessor architectures (8, 16, and 32 bits) as a proof of concept. As a reference, the results obtained in the weighing scale guarantee all features of X73PHD running over a microcontroller architecture based on ARM7TDMI requiring only 168 B of RAM and 2546 B of flash memory.

  20. Cpu/gpu Computing for AN Implicit Multi-Block Compressible Navier-Stokes Solver on Heterogeneous Platform

    NASA Astrophysics Data System (ADS)

    Deng, Liang; Bai, Hanli; Wang, Fang; Xu, Qingxin

    2016-06-01

    CPU/GPU computing allows scientists to tremendously accelerate their numerical codes. In this paper, we port and optimize a double precision alternating direction implicit (ADI) solver for three-dimensional compressible Navier-Stokes equations from our in-house Computational Fluid Dynamics (CFD) software on heterogeneous platform. First, we implement a full GPU version of the ADI solver to remove a lot of redundant data transfers between CPU and GPU, and then design two fine-grain schemes, namely “one-thread-one-point” and “one-thread-one-line”, to maximize the performance. Second, we present a dual-level parallelization scheme using the CPU/GPU collaborative model to exploit the computational resources of both multi-core CPUs and many-core GPUs within the heterogeneous platform. Finally, considering the fact that memory on a single node becomes inadequate when the simulation size grows, we present a tri-level hybrid programming pattern MPI-OpenMP-CUDA that merges fine-grain parallelism using OpenMP and CUDA threads with coarse-grain parallelism using MPI for inter-node communication. We also propose a strategy to overlap the computation with communication using the advanced features of CUDA and MPI programming. We obtain speedups of 6.0 for the ADI solver on one Tesla M2050 GPU in contrast to two Xeon X5670 CPUs. Scalability tests show that our implementation can offer significant performance improvement on heterogeneous platform.

  1. Computational Modeling of Inflammation and Wound Healing

    PubMed Central

    Ziraldo, Cordelia; Mi, Qi; An, Gary; Vodovotz, Yoram

    2013-01-01

    Objective Inflammation is both central to proper wound healing and a key driver of chronic tissue injury via a positive-feedback loop incited by incidental cell damage. We seek to derive actionable insights into the role of inflammation in wound healing in order to improve outcomes for individual patients. Approach To date, dynamic computational models have been used to study the time evolution of inflammation in wound healing. Emerging clinical data on histo-pathological and macroscopic images of evolving wounds, as well as noninvasive measures of blood flow, suggested the need for tissue-realistic, agent-based, and hybrid mechanistic computational simulations of inflammation and wound healing. Innovation We developed a computational modeling system, Simple Platform for Agent-based Representation of Knowledge, to facilitate the construction of tissue-realistic models. Results A hybrid equation–agent-based model (ABM) of pressure ulcer formation in both spinal cord-injured and -uninjured patients was used to identify control points that reduce stress caused by tissue ischemia/reperfusion. An ABM of arterial restenosis revealed new dynamics of cell migration during neointimal hyperplasia that match histological features, but contradict the currently prevailing mechanistic hypothesis. ABMs of vocal fold inflammation were used to predict inflammatory trajectories in individuals, possibly allowing for personalized treatment. Conclusions The intertwined inflammatory and wound healing responses can be modeled computationally to make predictions in individuals, simulate therapies, and gain mechanistic insights. PMID:24527362

  2. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    NASA Astrophysics Data System (ADS)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This paper presents the background, architectural design, and activities of GeoCloud in support of the Geospatial Platform Initiative. System security strategies and approval processes for migrating federal geospatial data, information, and applications into cloud, and cost estimation for cloud operations are covered. Finally, some lessons learned from the GeoCloud project are discussed as reference for geoscientists to consider in the adoption of cloud computing.

  3. [Reliability of static posturography in elderly persons].

    PubMed

    Bauer, C M; Gröger, I; Rupprecht, R; Tibesku, C O; Gassmann, K G

    2010-08-01

    Static posturography is used to quantify body sway. It is used to assess the balance of elderly persons who are prone to falls. There is still no general opinion concerning the reliability of force platform measurements. The aim of this study was to test the reliability of force platform parameters when measuring elderly persons. The reliability of 11 force platform parameters was tested measuring 30 elderly persons. The following parameters were calculated: mean speed of center of pressure displacement in mm/s, length of sway in mm, sway area in mm(2), amplitudes of center of pressure movement, the axis of oscillation in degrees and the person's angles of inclination in degrees. Three measurements were taken on the same day, with a resting period of 2 min. Four different test conditions were used: normal standing and narrow stand with eyes open and eyes closed, respectively. Reliability was determined by using intraclass correlation coefficients. Six parameters had excellent reliability with a correlation coefficient of >0.9: mean speed of center of pressure movement during narrow stand, area of sway during narrow stand, length of sway during normal and narrow stand, and the angle of inclination in the sagittal plane during normal stand and narrow stand. The condition "narrow stand eyes closed" proved to be the most reliable test position. Six parameters proved to have excellent reliability and are recommended to be used in further investigations. Narrow stand with eyes closed should be used as the test position. The tested protocol proved to be reliable. Whether these parameters can be used to predict falls in elderly persons remains to be investigated.

  4. Circuit design advances for ultra-low power sensing platforms

    NASA Astrophysics Data System (ADS)

    Wieckowski, Michael; Dreslinski, Ronald G.; Mudge, Trevor; Blaauw, David; Sylvester, Dennis

    2010-04-01

    This paper explores the recent advances in circuit structures and design methodologies that have enabled ultra-low power sensing platforms and opened up a host of new applications. Central to this theme is the development of Near Threshold Computing (NTC) as a viable design space for low power sensing platforms. In this paradigm, the system's supply voltage is approximately equal to the threshold voltage of its transistors. Operating in this "near-threshold" region provides much of the energy savings previously demonstrated for subthreshold operation while offering more favorable performance and variability characteristics. This makes NTC applicable to a broad range of power-constrained computing segments including energy constrained sensing platforms. This paper explores the barriers to the adoption of NTC and describes current work aimed at overcoming these obstacles in the circuit design space.

  5. Balloon platform for extended-life astronomy research

    NASA Technical Reports Server (NTRS)

    Ostwald, L. T.

    1974-01-01

    A configuration has been developed for a long-life balloon platform to carry pointing telescopes weighing as much as 80 pounds (36 kg) to point at selected celestial targets. A platform of this configuration weighs about 375 pounds (170 kg) gross and can be suspended from a high altitude super pressure balloon for a lifetime of several months. The balloon platform contains a solar array and storage batteries for electrical power, up and down link communications equipment, and navigational and attitude control systems for orienting the scientific instrument. A biaxial controller maintains the telescope attitude in response to look-angle data stored in an on-board computer memory which is updated periodically by ground command. Gimbal angles are computed by using location data derived by an on-board navigational receiver.

  6. 33 CFR 144.01-20 - Life preservers.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Life preservers. 144.01-20...) OUTER CONTINENTAL SHELF ACTIVITIES LIFESAVING APPLIANCES Manned Platforms § 144.01-20 Life preservers. (a) An approved life preserver shall be provided for each person on a manned platform. The life...

  7. 33 CFR 144.01-20 - Life preservers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Life preservers. 144.01-20...) OUTER CONTINENTAL SHELF ACTIVITIES LIFESAVING APPLIANCES Manned Platforms § 144.01-20 Life preservers. (a) An approved life preserver shall be provided for each person on a manned platform. The life...

  8. Leadership Platforms: Perspectives and Prospects.

    ERIC Educational Resources Information Center

    Norris, Cynthia J.; And Others

    What leaders encourage others to do must be congruent with the values they espouse and demonstrate through action. The educational leadership platform has recently been used as a tool to assist aspiring leaders in clarifying their personal values. This paper presents findings of a study that explored the effect of the educational leadership…

  9. Discovering Student Web Usage Profiles Using Markov Chains

    ERIC Educational Resources Information Center

    Marques, Alice; Belo, Orlando

    2011-01-01

    Nowadays, Web based platforms are quite common in any university, supporting a very diversified set of applications and services. Ranging from personal management to student evaluation processes, Web based platforms are doing a great job providing a very flexible way of working, promote student enrolment, and making access to academic information…

  10. 46 CFR 199.145 - Marine evacuation system launching arrangements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 7 2012-10-01 2012-10-01 false Marine evacuation system launching arrangements. 199.145....145 Marine evacuation system launching arrangements. (a) Arrangements. Each marine evacuation system... from the marine evacuation system platform by a person either in the liferaft or on the platform; (4...

  11. 46 CFR 199.145 - Marine evacuation system launching arrangements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 7 2013-10-01 2013-10-01 false Marine evacuation system launching arrangements. 199.145....145 Marine evacuation system launching arrangements. (a) Arrangements. Each marine evacuation system... from the marine evacuation system platform by a person either in the liferaft or on the platform; (4...

  12. 46 CFR 199.145 - Marine evacuation system launching arrangements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 7 2014-10-01 2014-10-01 false Marine evacuation system launching arrangements. 199.145....145 Marine evacuation system launching arrangements. (a) Arrangements. Each marine evacuation system... from the marine evacuation system platform by a person either in the liferaft or on the platform; (4...

  13. 46 CFR 199.145 - Marine evacuation system launching arrangements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Marine evacuation system launching arrangements. 199.145....145 Marine evacuation system launching arrangements. (a) Arrangements. Each marine evacuation system... from the marine evacuation system platform by a person either in the liferaft or on the platform; (4...

  14. 46 CFR 199.145 - Marine evacuation system launching arrangements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 7 2011-10-01 2011-10-01 false Marine evacuation system launching arrangements. 199.145....145 Marine evacuation system launching arrangements. (a) Arrangements. Each marine evacuation system... from the marine evacuation system platform by a person either in the liferaft or on the platform; (4...

  15. A multilevel control approach for a modular structured space platform

    NASA Technical Reports Server (NTRS)

    Chichester, F. D.; Borelli, M. T.

    1981-01-01

    A three axis mathematical representation of a modular assembled space platform consisting of interconnected discrete masses, including a deployable truss module, was derived for digital computer simulation. The platform attitude control system as developed to provide multilevel control utilizing the Gauss-Seidel second level formulation along with an extended form of linear quadratic regulator techniques. The objectives of the multilevel control are to decouple the space platform's spatial axes and to accommodate the modification of the platform's configuration for each of the decoupled axes.

  16. Formal design and verification of a reliable computing platform for real-time control. Phase 2: Results

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Divito, Ben L.

    1992-01-01

    The design and formal verification of the Reliable Computing Platform (RCP), a fault tolerant computing system for digital flight control applications is presented. The RCP uses N-Multiply Redundant (NMR) style redundancy to mask faults and internal majority voting to flush the effects of transient faults. The system is formally specified and verified using the Ehdm verification system. A major goal of this work is to provide the system with significant capability to withstand the effects of High Intensity Radiated Fields (HIRF).

  17. Architecture and Initial Development of a Digital Library Platform for Computable Knowledge Objects for Health.

    PubMed

    Flynn, Allen J; Bahulekar, Namita; Boisvert, Peter; Lagoze, Carl; Meng, George; Rampton, James; Friedman, Charles P

    2017-01-01

    Throughout the world, biomedical knowledge is routinely generated and shared through primary and secondary scientific publications. However, there is too much latency between publication of knowledge and its routine use in practice. To address this latency, what is actionable in scientific publications can be encoded to make it computable. We have created a purpose-built digital library platform to hold, manage, and share actionable, computable knowledge for health called the Knowledge Grid Library. Here we present it with its system architecture.

  18. A Geospatial Information Grid Framework for Geological Survey.

    PubMed

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.

  19. A Geospatial Information Grid Framework for Geological Survey

    PubMed Central

    Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong

    2015-01-01

    The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper. PMID:26710255

  20. Investigation of a computer virus outbreak in the pharmacy of a tertiary care teaching hospital.

    PubMed

    Bailey, T C; Reichley, R M

    1992-10-01

    A computer virus outbreak was recognized, verified, defined, investigated, and controlled using an infection control approach. The pathogenesis and epidemiology of computer virus infection are reviewed. Case-control study. Pharmacy of a tertiary care teaching institution. On October 28, 1991, 2 personal computers in the drug information center manifested symptoms consistent with the "Jerusalem" virus infection. The same day, a departmental personal computer began playing "Yankee Doodle," a sign of "Doodle" virus infection. An investigation of all departmental personal computers identified the "Stoned" virus in an additional personal computer. Controls were functioning virus-free personal computers within the department. Cases were associated with users who brought diskettes from outside the department (5/5 cases versus 5/13 controls, p = .04) and with College of Pharmacy student users (3/5 cases versus 0/13 controls, p = .012). The detection of a virus-infected diskette or personal computer was associated with the number of 5 1/4-inch diskettes in the files of personal computers, a surrogate for rate of media exchange (mean = 17.4 versus 152.5, p = .018, Wilcoxon rank sum test). After education of departmental personal computer users regarding appropriate computer hygiene and installation of virus protection software, no further spread of personal computer viruses occurred, although 2 additional Stoned-infected and 1 Jerusalem-infected diskettes were detected. We recommend that virus detection software be installed on personal computers where the interchange of diskettes among computers is necessary, that write-protect tabs be placed on all program master diskettes and data diskettes where data are being read and not written, that in the event of a computer virus outbreak, all available diskettes be quarantined and scanned by virus detection software, and to facilitate quarantine and scanning in an outbreak, that diskettes be stored in organized files.

  1. Energy efficiency analysis and optimization for mobile platforms

    NASA Astrophysics Data System (ADS)

    Metri, Grace Camille

    The introduction of mobile devices changed the landscape of computing. Gradually, these devices are replacing traditional personal computer (PCs) to become the devices of choice for entertainment, connectivity, and productivity. There are currently at least 45.5 million people in the United States who own a mobile device, and that number is expected to increase to 1.5 billion by 2015. Users of mobile devices expect and mandate that their mobile devices have maximized performance while consuming minimal possible power. However, due to the battery size constraints, the amount of energy stored in these devices is limited and is only growing by 5% annually. As a result, we focused in this dissertation on energy efficiency analysis and optimization for mobile platforms. We specifically developed SoftPowerMon, a tool that can power profile Android platforms in order to expose the power consumption behavior of the CPU. We also performed an extensive set of case studies in order to determine energy inefficiencies of mobile applications. Through our case studies, we were able to propose optimization techniques in order to increase the energy efficiency of mobile devices and proposed guidelines for energy-efficient application development. In addition, we developed BatteryExtender, an adaptive user-guided tool for power management of mobile devices. The tool enables users to extend battery life on demand for a specific duration until a particular task is completed. Moreover, we examined the power consumption of System-on-Chips (SoCs) and observed the impact on the energy efficiency in the event of offloading tasks from the CPU to the specialized custom engines. Based on our case studies, we were able to demonstrate that current software-based power profiling techniques for SoCs can have an error rate close to 12%, which needs to be addressed in order to be able to optimize the energy consumption of the SoC. Finally, we summarize our contributions and outline possible direction for future research in this field.

  2. E-Science technologies in a workflow for personalized medicine using cancer screening as a case study.

    PubMed

    Spjuth, Ola; Karlsson, Andreas; Clements, Mark; Humphreys, Keith; Ivansson, Emma; Dowling, Jim; Eklund, Martin; Jauhiainen, Alexandra; Czene, Kamila; Grönberg, Henrik; Sparén, Pär; Wiklund, Fredrik; Cheddad, Abbas; Pálsdóttir, Þorgerður; Rantalainen, Mattias; Abrahamsson, Linda; Laure, Erwin; Litton, Jan-Eric; Palmgren, Juni

    2017-09-01

    We provide an e-Science perspective on the workflow from risk factor discovery and classification of disease to evaluation of personalized intervention programs. As case studies, we use personalized prostate and breast cancer screenings. We describe an e-Science initiative in Sweden, e-Science for Cancer Prevention and Control (eCPC), which supports biomarker discovery and offers decision support for personalized intervention strategies. The generic eCPC contribution is a workflow with 4 nodes applied iteratively, and the concept of e-Science signifies systematic use of tools from the mathematical, statistical, data, and computer sciences. The eCPC workflow is illustrated through 2 case studies. For prostate cancer, an in-house personalized screening tool, the Stockholm-3 model (S3M), is presented as an alternative to prostate-specific antigen testing alone. S3M is evaluated in a trial setting and plans for rollout in the population are discussed. For breast cancer, new biomarkers based on breast density and molecular profiles are developed and the US multicenter Women Informed to Screen Depending on Measures (WISDOM) trial is referred to for evaluation. While current eCPC data management uses a traditional data warehouse model, we discuss eCPC-developed features of a coherent data integration platform. E-Science tools are a key part of an evidence-based process for personalized medicine. This paper provides a structured workflow from data and models to evaluation of new personalized intervention strategies. The importance of multidisciplinary collaboration is emphasized. Importantly, the generic concepts of the suggested eCPC workflow are transferrable to other disease domains, although each disease will require tailored solutions. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  3. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  4. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  5. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  6. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.706(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  7. CBESW: sequence alignment on the Playstation 3.

    PubMed

    Wirawan, Adrianto; Kwoh, Chee Keong; Hieu, Nim Tri; Schmidt, Bertil

    2008-09-17

    The exponential growth of available biological data has caused bioinformatics to be rapidly moving towards a data-intensive, computational science. As a result, the computational power needed by bioinformatics applications is growing exponentially as well. The recent emergence of accelerator technologies has made it possible to achieve an excellent improvement in execution time for many bioinformatics applications, compared to current general-purpose platforms. In this paper, we demonstrate how the PlayStation 3, powered by the Cell Broadband Engine, can be used as a computational platform to accelerate the Smith-Waterman algorithm. For large datasets, our implementation on the PlayStation 3 provides a significant improvement in running time compared to other implementations such as SSEARCH, Striped Smith-Waterman and CUDA. Our implementation achieves a peak performance of up to 3,646 MCUPS. The results from our experiments demonstrate that the PlayStation 3 console can be used as an efficient low cost computational platform for high performance sequence alignment applications.

  8. CBESW: Sequence Alignment on the Playstation 3

    PubMed Central

    Wirawan, Adrianto; Kwoh, Chee Keong; Hieu, Nim Tri; Schmidt, Bertil

    2008-01-01

    Background The exponential growth of available biological data has caused bioinformatics to be rapidly moving towards a data-intensive, computational science. As a result, the computational power needed by bioinformatics applications is growing exponentially as well. The recent emergence of accelerator technologies has made it possible to achieve an excellent improvement in execution time for many bioinformatics applications, compared to current general-purpose platforms. In this paper, we demonstrate how the PlayStation® 3, powered by the Cell Broadband Engine, can be used as a computational platform to accelerate the Smith-Waterman algorithm. Results For large datasets, our implementation on the PlayStation® 3 provides a significant improvement in running time compared to other implementations such as SSEARCH, Striped Smith-Waterman and CUDA. Our implementation achieves a peak performance of up to 3,646 MCUPS. Conclusion The results from our experiments demonstrate that the PlayStation® 3 console can be used as an efficient low cost computational platform for high performance sequence alignment applications. PMID:18798993

  9. Computer-operated analytical platform for the determination of nutrients in hydroponic systems.

    PubMed

    Rius-Ruiz, F Xavier; Andrade, Francisco J; Riu, Jordi; Rius, F Xavier

    2014-03-15

    Hydroponics is a water, energy, space, and cost efficient system for growing plants in constrained spaces or land exhausted areas. Precise control of hydroponic nutrients is essential for growing healthy plants and producing high yields. In this article we report for the first time on a new computer-operated analytical platform which can be readily used for the determination of essential nutrients in hydroponic growing systems. The liquid-handling system uses inexpensive components (i.e., peristaltic pump and solenoid valves), which are discretely computer-operated to automatically condition, calibrate and clean a multi-probe of solid-contact ion-selective electrodes (ISEs). These ISEs, which are based on carbon nanotubes, offer high portability, robustness and easy maintenance and storage. With this new computer-operated analytical platform we performed automatic measurements of K(+), Ca(2+), NO3(-) and Cl(-) during tomato plants growth in order to assure optimal nutritional uptake and tomato production. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. CFD and Neutron codes coupling on a computational platform

    NASA Astrophysics Data System (ADS)

    Cerroni, D.; Da Vià, R.; Manservisi, S.; Menghini, F.; Scardovelli, R.

    2017-01-01

    In this work we investigate the thermal-hydraulics behavior of a PWR nuclear reactor core, evaluating the power generation distribution taking into account the local temperature field. The temperature field, evaluated using a self-developed CFD module, is exchanged with a neutron code, DONJON-DRAGON, which updates the macroscopic cross sections and evaluates the new neutron flux. From the updated neutron flux the new peak factor is evaluated and the new temperature field is computed. The exchange of data between the two codes is obtained thanks to their inclusion into the computational platform SALOME, an open-source tools developed by the collaborative project NURESAFE. The numerical libraries MEDmem, included into the SALOME platform, are used in this work, for the projection of computational fields from one problem to another. The two problems are driven by a common supervisor that can access to the computational fields of both systems, in every time step, the temperature field, is extracted from the CFD problem and set into the neutron problem. After this iteration the new power peak factor is projected back into the CFD problem and the new time step can be computed. Several computational examples, where both neutron and thermal-hydraulics quantities are parametrized, are finally reported in this work.

  11. Establishing a communications link between two different, incompatible, personal computers: with practical examples and illustrations and program code.

    PubMed

    Davidson, R W

    1985-01-01

    The increasing need to communicate to exchange data can be handled by personal microcomputers. The necessity for the transference of information stored in one type of personal computer to another type of personal computer is often encountered in the process of integrating multiple sources of information stored in different and incompatible computers in Medical Research and Practice. A practical example is demonstrated with two relatively inexpensive commonly used computers, the IBM PC jr. and the Apple IIe. The basic input/output (I/O) interface chip for serial communication for each computer are joined together using a Null connector and cable to form a communications link. Using BASIC (Beginner's All-purpose Symbolic Instruction Code) Computer Language and the Disk Operating System (DOS) the communications handshaking protocol and file transfer is established between the two computers. The BASIC programming languages used are Applesoft (Apple Personal Computer) and PC BASIC (IBM Personal computer).

  12. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  13. Altered Timing of Postural Reflexes Contributes to Falling in Persons with Chronic Stroke

    PubMed Central

    Marigold, Daniel S.; Eng, Janice J.

    2011-01-01

    The purpose of this study was to determine differences in the timing of postural reflexes and changes in kinematics between those who fell (Fallers) in response to standing platform translations and those who did not (Non-fallers). Forty-four persons with stroke were exposed to unexpected forward and backward platform translations while standing. Surface electromyography from bilateral tibialis anterior, gastrocnemius, rectus femoris, and biceps femoris were recorded along with kinematic data. Those that fell in response to the translations were compared to those who did not fall in terms of (1) postural reflex onset latency, (2) the time interval between the activation of distal and proximal muscles (i.e. intralimb coupling), and (3) changes in joint angles and trunk motion. Approximately 85% of falls occurred in response to the forward translations. Postural reflex onset latencies were delayed and intralimb coupling durations were longer in the Faller versus Non-faller group. At the time that the platform completed the translating motion (300 ms), the Faller group demonstrated higher trunk velocity, greater change in paretic ankle angle, and the trunk was further behind the ankle compared to the Non-faller group. This study suggests that following platform translations, delays in the timing of postural reflexes and disturbed intralimb coupling result in changes in kinematics, which contribute to falls in persons with stroke. PMID:16418855

  14. iES - An Intelligent Electronic Sales Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanton, V L; Korbe III, W; Gao, J G

    Current e-commerce systems support online shopping based on electronic product catalogs. The major issues associated with catalog-based commerce systems are: difficulty in distinguishing one retailer from another, complex navigation with confusing links, and a lack of personalized service. This paper reports an intelligent solution to address these issues. Our solution will provide a more personalized sales experience through the use of a transaction-based knowledge model that includes both the rules used for reasoning as well as the corresponding actions. Based on this solution, we have developed an intelligent electronic sales platform that is supported by a framework which provides themore » desired personalization as well as extensibility and customization capabilities. This paper reports our design and development of this system and application examples.« less

  15. Final Report. Center for Scalable Application Development Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codesmore » for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.« less

  16. Role of Teacher Educational Institutions in Developing Personality of Student Teachers

    ERIC Educational Resources Information Center

    Prakash, Srinivasan; Xavier S. J., S. Amaladoss

    2014-01-01

    Teacher Education is an integral part of any educational system. It should provide a platform in developing the holistic personality of a student teacher. This paper reports on personality of student teachers and the role of Teacher Educational institutions in developing it. The sample consists of 1,080 student teachers of Madurai revenue…

  17. Role of Teacher Educational Institutions in Developing Personality of Student Teachers

    ERIC Educational Resources Information Center

    Prakash, S.; Xavier, S. Amaladoss

    2014-01-01

    Teacher Education is an integral part of any educational system. It should provide a platform in developing the holistic personality of a student teacher. This paper reports on personality of student teachers and the role of Teacher Educational institutions in developing it. The sample consists of 1080 student teachers of Madurai revenue district.…

  18. A Multi-Level Parallelization Concept for High-Fidelity Multi-Block Solvers

    NASA Technical Reports Server (NTRS)

    Hatay, Ferhat F.; Jespersen, Dennis C.; Guruswamy, Guru P.; Rizk, Yehia M.; Byun, Chansup; Gee, Ken; VanDalsem, William R. (Technical Monitor)

    1997-01-01

    The integration of high-fidelity Computational Fluid Dynamics (CFD) analysis tools with the industrial design process benefits greatly from the robust implementations that are transportable across a wide range of computer architectures. In the present work, a hybrid domain-decomposition and parallelization concept was developed and implemented into the widely-used NASA multi-block Computational Fluid Dynamics (CFD) packages implemented in ENSAERO and OVERFLOW. The new parallel solver concept, PENS (Parallel Euler Navier-Stokes Solver), employs both fine and coarse granularity in data partitioning as well as data coalescing to obtain the desired load-balance characteristics on the available computer platforms. This multi-level parallelism implementation itself introduces no changes to the numerical results, hence the original fidelity of the packages are identically preserved. The present implementation uses the Message Passing Interface (MPI) library for interprocessor message passing and memory accessing. By choosing an appropriate combination of the available partitioning and coalescing capabilities only during the execution stage, the PENS solver becomes adaptable to different computer architectures from shared-memory to distributed-memory platforms with varying degrees of parallelism. The PENS implementation on the IBM SP2 distributed memory environment at the NASA Ames Research Center obtains 85 percent scalable parallel performance using fine-grain partitioning of single-block CFD domains using up to 128 wide computational nodes. Multi-block CFD simulations of complete aircraft simulations achieve 75 percent perfect load-balanced executions using data coalescing and the two levels of parallelism. SGI PowerChallenge, SGI Origin 2000, and a cluster of workstations are the other platforms where the robustness of the implementation is tested. The performance behavior on the other computer platforms with a variety of realistic problems will be included as this on-going study progresses.

  19. Cellphone-based devices for bioanalytical sciences

    PubMed Central

    Vashist, Sandeep Kumar; Mudanyali, Onur; Schneider, E.Marion; Zengerle, Roland; Ozcan, Aydogan

    2014-01-01

    During the last decade, there has been a rapidly growing trend toward the use of cellphone-based devices (CBDs) in bioanalytical sciences. For example, they have been used for digital microscopy, cytometry, read-out of immunoassays and lateral flow tests, electrochemical and surface plasmon resonance based bio-sensing, colorimetric detection and healthcare monitoring, among others. Cellphone can be considered as one of the most prospective devices for the development of next-generation point-of-care (POC) diagnostics platforms, enabling mobile healthcare delivery and personalized medicine. With more than 6.5 billion cellphone subscribers worldwide and approximately 1.6 billion new devices being sold each year, cellphone technology is also creating new business and research opportunities. Many cellphone-based devices, such as those targeted for diabetic management, weight management, monitoring of blood pressure and pulse rate, have already become commercially-available in recent years. In addition to such monitoring platforms, several other CBDs are also being introduced, targeting e.g., microscopic imaging and sensing applications for medical diagnostics using novel computational algorithms and components already embedded on cellphones. This manuscript aims to review these recent developments in CBDs for bioanalytical sciences along with some of the challenges involved and the future opportunities. PMID:24287630

  20. Implementation of a Multiplex and Quantitative Proteomics Platform for Assessing Protein Lysates Using DNA-Barcoded Antibodies.

    PubMed

    Lee, Jinho; Geiss, Gary K; Demirkan, Gokhan; Vellano, Christopher P; Filanoski, Brian; Lu, Yiling; Ju, Zhenlin; Yu, Shuangxing; Guo, Huifang; Bogatzki, Lisa Y; Carter, Warren; Meredith, Rhonda K; Krishnamurthy, Savitri; Ding, Zhiyong; Beechem, Joseph M; Mills, Gordon B

    2018-06-01

    Molecular analysis of tumors forms the basis for personalized cancer medicine and increasingly guides patient selection for targeted therapy. Future opportunities for personalized medicine are highlighted by the measurement of protein expression levels via immunohistochemistry, protein arrays, and other approaches; however, sample type, sample quantity, batch effects, and "time to result" are limiting factors for clinical application. Here, we present a development pipeline for a novel multiplexed DNA-labeled antibody platform which digitally quantifies protein expression from lysate samples. We implemented a rigorous validation process for each antibody and show that the platform is amenable to multiple protocols covering nitrocellulose and plate-based methods. Results are highly reproducible across technical and biological replicates, and there are no observed "batch effects" which are common for most multiplex molecular assays. Tests from basal and perturbed cancer cell lines indicate that this platform is comparable to orthogonal proteomic assays such as Reverse-Phase Protein Array, and applicable to measuring the pharmacodynamic effects of clinically-relevant cancer therapeutics. Furthermore, we demonstrate the potential clinical utility of the platform with protein profiling from breast cancer patient samples to identify molecular subtypes. Together, these findings highlight the potential of this platform for enhancing our understanding of cancer biology in a clinical translation setting. © 2018 by The American Society for Biochemistry and Molecular Biology, Inc.

  1. Stroke patients’ utilisation of extrinsic feedback from computer-based technology in the home: a multiple case study realistic evaluation

    PubMed Central

    2014-01-01

    Background Evidence indicates that post − stroke rehabilitation improves function, independence and quality of life. A key aspect of rehabilitation is the provision of appropriate information and feedback to the learner. Advances in information and communications technology (ICT) have allowed for the development of various systems to complement stroke rehabilitation that could be used in the home setting. These systems may increase the provision of rehabilitation a stroke survivor receives and carries out, as well as providing a learning platform that facilitates long-term self-managed rehabilitation and behaviour change. This paper describes the application of an innovative evaluative methodology to explore the utilisation of feedback for post-stroke upper-limb rehabilitation in the home. Methods Using the principles of realistic evaluation, this study aimed to test and refine intervention theories by exploring the complex interactions of contexts, mechanisms and outcomes that arise from technology deployment in the home. Methods included focus groups followed by multi-method case studies (n = 5) before, during and after the use of computer-based equipment. Data were analysed in relation to the context-mechanism-outcome hypotheses case by case. This was followed by a synthesis of the findings to answer the question, ‘what works for whom and in what circumstances and respects?’ Results Data analysis reveals that to achieve desired outcomes through the use of ICT, key elements of computer feedback, such as accuracy, measurability, rewarding feedback, adaptability, and knowledge of results feedback, are required to trigger the theory-driven mechanisms underpinning the intervention. In addition, the pre-existing context and the personal and environmental contexts, such as previous experience of service delivery, personal goals, trust in the technology, and social circumstances may also enable or constrain the underpinning theory-driven mechanisms. Conclusions Findings suggest that the theory-driven mechanisms underpinning the utilisation of feedback from computer-based technology for home-based upper-limb post-stroke rehabilitation are dependent on key elements of computer feedback and the personal and environmental context. The identification of these elements may therefore inform the development of technology; therapy education and the subsequent adoption of technology and a self-management paradigm; long-term self-managed rehabilitation; and importantly, improvements in the physical and psychosocial aspects of recovery. PMID:24903401

  2. Stroke patients' utilisation of extrinsic feedback from computer-based technology in the home: a multiple case study realistic evaluation.

    PubMed

    Parker, Jack; Mawson, Susan; Mountain, Gail; Nasr, Nasrin; Zheng, Huiru

    2014-06-05

    Evidence indicates that post-stroke rehabilitation improves function, independence and quality of life. A key aspect of rehabilitation is the provision of appropriate information and feedback to the learner.Advances in information and communications technology (ICT) have allowed for the development of various systems to complement stroke rehabilitation that could be used in the home setting. These systems may increase the provision of rehabilitation a stroke survivor receives and carries out, as well as providing a learning platform that facilitates long-term self-managed rehabilitation and behaviour change. This paper describes the application of an innovative evaluative methodology to explore the utilisation of feedback for post-stroke upper-limb rehabilitation in the home. Using the principles of realistic evaluation, this study aimed to test and refine intervention theories by exploring the complex interactions of contexts, mechanisms and outcomes that arise from technology deployment in the home. Methods included focus groups followed by multi-method case studies (n = 5) before, during and after the use of computer-based equipment. Data were analysed in relation to the context-mechanism-outcome hypotheses case by case. This was followed by a synthesis of the findings to answer the question, 'what works for whom and in what circumstances and respects?' Data analysis reveals that to achieve desired outcomes through the use of ICT, key elements of computer feedback, such as accuracy, measurability, rewarding feedback, adaptability, and knowledge of results feedback, are required to trigger the theory-driven mechanisms underpinning the intervention. In addition, the pre-existing context and the personal and environmental contexts, such as previous experience of service delivery, personal goals, trust in the technology, and social circumstances may also enable or constrain the underpinning theory-driven mechanisms. Findings suggest that the theory-driven mechanisms underpinning the utilisation of feedback from computer-based technology for home-based upper-limb post-stroke rehabilitation are dependent on key elements of computer feedback and the personal and environmental context. The identification of these elements may therefore inform the development of technology; therapy education and the subsequent adoption of technology and a self-management paradigm; long-term self-managed rehabilitation; and importantly, improvements in the physical and psychosocial aspects of recovery.

  3. A System Engineering Study and Concept Development for a Humanitarian Aid and Disaster Relief Operations Management Platform

    DTIC Science & Technology

    2016-09-01

    and network. The computing and network hardware are identified and include routers, servers, firewalls, laptops , backup hard drives, smart phones...deployable hardware units will be necessary. This includes the use of ruggedized laptops and desktop computers , a projector system, communications system...ENGINEERING STUDY AND CONCEPT DEVELOPMENT FOR A HUMANITARIAN AID AND DISASTER RELIEF OPERATIONS MANAGEMENT PLATFORM by Julie A. Reed September

  4. ROS-IGTL-Bridge: an open network interface for image-guided therapy using the ROS environment.

    PubMed

    Frank, Tobias; Krieger, Axel; Leonard, Simon; Patel, Niravkumar A; Tokuda, Junichi

    2017-08-01

    With the growing interest in advanced image-guidance for surgical robot systems, rapid integration and testing of robotic devices and medical image computing software are becoming essential in the research and development. Maximizing the use of existing engineering resources built on widely accepted platforms in different fields, such as robot operating system (ROS) in robotics and 3D Slicer in medical image computing could simplify these tasks. We propose a new open network bridge interface integrated in ROS to ensure seamless cross-platform data sharing. A ROS node named ROS-IGTL-Bridge was implemented. It establishes a TCP/IP network connection between the ROS environment and external medical image computing software using the OpenIGTLink protocol. The node exports ROS messages to the external software over the network and vice versa simultaneously, allowing seamless and transparent data sharing between the ROS-based devices and the medical image computing platforms. Performance tests demonstrated that the bridge could stream transforms, strings, points, and images at 30 fps in both directions successfully. The data transfer latency was <1.2 ms for transforms, strings and points, and 25.2 ms for color VGA images. A separate test also demonstrated that the bridge could achieve 900 fps for transforms. Additionally, the bridge was demonstrated in two representative systems: a mock image-guided surgical robot setup consisting of 3D slicer, and Lego Mindstorms with ROS as a prototyping and educational platform for IGT research; and the smart tissue autonomous robot surgical setup with 3D Slicer. The study demonstrated that the bridge enabled cross-platform data sharing between ROS and medical image computing software. This will allow rapid and seamless integration of advanced image-based planning/navigation offered by the medical image computing software such as 3D Slicer into ROS-based surgical robot systems.

  5. On the Impact of Execution Models: A Case Study in Computational Chemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavarría-Miranda, Daniel; Halappanavar, Mahantesh; Krishnamoorthy, Sriram

    2015-05-25

    Efficient utilization of high-performance computing (HPC) platforms is an important and complex problem. Execution models, abstract descriptions of the dynamic runtime behavior of the execution stack, have significant impact on the utilization of HPC systems. Using a computational chemistry kernel as a case study and a wide variety of execution models combined with load balancing techniques, we explore the impact of execution models on the utilization of an HPC system. We demonstrate a 50 percent improvement in performance by using work stealing relative to a more traditional static scheduling approach. We also use a novel semi-matching technique for load balancingmore » that has comparable performance to a traditional hypergraph-based partitioning implementation, which is computationally expensive. Using this study, we found that execution model design choices and assumptions can limit critical optimizations such as global, dynamic load balancing and finding the correct balance between available work units and different system and runtime overheads. With the emergence of multi- and many-core architectures and the consequent growth in the complexity of HPC platforms, we believe that these lessons will be beneficial to researchers tuning diverse applications on modern HPC platforms, especially on emerging dynamic platforms with energy-induced performance variability.« less

  6. Scalable, High-performance 3D Imaging Software Platform: System Architecture and Application to Virtual Colonoscopy

    PubMed Central

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli; Brett, Bevin

    2013-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. In this work, we have developed a software platform that is designed to support high-performance 3D medical image processing for a wide range of applications using increasingly available and affordable commodity computing systems: multi-core, clusters, and cloud computing systems. To achieve scalable, high-performance computing, our platform (1) employs size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D image processing algorithms; (2) supports task scheduling for efficient load distribution and balancing; and (3) consists of a layered parallel software libraries that allow a wide range of medical applications to share the same functionalities. We evaluated the performance of our platform by applying it to an electronic cleansing system in virtual colonoscopy, with initial experimental results showing a 10 times performance improvement on an 8-core workstation over the original sequential implementation of the system. PMID:23366803

  7. Interactions Between Genetics, Lifestyle, and Environmental Factors for Healthcare.

    PubMed

    Lin, Yuxin; Chen, Jiajia; Shen, Bairong

    2017-01-01

    The occurrence and progression of diseases are strongly associated with a combination of genetic, lifestyle, and environmental factors. Understanding the interplay between genetic and nongenetic components provides deep insights into disease pathogenesis and promotes personalized strategies for people healthcare. Recently, the paradigm of systems medicine, which integrates biomedical data and knowledge at multidimensional levels, is considered to be an optimal way for disease management and clinical decision-making in the era of precision medicine. In this chapter, epigenetic-mediated genetics-lifestyle-environment interactions within specific diseases and different ethnic groups are systematically discussed, and data sources, computational models, and translational platforms for systems medicine research are sequentially presented. Moreover, feasible suggestions on precision healthcare and healthy longevity are kindly proposed based on the comprehensive review of current studies.

  8. Virtual Reality: You Are There

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Telepresence or "virtual reality," allows a person, with assistance from advanced technology devices, to figuratively project himself into another environment. This technology is marketed by several companies, among them Fakespace, Inc., a former Ames Research Center contractor. Fakespace developed a teleoperational motion platform for transmitting sounds and images from remote locations. The "Molly" matches the user's head motion and, when coupled with a stereo viewing device and appropriate software, creates the telepresence experience. Its companion piece is the BOOM-the user's viewing device that provides the sense of involvement in the virtual environment. Either system may be used alone. Because suits, gloves, headphones, etc. are not needed, a whole range of commercial applications is possible, including computer-aided design techniques and virtual reality visualizations. Customers include Sandia National Laboratories, Stanford Research Institute and Mattel Toys.

  9. Association Rule Analysis for Tour Route Recommendation and Application to Wctsnop

    NASA Astrophysics Data System (ADS)

    Fang, H.; Chen, C.; Lin, J.; Liu, X.; Fang, D.

    2017-09-01

    The increasing E-tourism systems provide intelligent tour recommendation for tourists. In this sense, recommender system can make personalized suggestions and provide satisfied information associated with their tour cycle. Data mining is a proper tool that extracting potential information from large database for making strategic decisions. In the study, association rule analysis based on FP-growth algorithm is applied to find the association relationship among scenic spots in different cities as tour route recommendation. In order to figure out valuable rules, Kulczynski interestingness measure is adopted and imbalance ratio is computed. The proposed scheme was evaluated on Wangluzhe cultural tourism service network operation platform (WCTSNOP), where it could verify that it is able to quick recommend tour route and to rapidly enhance the recommendation quality.

  10. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    NASA Astrophysics Data System (ADS)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  11. Google Earth Engine: a new cloud-computing platform for global-scale earth observation data and analysis

    NASA Astrophysics Data System (ADS)

    Moore, R. T.; Hansen, M. C.

    2011-12-01

    Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as well as transparency in data and methods. Methods developed for global processing of MODIS data to map land cover are being adopted for use with Landsat data. Specifically, the MODIS Vegetation Continuous Field product methodology has been applied for mapping forest extent and change at national scales using Landsat time-series data sets. Scaling this method to continental and global scales is enabled by Google Earth Engine computing capabilities. By combining the supervised learning VCF approach with the Landsat archive and cloud computing, unprecedented monitoring of land cover dynamics is enabled.

  12. Design and fabrication of an angle-scanning based platform for the construction of surface plasmon resonance biosensor

    NASA Astrophysics Data System (ADS)

    Hu, Jiandong; Cao, Baiqiong; Wang, Shun; Li, Jianwei; Wei, Wensong; Zhao, Yuanyuan; Hu, Xinran; Zhu, Juanhua; Jiang, Min; Sun, Xiaohui; Chen, Ruipeng; Ma, Liuzheng

    2016-03-01

    A sensing system for an angle-scanning optical surface-plasmon-resonance (SPR) based biosensor has been designed with a laser line generator in which a P polarizer is embedded to utilize as an excitation source for producing the surface plasmon wave. In this system, the emitting beam from the laser line generator is controlled to realize the angle-scanning using a variable speed direct current (DC) motor. The light beam reflected from the prism deposited with a 50 nm Au film is then captured using the area CCD array which was controlled by a personal computer (PC) via a universal serial bus (USB) interface. The photoelectric signals from the high speed digital camera (an area CCD array) were converted by a 16 bit A/D converter before it transferred to the PC. One of the advantages of this SPR biosensing platform is greatly demonstrated by the label-free and real-time bio-molecular analysis without moving the area CCD array by following the laser line generator. It also could provide a low-cost surface plasmon resonance platform to improve the detection range in the measurement of bioanalytes. The SPR curve displayed on the PC screen promptly is formed by the effective data from the image on the area CCD array and the sensing responses of the platform to bulk refractive indices were calibrated using various concentrations of ethanol solution. These ethanol concentrations indicated with volumetric fraction of 5%, 10%, 15%, 20%, and 25%, respectively, were experimented to validate the performance of the angle-scanning optic SPR biosensing platform. As a result, the SPR sensor was capable to detect a change in the refractive index of the ethanol solution with the relative high linearity at the correlation coefficient of 0.9842. This greatly enhanced detection range is obtained from the position relationship between the laser line generator and the right-angle prism to allow direct quantification of the samples over a wide range of concentrations.

  13. SU-E-T-628: A Cloud Computing Based Multi-Objective Optimization Method for Inverse Treatment Planning.

    PubMed

    Na, Y; Suh, T; Xing, L

    2012-06-01

    Multi-objective (MO) plan optimization entails generation of an enormous number of IMRT or VMAT plans constituting the Pareto surface, which presents a computationally challenging task. The purpose of this work is to overcome the hurdle by developing an efficient MO method using emerging cloud computing platform. As a backbone of cloud computing for optimizing inverse treatment planning, Amazon Elastic Compute Cloud with a master node (17.1 GB memory, 2 virtual cores, 420 GB instance storage, 64-bit platform) is used. The master node is able to scale seamlessly a number of working group instances, called workers, based on the user-defined setting account for MO functions in clinical setting. Each worker solved the objective function with an efficient sparse decomposition method. The workers are automatically terminated if there are finished tasks. The optimized plans are archived to the master node to generate the Pareto solution set. Three clinical cases have been planned using the developed MO IMRT and VMAT planning tools to demonstrate the advantages of the proposed method. The target dose coverage and critical structure sparing of plans are comparable obtained using the cloud computing platform are identical to that obtained using desktop PC (Intel Xeon® CPU 2.33GHz, 8GB memory). It is found that the MO planning speeds up the processing of obtaining the Pareto set substantially for both types of plans. The speedup scales approximately linearly with the number of nodes used for computing. With the use of N nodes, the computational time is reduced by the fitting model, 0.2+2.3/N, with r̂2>0.99, on average of the cases making real-time MO planning possible. A cloud computing infrastructure is developed for MO optimization. The algorithm substantially improves the speed of inverse plan optimization. The platform is valuable for both MO planning and future off- or on-line adaptive re-planning. © 2012 American Association of Physicists in Medicine.

  14. A Platform for Scalable Satellite and Geospatial Data Analysis

    NASA Astrophysics Data System (ADS)

    Beneke, C. M.; Skillman, S.; Warren, M. S.; Kelton, T.; Brumby, S. P.; Chartrand, R.; Mathis, M.

    2017-12-01

    At Descartes Labs, we use the commercial cloud to run global-scale machine learning applications over satellite imagery. We have processed over 5 Petabytes of public and commercial satellite imagery, including the full Landsat and Sentinel archives. By combining open-source tools with a FUSE-based filesystem for cloud storage, we have enabled a scalable compute platform that has demonstrated reading over 200 GB/s of satellite imagery into cloud compute nodes. In one application, we generated global 15m Landsat-8, 20m Sentinel-1, and 10m Sentinel-2 composites from 15 trillion pixels, using over 10,000 CPUs. We recently created a public open-source Python client library that can be used to query and access preprocessed public satellite imagery from within our platform, and made this platform available to researchers for non-commercial projects. In this session, we will describe how you can use the Descartes Labs Platform for rapid prototyping and scaling of geospatial analyses and demonstrate examples in land cover classification.

  15. FPGA platform for prototyping and evaluation of neural network automotive applications

    NASA Technical Reports Server (NTRS)

    Aranki, N.; Tawel, R.

    2002-01-01

    In this paper we present an FPGA based reconfigurable computing platform for prototyping and evaluation of advanced neural network based applications for control and diagnostics in an automotive sub-systems.

  16. Collegial Activity Learning between Heterogeneous Sensors.

    PubMed

    Feuz, Kyle D; Cook, Diane J

    2017-11-01

    Activity recognition algorithms have matured and become more ubiquitous in recent years. However, these algorithms are typically customized for a particular sensor platform. In this paper we introduce PECO, a Personalized activity ECOsystem, that transfers learned activity information seamlessly between sensor platforms in real time so that any available sensor can continue to track activities without requiring its own extensive labeled training data. We introduce a multi-view transfer learning algorithm that facilitates this information handoff between sensor platforms and provide theoretical performance bounds for the algorithm. In addition, we empirically evaluate PECO using datasets that utilize heterogeneous sensor platforms to perform activity recognition. These results indicate that not only can activity recognition algorithms transfer important information to new sensor platforms, but any number of platforms can work together as colleagues to boost performance.

  17. 46 CFR 163.002-5 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... load means the sum of the weights of— (1) The rigid ladder or lift platform, the suspension cables (if... persons capacity of the hoist; (c) Lift height means the distance from the lowest step of the pilot ladder... (2) If the hoist does not have suspension cables, the ladder or lift platform is in its lowest...

  18. 46 CFR 163.002-5 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... load means the sum of the weights of— (1) The rigid ladder or lift platform, the suspension cables (if... persons capacity of the hoist; (c) Lift height means the distance from the lowest step of the pilot ladder... (2) If the hoist does not have suspension cables, the ladder or lift platform is in its lowest...

  19. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  20. An electromagnetic signals monitoring and analysis wireless platform employing personal digital assistants and pattern analysis techniques

    NASA Astrophysics Data System (ADS)

    Ninos, K.; Georgiadis, P.; Cavouras, D.; Nomicos, C.

    2010-05-01

    This study presents the design and development of a mobile wireless platform to be used for monitoring and analysis of seismic events and related electromagnetic (EM) signals, employing Personal Digital Assistants (PDAs). A prototype custom-developed application was deployed on a 3G enabled PDA that could connect to the FTP server of the Institute of Geodynamics of the National Observatory of Athens and receive and display EM signals at 4 receiver frequencies (3 KHz (E-W, N-S), 10 KHz (E-W, N-S), 41 MHz and 46 MHz). Signals may originate from any one of the 16 field-stations located around the Greek territory. Employing continuous recordings of EM signals gathered from January 2003 till December 2007, a Support Vector Machines (SVM)-based classification system was designed to distinguish EM precursor signals within noisy background. EM-signals corresponding to recordings preceding major seismic events (Ms≥5R) were segmented, by an experienced scientist, and five features (mean, variance, skewness, kurtosis, and a wavelet based feature), derived from the EM-signals were calculated. These features were used to train the SVM-based classification scheme. The performance of the system was evaluated by the exhaustive search and leave-one-out methods giving 87.2% overall classification accuracy, in correctly identifying EM precursor signals within noisy background employing all calculated features. Due to the insufficient processing power of the PDAs, this task was performed on a typical desktop computer. This optimal trained context of the SVM classifier was then integrated in the PDA based application rendering the platform capable to discriminate between EM precursor signals and noise. System's efficiency was evaluated by an expert who reviewed 1/ multiple EM-signals, up to 18 days prior to corresponding past seismic events, and 2/ the possible EM-activity of a specific region employing the trained SVM classifier. Additionally, the proposed architecture can form a base platform for a future integrated system that will incorporate services such as notifications for field station power failures, disruption of data flow, occurring SEs, and even other types of measurement and analysis processes such as the integration of a special analysis algorithm based on the ratio of short term to long term signal average.

  1. Typology of person-environment fit constellations: a platform addressing accessibility problems in the built environment for people with functional limitations.

    PubMed

    Slaug, Björn; Schilling, Oliver; Iwarsson, Susanne; Carlsson, Gunilla

    2015-09-02

    Making the built environment accessible for all regardless of functional capacity is an important goal for public health efforts. Considerable impediments to achieving this goal suggest the need for valid measurements of acccessibility and for greater attention to the complexity of person-environment fit issues. To address these needs, this study aimed to provide a methodological platform, useful for further research and instrument development within accessibility research. This was accomplished by the construction of a typology of problematic person-environment fit constellations, utilizing an existing methodology developed to assess and analyze accessibility problems in the built environment. By means of qualitative review and statistical methods we classified the person-environment fit components covered by an existing application which targets housing accessibility: the Housing Enabler (HE) instrument. The International Classification of Functioning, Disability and Health (ICF) was used as a conceptual framework. Qualitative classification principles were based on conceptual similarities and for quantitative analysis of similarities, Principal Component Analysis was carried out. We present a typology of problematic person-environment fit constellations classified along three dimensions: 1) accessibility problem range and severity 2) aspects of functioning 3) environmental context. As a result of the classification of the HE components, 48 typical person-environment fit constellations were recognised. The main contribution of this study is the proposed typology of person-environment fit constellations. The typology provides a methodological platform for the identification and quantification of problematic person-environment fit constellations. Its link to the globally accepted ICF classification system facilitates communication within the scientific and health care practice communities. The typology also highlights how relations between aspects of functioning and physical environmental barriers generate typical accessibility problems, and thereby furnishes a reference point for research oriented to how the built environment may be designed to be supportive for activity, participation and health.

  2. An Approach to Integrate a Space-Time GIS Data Model with High Performance Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Dali; Zhao, Ziliang; Shaw, Shih-Lung

    2011-01-01

    In this paper, we describe an approach to integrate a Space-Time GIS data model on a high performance computing platform. The Space-Time GIS data model has been developed on a desktop computing environment. We use the Space-Time GIS data model to generate GIS module, which organizes a series of remote sensing data. We are in the process of porting the GIS module into an HPC environment, in which the GIS modules handle large dataset directly via parallel file system. Although it is an ongoing project, authors hope this effort can inspire further discussions on the integration of GIS on highmore » performance computing platforms.« less

  3. A demonstrative model of a lunar base simulation on a personal computer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.

  4. Reprocessing Multiyear GPS Data from Continuously Operating Reference Stations on Cloud Computing Platform

    NASA Astrophysics Data System (ADS)

    Yoon, S.

    2016-12-01

    To define geodetic reference frame using GPS data collected by Continuously Operating Reference Stations (CORS) network, historical GPS data needs to be reprocessed regularly. Reprocessing GPS data collected by upto 2000 CORS sites for the last two decades requires a lot of computational resource. At National Geodetic Survey (NGS), there has been one completed reprocessing in 2011, and currently, the second reprocessing is undergoing. For the first reprocessing effort, in-house computing resource was utilized. In the current second reprocessing effort, outsourced cloud computing platform is being utilized. In this presentation, the outline of data processing strategy at NGS is described as well as the effort to parallelize the data processing procedure in order to maximize the benefit of the cloud computing. The time and cost savings realized by utilizing cloud computing approach will also be discussed.

  5. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Plutonium Metals, Oxides, and Solutions on the High Performance Computing Platform Moonlight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Bryan Scott; Gough, Sean T.

    This report documents a validation of the MCNP6 Version 1.0 computer code on the high performance computing platform Moonlight, for operations at Los Alamos National Laboratory (LANL) that involve plutonium metals, oxides, and solutions. The validation is conducted using the ENDF/B-VII.1 continuous energy group cross section library at room temperature. The results are for use by nuclear criticality safety personnel in performing analysis and evaluation of various facility activities involving plutonium materials.

  6. Development of a Very Dense Liquid Cooled Compute Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Phillip N.; Lipp, Robert J.

    2013-12-10

    The objective of this project was to design and develop a prototype very energy efficient high density compute platform with 100% pumped refrigerant liquid cooling using commodity components and high volume manufacturing techniques. Testing at SLAC has indicated that we achieved a DCIE of 0.93 against our original goal of 0.85. This number includes both cooling and power supply and was achieved employing some of the highest wattage processors available.

  7. Technical Note: scuda: A software platform for cumulative dose assessment.

    PubMed

    Park, Seyoun; McNutt, Todd; Plishker, William; Quon, Harry; Wong, John; Shekhar, Raj; Lee, Junghoon

    2016-10-01

    Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (scuda) that can be seamlessly integrated into the clinical workflow. scuda consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our image PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. The authors developed a unified software platform that provides accurate and efficient monitoring of anatomical changes and computation of actually delivered dose to the patient, thus realizing an efficient cumulative dose computation workflow. Evaluation on HN cases demonstrated the utility of our platform for monitoring the treatment quality and detecting significant dosimetric variations that are keys to successful ART.

  8. Technical Note: SCUDA: A software platform for cumulative dose assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Seyoun; McNutt, Todd; Quon, Harry

    Purpose: Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (SCUDA) that can be seamlessly integrated into the clinical workflow. Methods: SCUDA consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our imagemore » PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. Results: The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. Conclusions: The authors developed a unified software platform that provides accurate and efficient monitoring of anatomical changes and computation of actually delivered dose to the patient, thus realizing an efficient cumulative dose computation workflow. Evaluation on HN cases demonstrated the utility of our platform for monitoring the treatment quality and detecting significant dosimetric variations that are keys to successful ART.« less

  9. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.

  10. A Multi-Peer Assessment Platform for Programming Language Learning: Considering Group Non-Consensus and Personal Radicalness

    ERIC Educational Resources Information Center

    Wang, Yanqing; Liang, Yaowen; Liu, Luning; Liu, Ying

    2016-01-01

    Multi-peer assessment has often been used by teachers to reduce personal bias and make the assessment more reliable. This study reviews the design and development of multi-peer assessment systems that detect and solve two common issues in such systems: non-consensus among group members and personal radicalness in some assessments. A multi-peer…

  11. Comparing Facebook Users and Facebook Non-Users: Relationship between Personality Traits and Mental Health Variables - An Exploratory Study.

    PubMed

    Brailovskaia, Julia; Margraf, Jürgen

    2016-01-01

    Over one billion people use Facebook as a platform for social interaction and self-presentation making it one of the most popular online sites. The aim of the present study was to investigate differences in various personality traits and mental health variables between Facebook users and people who do not use this platform. The data of 945 participants (790 Facebook users, 155 Facebook non-users) were collected. Results indicate that Facebook users score significantly higher on narcissism, self-esteem and extraversion than Facebook non-users. Furthermore, they have significantly higher values of social support, life satisfaction and subjective happiness. Facebook non-users have (marginally) significantly higher values of depression symptoms than Facebook users. In both groups, extraversion, self-esteem, happiness, life satisfaction, resilience and social support, on the one hand, and depression, anxiety and stress symptoms, on the other hand, are negatively correlated. Neuroticism is positively associated with depression, anxiety and stress symptoms. However, significant differences exist between Facebook users and Facebook non-users regarding some associations of personality traits and mental health variables. Compared to Facebook non-users, the present results indicate that Facebook users have higher values of certain personality traits and positive variables protecting mental health. These findings are of particular interest considering the high importance of social online-platforms in the daily life of many people.

  12. A personalized food allergen testing platform on a cellphone

    PubMed Central

    Coskun, Ahmet F.; Wong, Justin; Khodadadi, Delaram; Nagi, Richie; Tey, Andrew; Ozcan, Aydogan

    2013-01-01

    We demonstrate a personalized food allergen testing platform, termed iTube, running on a cellphone that images and automatically analyses colorimetric assays performed in test tubes toward sensitive and specific detection of allergens in food samples. This cost-effective and compact iTube attachment, weighing approximately 40 grams, is mechanically installed on the existing camera unit of a cellphone where the test and control tubes are inserted from the side and are vertically illuminated by two separate light-emitting-diodes. The illumination light is absorbed by the allergen assay that is activated within the tubes, causing an intensity change in the acquired images by the cellphone camera. These transmission images of the sample and control tubes are digitally processed within1 sec using a smart application running on the same cellphone for detection and quantification of allergen contamination in food products. We evaluated the performance of this cellphone based iTube platform using different types of commercially available cookies, where the existence of peanuts was accurately quantified after a sample preparation and incubation time of ~20 min per test. This automated and cost-effective personalized food allergen testing tool running on cellphones can also permit uploading of test results to secure servers to create personal and/or public spatio-temporal allergen maps, which can be useful for public health in various settings. PMID:23254910

  13. Comparing Facebook Users and Facebook Non-Users: Relationship between Personality Traits and Mental Health Variables – An Exploratory Study

    PubMed Central

    2016-01-01

    Over one billion people use Facebook as a platform for social interaction and self-presentation making it one of the most popular online sites. The aim of the present study was to investigate differences in various personality traits and mental health variables between Facebook users and people who do not use this platform. The data of 945 participants (790 Facebook users, 155 Facebook non-users) were collected. Results indicate that Facebook users score significantly higher on narcissism, self-esteem and extraversion than Facebook non-users. Furthermore, they have significantly higher values of social support, life satisfaction and subjective happiness. Facebook non-users have (marginally) significantly higher values of depression symptoms than Facebook users. In both groups, extraversion, self-esteem, happiness, life satisfaction, resilience and social support, on the one hand, and depression, anxiety and stress symptoms, on the other hand, are negatively correlated. Neuroticism is positively associated with depression, anxiety and stress symptoms. However, significant differences exist between Facebook users and Facebook non-users regarding some associations of personality traits and mental health variables. Compared to Facebook non-users, the present results indicate that Facebook users have higher values of certain personality traits and positive variables protecting mental health. These findings are of particular interest considering the high importance of social online-platforms in the daily life of many people. PMID:27907020

  14. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.

    2014-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, and several new data products, such as map and distance-based goodness of fit plots. As the number and complexity of scenarios simulated using the Broadband Platform increases, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  15. Global Software Development with Cloud Platforms

    NASA Astrophysics Data System (ADS)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  16. Design and performance of the virtualization platform for offline computing on the ATLAS TDAQ Farm

    NASA Astrophysics Data System (ADS)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Twomey, M. S.; Zaytsev, A.

    2014-06-01

    With the LHC collider at CERN currently going through the period of Long Shutdown 1 there is an opportunity to use the computing resources of the experiments' large trigger farms for other data processing activities. In the case of the ATLAS experiment, the TDAQ farm, consisting of more than 1500 compute nodes, is suitable for running Monte Carlo (MC) production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of the design and deployment of a virtualized platform running on this computing resource and of its use to run large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to guarantee the security and the usability of the ATLAS private network, and to minimize interference with TDAQ's usage of the farm. Openstack has been chosen to provide a cloud management layer. The experience gained in the last 3.5 months shows that the use of the TDAQ farm for the MC simulation contributes to the ATLAS data processing at the level of a large Tier-1 WLCG site, despite the opportunistic nature of the underlying computing resources being used.

  17. The Trip Itinerary Optimization Platform: A Framework for Personalized Travel Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwasnik, Ted; Carmichael, Scott P.; Arent, Douglas J

    The New Concepts Incubator team at the National Renewable Energy Laboratory (NREL) developed a three-stage online platform for travel diary collection, personal travel plan optimization and travel itinerary visualization. In the first stage, users provide a travel diary for the previous day through an interactive map and calendar interface and survey for travel attitudes and behaviors. One or more days later, users are invited via email to engage in a second stage where they view a personal mobility dashboard displaying recommended travel itineraries generated from a novel framework that optimizes travel outcomes over a sequence of interrelated trips. A weekmore » or more after viewing these recommended travel itineraries on the dashboard, users are emailed again to engage in a third stage where they complete a final survey about travel attitudes and behaviors. A usability study of the platform conducted online showed that, in general, users found the system valuable for informing their travel decisions. A total of 274 individuals were recruited through Amazon Mechanical Turk, an online survey platform, to participate in a transportation study using this platform. On average, the platform distilled 65 feasible travel plans per individual into two recommended itineraries, each optimal according to one or more outcomes and dependent on the fixed times and locations from the travel diary. For 45 percent of users, the trip recommendation algorithm returned only a single, typically automobile-centric, itinerary because there were no other viable alternative transportation modes available. Platform users generally agreed that the dashboard was enjoyable and easy to use, and that it would be a helpful tool in adopting new travel behaviors. Users generally agreed most that the time, cost and user preferred recommendations 'made sense' to them, and were most willing to implement these itineraries. Platform users typically expressed low willingness to try the carbon and calories optimized itineraries. Of the platform users who viewed the dashboard, 13 percent reported changing their travel behavior, most adopting the time, calories or carbon optimized itineraries. While the algorithm incorporates a wealth of travel data obtained from online APIs pertaining to a travelers route such as historic traffic condition data, public transit time-tables, and bike path routes, open-ended responses from users expressed an interest in the integration of even more fine-grained traffic data and the ability to dynamically model the effect of changes in travel times. Users also commonly expressed concerns over the safety of walking and biking recommendations. Responses indicate that more information about the amenities available to cyclists and pedestrians (sidewalks, shade from trees, access to food) and routes that avoid areas of perceived elevated danger would reduce barriers to implementing these recommendations. More accurate representations of personal vehicle trips (based on vehicle make and model, implications of parking) and the identification of routes that optimize caloric intensity (seeking out elevation changes or longer walks to public transit) are promising avenues for future research.« less

  18. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    ERIC Educational Resources Information Center

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  19. Cloud Computing in Support of Applied Learning: A Baseline Study of Infrastructure Design at Southern Polytechnic State University

    ERIC Educational Resources Information Center

    Conn, Samuel S.; Reichgelt, Han

    2013-01-01

    Cloud computing represents an architecture and paradigm of computing designed to deliver infrastructure, platforms, and software as constructible computing resources on demand to networked users. As campuses are challenged to better accommodate academic needs for applications and computing environments, cloud computing can provide an accommodating…

  20. FORECAST - A cloud-based personalized intelligent virtual coaching platform for the well-being of cancer patients.

    PubMed

    Kyriazakos, Sofoklis; Valentini, Vincenzo; Cesario, Alfredo; Zachariae, Robert

    2018-01-01

    Well-being of cancer patients and survivors is a challenge worldwide, considering the often chronic nature of the disease. Today, a large number of initiatives, products and services are available that aim to provide strategies to face the challenge of well-being in cancer patients; nevertheless the proposed solutions are often non-sustainable, costly, unavailable to those in need, and less well-received by patients. These challenges were considered in designing FORECAST, a cloud-based personalized intelligent virtual coaching platform for improving the well-being of cancer patients. Personalized coaching for cancer patients focuses on physical, mental, and emotional concerns, which FORECAST is able to identify. Cancer patients can benefit from coaching that addresses their emotional problems, helps them focus on their goals, and supports them in coping with their disease-related stressors. Personalized coaching in FORECAST offers support, encouragement, motivation, confidence, and hope and is a valuable tool for the wellbeing of a patient.

  1. Virtual Versus In-Person Focus Groups: Comparison of Costs, Recruitment, and Participant Logistics

    PubMed Central

    Poehlman, Jon A; Hayes, Jennifer J; Ray, Sarah E; Moultrie, Rebecca R

    2017-01-01

    Background Virtual focus groups—such as online chat and video groups—are increasingly promoted as qualitative research tools. Theoretically, virtual groups offer several advantages, including lower cost, faster recruitment, greater geographic diversity, enrollment of hard-to-reach populations, and reduced participant burden. However, no study has compared virtual and in-person focus groups on these metrics. Objective To rigorously compare virtual and in-person focus groups on cost, recruitment, and participant logistics. We examined 3 focus group modes and instituted experimental controls to ensure a fair comparison. Methods We conducted 6 1-hour focus groups in August 2014 using in-person (n=2), live chat (n=2), and video (n=2) modes with individuals who had type 2 diabetes (n=48 enrolled, n=39 completed). In planning groups, we solicited bids from 6 virtual platform vendors and 4 recruitment firms. We then selected 1 platform or facility per mode and a single recruitment firm across all modes. To minimize bias, the recruitment firm employed different recruiters by mode who were blinded to recruitment efforts for other modes. We tracked enrollment during a 2-week period. A single moderator conducted all groups using the same guide, which addressed the use of technology to communicate with health care providers. We conducted the groups at the same times of day on Monday to Wednesday during a single week. At the end of each group, participants completed a short survey. Results Virtual focus groups offered minimal cost savings compared with in-person groups (US $2000 per chat group vs US $2576 per in-person group vs US $2,750 per video group). Although virtual groups did not incur travel costs, they often had higher management fees and miscellaneous expenses (eg, participant webcams). Recruitment timing did not differ by mode, but show rates were higher for in-person groups (94% [15/16] in-person vs 81% [13/16] video vs 69% [11/16] chat). Virtual group participants were more geographically diverse (but with significant clustering around major metropolitan areas) and more likely to be non-white, less educated, and less healthy. Internet usage was higher among virtual group participants, yet virtual groups still reached light Internet users. In terms of burden, chat groups were easiest to join and required the least preparation (chat = 13 minutes, video = 40 minutes, in-person = 78 minutes). Virtual group participants joined using laptop or desktop computers, and most virtual participants (82% [9/11] chat vs 62% [8/13] video) reported having no other people in their immediate vicinity. Conclusions Virtual focus groups offer potential advantages for participant diversity and reaching less healthy populations. However, virtual groups do not appear to cost less or recruit participants faster than in-person groups. Further research on virtual group data quality and group dynamics is needed to fully understand their advantages and limitations. PMID:28330832

  2. Integration of multisensor hybrid reasoners to support personal autonomy in the smart home.

    PubMed

    Valero, Miguel Ángel; Bravo, José; Chamizo, Juan Manuel García; López-de-Ipiña, Diego

    2014-09-17

    The deployment of the Ambient Intelligence (AmI) paradigm requires designing and integrating user-centered smart environments to assist people in their daily life activities. This research paper details an integration and validation of multiple heterogeneous sensors with hybrid reasoners that support decision making in order to monitor personal and environmental data at a smart home in a private way. The results innovate on knowledge-based platforms, distributed sensors, connected objects, accessibility and authentication methods to promote independent living for elderly people. TALISMAN+, the AmI framework deployed, integrates four subsystems in the smart home: (i) a mobile biomedical telemonitoring platform to provide elderly patients with continuous disease management; (ii) an integration middleware that allows context capture from heterogeneous sensors to program environment's reaction; (iii) a vision system for intelligent monitoring of daily activities in the home; and (iv) an ontologies-based integrated reasoning platform to trigger local actions and manage private information in the smart home. The framework was integrated in two real running environments, the UPM Accessible Digital Home and MetalTIC house, and successfully validated by five experts in home care, elderly people and personal autonomy.

  3. Integration of Multisensor Hybrid Reasoners to Support Personal Autonomy in the Smart Home

    PubMed Central

    Valero, Miguel Ángel; Bravo, José; Chamizo, Juan Manuel García; López-de-Ipiña, Diego

    2014-01-01

    The deployment of the Ambient Intelligence (AmI) paradigm requires designing and integrating user-centered smart environments to assist people in their daily life activities. This research paper details an integration and validation of multiple heterogeneous sensors with hybrid reasoners that support decision making in order to monitor personal and environmental data at a smart home in a private way. The results innovate on knowledge-based platforms, distributed sensors, connected objects, accessibility and authentication methods to promote independent living for elderly people. TALISMAN+, the AmI framework deployed, integrates four subsystems in the smart home: (i) a mobile biomedical telemonitoring platform to provide elderly patients with continuous disease management; (ii) an integration middleware that allows context capture from heterogeneous sensors to program environment's reaction; (iii) a vision system for intelligent monitoring of daily activities in the home; and (iv) an ontologies-based integrated reasoning platform to trigger local actions and manage private information in the smart home. The framework was integrated in two real running environments, the UPM Accessible Digital Home and MetalTIC house, and successfully validated by five experts in home care, elderly people and personal autonomy. PMID:25232910

  4. To What Degree Are Undergraduate Students Using Their Personal Computers to Support Their Daily Study Practices?

    ERIC Educational Resources Information Center

    Sim, KwongNui; Butson, Russell

    2014-01-01

    This scoping study examines the degree to which twenty two undergraduate students used their personal computers to support their academic study. The students were selected based on their responses to a questionnaire aimed at gauging their degree of computer skill. Computer activity data was harvested from the personal computers of eighteen…

  5. Initial experience with a handheld device digital imaging and communications in medicine viewer: OsiriX mobile on the iPhone.

    PubMed

    Choudhri, Asim F; Radvany, Martin G

    2011-04-01

    Medical imaging is commonly used to diagnose many emergent conditions, as well as plan treatment. Digital images can be reviewed on almost any computing platform. Modern mobile phones and handheld devices are portable computing platforms with robust software programming interfaces, powerful processors, and high-resolution displays. OsiriX mobile, a new Digital Imaging and Communications in Medicine viewing program, is available for the iPhone/iPod touch platform. This raises the possibility of mobile review of diagnostic medical images to expedite diagnosis and treatment planning using a commercial off the shelf solution, facilitating communication among radiologists and referring clinicians.

  6. Atomdroid: a computational chemistry tool for mobile platforms.

    PubMed

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  7. A Wearable Mobile Sensor Platform to Assist Fruit Grading

    PubMed Central

    Aroca, Rafael V.; Gomes, Rafael B.; Dantas, Rummennigue R.; Calbo, Adonai G.; Gonçalves, Luiz M. G.

    2013-01-01

    Wearable computing is a form of ubiquitous computing that offers flexible and useful tools for users. Specifically, glove-based systems have been used in the last 30 years in a variety of applications, but mostly focusing on sensing people's attributes, such as finger bending and heart rate. In contrast, we propose in this work a novel flexible and reconfigurable instrumentation platform in the form of a glove, which can be used to analyze and measure attributes of fruits by just pointing or touching them with the proposed glove. An architecture for such a platform is designed and its application for intuitive fruit grading is also presented, including experimental results for several fruits. PMID:23666134

  8. Design of Control Plane Architecture Based on Cloud Platform and Experimental Network Demonstration for Multi-domain SDON

    NASA Astrophysics Data System (ADS)

    Li, Ming; Yin, Hongxi; Xing, Fangyuan; Wang, Jingchao; Wang, Honghuan

    2016-02-01

    With the features of network virtualization and resource programming, Software Defined Optical Network (SDON) is considered as the future development trend of optical network, provisioning a more flexible, efficient and open network function, supporting intraconnection and interconnection of data centers. Meanwhile cloud platform can provide powerful computing, storage and management capabilities. In this paper, with the coordination of SDON and cloud platform, a multi-domain SDON architecture based on cloud control plane has been proposed, which is composed of data centers with database (DB), path computation element (PCE), SDON controller and orchestrator. In addition, the structure of the multidomain SDON orchestrator and OpenFlow-enabled optical node are proposed to realize the combination of centralized and distributed effective management and control platform. Finally, the functional verification and demonstration are performed through our optical experiment network.

  9. CMOS Image Sensor with a Built-in Lane Detector.

    PubMed

    Hsiao, Pei-Yung; Cheng, Hsien-Chein; Huang, Shih-Shinh; Fu, Li-Chen

    2009-01-01

    This work develops a new current-mode mixed signal Complementary Metal-Oxide-Semiconductor (CMOS) imager, which can capture images and simultaneously produce vehicle lane maps. The adopted lane detection algorithm, which was modified to be compatible with hardware requirements, can achieve a high recognition rate of up to approximately 96% under various weather conditions. Instead of a Personal Computer (PC) based system or embedded platform system equipped with expensive high performance chip of Reduced Instruction Set Computer (RISC) or Digital Signal Processor (DSP), the proposed imager, without extra Analog to Digital Converter (ADC) circuits to transform signals, is a compact, lower cost key-component chip. It is also an innovative component device that can be integrated into intelligent automotive lane departure systems. The chip size is 2,191.4 × 2,389.8 μm, and the package uses 40 pin Dual-In-Package (DIP). The pixel cell size is 18.45 × 21.8 μm and the core size of photodiode is 12.45 × 9.6 μm; the resulting fill factor is 29.7%.

  10. An independent software system for the analysis of dynamic MR images.

    PubMed

    Torheim, G; Lombardi, M; Rinck, P A

    1997-01-01

    A computer system for the manual, semi-automatic, and automatic analysis of dynamic MR images was to be developed on UNIX and personal computer platforms. The system was to offer an integrated and standardized way of performing both image processing and analysis that was independent of the MR unit used. The system consists of modules that are easily adaptable to special needs. Data from MR units or other diagnostic imaging equipment in techniques such as CT, ultrasonography, or nuclear medicine can be processed through the ACR-NEMA/DICOM standard file formats. A full set of functions is available, among them cine-loop visual analysis, and generation of time-intensity curves. Parameters such as cross-correlation coefficients, area under the curve, peak/maximum intensity, wash-in and wash-out slopes, time to peak, and relative signal intensity/contrast enhancement can be calculated. Other parameters can be extracted by fitting functions like the gamma-variate function. Region-of-interest data and parametric values can easily be exported. The system has been successfully tested in animal and patient examinations.

  11. An efficient and scalable deformable model for virtual reality-based medical applications.

    PubMed

    Choi, Kup-Sze; Sun, Hanqiu; Heng, Pheng-Ann

    2004-09-01

    Modeling of tissue deformation is of great importance to virtual reality (VR)-based medical simulations. Considerable effort has been dedicated to the development of interactively deformable virtual tissues. In this paper, an efficient and scalable deformable model is presented for virtual-reality-based medical applications. It considers deformation as a localized force transmittal process which is governed by algorithms based on breadth-first search (BFS). The computational speed is scalable to facilitate real-time interaction by adjusting the penetration depth. Simulated annealing (SA) algorithms are developed to optimize the model parameters by using the reference data generated with the linear static finite element method (FEM). The mechanical behavior and timing performance of the model have been evaluated. The model has been applied to simulate the typical behavior of living tissues and anisotropic materials. Integration with a haptic device has also been achieved on a generic personal computer (PC) platform. The proposed technique provides a feasible solution for VR-based medical simulations and has the potential for multi-user collaborative work in virtual environment.

  12. Two complementary personal medication management applications developed on a common platform: case report.

    PubMed

    Ross, Stephen E; Johnson, Kevin B; Siek, Katie A; Gordon, Jeffry S; Khan, Danish U; Haverhals, Leah M

    2011-07-12

    Adverse drug events are a major safety issue in ambulatory care. Improving medication self-management could reduce these adverse events. Researchers have developed medication applications for tethered personal health records (PHRs), but little has been reported about medication applications for interoperable PHRs. Our objective was to develop two complementary personal health applications on a common PHR platform: one to assist children with complex health needs (MyMediHealth), and one to assist older adults in care transitions (Colorado Care Tablet). The applications were developed using a user-centered design approach. The two applications shared a common PHR platform based on a service-oriented architecture. MyMediHealth employed Web and mobile phone user interfaces. Colorado Care Tablet employed a Web interface customized for a tablet PC. We created complementary medication management applications tailored to the needs of distinctly different user groups using common components. Challenges were addressed in multiple areas, including how to encode medication identities, how to incorporate knowledge bases for medication images and consumer health information, how to include supplementary dosing information, how to simplify user interfaces for older adults, and how to support mobile devices for children. These prototypes demonstrate the utility of abstracting PHR data and services (the PHR platform) from applications that can be tailored to meet the needs of diverse patients. Based on the challenges we faced, we provide recommendations on the structure of publicly available knowledge resources and the use of mobile messaging systems for PHR applications.

  13. Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales (NCAPS)

    DTIC Science & Technology

    2006-10-01

    NCAPS ) Christina M. Underhill, Ph.D. Approved for public release; distribution is unlimited. NPRST-TN-06-9 October 2006...Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) Christina M. Underhill, Ph.D...documents one of the steps in our development of the Navy Computer Adaptive Personality Scales ( NCAPS ). NCAPS is a computer adaptive personality measure

  14. Cooperative high-performance storage in the accelerated strategic computing initiative

    NASA Technical Reports Server (NTRS)

    Gary, Mark; Howard, Barry; Louis, Steve; Minuzzo, Kim; Seager, Mark

    1996-01-01

    The use and acceptance of new high-performance, parallel computing platforms will be impeded by the absence of an infrastructure capable of supporting orders-of-magnitude improvement in hierarchical storage and high-speed I/O (Input/Output). The distribution of these high-performance platforms and supporting infrastructures across a wide-area network further compounds this problem. We describe an architectural design and phased implementation plan for a distributed, Cooperative Storage Environment (CSE) to achieve the necessary performance, user transparency, site autonomy, communication, and security features needed to support the Accelerated Strategic Computing Initiative (ASCI). ASCI is a Department of Energy (DOE) program attempting to apply terascale platforms and Problem-Solving Environments (PSEs) toward real-world computational modeling and simulation problems. The ASCI mission must be carried out through a unified, multilaboratory effort, and will require highly secure, efficient access to vast amounts of data. The CSE provides a logically simple, geographically distributed, storage infrastructure of semi-autonomous cooperating sites to meet the strategic ASCI PSE goal of highperformance data storage and access at the user desktop.

  15. Auto-Generated Semantic Processing Services

    NASA Technical Reports Server (NTRS)

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  16. 47 CFR 15.102 - CPU boards and power supplies used in personal computers.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...

  17. 47 CFR 15.102 - CPU boards and power supplies used in personal computers.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...

  18. 47 CFR 15.102 - CPU boards and power supplies used in personal computers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...

  19. 47 CFR 15.102 - CPU boards and power supplies used in personal computers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...

  20. 47 CFR 15.102 - CPU boards and power supplies used in personal computers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...

  1. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  2. Research and the Personal Computer.

    ERIC Educational Resources Information Center

    Blackburn, D. A.

    1989-01-01

    Discussed is the history and elements of the personal computer. Its uses as a laboratory assistant and generic toolkit for mathematical analysis and modeling are included. The future of the personal computer in research is addressed. (KR)

  3. Breastfeeding and Social Media among First-Time African American Mothers

    PubMed Central

    Asiodu, Ifeyinwa V.; Waters, Catherine M.; Dailey, Dawn E.; Lee, Kathryn A.; Lyndon, Audrey

    2015-01-01

    Objective To describe the use of social media during the antepartum and postpartum periods among first-time African American mothers and their support persons. Design A qualitative critical ethnographic research design within the contexts of Family Life Course Development Theory and Black Feminist Theory. Setting Participants were recruited from community-based, public health, and home visiting programs. Participants A purposive sample was recruited, consisting of 14 pregnant African American women and eight support persons. Methods Pregnant and postpartum African American women and their support persons were interviewed separately during the antepartum and postpartum periods. Data were analyzed thematically. Results Participants frequently used social media for educational and social support and searched the internet for perinatal and parenting information. Most participants reported using at least one mobile application during their pregnancies and after giving birth. Social media were typically accessed through smartphones and/or computers using different websites and applications. While participants gleaned considerable information about infant development from these applications, they had difficulty finding and recalling information about infant feeding. Conclusion Social media are an important vehicle to disseminate infant feeding information; however, they are not currently being used to full potential. Our findings suggest that future interventions geared towards African American mothers and their support persons should include social media approaches. The way individuals gather, receive, and interpret information is dynamic. The increasing popularity and use of social media platforms offers the opportunity to create more innovative, targeted mobile health interventions for infant feeding and breastfeeding promotion. PMID:25712127

  4. Soft Computing Techniques for the Protein Folding Problem on High Performance Computing Architectures.

    PubMed

    Llanes, Antonio; Muñoz, Andrés; Bueno-Crespo, Andrés; García-Valverde, Teresa; Sánchez, Antonia; Arcas-Túnez, Francisco; Pérez-Sánchez, Horacio; Cecilia, José M

    2016-01-01

    The protein-folding problem has been extensively studied during the last fifty years. The understanding of the dynamics of global shape of a protein and the influence on its biological function can help us to discover new and more effective drugs to deal with diseases of pharmacological relevance. Different computational approaches have been developed by different researchers in order to foresee the threedimensional arrangement of atoms of proteins from their sequences. However, the computational complexity of this problem makes mandatory the search for new models, novel algorithmic strategies and hardware platforms that provide solutions in a reasonable time frame. We present in this revision work the past and last tendencies regarding protein folding simulations from both perspectives; hardware and software. Of particular interest to us are both the use of inexact solutions to this computationally hard problem as well as which hardware platforms have been used for running this kind of Soft Computing techniques.

  5. Applications integration in a hybrid cloud computing environment: modelling and platform

    NASA Astrophysics Data System (ADS)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lingerfelt, Eric J; Endeve, Eirik; Hui, Yawei

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now--with the rise of multimodal acquisition systems and the associated processing capability--the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalablemore » data analysis and simulation and manage uploaded data files via an intuitive, cross-platform client user interface. This framework delivers authenticated, "push-button" execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing compute-and-data cloud infrastructures and HPC environments like Titan at the Oak Ridge Leadershp Computing Facility (OLCF).« less

  7. Validation of tablet-based evaluation of color fundus images

    PubMed Central

    Christopher, Mark; Moga, Daniela C.; Russell, Stephen R.; Folk, James C.; Scheetz, Todd; Abràmoff, Michael D.

    2012-01-01

    Purpose To compare diabetic retinopathy (DR) referral recommendations made by viewing fundus images using a tablet computer to recommendations made using a standard desktop display. Methods A tablet computer (iPad) and a desktop PC with a high-definition color display were compared. For each platform, two retinal specialists independently rated 1200 color fundus images from patients at risk for DR using an annotation program, Truthseeker. The specialists determined whether each image had referable DR, and also how urgently each patient should be referred for medical examination. Graders viewed and rated the randomly presented images independently and were masked to their ratings on the alternative platform. Tablet- and desktop display-based referral ratings were compared using cross-platform, intra-observer kappa as the primary outcome measure. Additionally, inter-observer kappa, sensitivity, specificity, and area under ROC (AUC) were determined. Results A high level of cross-platform, intra-observer agreement was found for the DR referral ratings between the platforms (κ=0.778), and for the two graders, (κ=0.812). Inter-observer agreement was similar for the two platforms (κ=0.544 and κ=0.625 for tablet and desktop, respectively). The tablet-based ratings achieved a sensitivity of 0.848, a specificity of 0.987, and an AUC of 0.950 compared to desktop display-based ratings. Conclusions In this pilot study, tablet-based rating of color fundus images for subjects at risk for DR was consistent with desktop display-based rating. These results indicate that tablet computers can be reliably used for clinical evaluation of fundus images for DR. PMID:22495326

  8. The role of atomic lines in radiation heating of the experimental space vehicle Fire-II

    NASA Astrophysics Data System (ADS)

    Surzhikov, S. T.

    2015-10-01

    The results of calculating the convective and radiation heating of the Fire-II experimental space vehicle allowing for atomic lines of atoms and ions using the NERAT-ASTEROID computer platform are presented. This computer platform is intended to solve the complete set of equations of radiation gas dynamics of viscous, heat-conductive, and physically and chemically nonequilibrium gas, as well as radiation transfer. The spectral optical properties of high temperature gases are calculated using ab initio quasi-classical and quantum-mechanical methods. The calculation of the transfer of selective thermal radiation is performed using a line-by-line method using specially generated computational grids over the radiation wavelengths, which make it possible to attain a noticeable economy of computational resources.

  9. A novel medical image data-based multi-physics simulation platform for computational life sciences.

    PubMed

    Neufeld, Esra; Szczerba, Dominik; Chavannes, Nicolas; Kuster, Niels

    2013-04-06

    Simulating and modelling complex biological systems in computational life sciences requires specialized software tools that can perform medical image data-based modelling, jointly visualize the data and computational results, and handle large, complex, realistic and often noisy anatomical models. The required novel solvers must provide the power to model the physics, biology and physiology of living tissue within the full complexity of the human anatomy (e.g. neuronal activity, perfusion and ultrasound propagation). A multi-physics simulation platform satisfying these requirements has been developed for applications including device development and optimization, safety assessment, basic research, and treatment planning. This simulation platform consists of detailed, parametrized anatomical models, a segmentation and meshing tool, a wide range of solvers and optimizers, a framework for the rapid development of specialized and parallelized finite element method solvers, a visualization toolkit-based visualization engine, a Python scripting interface for customized applications, a coupling framework, and more. Core components are cross-platform compatible and use open formats. Several examples of applications are presented: hyperthermia cancer treatment planning, tumour growth modelling, evaluating the magneto-haemodynamic effect as a biomarker and physics-based morphing of anatomical models.

  10. Launching a virtual decision lab: development and field-testing of a web-based patient decision support research platform.

    PubMed

    Hoffman, Aubri S; Llewellyn-Thomas, Hilary A; Tosteson, Anna N A; O'Connor, Annette M; Volk, Robert J; Tomek, Ivan M; Andrews, Steven B; Bartels, Stephen J

    2014-12-12

    Over 100 trials show that patient decision aids effectively improve patients' information comprehension and values-based decision making. However, gaps remain in our understanding of several fundamental and applied questions, particularly related to the design of interactive, personalized decision aids. This paper describes an interdisciplinary development process for, and early field testing of, a web-based patient decision support research platform, or virtual decision lab, to address these questions. An interdisciplinary stakeholder panel designed the web-based research platform with three components: a) an introduction to shared decision making, b) a web-based patient decision aid, and c) interactive data collection items. Iterative focus groups provided feedback on paper drafts and online prototypes. A field test assessed a) feasibility for using the research platform, in terms of recruitment, usage, and acceptability; and b) feasibility of using the web-based decision aid component, compared to performance of a videobooklet decision aid in clinical care. This interdisciplinary, theory-based, patient-centered design approach produced a prototype for field-testing in six months. Participants (n = 126) reported that: the decision aid component was easy to use (98%), information was clear (90%), the length was appropriate (100%), it was appropriately detailed (90%), and it held their interest (97%). They spent a mean of 36 minutes using the decision aid and 100% preferred using their home/library computer. Participants scored a mean of 75% correct on the Decision Quality, Knowledge Subscale, and 74 out of 100 on the Preparation for Decision Making Scale. Completing the web-based decision aid reduced mean Decisional Conflict scores from 31.1 to 19.5 (p < 0.01). Combining decision science and health informatics approaches facilitated rapid development of a web-based patient decision support research platform that was feasible for use in research studies in terms of recruitment, acceptability, and usage. Within this platform, the web-based decision aid component performed comparably with the videobooklet decision aid used in clinical practice. Future studies may use this interactive research platform to study patients' decision making processes in real-time, explore interdisciplinary approaches to designing web-based decision aids, and test strategies for tailoring decision support to meet patients' needs and preferences.

  11. BrainFrame: a node-level heterogeneous accelerator platform for neuron simulations

    NASA Astrophysics Data System (ADS)

    Smaragdos, Georgios; Chatzikonstantis, Georgios; Kukreja, Rahul; Sidiropoulos, Harry; Rodopoulos, Dimitrios; Sourdis, Ioannis; Al-Ars, Zaid; Kachris, Christoforos; Soudris, Dimitrios; De Zeeuw, Chris I.; Strydis, Christos

    2017-12-01

    Objective. The advent of high-performance computing (HPC) in recent years has led to its increasing use in brain studies through computational models. The scale and complexity of such models are constantly increasing, leading to challenging computational requirements. Even though modern HPC platforms can often deal with such challenges, the vast diversity of the modeling field does not permit for a homogeneous acceleration platform to effectively address the complete array of modeling requirements. Approach. In this paper we propose and build BrainFrame, a heterogeneous acceleration platform that incorporates three distinct acceleration technologies, an Intel Xeon-Phi CPU, a NVidia GP-GPU and a Maxeler Dataflow Engine. The PyNN software framework is also integrated into the platform. As a challenging proof of concept, we analyze the performance of BrainFrame on different experiment instances of a state-of-the-art neuron model, representing the inferior-olivary nucleus using a biophysically-meaningful, extended Hodgkin-Huxley representation. The model instances take into account not only the neuronal-network dimensions but also different network-connectivity densities, which can drastically affect the workload’s performance characteristics. Main results. The combined use of different HPC technologies demonstrates that BrainFrame is better able to cope with the modeling diversity encountered in realistic experiments while at the same time running on significantly lower energy budgets. Our performance analysis clearly shows that the model directly affects performance and all three technologies are required to cope with all the model use cases. Significance. The BrainFrame framework is designed to transparently configure and select the appropriate back-end accelerator technology for use per simulation run. The PyNN integration provides a familiar bridge to the vast number of models already available. Additionally, it gives a clear roadmap for extending the platform support beyond the proof of concept, with improved usability and directly useful features to the computational-neuroscience community, paving the way for wider adoption.

  12. BrainFrame: a node-level heterogeneous accelerator platform for neuron simulations.

    PubMed

    Smaragdos, Georgios; Chatzikonstantis, Georgios; Kukreja, Rahul; Sidiropoulos, Harry; Rodopoulos, Dimitrios; Sourdis, Ioannis; Al-Ars, Zaid; Kachris, Christoforos; Soudris, Dimitrios; De Zeeuw, Chris I; Strydis, Christos

    2017-12-01

    The advent of high-performance computing (HPC) in recent years has led to its increasing use in brain studies through computational models. The scale and complexity of such models are constantly increasing, leading to challenging computational requirements. Even though modern HPC platforms can often deal with such challenges, the vast diversity of the modeling field does not permit for a homogeneous acceleration platform to effectively address the complete array of modeling requirements. In this paper we propose and build BrainFrame, a heterogeneous acceleration platform that incorporates three distinct acceleration technologies, an Intel Xeon-Phi CPU, a NVidia GP-GPU and a Maxeler Dataflow Engine. The PyNN software framework is also integrated into the platform. As a challenging proof of concept, we analyze the performance of BrainFrame on different experiment instances of a state-of-the-art neuron model, representing the inferior-olivary nucleus using a biophysically-meaningful, extended Hodgkin-Huxley representation. The model instances take into account not only the neuronal-network dimensions but also different network-connectivity densities, which can drastically affect the workload's performance characteristics. The combined use of different HPC technologies demonstrates that BrainFrame is better able to cope with the modeling diversity encountered in realistic experiments while at the same time running on significantly lower energy budgets. Our performance analysis clearly shows that the model directly affects performance and all three technologies are required to cope with all the model use cases. The BrainFrame framework is designed to transparently configure and select the appropriate back-end accelerator technology for use per simulation run. The PyNN integration provides a familiar bridge to the vast number of models already available. Additionally, it gives a clear roadmap for extending the platform support beyond the proof of concept, with improved usability and directly useful features to the computational-neuroscience community, paving the way for wider adoption.

  13. Accessible high performance computing solutions for near real-time image processing for time critical applications

    NASA Astrophysics Data System (ADS)

    Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek

    2009-09-01

    High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.

  14. Using Computer Simulations for Investigating a Sex Education Intervention: An Exploratory Study

    PubMed Central

    Bullock, Seth; Graham, Cynthia A; Ingham, Roger

    2017-01-01

    Background Sexually transmitted infections (STIs) are ongoing concerns. The best method for preventing the transmission of these infections is the correct and consistent use of condoms. Few studies have explored the use of games in interventions for increasing condom use by challenging the false sense of security associated with judging the presence of an STI based on attractiveness. Objectives The primary purpose of this study was to explore the potential use of computer simulation as a serious game for sex education. Specific aims were to (1) study the influence of a newly designed serious game on self-rated confidence for assessing STI risk and (2) examine whether this varied by gender, age, and scores on sexuality-related personality trait measures. Methods This paper undertook a Web-based questionnaire study employing between and within subject analyses. A Web-based platform hosted in the United Kingdom was used to deliver male and female stimuli (facial photographs) and collect data. A convenience sample group of 66 participants (64%, 42/66) male, mean age 22.5 years) completed the Term on the Tides, a computer simulation developed for this study. Participants also completed questionnaires on demographics, sexual preferences, sexual risk evaluations, the Sexual Sensation Seeking Scale (SSS), and the Sexual Inhibition Subscale 2 (SIS2) of the Sexual Inhibition/Sexual Excitation Scales-Short Form (SIS/SES - SF). Results The overall confidence of participants to evaluate sexual risks reduced after playing the game (P<.005). Age and personality trait measures did not predict the change in confidence of evaluating risk. Women demonstrated larger shifts in confidence than did men (P=.03). Conclusions This study extends the literature by investigating the potential of computer simulations as a serious game for sex education. Engaging in the Term on the Tides game had an impact on participants’ confidence in evaluating sexual risks. PMID:28468747

  15. Web Platform for Sharing Modeling Software in the Field of Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Dubenskaya, Julia; Kryukov, Alexander; Demichev, Andrey

    2018-02-01

    We describe the prototype of a Web platform intended for sharing software programs for computer modeling in the rapidly developing field of the nonlinear optics phenomena. The suggested platform is built on the top of the HUBZero open-source middleware. In addition to the basic HUBZero installation we added to our platform the capability to run Docker containers via an external application server and to send calculation programs to those containers for execution. The presented web platform provides a wide range of features and might be of benefit to nonlinear optics researchers.

  16. Scrap computer recycling in Taiwan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C.H.; Chang, S.L.; Wang, K.M.

    1999-07-01

    It is estimated that approximately 700,000 scrap personal computers will be generated each year in Taiwan. The disposal of such a huge amount of scrap computers presents a difficult task for the island due to the scarcity of landfills and incineration facilities available locally. Also, the hazardous materials contained in the scrap computers may cause serious pollution to the environment, if they are not properly disposed. Thus, EPA of Taiwan has declared scrap personal computers as a producer responsibility recycling product on July 1997 to mandate that the manufacturers, importers and sellers of personal computers have to recover and recyclemore » their scrap computers properly. Beginning on June 1, 1998, a scrap computer recycling plan is officially implemented on the island. Under this plan, consumers can deliver their unwanted personal computers to the designated collection points to receive reward money. Currently, only six items are mandated to be recycled in this recycling plan. They are notebooks, monitor and the hard disk, power supply, printed circuit board and shell of the main frame of the personal computer. This paper presents the current scrap computer recycling system in Taiwan.« less

  17. Parallel Multivariate Spatio-Temporal Clustering of Large Ecological Datasets on Hybrid Supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sreepathi, Sarat; Kumar, Jitendra; Mills, Richard T.

    A proliferation of data from vast networks of remote sensing platforms (satellites, unmanned aircraft systems (UAS), airborne etc.), observational facilities (meteorological, eddy covariance etc.), state-of-the-art sensors, and simulation models offer unprecedented opportunities for scientific discovery. Unsupervised classification is a widely applied data mining approach to derive insights from such data. However, classification of very large data sets is a complex computational problem that requires efficient numerical algorithms and implementations on high performance computing (HPC) platforms. Additionally, increasing power, space, cooling and efficiency requirements has led to the deployment of hybrid supercomputing platforms with complex architectures and memory hierarchies like themore » Titan system at Oak Ridge National Laboratory. The advent of such accelerated computing architectures offers new challenges and opportunities for big data analytics in general and specifically, large scale cluster analysis in our case. Although there is an existing body of work on parallel cluster analysis, those approaches do not fully meet the needs imposed by the nature and size of our large data sets. Moreover, they had scaling limitations and were mostly limited to traditional distributed memory computing platforms. We present a parallel Multivariate Spatio-Temporal Clustering (MSTC) technique based on k-means cluster analysis that can target hybrid supercomputers like Titan. We developed a hybrid MPI, CUDA and OpenACC implementation that can utilize both CPU and GPU resources on computational nodes. We describe performance results on Titan that demonstrate the scalability and efficacy of our approach in processing large ecological data sets.« less

  18. e-Collaboration for Earth observation (E-CEO): the Cloud4SAR interferometry data challenge

    NASA Astrophysics Data System (ADS)

    Casu, Francesco; Manunta, Michele; Boissier, Enguerran; Brito, Fabrice; Aas, Christina; Lavender, Samantha; Ribeiro, Rita; Farres, Jordi

    2014-05-01

    The e-Collaboration for Earth Observation (E-CEO) project addresses the technologies and architectures needed to provide a collaborative research Platform for automating data mining and processing, and information extraction experiments. The Platform serves for the implementation of Data Challenge Contests focusing on Information Extraction for Earth Observations (EO) applications. The possibility to implement multiple processors within a Common Software Environment facilitates the validation, evaluation and transparent peer comparison among different methodologies, which is one of the main requirements rose by scientists who develop algorithms in the EO field. In this scenario, we set up a Data Challenge, referred to as Cloud4SAR (http://wiki.services.eoportal.org/tiki-index.php?page=ECEO), to foster the deployment of Interferometric SAR (InSAR) processing chains within a Cloud Computing platform. While a large variety of InSAR processing software tools are available, they require a high level of expertise and a complex user interaction to be effectively run. Computing a co-seismic interferogram or a 20-years deformation time series on a volcanic area are not easy tasks to be performed in a fully unsupervised way and/or in very short time (hours or less). Benefiting from ESA's E-CEO platform, participants can optimise algorithms on a Virtual Sandbox environment without being expert programmers, and compute results on high performing Cloud platforms. Cloud4SAR requires solving a relatively easy InSAR problem by trying to maximize the exploitation of the processing capabilities provided by a Cloud Computing infrastructure. The proposed challenge offers two different frameworks, each dedicated to participants with different skills, identified as Beginners and Experts. For both of them, the contest mainly resides in the degree of automation of the deployed algorithms, no matter which one is used, as well as in the capability of taking effective benefit from a parallel computing environment.

  19. Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane

    2012-01-01

    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.

  20. The Technology Refresh Program: Affording State-of-the Art Personal Computing.

    ERIC Educational Resources Information Center

    Spiwak, Rand

    2000-01-01

    Describes the Florida Community College Technology Refresh Program in which 28 Florida community colleges refresh their personal computer technology on a three-year cyclical basis through negotiation of a contract with Dell Computer Corporation. Discusses the contract highlights (such as a 22.5 percent discount on personal computers and on-site…

  1. Using Personal Computers To Acquire Special Education Information. Revised. ERIC Digest #429.

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Handicapped and Gifted Children, Reston, VA.

    This digest offers basic information about resources, available to users of personal computers, in the area of professional development in special education. Two types of resources are described: those that can be purchased on computer diskettes and those made available by linking personal computers through electronic telephone networks. Resources…

  2. Open-Phylo: a customizable crowd-computing platform for multiple sequence alignment

    PubMed Central

    2013-01-01

    Citizen science games such as Galaxy Zoo, Foldit, and Phylo aim to harness the intelligence and processing power generated by crowds of online gamers to solve scientific problems. However, the selection of the data to be analyzed through these games is under the exclusive control of the game designers, and so are the results produced by gamers. Here, we introduce Open-Phylo, a freely accessible crowd-computing platform that enables any scientist to enter our system and use crowds of gamers to assist computer programs in solving one of the most fundamental problems in genomics: the multiple sequence alignment problem. PMID:24148814

  3. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  4. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  5. A Wireless Biomedical Signal Interface System-on-Chip for Body Sensor Networks.

    PubMed

    Lei Wang; Guang-Zhong Yang; Jin Huang; Jinyong Zhang; Li Yu; Zedong Nie; Cumming, D R S

    2010-04-01

    Recent years have seen the rapid development of biosensor technology, system-on-chip design, wireless technology. and ubiquitous computing. When assembled into an autonomous body sensor network (BSN), the technologies become powerful tools in well-being monitoring, medical diagnostics, and personal connectivity. In this paper, we describe the first demonstration of a fully customized mixed-signal silicon chip that has most of the attributes required for use in a wearable or implantable BSN. Our intellectual-property blocks include low-power analog sensor interface for temperature and pH, a data multiplexing and conversion module, a digital platform based around an 8-b microcontroller, data encoding for spread-spectrum wireless transmission, and a RF section requiring very few off-chip components. The chip has been fully evaluated and tested by connection to external sensors, and it satisfied typical system requirements.

  6. Why Mobile Is a Must

    ERIC Educational Resources Information Center

    McCaffrey, Mary

    2011-01-01

    There is a need for a new educational model that makes learning personal and motivating, and helps secure students' future in the knowledge economy. Mobile technology opens the door to it. Mobile devices provide the platform and, as importantly, the incentive for students to take personal ownership of the learning experience. The lessons absorbed…

  7. Semantic Web-Driven LMS Architecture towards a Holistic Learning Process Model Focused on Personalization

    ERIC Educational Resources Information Center

    Kerkiri, Tania

    2010-01-01

    A comprehensive presentation is here made on the modular architecture of an e-learning platform with a distinctive emphasis on content personalization, combining advantages from semantic web technology, collaborative filtering and recommendation systems. Modules of this architecture handle information about both the domain-specific didactic…

  8. Converged photonic data storage and switch platform for exascale disaggregated data centers

    NASA Astrophysics Data System (ADS)

    Pitwon, R.; Wang, K.; Worrall, A.

    2017-02-01

    We report on a converged optically enabled Ethernet storage, switch and compute platform, which could support future disaggregated data center architectures. The platform includes optically enabled Ethernet switch controllers, an advanced electro-optical midplane and optically interchangeable generic end node devices. We demonstrate system level performance using optically enabled Ethernet disk drives and micro-servers across optical links of varied lengths.

  9. Use of a personal computer for dynamical engineering illustrations in a classroom and over an instructional TV network

    NASA Technical Reports Server (NTRS)

    Watson, V. R.

    1983-01-01

    A personal computer has been used to illustrate physical phenomena and problem solution techniques in engineering classes. According to student evaluations, instruction of concepts was greatly improved through the use of these illustrations. This paper describes the class of phenomena that can be effectively illustrated, the techniques used to create these illustrations, and the techniques used to display the illustrations in regular classrooms and over an instructional TV network. The features of a personal computer required to apply these techniques are listed. The capabilities of some present personal computers are discussed and a forecast of the capabilities of future personal computers is presented.

  10. Transformation of personal computers and mobile phones into genetic diagnostic systems.

    PubMed

    Walker, Faye M; Ahmad, Kareem M; Eisenstein, Michael; Soh, H Tom

    2014-09-16

    Molecular diagnostics based on the polymerase chain reaction (PCR) offer rapid and sensitive means for detecting infectious disease, but prohibitive costs have impeded their use in resource-limited settings where such diseases are endemic. In this work, we report an innovative method for transforming a desktop computer and a mobile camera phone--devices that have become readily accessible in developing countries--into a highly sensitive DNA detection system. This transformation was achieved by converting a desktop computer into a de facto thermal cycler with software that controls the temperature of the central processing unit (CPU), allowing for highly efficient PCR. Next, we reconfigured the mobile phone into a fluorescence imager by adding a low-cost filter, which enabled us to quantitatively measure the resulting PCR amplicons. Our system is highly sensitive, achieving quantitative detection of as little as 9.6 attograms of target DNA, and we show that its performance is comparable to advanced laboratory instruments at approximately 1/500th of the cost. Finally, in order to demonstrate clinical utility, we have used our platform for the successful detection of genomic DNA from the parasite that causes Chagas disease, Trypanosoma cruzi, directly in whole, unprocessed human blood at concentrations 4-fold below the clinical titer of the parasite.

  11. Transformation of Personal Computers and Mobile Phones into Genetic Diagnostic Systems

    PubMed Central

    2014-01-01

    Molecular diagnostics based on the polymerase chain reaction (PCR) offer rapid and sensitive means for detecting infectious disease, but prohibitive costs have impeded their use in resource-limited settings where such diseases are endemic. In this work, we report an innovative method for transforming a desktop computer and a mobile camera phone—devices that have become readily accessible in developing countries—into a highly sensitive DNA detection system. This transformation was achieved by converting a desktop computer into a de facto thermal cycler with software that controls the temperature of the central processing unit (CPU), allowing for highly efficient PCR. Next, we reconfigured the mobile phone into a fluorescence imager by adding a low-cost filter, which enabled us to quantitatively measure the resulting PCR amplicons. Our system is highly sensitive, achieving quantitative detection of as little as 9.6 attograms of target DNA, and we show that its performance is comparable to advanced laboratory instruments at approximately 1/500th of the cost. Finally, in order to demonstrate clinical utility, we have used our platform for the successful detection of genomic DNA from the parasite that causes Chagas disease, Trypanosoma cruzi, directly in whole, unprocessed human blood at concentrations 4-fold below the clinical titer of the parasite. PMID:25223929

  12. An Open Architecture to Support Social and Health Services in a Smart TV Environment.

    PubMed

    Costa, Carlos Rivas; Anido-Rifon, Luis E; Fernandez-Iglesias, Manuel J

    2017-03-01

    To design, implement, and test a solution to provide social and health services for the elderly at home based on smart TV technologies and access to all services. The architecture proposed is based on an open software platform and standard personal computing hardware. This provides great flexibility to develop new applications over the underlying infrastructure or to integrate new devices, for instance to monitor a broad range of vital signs in those cases where home monitoring is required. An actual system as a proof-of-concept was designed, implemented, and deployed. Applications range from social network clients to vital signs monitoring; from interactive TV contests to conventional online care applications such as medication reminders or telemedicine. In both cases, the results have been very positive, confirming the initial perception of the TV as a convenient, easy-to-use technology to provide social and health care. The TV set is a much more familiar computing interface for most senior users, and as a consequence, smart TVs become a most convenient solution for the design and implementation of applications and services targeted to this user group. This proposal has been tested in real setting with 62 senior people at their homes. Users included both individuals with experience using computers and others reluctant to them.

  13. Comparing Neuromorphic Solutions in Action: Implementing a Bio-Inspired Solution to a Benchmark Classification Task on Three Parallel-Computing Platforms

    PubMed Central

    Diamond, Alan; Nowotny, Thomas; Schmuker, Michael

    2016-01-01

    Neuromorphic computing employs models of neuronal circuits to solve computing problems. Neuromorphic hardware systems are now becoming more widely available and “neuromorphic algorithms” are being developed. As they are maturing toward deployment in general research environments, it becomes important to assess and compare them in the context of the applications they are meant to solve. This should encompass not just task performance, but also ease of implementation, speed of processing, scalability, and power efficiency. Here, we report our practical experience of implementing a bio-inspired, spiking network for multivariate classification on three different platforms: the hybrid digital/analog Spikey system, the digital spike-based SpiNNaker system, and GeNN, a meta-compiler for parallel GPU hardware. We assess performance using a standard hand-written digit classification task. We found that whilst a different implementation approach was required for each platform, classification performances remained in line. This suggests that all three implementations were able to exercise the model's ability to solve the task rather than exposing inherent platform limits, although differences emerged when capacity was approached. With respect to execution speed and power consumption, we found that for each platform a large fraction of the computing time was spent outside of the neuromorphic device, on the host machine. Time was spent in a range of combinations of preparing the model, encoding suitable input spiking data, shifting data, and decoding spike-encoded results. This is also where a large proportion of the total power was consumed, most markedly for the SpiNNaker and Spikey systems. We conclude that the simulation efficiency advantage of the assessed specialized hardware systems is easily lost in excessive host-device communication, or non-neuronal parts of the computation. These results emphasize the need to optimize the host-device communication architecture for scalability, maximum throughput, and minimum latency. Moreover, our results indicate that special attention should be paid to minimize host-device communication when designing and implementing networks for efficient neuromorphic computing. PMID:26778950

  14. FLAME: A platform for high performance computing of complex systems, applied for three case studies

    DOE PAGES

    Kiran, Mariam; Bicak, Mesude; Maleki-Dizaji, Saeedeh; ...

    2011-01-01

    FLAME allows complex models to be automatically parallelised on High Performance Computing (HPC) grids enabling large number of agents to be simulated over short periods of time. Modellers are hindered by complexities of porting models on parallel platforms and time taken to run large simulations on a single machine, which FLAME overcomes. Three case studies from different disciplines were modelled using FLAME, and are presented along with their performance results on a grid.

  15. Generating Researcher Networks with Identified Persons on a Semantic Service Platform

    NASA Astrophysics Data System (ADS)

    Jung, Hanmin; Lee, Mikyoung; Kim, Pyung; Lee, Seungwoo

    This paper describes a Semantic Web-based method to acquire researcher networks by means of identification scheme, ontology, and reasoning. Three steps are required to realize it; resolving co-references, finding experts, and generating researcher networks. We adopt OntoFrame as an underlying semantic service platform and apply reasoning to make direct relations between far-off classes in ontology schema. 453,124 Elsevier journal articles with metadata and full-text documents in information technology and biomedical domains have been loaded and served on the platform as a test set.

  16. Revision and Expansion of Navy Computer Adaptive Personality Scales (NCAPS)

    DTIC Science & Technology

    2007-08-01

    Nav Pesne Reerh Stde, an Technolg y Dii sio Revision and Expansion of Navy Computer Adaptive Personality Scales ( NCAPS ) Robert J. Schneider, Ph.D...TN-o7-12 August 2007 Revision and Expansion of Navy Computer Adaptive Personality Scales ( NCAPS ) Robert J. Schneider, Ph.D. Kerri L. Ferstl, Ph.D...03/31/2006 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Revision and Expansion of Navy Computer Adaptive Personality Scales ( NCAPS ) 5b. GRANT NUMBER 5c

  17. Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales (NCAPS)

    DTIC Science & Technology

    2006-10-01

    Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) Christina M. Underhill, Ph.D...Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) Christina M. Underhill, Ph.D. Reviewed and Approved by Jacqueline A. Mottern...and Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 0602236N and 0603236N 6

  18. Cloud Computing Fundamentals

    NASA Astrophysics Data System (ADS)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  19. Hardware platforms for MEMS gyroscope tuning based on evolutionary computation using open-loop and closed -loop frequency response

    NASA Technical Reports Server (NTRS)

    Keymeulen, Didier; Ferguson, Michael I.; Fink, Wolfgang; Oks, Boris; Peay, Chris; Terrile, Richard; Cheng, Yen; Kim, Dennis; MacDonald, Eric; Foor, David

    2005-01-01

    We propose a tuning method for MEMS gyroscopes based on evolutionary computation to efficiently increase the sensitivity of MEMS gyroscopes through tuning. The tuning method was tested for the second generation JPL/Boeing Post-resonator MEMS gyroscope using the measurement of the frequency response of the MEMS device in open-loop operation. We also report on the development of a hardware platform for integrated tuning and closed loop operation of MEMS gyroscopes. The control of this device is implemented through a digital design on a Field Programmable Gate Array (FPGA). The hardware platform easily transitions to an embedded solution that allows for the miniaturization of the system to a single chip.

  20. [Construction of platform on the three-dimensional finite element model of the dentulous mandibular body of a normal person].

    PubMed

    Gong, Lu-Lu; Zhu, Jing; Ding, Zu-Quan; Li, Guo-Qiang; Wang, Li-Ming; Yan, Bo-Yong

    2008-04-01

    To develop a method to construct a three-dimensional finite element model of the dentulous mandibular body of a normal person. A series of pictures with the interval of 0.1 mm were taken by CT scanning. After extracting the coordinates of key points of some pictures by the procedure, we used a C program to process the useful data, and constructed a platform of the three-dimensional finite element model of the dentulous mandibular body with the Ansys software for finite element analysis. The experimental results showed that the platform of the three-dimensional finite element model of the dentulous mandibular body was more accurate and applicable. The exact three-dimensional shape of model was well constructed, and each part of this model, such as one single tooth, can be deleted, which can be used to emulate various tooth-loss clinical cases. The three-dimensional finite element model is constructed with life-like shapes of dental cusps. Each part of this model can be easily removed. In conclusion, this experiment provides a good platform of biomechanical analysis on various tooth-loss clinical cases.

  1. Development of qualification guidelines for personal computer-based aviation training devices.

    DOT National Transportation Integrated Search

    1995-02-01

    Recent advances in the capabilities of personal computers have resulted in an increase in the number of flight simulation programs made available as Personal Computer-Based Aviation Training Devices (PCATDs).The potential benefits of PCATDs have been...

  2. Evaluation of an Application for Making Palmtop Computers Accessible to Individuals with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Stock, Steven E.; Davies, Daniel K.; Davies, Katelyn R.; Wehmeyer, Michael L.

    2006-01-01

    Background: Palmtop computers provide a promising mobile platform to address barriers to computer-based supports for people with intellectual disabilities. This study evaluated a specially designed interface to make navigation and features of palmtop computers more accessible to users with intellectual disabilities. Method: The specialised…

  3. Control mechanism of double-rotator-structure ternary optical computer

    NASA Astrophysics Data System (ADS)

    Kai, SONG; Liping, YAN

    2017-03-01

    Double-rotator-structure ternary optical processor (DRSTOP) has two characteristics, namely, giant data-bits parallel computing and reconfigurable processor, which can handle thousands of data bits in parallel, and can run much faster than computers and other optical computer systems so far. In order to put DRSTOP into practical application, this paper established a series of methods, namely, task classification method, data-bits allocation method, control information generation method, control information formatting and sending method, and decoded results obtaining method and so on. These methods form the control mechanism of DRSTOP. This control mechanism makes DRSTOP become an automated computing platform. Compared with the traditional calculation tools, DRSTOP computing platform can ease the contradiction between high energy consumption and big data computing due to greatly reducing the cost of communications and I/O. Finally, the paper designed a set of experiments for DRSTOP control mechanism to verify its feasibility and correctness. Experimental results showed that the control mechanism is correct, feasible and efficient.

  4. Microfluidic chip-based technologies: emerging platforms for cancer diagnosis

    PubMed Central

    2013-01-01

    The development of early and personalized diagnostic protocols is considered the most promising avenue to decrease mortality from cancer and improve outcome. The emerging microfluidic-based analyzing platforms hold high promises to fulfill high-throughput and high-precision screening with reduced equipment cost and low analysis time, as compared to traditional bulky counterparts in bench-top laboratories. This article overviewed the potential applications of microfluidic technologies for detection and monitoring of cancer through nucleic acid and protein biomarker analysis. The implications of the technologies in cancer cytology that can provide functional personalized diagnosis were highlighted. Finally, the future niches for using microfluidic-based systems in tumor screening were briefly discussed. PMID:24070124

  5. Co-occurrence of addictive behaviours: personality factors related to substance use, gambling and computer gaming.

    PubMed

    Walther, Birte; Morgenstern, Matthis; Hanewinkel, Reiner

    2012-01-01

    To investigate co-occurrence and shared personality characteristics of problematic computer gaming, problematic gambling and substance use. Cross-sectional survey data were collected from 2,553 German students aged 12-25 years. Self-report measures of substance use (alcohol, tobacco and cannabis), problematic gambling (South Oaks Gambling Screen - Revised for Adolescents, SOGS-RA), problematic computer gaming (Video Game Dependency Scale, KFN-CSAS-II), and of twelve different personality characteristics were obtained. Analyses revealed positive correlations between tobacco, alcohol and cannabis use and a smaller positive correlation between problematic gambling and problematic computer gaming. Problematic computer gaming co-occurred only with cannabis use, whereas problematic gambling was associated with all three types of substance use. Multivariate multilevel analyses showed differential patterns of personality characteristics. High impulsivity was the only personality characteristic associated with all five addictive behaviours. Depression and extraversion were specific to substance users. Four personality characteristics were specifically associated with problematic computer gaming: irritability/aggression, social anxiety, ADHD, and low self-esteem. Problematic gamblers seem to be more similar to substance users than problematic computer gamers. From a personality perspective, results correspond to the inclusion of gambling in the same DSM-V category as substance use and question a one-to-one proceeding for computer gaming. Copyright © 2012 S. Karger AG, Basel.

  6. OMPC: an Open-Source MATLAB®-to-Python Compiler

    PubMed Central

    Jurica, Peter; van Leeuwen, Cees

    2008-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB®, the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB®-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB® functions into Python programs. The imported MATLAB® modules will run independently of MATLAB®, relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB®. OMPC is available at http://ompc.juricap.com. PMID:19225577

  7. Precollege Computer Literacy: A Personal Computing Approach. Second Edition.

    ERIC Educational Resources Information Center

    Moursund, David

    Intended for elementary and secondary teachers and curriculum specialists, this booklet discusses and defines computer literacy as a functional knowledge of computers and their effects on students and the rest of society. It analyzes personal computing and the aspects of computers that have direct impact on students. Outlining computer-assisted…

  8. Learners' Field Dependence and the Effects of Personalized Narration on Learners' Computer Perceptions and Task-Related Attitudes in Multimedia Learning

    ERIC Educational Resources Information Center

    Liew, Tze Wei; Tan, Su-Mae; Seydali, Rouzbeh

    2014-01-01

    In this article, the effects of personalized narration in multimedia learning on learners' computer perceptions and task-related attitudes were examined. Twenty-six field independent and 22 field dependent participants studied the computer-based multimedia lessons on C-Programming, either with personalized narration or non-personalized narration.…

  9. Lensfree Computational Microscopy Tools and their Biomedical Applications

    NASA Astrophysics Data System (ADS)

    Sencan, Ikbal

    Conventional microscopy has been a revolutionary tool for biomedical applications since its invention several centuries ago. Ability to non-destructively observe very fine details of biological objects in real time enabled to answer many important questions about their structures and functions. Unfortunately, most of these advance microscopes are complex, bulky, expensive, and/or hard to operate, so they could not reach beyond the walls of well-equipped laboratories. Recent improvements in optoelectronic components and computational methods allow creating imaging systems that better fulfill the specific needs of clinics or research related biomedical applications. In this respect, lensfree computational microscopy aims to replace bulky and expensive optical components with compact and cost-effective alternatives through the use of computation, which can be particularly useful for lab-on-a-chip platforms as well as imaging applications in low-resource settings. Several high-throughput on-chip platforms are built with this approach for applications including, but not limited to, cytometry, micro-array imaging, rare cell analysis, telemedicine, and water quality screening. The lack of optical complexity in these lensfree on-chip imaging platforms is compensated by using computational techniques. These computational methods are utilized for various purposes in coherent, incoherent and fluorescent on-chip imaging platforms e.g. improving the spatial resolution, to undo the light diffraction without using lenses, localization of objects in a large volume and retrieval of the phase or the color/spectral content of the objects. For instance, pixel super resolution approaches based on source shifting are used in lensfree imaging platforms to prevent under sampling, Bayer pattern, and aliasing artifacts. Another method, iterative phase retrieval, is utilized to compensate the lack of lenses by undoing the diffraction and removing the twin image noise of in-line holograms. This technique enables recovering the complex optical field from its intensity measurement(s) by using additional constraints in iterations, such as spatial boundaries and other known properties of objects. Another computational tool employed in lensfree imaging is compressive sensing (or decoding), which is a novel method taking advantage of the fact that natural signals/objects are mostly sparse or compressible in known bases. This inherent property of objects enables better signal recovery when the number of measurement is low, even below the Nyquist rate, and increases the additive noise immunity of the system.

  10. Use of Parallel Micro-Platform for the Simulation the Space Exploration

    NASA Astrophysics Data System (ADS)

    Velasco Herrera, Victor Manuel; Velasco Herrera, Graciela; Rosano, Felipe Lara; Rodriguez Lozano, Salvador; Lucero Roldan Serrato, Karen

    The purpose of this work is to create a parallel micro-platform, that simulates the virtual movements of a space exploration in 3D. One of the innovations presented in this design consists of the application of a lever mechanism for the transmission of the movement. The development of such a robot is a challenging task very different of the industrial manipulators due to a totally different target system of requirements. This work presents the study and simulation, aided by computer, of the movement of this parallel manipulator. The development of this model has been developed using the platform of computer aided design Unigraphics, in which it was done the geometric modeled of each one of the components and end assembly (CAD), the generation of files for the computer aided manufacture (CAM) of each one of the pieces and the kinematics simulation of the system evaluating different driving schemes. We used the toolbox (MATLAB) of aerospace and create an adaptive control module to simulate the system.

  11. Real-Time Compressive Sensing MRI Reconstruction Using GPU Computing and Split Bregman Methods

    PubMed Central

    Smith, David S.; Gore, John C.; Yankeelov, Thomas E.; Welch, E. Brian

    2012-01-01

    Compressive sensing (CS) has been shown to enable dramatic acceleration of MRI acquisition in some applications. Being an iterative reconstruction technique, CS MRI reconstructions can be more time-consuming than traditional inverse Fourier reconstruction. We have accelerated our CS MRI reconstruction by factors of up to 27 by using a split Bregman solver combined with a graphics processing unit (GPU) computing platform. The increases in speed we find are similar to those we measure for matrix multiplication on this platform, suggesting that the split Bregman methods parallelize efficiently. We demonstrate that the combination of the rapid convergence of the split Bregman algorithm and the massively parallel strategy of GPU computing can enable real-time CS reconstruction of even acquisition data matrices of dimension 40962 or more, depending on available GPU VRAM. Reconstruction of two-dimensional data matrices of dimension 10242 and smaller took ~0.3 s or less, showing that this platform also provides very fast iterative reconstruction for small-to-moderate size images. PMID:22481908

  12. Computer optimization techniques for NASA Langley's CSI evolutionary model's real-time control system

    NASA Technical Reports Server (NTRS)

    Elliott, Kenny B.; Ugoletti, Roberto; Sulla, Jeff

    1992-01-01

    The evolution and optimization of a real-time digital control system is presented. The control system is part of a testbed used to perform focused technology research on the interactions of spacecraft platform and instrument controllers with the flexible-body dynamics of the platform and platform appendages. The control system consists of Computer Automated Measurement and Control (CAMAC) standard data acquisition equipment interfaced to a workstation computer. The goal of this work is to optimize the control system's performance to support controls research using controllers with up to 50 states and frame rates above 200 Hz. The original system could support a 16-state controller operating at a rate of 150 Hz. By using simple yet effective software improvements, Input/Output (I/O) latencies and contention problems are reduced or eliminated in the control system. The final configuration can support a 16-state controller operating at 475 Hz. Effectively the control system's performance was increased by a factor of 3.

  13. Real-Time Compressive Sensing MRI Reconstruction Using GPU Computing and Split Bregman Methods.

    PubMed

    Smith, David S; Gore, John C; Yankeelov, Thomas E; Welch, E Brian

    2012-01-01

    Compressive sensing (CS) has been shown to enable dramatic acceleration of MRI acquisition in some applications. Being an iterative reconstruction technique, CS MRI reconstructions can be more time-consuming than traditional inverse Fourier reconstruction. We have accelerated our CS MRI reconstruction by factors of up to 27 by using a split Bregman solver combined with a graphics processing unit (GPU) computing platform. The increases in speed we find are similar to those we measure for matrix multiplication on this platform, suggesting that the split Bregman methods parallelize efficiently. We demonstrate that the combination of the rapid convergence of the split Bregman algorithm and the massively parallel strategy of GPU computing can enable real-time CS reconstruction of even acquisition data matrices of dimension 4096(2) or more, depending on available GPU VRAM. Reconstruction of two-dimensional data matrices of dimension 1024(2) and smaller took ~0.3 s or less, showing that this platform also provides very fast iterative reconstruction for small-to-moderate size images.

  14. Using Raspberry Pi to Teach Computing "Inside Out"

    ERIC Educational Resources Information Center

    Jaokar, Ajit

    2013-01-01

    This article discusses the evolution of computing education in preparing for the next wave of computing. With the proliferation of mobile devices, most agree that we are living in a "post-PC" world. Using the Raspberry Pi computer platform, based in the UK, as an example, the author discusses computing education in a world where the…

  15. Multiuser Collaboration with Networked Mobile Devices

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.; Tai, Ann T.; Deng, Yong M.; Becks, Paul G.

    2006-01-01

    In this paper we describe a multiuser collaboration infrastructure that enables multiple mission scientists to remotely and collaboratively interact with visualization and planning software, using wireless networked personal digital assistants(PDAs) and other mobile devices. During ground operations of planetary rover and lander missions, scientists need to meet daily to review downlinked data and plan science activities. For example, scientists use the Science Activity Planner (SAP) in the Mars Exploration Rover (MER) mission to visualize downlinked data and plan rover activities during the science meetings [1]. Computer displays are projected onto large screens in the meeting room to enable the scientists to view and discuss downlinked images and data displayed by SAP and other software applications. However, only one person can interact with the software applications because input to the computer is limited to a single mouse and keyboard. As a result, the scientists have to verbally express their intentions, such as selecting a target at a particular location on the Mars terrain image, to that person in order to interact with the applications. This constrains communication and limits the returns of science planning. Furthermore, ground operations for Mars missions are fundamentally constrained by the short turnaround time for science and engineering teams to process and analyze data, plan the next uplink, generate command sequences, and transmit the uplink to the vehicle [2]. Therefore, improving ground operations is crucial to the success of Mars missions. The multiuser collaboration infrastructure enables users to control software applications remotely and collaboratively using mobile devices. The infrastructure includes (1) human-computer interaction techniques to provide natural, fast, and accurate inputs, (2) a communications protocol to ensure reliable and efficient coordination of the input devices and host computers, (3) an application-independent middleware that maintains the states, sessions, and interactions of individual users of the software applications, (4) an application programming interface to enable tight integration of applications and the middleware. The infrastructure is able to support any software applications running under the Windows or Unix platforms. The resulting technologies not only are applicable to NASA mission operations, but also useful in other situations such as design reviews, brainstorming sessions, and business meetings, as they can benefit from having the participants concurrently interact with the software applications (e.g., presentation applications and CAD design tools) to illustrate their ideas and provide inputs.

  16. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  17. MicroScope in 2017: an expanding and evolving integrated resource for community expertise of microbial genomes

    PubMed Central

    Vallenet, David; Calteau, Alexandra; Cruveiller, Stéphane; Gachet, Mathieu; Lajus, Aurélie; Josso, Adrien; Mercier, Jonathan; Renaux, Alexandre; Rollin, Johan; Rouy, Zoe; Roche, David; Scarpelli, Claude; Médigue, Claudine

    2017-01-01

    The annotation of genomes from NGS platforms needs to be automated and fully integrated. However, maintaining consistency and accuracy in genome annotation is a challenging problem because millions of protein database entries are not assigned reliable functions. This shortcoming limits the knowledge that can be extracted from genomes and metabolic models. Launched in 2005, the MicroScope platform (http://www.genoscope.cns.fr/agc/microscope) is an integrative resource that supports systematic and efficient revision of microbial genome annotation, data management and comparative analysis. Effective comparative analysis requires a consistent and complete view of biological data, and therefore, support for reviewing the quality of functional annotation is critical. MicroScope allows users to analyze microbial (meta)genomes together with post-genomic experiment results if any (i.e. transcriptomics, re-sequencing of evolved strains, mutant collections, phenotype data). It combines tools and graphical interfaces to analyze genomes and to perform the expert curation of gene functions in a comparative context. Starting with a short overview of the MicroScope system, this paper focuses on some major improvements of the Web interface, mainly for the submission of genomic data and on original tools and pipelines that have been developed and integrated in the platform: computation of pan-genomes and prediction of biosynthetic gene clusters. Today the resource contains data for more than 6000 microbial genomes, and among the 2700 personal accounts (65% of which are now from foreign countries), 14% of the users are performing expert annotations, on at least a weekly basis, contributing to improve the quality of microbial genome annotations. PMID:27899624

  18. Personalized Medicine-Based Approach to Model Patterns of Chemoresistance and Tumor Recurrence Using Ovarian Cancer Stem Cell Spheroids.

    PubMed

    Raghavan, Shreya; Mehta, Pooja; Ward, Maria R; Bregenzer, Michael E; Fleck, Elyse M A; Tan, Lijun; McLean, Karen; Buckanovich, Ronald J; Mehta, Geeta

    2017-11-15

    Purpose: Chemoresistant ovarian cancers grow in suspension within the ascites fluid. To screen the effect of chemotherapeutics and biologics on resistant ovarian cancers with a personalized basis, we developed a 3D hanging drop spheroid platform. Experimental Design: We initiated spheroids with primary aldehyde dehydrogenase-positive (ALDH + ) CD133 + ovarian cancer stem cells (OvCSC) from different patient samples and demonstrated that stem cell progeny from harvested spheroids was similar to the primary tumor. OvCSC spheroids were utilized to initiate tumors in immunodeficient mice. Drug responses to cisplatin and ALDH-targeting compound or JAK2 inhibitor determined whether the OvCSC population within the spheroids could be targeted. Cells that escaped therapy were isolated and used to initiate new spheroids and model tumor reemergence in a personalized manner. Results: OvCSC spheroids from different patients exhibited varying and personalized responses to chemotherapeutics. Xenografts were established from OvCSC spheroids, even with a single spheroid. Distinct responses to therapy were observed in distinct primary tumor xenografts similar to those observed in spheroids. Spheroids resistant to cisplatin/ALDH inhibitor therapy had persistent, albeit lower ALDH expression and complete loss of CD133 expression, whereas those resistant to cisplatin/JAK2 inhibitor therapy were enriched for ALDH + cells. Conclusions: Our 3D hanging drop suspension platform can be used to propagate primary OvCSCs that represent individual patient tumors effectively by differentiating in vitro and initiating tumors in mice. Therefore, our platform can be used to study cancer stem cell biology and model tumor reemergence to identify new targeted therapeutics from an effective personalized medicine standpoint. Clin Cancer Res; 23(22); 6934-45. ©2017 AACR . ©2017 American Association for Cancer Research.

  19. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    NASA Astrophysics Data System (ADS)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.

  20. Personalized Energy Reduction Cyber-Physical System (PERCS): A gamified end-user platform for energy efficiency and demand response.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sintov, Nicole; Orosz, Michael; Schultz, P. Wesley

    2015-01-01

    The mission of the Personalized Energy Reduction Cyber-physical System (PERCS) is to create new possibilities for improving building operating efficiency, enhancing grid reliability, avoiding costly power interruptions, and mitigating greenhouse gas emissions. PERCS proposes to achieve these outcomes by engaging building occupants as partners in a user-centered smart service platform. Using a non-intrusive load monitoring approach, PERCS uses a single sensing point in each home to capture smart electric meter data in real time. The household energy signal is disaggregated into individual load signatures of common appliances (e.g., air conditioners), yielding near real-time appliance-level energy information. Users interact with PERCSmore » via a mobile phone platform that provides household- and appliance-level energy feedback, tailored recommendations, and a competitive game tied to energy use and behavioral changes. PERCS challenges traditional energy management approaches by directly engaging occupant as key elements in a technological system.« less

  1. MSIX - A general and user-friendly platform for RAM analysis

    NASA Astrophysics Data System (ADS)

    Pan, Z. J.; Blemel, Peter

    The authors present a CAD (computer-aided design) platform supporting RAM (reliability, availability, and maintainability) analysis with efficient system description and alternative evaluation. The design concepts, implementation techniques, and application results are described. This platform is user-friendly because of its graphic environment, drawing facilities, object orientation, self-tutoring, and access to the operating system. The programs' independency and portability make them generally applicable to various analysis tasks.

  2. On Curating Multimodal Sensory Data for Health and Wellness Platforms

    PubMed Central

    Amin, Muhammad Bilal; Banos, Oresti; Khan, Wajahat Ali; Muhammad Bilal, Hafiz Syed; Gong, Jinhyuk; Bui, Dinh-Mao; Cho, Soung Ho; Hussain, Shujaat; Ali, Taqdir; Akhtar, Usman; Chung, Tae Choong; Lee, Sungyoung

    2016-01-01

    In recent years, the focus of healthcare and wellness technologies has shown a significant shift towards personal vital signs devices. The technology has evolved from smartphone-based wellness applications to fitness bands and smartwatches. The novelty of these devices is the accumulation of activity data as their users go about their daily life routine. However, these implementations are device specific and lack the ability to incorporate multimodal data sources. Data accumulated in their usage does not offer rich contextual information that is adequate for providing a holistic view of a user’s lifelog. As a result, making decisions and generating recommendations based on this data are single dimensional. In this paper, we present our Data Curation Framework (DCF) which is device independent and accumulates a user’s sensory data from multimodal data sources in real time. DCF curates the context of this accumulated data over the user’s lifelog. DCF provides rule-based anomaly detection over this context-rich lifelog in real time. To provide computation and persistence over the large volume of sensory data, DCF utilizes the distributed and ubiquitous environment of the cloud platform. DCF has been evaluated for its performance, correctness, ability to detect complex anomalies, and management support for a large volume of sensory data. PMID:27355955

  3. Environmental Monitoring and Characterization of Radiation Sources on UF Campus Using a Large Volume NaI Detector

    NASA Astrophysics Data System (ADS)

    Bruner, Jesse A.; Gardiner, Hannah E.; Jordan, Kelly A.; Baciak, James E.

    2016-09-01

    Environmental radiation surveys are important for applications such as safety and regulations. This is especially true for areas exposed to emissions from nuclear reactors, such as the University of Florida Training Reactor (UFTR). At the University of Florida, surveys are performed using the RSX-1 NaI detector, developed by Radiation Solutions Inc. The detector uses incoming gamma rays and an Advanced Digital Spectrometer module to produce a linear energy spectrum. These spectra can then be analyzed in real time with a personal computer using the built in software, RadAssist. We report on radiation levels around the University of Florida campus using two mobile detection platforms, car-borne and cart-borne. The car-borne surveys provide a larger, broader map of campus radiation levels. On the other hand, cart-borne surveys provide a more detailed radiation map because of its ability to reach places on campus cars cannot go. Throughout the survey data, there are consistent radon decay product energy peaks in addition to other sources such as medical I-131 found in a large crowd of people. Finally, we investigate further applications of this mobile detection platform, such as tracking the Ar-41 plume emitted from the UFTR and detection of potential environmental hazards.

  4. Comparison of fMRI data analysis by SPM99 on different operating systems.

    PubMed

    Shinagawa, Hideo; Honda, Ei-ichi; Ono, Takashi; Kurabayashi, Tohru; Ohyama, Kimie

    2004-09-01

    The hardware chosen for fMRI data analysis may depend on the platform already present in the laboratory or the supporting software. In this study, we ran SPM99 software on multiple platforms to examine whether we could analyze fMRI data by SPM99, and to compare their differences and limitations in processing fMRI data, which can be attributed to hardware capabilities. Six normal right-handed volunteers participated in a study of hand-grasping to obtain fMRI data. Each subject performed a run that consisted of 98 images. The run was measured using a gradient echo-type echo planar imaging sequence on a 1.5T apparatus with a head coil. We used several personal computer (PC), Unix and Linux machines to analyze the fMRI data. There were no differences in the results obtained on several PC, Unix and Linux machines. The only limitations in processing large amounts of the fMRI data were found using PC machines. This suggests that the results obtained with different machines were not affected by differences in hardware components, such as the CPU, memory and hard drive. Rather, it is likely that the limitations in analyzing a huge amount of the fMRI data were due to differences in the operating system (OS).

  5. A Wireless Monitoring System Using a Tunneling Sensor Array in a Smart Oral Appliance for Sleep Apnea Treatment

    PubMed Central

    Yeh, Kun-Ying; Yeh, Chao-Chi; Tang, Kuan; Wu, Jyun-Yi; Chen, Yun-Ting; Xu, Ming-Xin; Chen, Yunn-Jy; Yang, Yao-Joe; Lu, Shey-Shi

    2017-01-01

    Sleep apnea is a serious sleep disorder, and the most common type is obstructive sleep apnea (OSA). Untreated OSA will cause lots of potential health problems. Oral appliance therapy is an effective and popular approach for OSA treatment, but making a perfect fit for each patient is time-consuming and decreases its efficiency considerably. This paper proposes a System-on-a-Chip (SoC) enabled sleep monitoring system in a smart oral appliance, which is capable of intelligently collecting the physiological data about tongue movement through the whole therapy. A tunneling sensor array with an ultra-high sensitivity is incorporated to accurately detect the subtle pressure from the tongue. When the device is placed on the wireless platform, the temporary stored data will be retrieved and wirelessly transmitted to personal computers and cloud storages. The battery will be recharged by harvesting external RF power from the platform. A compact prototype module, whose size is 4.5 × 2.5 × 0.9 cm3, is implemented and embedded inside the oral appliance to demonstrate the tongue movement detection in continuous time frames. The functions of this design are verified by the presented measurement results. This design aims to increase efficiency and make it a total solution for OSA treatment. PMID:29035296

  6. Mining the Quantified Self: Personal Knowledge Discovery as a Challenge for Data Science.

    PubMed

    Fawcett, Tom

    2015-12-01

    The last several years have seen an explosion of interest in wearable computing, personal tracking devices, and the so-called quantified self (QS) movement. Quantified self involves ordinary people recording and analyzing numerous aspects of their lives to understand and improve themselves. This is now a mainstream phenomenon, attracting a great deal of attention, participation, and funding. As more people are attracted to the movement, companies are offering various new platforms (hardware and software) that allow ever more aspects of daily life to be tracked. Nearly every aspect of the QS ecosystem is advancing rapidly, except for analytic capabilities, which remain surprisingly primitive. With increasing numbers of qualified self participants collecting ever greater amounts and types of data, many people literally have more data than they know what to do with. This article reviews the opportunities and challenges posed by the QS movement. Data science provides well-tested techniques for knowledge discovery. But making these useful for the QS domain poses unique challenges that derive from the characteristics of the data collected as well as the specific types of actionable insights that people want from the data. Using a small sample of QS time series data containing information about personal health we provide a formulation of the QS problem that connects data to the decisions of interest to the user.

  7. Many-core computing for space-based stereoscopic imaging

    NASA Astrophysics Data System (ADS)

    McCall, Paul; Torres, Gildo; LeGrand, Keith; Adjouadi, Malek; Liu, Chen; Darling, Jacob; Pernicka, Henry

    The potential benefits of using parallel computing in real-time visual-based satellite proximity operations missions are investigated. Improvements in performance and relative navigation solutions over single thread systems can be achieved through multi- and many-core computing. Stochastic relative orbit determination methods benefit from the higher measurement frequencies, allowing them to more accurately determine the associated statistical properties of the relative orbital elements. More accurate orbit determination can lead to reduced fuel consumption and extended mission capabilities and duration. Inherent to the process of stereoscopic image processing is the difficulty of loading, managing, parsing, and evaluating large amounts of data efficiently, which may result in delays or highly time consuming processes for single (or few) processor systems or platforms. In this research we utilize the Single-Chip Cloud Computer (SCC), a fully programmable 48-core experimental processor, created by Intel Labs as a platform for many-core software research, provided with a high-speed on-chip network for sharing information along with advanced power management technologies and support for message-passing. The results from utilizing the SCC platform for the stereoscopic image processing application are presented in the form of Performance, Power, Energy, and Energy-Delay-Product (EDP) metrics. Also, a comparison between the SCC results and those obtained from executing the same application on a commercial PC are presented, showing the potential benefits of utilizing the SCC in particular, and any many-core platforms in general for real-time processing of visual-based satellite proximity operations missions.

  8. Personality Types and Affinity for Computers

    DTIC Science & Technology

    1991-03-01

    differences on personality dimensions between the respondents, and to explore the relationship between these differences and computer affinity. The results...between the respondents, and to explore the relationship between these differences and computer affinity. The results revealed no significant differences...type to this measure of computer affinity. 2 II. LITERATURZ REVIEW The interest of this study was the relationship between a person’s psychological

  9. Personalizing public health: your health avatar.

    PubMed

    Pereira, Chrystian; McNamara, Anusha; Sorge, Lindsay; Arya, Vibhuti

    2013-01-01

    To describe the creation of a health avatar, with the goals of providing patients with complete health information from various sources, establishing an interactive and customizable platform, empowering users to determine how the health information best fits or speaks to their personal needs, and providing perspective by comparing the health status of the individual with that of the individual's community. The Internet is rapidly becoming integrated into Americans' daily lives. According to the 2007 Health Information National Trends Study, 69% of U.S. adults had access to the Internet and 23% reported using a social networking site. The impact of social media has further grown, and an estimated 50% of adults in America have a profile on social media. The potential for using cyber communities to improve health messaging is great. Several health care organizations have implemented the use of social media in a variety of ways to varying degrees of success. We propose a platform that automatically gathers information and reflects the health status of an individual back to the user. An avatar, which is a representation of a user, could be created and assigned characteristics that allow users to appreciate their health status. The health avatar platform also would allow users to compare their personal status with that of their community. The overall goal is to engage and then motivate users to improve their overall health status. Medicine must acknowledge the evolving relationships that the next generation of patients will have with technology. The health avatar is a platform that incorporates a connection with the health system through electronic medical records and connects individuals to the greater community.

  10. Training for vigilance on the move: a video game-based paradigm for sustained attention.

    PubMed

    Szalma, J L; Daly, T N; Teo, G W L; Hancock, G M; Hancock, P A

    2018-04-01

    The capacity for superior vigilance can be trained by using knowledge of results (KR). Our present experiments demonstrate the efficacy of such training using a first-person perspective movement videogame-based platform in samples of students and Soldiers. Effectiveness was assessed by manipulating KR during a training phase and withdrawing it in a subsequent transfer phase. Relative to a no KR control condition, KR systematically improved performance for both Soldiers and students. These results build upon our previous findings that demonstrated that a video game-based platform can be used to create a movement-centred sustained attention task with important elements of traditional vigilance. The results indicate that KR effects in sustained attention extend to a first person perspective movement based paradigm, and that these effects occur in professional military as well as a more general population. Such sustained attention training can save lives and the present findings demonstrate one particular avenue to achieve this goal. Practitioner Summary: Sustained attention can be trained by means of knowledge of results using a videogame-based platform with samples of students and Soldiers. Four experiments demonstrate that a dynamic, first-person perspective video game environment can serve to support effective sustained attention training in professional military as well as a more general population.

  11. RAPPORT: running scientific high-performance computing applications on the cloud.

    PubMed

    Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt

    2013-01-28

    Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.

  12. ENFIN--A European network for integrative systems biology.

    PubMed

    Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan

    2009-11-01

    Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.

  13. A contact-lens-on-a-chip companion diagnostic tool for personalized medicine.

    PubMed

    Guan, Allan; Wang, Yi; Phillips, K Scott; Li, Zhenyu

    2016-04-07

    We present a novel, microfluidic platform that integrates human tears (1 μL) with commercial contact lens materials to provide personalized assessment of lens care solution performance. This device enabled the detection of significant differences in cleaning and disinfection outcomes between subjects and between biofilms vs. planktonic bacteria.

  14. New Data, Old Tensions: Big Data, Personalized Learning, and the Challenges of Progressive Education

    ERIC Educational Resources Information Center

    Dishon, Gideon

    2017-01-01

    Personalized learning has become the most notable application of big data in primary and secondary schools in the United States. The combination of big data and adaptive technological platforms is heralded as a revolution that could transform education, overcoming the outdated classroom model, and realizing the progressive vision of…

  15. Integrating Mailed Personalized Feedback and Alcohol Screening Events: A Feasibility Study

    ERIC Educational Resources Information Center

    Benson, Trisha A.; Ambrose, Carrie; Mulfinger, Amanda M. M.; Correia, Christopher J.

    2004-01-01

    This study characterized a sample of college students attending National Alcohol Screening Day (NASD), and tested the feasibility of using NASD as a platform for initiating the delivery of mailed personalized feedback forms. Participants (N = 153, 65% female) attended NASD and completed the Alcohol Use Disorders Identification Test (AUDIT [1]). A…

  16. An Ecological Framework for Cancer Communication: Implications for Research

    PubMed Central

    Intille, Stephen S; Zabinski, Marion F

    2005-01-01

    The field of cancer communication has undergone a major revolution as a result of the Internet. As recently as the early 1990s, face-to-face, print, and the telephone were the dominant methods of communication between health professionals and individuals in support of the prevention and treatment of cancer. Computer-supported interactive media existed, but this usually required sophisticated computer and video platforms that limited availability. The introduction of point-and-click interfaces for the Internet dramatically improved the ability of non-expert computer users to obtain and publish information electronically on the Web. Demand for Web access has driven computer sales for the home setting and improved the availability, capability, and affordability of desktop computers. New advances in information and computing technologies will lead to similarly dramatic changes in the affordability and accessibility of computers. Computers will move from the desktop into the environment and onto the body. Computers are becoming smaller, faster, more sophisticated, more responsive, less expensive, and—essentially—ubiquitous. Computers are evolving into much more than desktop communication devices. New computers include sensing, monitoring, geospatial tracking, just-in-time knowledge presentation, and a host of other information processes. The challenge for cancer communication researchers is to acknowledge the expanded capability of the Web and to move beyond the approaches to health promotion, behavior change, and communication that emerged during an era when language- and image-based interpersonal and mass communication strategies predominated. Ecological theory has been advanced since the early 1900s to explain the highly complex relationships among individuals, society, organizations, the built and natural environments, and personal and population health and well-being. This paper provides background on ecological theory, advances an Ecological Model of Internet-Based Cancer Communication intended to broaden the vision of potential uses of the Internet for cancer communication, and provides some examples of how such a model might inform future research and development in cancer communication. PMID:15998614

  17. Production experience with the ATLAS Event Service

    NASA Astrophysics Data System (ADS)

    Benjamin, D.; Calafiura, P.; Childers, T.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The ATLAS Event Service (AES) has been designed and implemented for efficient running of ATLAS production workflows on a variety of computing platforms, ranging from conventional Grid sites to opportunistic, often short-lived resources, such as spot market commercial clouds, supercomputers and volunteer computing. The Event Service architecture allows real time delivery of fine grained workloads to running payload applications which process dispatched events or event ranges and immediately stream the outputs to highly scalable Object Stores. Thanks to its agile and flexible architecture the AES is currently being used by grid sites for assigning low priority workloads to otherwise idle computing resources; similarly harvesting HPC resources in an efficient back-fill mode; and massively scaling out to the 50-100k concurrent core level on the Amazon spot market to efficiently utilize those transient resources for peak production needs. Platform ports in development include ATLAS@Home (BOINC) and the Google Compute Engine, and a growing number of HPC platforms. After briefly reviewing the concept and the architecture of the Event Service, we will report the status and experience gained in AES commissioning and production operations on supercomputers, and our plans for extending ES application beyond Geant4 simulation to other workflows, such as reconstruction and data analysis.

  18. An Application-Based Performance Evaluation of NASAs Nebula Cloud Computing Platform

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Heistand, Steve; Jin, Haoqiang; Chang, Johnny; Hood, Robert T.; Mehrotra, Piyush; Biswas, Rupak

    2012-01-01

    The high performance computing (HPC) community has shown tremendous interest in exploring cloud computing as it promises high potential. In this paper, we examine the feasibility, performance, and scalability of production quality scientific and engineering applications of interest to NASA on NASA's cloud computing platform, called Nebula, hosted at Ames Research Center. This work represents the comprehensive evaluation of Nebula using NUTTCP, HPCC, NPB, I/O, and MPI function benchmarks as well as four applications representative of the NASA HPC workload. Specifically, we compare Nebula performance on some of these benchmarks and applications to that of NASA s Pleiades supercomputer, a traditional HPC system. We also investigate the impact of virtIO and jumbo frames on interconnect performance. Overall results indicate that on Nebula (i) virtIO and jumbo frames improve network bandwidth by a factor of 5x, (ii) there is a significant virtualization layer overhead of about 10% to 25%, (iii) write performance is lower by a factor of 25x, (iv) latency for short MPI messages is very high, and (v) overall performance is 15% to 48% lower than that on Pleiades for NASA HPC applications. We also comment on the usability of the cloud platform.

  19. Scalable Algorithms for Clustering Large Geospatiotemporal Data Sets on Manycore Architectures

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Hoffman, F. M.; Kumar, J.; Sreepathi, S.; Sripathi, V.

    2016-12-01

    The increasing availability of high-resolution geospatiotemporal data sets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery using data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe a massively parallel implementation of accelerated k-means clustering and some optimizations to boost computational intensity and utilization of wide SIMD lanes on state-of-the art multi- and manycore processors, including the second-generation Intel Xeon Phi ("Knights Landing") processor based on the Intel Many Integrated Core (MIC) architecture, which includes several new features, including an on-package high-bandwidth memory. We also analyze the code in the context of a few practical applications to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.

  20. A CLIPS based personal computer hardware diagnostic system

    NASA Technical Reports Server (NTRS)

    Whitson, George M.

    1991-01-01

    Often the person designated to repair personal computers has little or no knowledge of how to repair a computer. Described here is a simple expert system to aid these inexperienced repair people. The first component of the system leads the repair person through a number of simple system checks such as making sure that all cables are tight and that the dip switches are set correctly. The second component of the system assists the repair person in evaluating error codes generated by the computer. The final component of the system applies a large knowledge base to attempt to identify the component of the personal computer that is malfunctioning. We have implemented and tested our design with a full system to diagnose problems for an IBM compatible system based on the 8088 chip. In our tests, the inexperienced repair people found the system very useful in diagnosing hardware problems.

Top