Sample records for software coherent shared

  1. Compiler-directed cache management in multiprocessors

    NASA Technical Reports Server (NTRS)

    Cheong, Hoichi; Veidenbaum, Alexander V.

    1990-01-01

    The necessity of finding alternatives to hardware-based cache coherence strategies for large-scale multiprocessor systems is discussed. Three different software-based strategies sharing the same goals and general approach are presented. They consist of a simple invalidation approach, a fast selective invalidation scheme, and a version control scheme. The strategies are suitable for shared-memory multiprocessor systems with interconnection networks and a large number of processors. Results of trace-driven simulations conducted on numerical benchmark routines to compare the performance of the three schemes are presented.

  2. Community-driven computational biology with Debian Linux.

    PubMed

    Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles

    2010-12-21

    The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.

  3. Community-driven computational biology with Debian Linux

    PubMed Central

    2010-01-01

    Background The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. Results The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Conclusions Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers. PMID:21210984

  4. Software Coherence in Multiprocessor Memory Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Bolosky, William Joseph

    1993-01-01

    Processors are becoming faster and multiprocessor memory interconnection systems are not keeping up. Therefore, it is necessary to have threads and the memory they access as near one another as possible. Typically, this involves putting memory or caches with the processors, which gives rise to the problem of coherence: if one processor writes an address, any other processor reading that address must see the new value. This coherence can be maintained by the hardware or with software intervention. Systems of both types have been built in the past; the hardware-based systems tended to outperform the software ones. However, the ratio of processor to interconnect speed is now so high that the extra overhead of the software systems may no longer be significant. This issue is explored both by implementing a software maintained system and by introducing and using the technique of offline optimal analysis of memory reference traces. It finds that in properly built systems, software maintained coherence can perform comparably to or even better than hardware maintained coherence. The architectural features necessary for efficient software coherence to be profitable include a small page size, a fast trap mechanism, and the ability to execute instructions while remote memory references are outstanding.

  5. Method and apparatus for single-stepping coherence events in a multiprocessor system under software control

    DOEpatents

    Blumrich, Matthias A.; Salapura, Valentina

    2010-11-02

    An apparatus and method are disclosed for single-stepping coherence events in a multiprocessor system under software control in order to monitor the behavior of a memory coherence mechanism. Single-stepping coherence events in a multiprocessor system is made possible by adding one or more step registers. By accessing these step registers, one or more coherence requests are processed by the multiprocessor system. The step registers determine if the snoop unit will operate by proceeding in a normal execution mode, or operate in a single-step mode.

  6. Forth system for coherent-scatter radar data acquisition and processing

    NASA Technical Reports Server (NTRS)

    Rennier, A. D.; Bowhill, S. A.

    1985-01-01

    A real time collection system was developed for the Urbana coherent scatter radar system. The new system, designed for use with a microcomputer, has several advantages over the old system implemented with a minicomputer. The software used to collect the data is described as well as the processing software used to analyze the data. In addition a magnetic tape format for coherent scatter data exchange is given.

  7. Managing coherence via put/get windows

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton on Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Hoenicke, Dirk [Ossining, NY; Ohmacht, Martin [Yorktown Heights, NY

    2011-01-11

    A method and apparatus for managing coherence between two processors of a two processor node of a multi-processor computer system. Generally the present invention relates to a software algorithm that simplifies and significantly speeds the management of cache coherence in a message passing parallel computer, and to hardware apparatus that assists this cache coherence algorithm. The software algorithm uses the opening and closing of put/get windows to coordinate the activated required to achieve cache coherence. The hardware apparatus may be an extension to the hardware address decode, that creates, in the physical memory address space of the node, an area of virtual memory that (a) does not actually exist, and (b) is therefore able to respond instantly to read and write requests from the processing elements.

  8. Managing coherence via put/get windows

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton on Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Hoenicke, Dirk [Ossining, NY; Ohmacht, Martin [Yorktown Heights, NY

    2012-02-21

    A method and apparatus for managing coherence between two processors of a two processor node of a multi-processor computer system. Generally the present invention relates to a software algorithm that simplifies and significantly speeds the management of cache coherence in a message passing parallel computer, and to hardware apparatus that assists this cache coherence algorithm. The software algorithm uses the opening and closing of put/get windows to coordinate the activated required to achieve cache coherence. The hardware apparatus may be an extension to the hardware address decode, that creates, in the physical memory address space of the node, an area of virtual memory that (a) does not actually exist, and (b) is therefore able to respond instantly to read and write requests from the processing elements.

  9. Optical threshold secret sharing scheme based on basic vector operations and coherence superposition

    NASA Astrophysics Data System (ADS)

    Deng, Xiaopeng; Wen, Wei; Mi, Xianwu; Long, Xuewen

    2015-04-01

    We propose, to our knowledge for the first time, a simple optical algorithm for secret image sharing with the (2,n) threshold scheme based on basic vector operations and coherence superposition. The secret image to be shared is firstly divided into n shadow images by use of basic vector operations. In the reconstruction stage, the secret image can be retrieved by recording the intensity of the coherence superposition of any two shadow images. Compared with the published encryption techniques which focus narrowly on information encryption, the proposed method can realize information encryption as well as secret sharing, which further ensures the safety and integrality of the secret information and prevents power from being kept centralized and abused. The feasibility and effectiveness of the proposed method are demonstrated by numerical results.

  10. Simplifying and speeding the management of intra-node cache coherence

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton on Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Phillip [Cortlandt Manor, NY; Hoenicke, Dirk [Ossining, NY; Ohmacht, Martin [Yorktown Heights, NY

    2012-04-17

    A method and apparatus for managing coherence between two processors of a two processor node of a multi-processor computer system. Generally the present invention relates to a software algorithm that simplifies and significantly speeds the management of cache coherence in a message passing parallel computer, and to hardware apparatus that assists this cache coherence algorithm. The software algorithm uses the opening and closing of put/get windows to coordinate the activated required to achieve cache coherence. The hardware apparatus may be an extension to the hardware address decode, that creates, in the physical memory address space of the node, an area of virtual memory that (a) does not actually exist, and (b) is therefore able to respond instantly to read and write requests from the processing elements.

  11. Managing coherence via put/get windows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumrich, Matthias A; Chen, Dong; Coteus, Paul W

    A method and apparatus for managing coherence between two processors of a two processor node of a multi-processor computer system. Generally the present invention relates to a software algorithm that simplifies and significantly speeds the management of cache coherence in a message passing parallel computer, and to hardware apparatus that assists this cache coherence algorithm. The software algorithm uses the opening and closing of put/get windows to coordinate the activated required to achieve cache coherence. The hardware apparatus may be an extension to the hardware address decode, that creates, in the physical memory address space of the node, an areamore » of virtual memory that (a) does not actually exist, and (b) is therefore able to respond instantly to read and write requests from the processing elements.« less

  12. Experimenting with an Evolving Ground/Space-based Software Architecture to Enable Sensor Webs

    NASA Technical Reports Server (NTRS)

    mandl, Daniel; Frye, Stuart

    2005-01-01

    A series of ongoing experiments are being conducted at the NASA Goddard Space Flight Center to explore integrated ground and space-based software architectures enabling sensor webs. A sensor web, as defined by Steve Talabac at NASA Goddard Space Flight Center(GSFC), is a coherent set of distributed nodes interconnected by a communications fabric, that collectively behave as a single, dynamically adaptive, observing system. The nodes can be comprised of satellites, ground instruments, computing nodes etc. Sensor web capability requires autonomous management of constellation resources. This becomes progressively more important as more and more satellites share resource, such as communication channels and ground station,s while automatically coordinating their activities. There have been five ongoing activities which include an effort to standardize a set of middleware. This paper will describe one set of activities using the Earth Observing 1 satellite, which used a variety of ground and flight software along with other satellites and ground sensors to prototype a sensor web. This activity allowed us to explore where the difficulties that occur in the assembly of sensor webs given today s technology. We will present an overview of the software system architecture, some key experiments and lessons learned to facilitate better sensor webs in the future.

  13. The Speckle Toolbox: A Powerful Data Reduction Tool for CCD Astrometry

    NASA Astrophysics Data System (ADS)

    Harshaw, Richard; Rowe, David; Genet, Russell

    2017-01-01

    Recent advances in high-speed low-noise CCD and CMOS cameras, coupled with breakthroughs in data reduction software that runs on desktop PCs, has opened the domain of speckle interferometry and high-accuracy CCD measurements of double stars to amateurs, allowing them to do useful science of high quality. This paper describes how to use a speckle interferometry reduction program, the Speckle Tool Box (STB), to achieve this level of result. For over a year the author (Harshaw) has been using STB (and its predecessor, Plate Solve 3) to obtain measurements of double stars based on CCD camera technology for pairs that are either too wide (the stars not sharing the same isoplanatic patch, roughly 5 arc-seconds in diameter) or too faint to image in the coherence time required for speckle (usually under 40ms). This same approach - using speckle reduction software to measure CCD pairs with greater accuracy than possible with lucky imaging - has been used, it turns out, for several years by the U. S. Naval Observatory.

  14. Patchy 'coherence': using normalization process theory to evaluate a multi-faceted shared decision making implementation program (MAGIC).

    PubMed

    Lloyd, Amy; Joseph-Williams, Natalie; Edwards, Adrian; Rix, Andrew; Elwyn, Glyn

    2013-09-05

    Implementing shared decision making into routine practice is proving difficult, despite considerable interest from policy-makers, and is far more complex than merely making decision support interventions available to patients. Few have reported successful implementation beyond research studies. MAking Good Decisions In Collaboration (MAGIC) is a multi-faceted implementation program, commissioned by The Health Foundation (UK), to examine how best to put shared decision making into routine practice. In this paper, we investigate healthcare professionals' perspectives on implementing shared decision making during the MAGIC program, to examine the work required to implement shared decision making and to inform future efforts. The MAGIC program approached implementation of shared decision making by initiating a range of interventions including: providing workshops; facilitating development of brief decision support tools (Option Grids); initiating a patient activation campaign ('Ask 3 Questions'); gathering feedback using Decision Quality Measures; providing clinical leads meetings, learning events, and feedback sessions; and obtaining executive board level support. At 9 and 15 months (May and November 2011), two rounds of semi-structured interviews were conducted with healthcare professionals in three secondary care teams to explore views on the impact of these interventions. Interview data were coded by two reviewers using a framework derived from the Normalization Process Theory. A total of 54 interviews were completed with 31 healthcare professionals. Partial implementation of shared decision making could be explained using the four components of the Normalization Process Theory: 'coherence,' 'cognitive participation,' 'collective action,' and 'reflexive monitoring.' Shared decision making was integrated into routine practice when clinical teams shared coherent views of role and purpose ('coherence'). Shared decision making was facilitated when teams engaged in developing and delivering interventions ('cognitive participation'), and when those interventions fit with existing skill sets and organizational priorities ('collective action') resulting in demonstrable improvements to practice ('reflexive monitoring'). The implementation process uncovered diverse and conflicting attitudes toward shared decision making; 'coherence' was often missing. The study showed that implementation of shared decision making is more complex than the delivery of patient decision support interventions to patients, a portrayal that often goes unquestioned. Normalizing shared decision making requires intensive work to ensure teams have a shared understanding of the purpose of involving patients in decisions, and undergo the attitudinal shifts that many health professionals feel are required when comprehension goes beyond initial interpretations. Divergent views on the value of engaging patients in decisions remain a significant barrier to implementation.

  15. Panoptes: web-based exploration of large scale genome variation data.

    PubMed

    Vauterin, Paul; Jeffery, Ben; Miles, Alistair; Amato, Roberto; Hart, Lee; Wright, Ian; Kwiatkowski, Dominic

    2017-10-15

    The size and complexity of modern large-scale genome variation studies demand novel approaches for exploring and sharing the data. In order to unlock the potential of these data for a broad audience of scientists with various areas of expertise, a unified exploration framework is required that is accessible, coherent and user-friendly. Panoptes is an open-source software framework for collaborative visual exploration of large-scale genome variation data and associated metadata in a web browser. It relies on technology choices that allow it to operate in near real-time on very large datasets. It can be used to browse rich, hybrid content in a coherent way, and offers interactive visual analytics approaches to assist the exploration. We illustrate its application using genome variation data of Anopheles gambiae, Plasmodium falciparum and Plasmodium vivax. Freely available at https://github.com/cggh/panoptes, under the GNU Affero General Public License. paul.vauterin@gmail.com. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  16. Cerebral coherence between communicators marks the emergence of meaning

    PubMed Central

    Stolk, Arjen; Noordzij, Matthijs L.; Verhagen, Lennart; Volman, Inge; Schoffelen, Jan-Mathijs; Oostenveld, Robert; Hagoort, Peter; Toni, Ivan

    2014-01-01

    How can we understand each other during communicative interactions? An influential suggestion holds that communicators are primed by each other’s behaviors, with associative mechanisms automatically coordinating the production of communicative signals and the comprehension of their meanings. An alternative suggestion posits that mutual understanding requires shared conceptualizations of a signal’s use, i.e., “conceptual pacts” that are abstracted away from specific experiences. Both accounts predict coherent neural dynamics across communicators, aligned either to the occurrence of a signal or to the dynamics of conceptual pacts. Using coherence spectral-density analysis of cerebral activity simultaneously measured in pairs of communicators, this study shows that establishing mutual understanding of novel signals synchronizes cerebral dynamics across communicators’ right temporal lobes. This interpersonal cerebral coherence occurred only within pairs with a shared communicative history, and at temporal scales independent from signals’ occurrences. These findings favor the notion that meaning emerges from shared conceptualizations of a signal’s use. PMID:25489093

  17. Coherence of Pre-Service Physics Teachers' Views of the Relatedness of Physics Concepts

    ERIC Educational Resources Information Center

    Nousiainen, Maija

    2013-01-01

    In physics teacher education, one of the recurrent themes is the importance of fostering the formation of organised and coherent knowledge structures, but a simple shared understanding of what coherence actually means and how it can be recognised, is not easily found. This study suggests an approach in which the coherence of students' views about…

  18. Characterization of network structure in stereoEEG data using consensus-based partial coherence.

    PubMed

    Ter Wal, Marije; Cardellicchio, Pasquale; LoRusso, Giorgio; Pelliccia, Veronica; Avanzini, Pietro; Orban, Guy A; Tiesinga, Paul He

    2018-06-06

    Coherence is a widely used measure to determine the frequency-resolved functional connectivity between pairs of recording sites, but this measure is confounded by shared inputs to the pair. To remove shared inputs, the 'partial coherence' can be computed by conditioning the spectral matrices of the pair on all other recorded channels, which involves the calculation of a matrix (pseudo-) inverse. It has so far remained a challenge to use the time-resolved partial coherence to analyze intracranial recordings with a large number of recording sites. For instance, calculating the partial coherence using a pseudoinverse method produces a high number of false positives when it is applied to a large number of channels. To address this challenge, we developed a new method that randomly aggregated channels into a smaller number of effective channels on which the calculation of partial coherence was based. We obtained a 'consensus' partial coherence (cPCOH) by repeating this approach for several random aggregations of channels (permutations) and only accepting those activations in time and frequency with a high enough consensus. Using model data we show that the cPCOH method effectively filters out the effect of shared inputs and performs substantially better than the pseudo-inverse. We successfully applied the cPCOH procedure to human stereotactic EEG data and demonstrated three key advantages of this method relative to alternative procedures. First, it reduces the number of false positives relative to the pseudo-inverse method. Second, it allows for titration of the amount of false positives relative to the false negatives by adjusting the consensus threshold, thus allowing the data-analyst to prioritize one over the other to meet specific analysis demands. Third, it substantially reduced the number of identified interactions compared to coherence, providing a sparser network of connections from which clear spatial patterns emerged. These patterns can serve as a starting point of further analyses that provide insight into network dynamics during cognitive processes. These advantages likely generalize to other modalities in which shared inputs introduce confounds, such as electroencephalography (EEG) and magneto-encephalography (MEG). Copyright © 2018. Published by Elsevier Inc.

  19. a Standardized Approach to Topographic Data Processing and Workflow Management

    NASA Astrophysics Data System (ADS)

    Wheaton, J. M.; Bailey, P.; Glenn, N. F.; Hensleigh, J.; Hudak, A. T.; Shrestha, R.; Spaete, L.

    2013-12-01

    An ever-increasing list of options exist for collecting high resolution topographic data, including airborne LIDAR, terrestrial laser scanners, bathymetric SONAR and structure-from-motion. An equally rich, arguably overwhelming, variety of tools exists with which to organize, quality control, filter, analyze and summarize these data. However, scientists are often left to cobble together their analysis as a series of ad hoc steps, often using custom scripts and one-time processes that are poorly documented and rarely shared with the community. Even when literature-cited software tools are used, the input and output parameters differ from tool to tool. These parameters are rarely archived and the steps performed lost, making the analysis virtually impossible to replicate precisely. What is missing is a coherent, robust, framework for combining reliable, well-documented topographic data-processing steps into a workflow that can be repeated and even shared with others. We have taken several popular topographic data processing tools - including point cloud filtering and decimation as well as DEM differencing - and defined a common protocol for passing inputs and outputs between them. This presentation describes a free, public online portal that enables scientists to create custom workflows for processing topographic data using a number of popular topographic processing tools. Users provide the inputs required for each tool and in what sequence they want to combine them. This information is then stored for future reuse (and optionally sharing with others) before the user then downloads a single package that contains all the input and output specifications together with the software tools themselves. The user then launches the included batch file that executes the workflow on their local computer against their topographic data. This ZCloudTools architecture helps standardize, automate and archive topographic data processing. It also represents a forum for discovering and sharing effective topographic processing workflows.

  20. Design of Control Software for a High-Speed Coherent Doppler Lidar System for CO2 Measurement

    NASA Technical Reports Server (NTRS)

    Vanvalkenburg, Randal L.; Beyon, Jeffrey Y.; Koch, Grady J.; Yu, Jirong; Singh, Upendra N.; Kavaya, Michael J.

    2010-01-01

    The design of the software for a 2-micron coherent high-speed Doppler lidar system for CO2 measurement at NASA Langley Research Center is discussed in this paper. The specific strategy and design topology to meet the requirements of the system are reviewed. In order to attain the high-speed digitization of the different types of signals to be sampled on multiple channels, a carefully planned design of the control software is imperative. Samples of digitized data from each channel and their roles in data analysis post processing are also presented. Several challenges of extremely-fast, high volume data acquisition are discussed. The software must check the validity of each lidar return as well as other monitoring channel data in real-time. For such high-speed data acquisition systems, the software is a key component that enables the entire scope of CO2 measurement studies using commercially available system components.

  1. Development of a 9.3 micrometer CW LIDAR for the study of atmospheric aerosol

    NASA Technical Reports Server (NTRS)

    Whiteside, B. N.; Schotland, R. M.

    1993-01-01

    This report provides a brief summary of the basic requirements to obtain coherent or heterodyne mixing of the optical radiation backscattered by atmospheric aerosols with that from a fixed frequency source. The continuous wave (CW) mode of operation for a coherent lidar is reviewed along with the associated lidar transfer equation. A complete optical design of the three major subsystems of a CW, coherent lidar is given. Lens design software is implemented to model and optimize receiver performance. Techniques for the opto-mechanical assembly and some of the critical tolerances of the coherent lidar are provided along with preliminary tests of the subsystems. Included in these tests is a comparison of the experimental and the theoretical average power signal-to-noise ratio. The analog to digital software used to evaluate the power spectrum of the backscattered signal is presented in the Appendix of this report.

  2. A comparison of time-shared vs. batch development of space software

    NASA Technical Reports Server (NTRS)

    Forthofer, M.

    1977-01-01

    In connection with a study regarding the ground support software development for the Space Shuttle, an investigation was conducted concerning the most suitable software development techniques to be employed. A time-sharing 'trial period' was used to determine whether or not time-sharing would be a cost-effective software development technique for the Ground Based Shuttle system. It was found that time-sharing substantially improved job turnaround and programmer access to the computer for the representative group of ground support programmers. Moreover, this improvement resulted in an estimated saving of over fifty programmer days during the trial period.

  3. Virtual memory support for distributed computing environments using a shared data object model

    NASA Astrophysics Data System (ADS)

    Huang, F.; Bacon, J.; Mapp, G.

    1995-12-01

    Conventional storage management systems provide one interface for accessing memory segments and another for accessing secondary storage objects. This hinders application programming and affects overall system performance due to mandatory data copying and user/kernel boundary crossings, which in the microkernel case may involve context switches. Memory-mapping techniques may be used to provide programmers with a unified view of the storage system. This paper extends such techniques to support a shared data object model for distributed computing environments in which good support for coherence and synchronization is essential. The approach is based on a microkernel, typed memory objects, and integrated coherence control. A microkernel architecture is used to support multiple coherence protocols and the addition of new protocols. Memory objects are typed and applications can choose the most suitable protocols for different types of object to avoid protocol mismatch. Low-level coherence control is integrated with high-level concurrency control so that the number of messages required to maintain memory coherence is reduced and system-wide synchronization is realized without severely impacting the system performance. These features together contribute a novel approach to the support for flexible coherence under application control.

  4. Perspectives of the optical coherence tomography community on code and data sharing

    NASA Astrophysics Data System (ADS)

    Lurie, Kristen L.; Mistree, Behram F. T.; Ellerbee, Audrey K.

    2015-03-01

    As optical coherence tomography (OCT) grows to be a mature and successful field, it is important for the research community to develop a stronger practice of sharing code and data. A prolific culture of sharing can enable new and emerging laboratories to enter the field, allow research groups to gain new exposure and notoriety, and enable benchmarking of new algorithms and methods. Our long-term vision is to build tools to facilitate a stronger practice of sharing within this community. In line with this goal, our first aim was to understand the perceptions and practices of the community with respect to sharing research contributions (i.e., as code and data). We surveyed 52 members of the OCT community using an online polling system. Our main findings indicate that while researchers infrequently share their code and data, they are willing to contribute their research resources to a shared repository, and they believe that such a repository would benefit both their research and the OCT community at large. We plan to use the results of this survey to design a platform targeted to the OCT research community - an effort that ultimately aims to facilitate a more prolific culture of sharing.

  5. Gaussian private quantum channel with squeezed coherent states

    PubMed Central

    Jeong, Kabgyun; Kim, Jaewan; Lee, Su-Yong

    2015-01-01

    While the objective of conventional quantum key distribution (QKD) is to secretly generate and share the classical bits concealed in the form of maximally mixed quantum states, that of private quantum channel (PQC) is to secretly transmit individual quantum states concealed in the form of maximally mixed states using shared one-time pad and it is called Gaussian private quantum channel (GPQC) when the scheme is in the regime of continuous variables. We propose a GPQC enhanced with squeezed coherent states (GPQCwSC), which is a generalization of GPQC with coherent states only (GPQCo) [Phys. Rev. A 72, 042313 (2005)]. We show that GPQCwSC beats the GPQCo for the upper bound on accessible information. As a subsidiary example, it is shown that the squeezed states take an advantage over the coherent states against a beam splitting attack in a continuous variable QKD. It is also shown that a squeezing operation can be approximated as a superposition of two different displacement operations in the small squeezing regime. PMID:26364893

  6. Gaussian private quantum channel with squeezed coherent states.

    PubMed

    Jeong, Kabgyun; Kim, Jaewan; Lee, Su-Yong

    2015-09-14

    While the objective of conventional quantum key distribution (QKD) is to secretly generate and share the classical bits concealed in the form of maximally mixed quantum states, that of private quantum channel (PQC) is to secretly transmit individual quantum states concealed in the form of maximally mixed states using shared one-time pad and it is called Gaussian private quantum channel (GPQC) when the scheme is in the regime of continuous variables. We propose a GPQC enhanced with squeezed coherent states (GPQCwSC), which is a generalization of GPQC with coherent states only (GPQCo) [Phys. Rev. A 72, 042313 (2005)]. We show that GPQCwSC beats the GPQCo for the upper bound on accessible information. As a subsidiary example, it is shown that the squeezed states take an advantage over the coherent states against a beam splitting attack in a continuous variable QKD. It is also shown that a squeezing operation can be approximated as a superposition of two different displacement operations in the small squeezing regime.

  7. Implicit Attitude Toward Caregiving: The Moderating Role of Adult Attachment Styles

    PubMed Central

    De Carli, Pietro; Tagini, Angela; Sarracino, Diego; Santona, Alessandra; Parolin, Laura

    2016-01-01

    Attachment and caregiving are separate motivational systems that share the common evolutionary purpose of favoring child security. In the goal of studying the processes underlying the transmission of attachment styles, this study focused on the role of adult attachment styles in shaping preferences toward particular styles of caregiving. We hypothesized a correspondence between attachment and caregiving styles: we expect an individual to show a preference for a caregiving behavior coherent with his/her own attachment style, in order to increase the chance of passing it on to offspring. We activated different representations of specific caregiving modalities in females, by using three videos in which mothers with different Adult Attachment states of mind played with their infants. Participants' facial expressions while watching were recorded and analyzed with FaceReader software. After each video, participants' attitudes toward the category “mother” were measured, both explicitly (semantic differential) and implicitly (single target-implicit association task, ST-IAT). Participants' adult attachment styles (experiences in close relationships revised) predicted attitudes scores, but only when measured implicitly. Participants scored higher on the ST-IAT after watching a video coherent with their attachment style. No effect was found on the facial expressions of disgust. These findings suggest a role of adult attachment styles in shaping implicit attitudes related to the caregiving system. PMID:26779060

  8. Extracting quantum coherence via steering

    PubMed Central

    Hu, Xueyuan; Fan, Heng

    2016-01-01

    As the precious resource for quantum information processing, quantum coherence can be created remotely if the involved two sites are quantum correlated. It can be expected that the amount of coherence created should depend on the quantity of the shared quantum correlation, which is also a resource. Here, we establish an operational connection between coherence induced by steering and the quantum correlation. We find that the steering-induced coherence quantified by such as relative entropy of coherence and trace-norm of coherence is bounded from above by a known quantum correlation measure defined as the one-side measurement-induced disturbance. The condition that the upper bound saturated by the induced coherence varies for different measures of coherence. The tripartite scenario is also studied and similar conclusion can be obtained. Our results provide the operational connections between local and non-local resources in quantum information processing. PMID:27682450

  9. Application of Novel Software Algorithms to Spectral-Domain Optical Coherence Tomography for Automated Detection of Diabetic Retinopathy.

    PubMed

    Adhi, Mehreen; Semy, Salim K; Stein, David W; Potter, Daniel M; Kuklinski, Walter S; Sleeper, Harry A; Duker, Jay S; Waheed, Nadia K

    2016-05-01

    To present novel software algorithms applied to spectral-domain optical coherence tomography (SD-OCT) for automated detection of diabetic retinopathy (DR). Thirty-one diabetic patients (44 eyes) and 18 healthy, nondiabetic controls (20 eyes) who underwent volumetric SD-OCT imaging and fundus photography were retrospectively identified. A retina specialist independently graded DR stage. Trained automated software generated a retinal thickness score signifying macular edema and a cluster score signifying microaneurysms and/or hard exudates for each volumetric SD-OCT. Of 44 diabetic eyes, 38 had DR and six eyes did not have DR. Leave-one-out cross-validation using a linear discriminant at missed detection/false alarm ratio of 3.00 computed software sensitivity and specificity of 92% and 69%, respectively, for DR detection when compared to clinical assessment. Novel software algorithms applied to commercially available SD-OCT can successfully detect DR and may have potential as a viable screening tool for DR in future. [Ophthalmic Surg Lasers Imaging Retina. 2016;47:410-417.]. Copyright 2016, SLACK Incorporated.

  10. 32 CFR 37.550 - May I accept intellectual property as cost sharing?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... software) as cost sharing, because: (1) It is difficult to assign values to these intangible contributions... offer the use of commercially available software for which there is an established license fee for use of the product. The costs of the development of the software would not be a reasonable basis for...

  11. 32 CFR 37.550 - May I accept intellectual property as cost sharing?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... offer the use of commercially available software for which there is an established license fee for use of the product. The costs of the development of the software would not be a reasonable basis for... software) as cost sharing, because: (1) It is difficult to assign values to these intangible contributions...

  12. AAS Publishing News: Astronomical Software Citation Workshop

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2015-07-01

    Do you write code for your research? Use astronomical software? Do you wish there were a better way of citing, sharing, archiving, or discovering software for astronomy research? You're not alone! In April 2015, AAS's publishing team joined other leaders in the astronomical software community in a meeting funded by the Sloan Foundation, with the purpose of discussing these issues and potential solutions. In attendance were representatives from academic astronomy, publishing, libraries, for-profit software sharing platforms, telescope facilities, and grantmaking institutions. The goal of the group was to establish “protocols, policies, and platforms for astronomical software citation, sharing, and archiving,” in the hopes of encouraging a set of normalized standards across the field. The AAS is now collaborating with leaders at GitHub to write grant proposals for a project to develop strategies for software discoverability and citation, in astronomy and beyond. If this topic interests you, you can find more details in this document released by the group after the meeting: http://astronomy-software-index.github.io/2015-workshop/ The group hopes to move this project forward with input and support from the broader community. Please share the above document, discuss it on social media using the hashtag #astroware (so that your conversations can be found!), or send private comments to julie.steffen@aas.org.

  13. Challenging the coherence of social justice as a shared nursing value.

    PubMed

    Lipscomb, Martin

    2011-01-01

    Normative and prescriptive claims regarding social justice are often inadequately developed in the nursing literature and, in consequence, they must be rejected in their current form. Thus, claims regarding social justice are frequently presented as mere assertion (without clarification or supporting argument) or, alternatively, when assertions are supported that support may be weak (e.g. social justice is repeated juxtaposed against contentious assumptions regarding market disutility). This paper challenges the coherence of social justice as a shared nursing value and it is suggested that claims regarding the concept should be tempered. © 2010 Blackwell Publishing Ltd.

  14. Optimization of the coherence function estimation for multi-core central processing unit

    NASA Astrophysics Data System (ADS)

    Cheremnov, A. G.; Faerman, V. A.; Avramchuk, V. S.

    2017-02-01

    The paper considers use of parallel processing on multi-core central processing unit for optimization of the coherence function evaluation arising in digital signal processing. Coherence function along with other methods of spectral analysis is commonly used for vibration diagnosis of rotating machinery and its particular nodes. An algorithm is given for the function evaluation for signals represented with digital samples. The algorithm is analyzed for its software implementation and computational problems. Optimization measures are described, including algorithmic, architecture and compiler optimization, their results are assessed for multi-core processors from different manufacturers. Thus, speeding-up of the parallel execution with respect to sequential execution was studied and results are presented for Intel Core i7-4720HQ и AMD FX-9590 processors. The results show comparatively high efficiency of the optimization measures taken. In particular, acceleration indicators and average CPU utilization have been significantly improved, showing high degree of parallelism of the constructed calculating functions. The developed software underwent state registration and will be used as a part of a software and hardware solution for rotating machinery fault diagnosis and pipeline leak location with acoustic correlation method.

  15. Interface Specifications for the A-7E Shared Services Module.

    DTIC Science & Technology

    1982-09-08

    To illustrate the principles, the onboard software for the Navy’s A-7E aircraft will be redesigned and rewritten. The Shared Services module provides...purpose of the Shared Services module is to allow the remainder of the software to remain unchanged when the requirements-based rules for these values and...services change. This report describes the modular structure of the Shared Services module, and contains the abstract interface specifications for all

  16. Low Cost Coherent Doppler Lidar Data Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce W.; Koch, Grady J.

    2003-01-01

    The work described in this paper details the development of a low-cost, short-development time data acquisition and processing system for a coherent Doppler lidar. This was done using common laboratory equipment and a small software investment. This system provides near real-time wind profile measurements. Coding flexibility created a very useful test bed for new techniques.

  17. Physical-layer network coding in coherent optical OFDM systems.

    PubMed

    Guan, Xun; Chan, Chun-Kit

    2015-04-20

    We present the first experimental demonstration and characterization of the application of optical physical-layer network coding in coherent optical OFDM systems. It combines two optical OFDM frames to share the same link so as to enhance system throughput, while individual OFDM frames can be recovered with digital signal processing at the destined node.

  18. Developing community based rehabilitation for cancer survivors: organizing for coordination and coherence in practice

    PubMed Central

    2013-01-01

    Background Increasing incidences of cancer combined with prolonged survival have raised the need for developing community based rehabilitation. The objectives of the analysis were to describe and interpret the key issues related to coordination and coherence of community-based cancer rehabilitation in Denmark and to provide insights relevant for other contexts. Methods Twenty-seven rehabilitation managers across 15 municipalities in Denmark comprised the sample. The study was designed with a combination of data collection methods including questionnaires, individual interviews, and focus groups. A Grounded Theory approach was used to analyze the data. Results A lack of shared cultures among health care providers and systems of delivery was a primary barrier to collaboration which was essential for establishing coordination of care. Formal multidisciplinary steering committees, team-based organization, and informal relationships were fundamental for developing coordination and coherence. Conclusions Coordination and coherence in community-based rehabilitation relies on increased collaboration, which may best be optimized by use of shared frameworks within and across systems. Results highlight the challenges faced in practical implementation of community rehabilitation and point to possible strategies for its enhancement. PMID:24004881

  19. Hardware for dynamic quantum computing.

    PubMed

    Ryan, Colm A; Johnson, Blake R; Ristè, Diego; Donovan, Brian; Ohki, Thomas A

    2017-10-01

    We describe the hardware, gateware, and software developed at Raytheon BBN Technologies for dynamic quantum information processing experiments on superconducting qubits. In dynamic experiments, real-time qubit state information is fed back or fed forward within a fraction of the qubits' coherence time to dynamically change the implemented sequence. The hardware presented here covers both control and readout of superconducting qubits. For readout, we created a custom signal processing gateware and software stack on commercial hardware to convert pulses in a heterodyne receiver into qubit state assignments with minimal latency, alongside data taking capability. For control, we developed custom hardware with gateware and software for pulse sequencing and steering information distribution that is capable of arbitrary control flow in a fraction of superconducting qubit coherence times. Both readout and control platforms make extensive use of field programmable gate arrays to enable tailored qubit control systems in a reconfigurable fabric suitable for iterative development.

  20. Simultaneous multimodal ophthalmic imaging using swept-source spectrally encoded scanning laser ophthalmoscopy and optical coherence tomography

    PubMed Central

    Malone, Joseph D.; El-Haddad, Mohamed T.; Bozic, Ivan; Tye, Logan A.; Majeau, Lucas; Godbout, Nicolas; Rollins, Andrew M.; Boudoux, Caroline; Joos, Karen M.; Patel, Shriji N.; Tao, Yuankai K.

    2016-01-01

    Scanning laser ophthalmoscopy (SLO) benefits diagnostic imaging and therapeutic guidance by allowing for high-speed en face imaging of retinal structures. When combined with optical coherence tomography (OCT), SLO enables real-time aiming and retinal tracking and provides complementary information for post-acquisition volumetric co-registration, bulk motion compensation, and averaging. However, multimodality SLO-OCT systems generally require dedicated light sources, scanners, relay optics, detectors, and additional digitization and synchronization electronics, which increase system complexity. Here, we present a multimodal ophthalmic imaging system using swept-source spectrally encoded scanning laser ophthalmoscopy and optical coherence tomography (SS-SESLO-OCT) for in vivo human retinal imaging. SESLO reduces the complexity of en face imaging systems by multiplexing spatial positions as a function of wavelength. SESLO image quality benefited from single-mode illumination and multimode collection through a prototype double-clad fiber coupler, which optimized scattered light throughput and reduce speckle contrast while maintaining lateral resolution. Using a shared 1060 nm swept-source, shared scanner and imaging optics, and a shared dual-channel high-speed digitizer, we acquired inherently co-registered en face retinal images and OCT cross-sections simultaneously at 200 frames-per-second. PMID:28101411

  1. Caltrans WeatherShare Phase II System: An Application of Systems and Software Engineering Process to Project Development

    DOT National Transportation Integrated Search

    2009-08-25

    In cooperation with the California Department of Transportation, Montana State University's Western Transportation Institute has developed the WeatherShare Phase II system by applying Systems Engineering and Software Engineering processes. The system...

  2. An effective write policy for software coherence schemes

    NASA Technical Reports Server (NTRS)

    Chen, Yung-Chin; Veidenbaum, Alexander V.

    1992-01-01

    The authors study the write behavior and evaluate the performance of various write strategies and buffering techniques for a MIN-based multiprocessor system using the simple software coherence scheme. Hit ratios, memory latencies, total execution time, and total write traffic are used as the performance indices. The write-through write-allocate no-fetch cache using a write-back write buffer is shown to have a better performance than both write-through and write-back caches. This type of write buffer is effective in reducing the volume as well as bursts of write traffic. On average, the use of a write-back cache reduces by 60 percent the total write traffic generated by a write-through cache.

  3. Detecting and Analyzing Multiple Moving Objects in Crowded Environments with Coherent Motion Regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheriyadat, Anil M.

    Understanding the world around us from large-scale video data requires vision systems that can perform automatic interpretation. While human eyes can unconsciously perceive independent objects in crowded scenes and other challenging operating environments, automated systems have difficulty detecting, counting, and understanding their behavior in similar scenes. Computer scientists at ORNL have a developed a technology termed as "Coherent Motion Region Detection" that invloves identifying multiple indepedent moving objects in crowded scenes by aggregating low-level motion cues extracted from moving objects. Humans and other species exploit such low-level motion cues seamlessely to perform perceptual grouping for visual understanding. The algorithm detectsmore » and tracks feature points on moving objects resulting in partial trajectories that span coherent 3D region in the space-time volume defined by the video. In the case of multi-object motion, many possible coherent motion regions can be constructed around the set of trajectories. The unique approach in the algorithm is to identify all possible coherent motion regions, then extract a subset of motion regions based on an innovative measure to automatically locate moving objects in crowded environments.The software reports snapshot of the object, count, and derived statistics ( count over time) from input video streams. The software can directly process videos streamed over the internet or directly from a hardware device (camera).« less

  4. Controlling Distributed Planning

    NASA Technical Reports Server (NTRS)

    Clement, Bradley; Barrett, Anthony

    2004-01-01

    A system of software implements an extended version of an approach, denoted shared activity coordination (SHAC), to the interleaving of planning and the exchange of plan information among organizations devoted to different missions that normally communicate infrequently except that they need to collaborate on joint activities and/or the use of shared resources. SHAC enables the planning and scheduling systems of the organizations to coordinate by resolving conflicts while optimizing local planning solutions. The present software provides a framework for modeling and executing communication protocols for SHAC. Shared activities are represented in each interacting planning system to establish consensus on joint activities or to inform the other systems of consumption of a common resource or a change in a shared state. The representations of shared activities are extended to include information on (1) the role(s) of each participant, (2) permissions (defined as specifications of which participant controls what aspects of shared activities and scheduling thereof), and (3) constraints on the parameters of shared activities. Also defined in the software are protocols for changing roles, permissions, and constraints during the course of coordination and execution.

  5. Software architecture of the Magdalena Ridge Observatory Interferometer

    NASA Astrophysics Data System (ADS)

    Farris, Allen; Klinglesmith, Dan; Seamons, John; Torres, Nicolas; Buscher, David; Young, John

    2010-07-01

    Merging software from 36 independent work packages into a coherent, unified software system with a lifespan of twenty years is the challenge faced by the Magdalena Ridge Observatory Interferometer (MROI). We solve this problem by using standardized interface software automatically generated from simple highlevel descriptions of these systems, relying only on Linux, GNU, and POSIX without complex software such as CORBA. This approach, based on gigabit Ethernet with a TCP/IP protocol, provides the flexibility to integrate and manage diverse, independent systems using a centralized supervisory system that provides a database manager, data collectors, fault handling, and an operator interface.

  6. Plug into a Network.

    ERIC Educational Resources Information Center

    Vander Linden, Doug; Clark, Larry

    1994-01-01

    Stand-alone, single-user software programs for classroom use can be prohibitively expensive, compared to information-sharing network systems. Based on a Kansas district's experience, this article explains three types of networks (device-sharing, operating-system-based, and client-server-based) and discusses network protocols, software choices,…

  7. Distributed Visualization Project

    NASA Technical Reports Server (NTRS)

    Craig, Douglas; Conroy, Michael; Kickbusch, Tracey; Mazone, Rebecca

    2016-01-01

    Distributed Visualization allows anyone, anywhere to see any simulation at any time. Development focuses on algorithms, software, data formats, data systems and processes to enable sharing simulation-based information across temporal and spatial boundaries without requiring stakeholders to possess highly-specialized and very expensive display systems. It also introduces abstraction between the native and shared data, which allows teams to share results without giving away proprietary or sensitive data. The initial implementation of this capability is the Distributed Observer Network (DON) version 3.1. DON 3.1 is available for public release in the NASA Software Store (https://software.nasa.gov/software/KSC-13775) and works with version 3.0 of the Model Process Control specification (an XML Simulation Data Representation and Communication Language) to display complex graphical information and associated Meta-Data.

  8. Coherence and other autistic spectrum traits and eating disorders: building from mechanism to treatment. The Birgit Olsson lecture.

    PubMed

    Treasure, Janet

    2013-02-01

    To revisit Gillberg's hypothesis proposed in 1992, which was that anorexia nervosa should be considered within the spectrum of autistic disorders. A search was made of the literature relating to the behavioural traits, and cognitive, emotional and neuroanatomical intermediate phenotypes that are shared between autistic spectrum disorders (ASD) and anorexia nervosa. People with eating disorders in the acute phase (less so after recovery) share some behavioural traits (social impairment and restricted and repetitive behaviours) and intermediate phenotypes (weak central coherence, and impaired set shifting and theory of mind) with people in the autistic spectrum. Behavioural and intermediate neuropsychological traits are shared between eating disorders and ASD. In part, these are familial but also they are accentuated by the illness state and may be secondary to starvation. These traits have implications for prognosis and treatment.

  9. Semi-automated software to measure luminal and stromal areas of choroid in optical coherence tomographic images.

    PubMed

    Sonoda, Shozo; Sakamoto, Taiji; Kakiuchi, Naoko; Shiihara, Hideki; Sakoguchi, Tomonori; Tomita, Masatoshi; Yamashita, Takehiro; Uchino, Eisuke

    2018-03-01

    To determine the capabilities of "EyeGround" software in measuring the choroidal cross sectional areas in optical coherence tomographic (OCT) images. Cross sectional, prospective study. The cross-sectional area of the subfoveal choroid within a 1500 µm diameter circle centered on the fovea was measured both with and without using the EyeGround software in the OCT images. The differences between the evaluation times and the results of the measurements were compared. The inter-rater, intra-rater, inter-method agreements were determined. Fifty-one eyes of 51 healthy subjects were studied: 24 men and 27 women with an average age of 35.0 ± 8.8 years. The time for analyzing a single image was significantly shorter with the software at 3.2±1.1 min than without the software at 12.1±5.1 min (P <0.001). The inter-method correlation efficient for the measurements of the whole choroid was high [0.989, 95% CI (0.981-0.994)]. With the software, the inter-rater correlation efficient was significantly high [0.997, 95% CI (0.995-0.999)], and the intra-rater correlation efficient was also significantly high [0.999, 95% CI (0.999-1.0)]. The EyeGround software can measure the choroidal area in the OCT cross sectional images with good reproducibility and in a significantly shorter times. It can be a valuable tool for analyzing the choroid.

  10. A STUDY OF SOME SOFTWARE PARAMETERS IN TIME-SHARING SYSTEMS.

    DTIC Science & Technology

    A review is made of some existing time-sharing computer systems and an exploration of various software characteristics is conducted. This...of the various parameters upon the average response cycle time, the average number in the queue awaiting service , the average length of time a user is

  11. Enhanced Visualization of Subtle Outer Retinal Pathology by En Face Optical Coherence Tomography and Correlation with Multi-Modal Imaging

    PubMed Central

    Chew, Avenell L.; Lamey, Tina; McLaren, Terri; De Roach, John

    2016-01-01

    Purpose To present en face optical coherence tomography (OCT) images generated by graph-search theory algorithm-based custom software and examine correlation with other imaging modalities. Methods En face OCT images derived from high density OCT volumetric scans of 3 healthy subjects and 4 patients using a custom algorithm (graph-search theory) and commercial software (Heidelberg Eye Explorer software (Heidelberg Engineering)) were compared and correlated with near infrared reflectance, fundus autofluorescence, adaptive optics flood-illumination ophthalmoscopy (AO-FIO) and microperimetry. Results Commercial software was unable to generate accurate en face OCT images in eyes with retinal pigment epithelium (RPE) pathology due to segmentation error at the level of Bruch’s membrane (BM). Accurate segmentation of the basal RPE and BM was achieved using custom software. The en face OCT images from eyes with isolated interdigitation or ellipsoid zone pathology were of similar quality between custom software and Heidelberg Eye Explorer software in the absence of any other significant outer retinal pathology. En face OCT images demonstrated angioid streaks, lesions of acute macular neuroretinopathy, hydroxychloroquine toxicity and Bietti crystalline deposits that correlated with other imaging modalities. Conclusions Graph-search theory algorithm helps to overcome the limitations of outer retinal segmentation inaccuracies in commercial software. En face OCT images can provide detailed topography of the reflectivity within a specific layer of the retina which correlates with other forms of fundus imaging. Our results highlight the need for standardization of image reflectivity to facilitate quantification of en face OCT images and longitudinal analysis. PMID:27959968

  12. Quantitative Evaluation of Adult Subglottic Stenosis Using Intraoperative Long-range Optical Coherence Tomography

    PubMed Central

    Sharma, Giriraj K.; Loy, Anthony Chin; Su, Erica; Jing, Joe; Chen, Zhongping; Wong, Brian J-F.; Verma, Sunil

    2016-01-01

    Objectives To determine the feasibility of long-range optical coherence tomography (LR-OCT) as a tool to intraoperatively image and measure the subglottis and trachea during suspension microlaryngoscopy before and after endoscopic treatment of subglottic stenosis (SGS). Methods Long-range optical coherence tomography of the adult subglottis and trachea was performed during suspension microlaryngoscopy before and after endoscopic treatment for SGS. The anteroposterior and transverse diameters, cross-sectional area (CSA), distance from the vocal cords, and length of the SGS were measured using a MATLAB software. Pre-intervention and postintervention airway dimensions were compared. Three-dimensional volumetric airway reconstructions were generated using medical image processing software (MIMICS). Results Intraoperative LR-OCT imaging was performed in 3 patients undergoing endoscopic management of SGS. Statistically significant differences in mean anteroposterior diameter (P < .01), transverse diameter (P < .001), and CSA (P < .001) were noted between pre-intervention and postintervention data. Three-dimensional airway models were viewed in cross-sectional format and via virtual “fly through” bronchoscopy. Conclusions This is the first report of intraoperative LR-OCT of the subglottic and tracheal airway before and after surgical management of SGS in humans. Long-range optical coherence tomography offers a practical means to measure the dimensions of SGS and acquire objective data on the response to endoscopic treatment of SGS. PMID:27354215

  13. A-7E Software Module Guide.

    DTIC Science & Technology

    1981-12-08

    o 14 A& B:2.1 Function Driver Module.. ..... .... 14’ ’: B:2.2 Shared Services Module . . . o o . 0 -15 M’ 5:3 Software Decision Module...2.1.13 Weapon Release Functions... ........24 C:2.l.14 Ground Test Functions .. ........... 24 C:2.2 Shared Services Module Decomposition. ........24 C...Driver (FD) Module supported by a Shared Services (SS) Module. B:2.1 FUNCTION DRIVER MODULE The Function Driver Module consists of a set of individual

  14. NSF Policies on Software and Data Sharing and their Implementation

    NASA Astrophysics Data System (ADS)

    Katz, Daniel

    2014-01-01

    Since January 2011, the National Science Foundation has required a Data Management plan to be submitted with all proposals. This plan should include a description of how the proposers will share the products of the research (http://www.nsf.gov/bfa/dias/policy/dmp.jsp). What constitutes such data will be determined by the community of interest through the process of peer review and program management. This may include, but is not limited to: data, publications, samples, physical collections, software and models. In particular, “investigators and grantees are encouraged to share software and inventions created under an award or otherwise make them or their products widely available and usable.”

  15. Evaluation of a New Software Version of the RTVue Optical Coherence Tomograph for Image Segmentation and Detection of Glaucoma in High Myopia.

    PubMed

    Holló, Gábor; Shu-Wei, Hsu; Naghizadeh, Farzaneh

    2016-06-01

    To compare the current (6.3) and a novel software version (6.12) of the RTVue-100 optical coherence tomograph (RTVue-OCT) for ganglion cell complex (GCC) and retinal nerve fiber layer thickness (RNFLT) image segmentation and detection of glaucoma in high myopia. RNFLT and GCC scans were acquired with software version 6.3 of the RTVue-OCT on 51 highly myopic eyes (spherical refractive error ≤-6.0 D) of 51 patients, and were analyzed with both the software versions. Twenty-two eyes were nonglaucomatous, 13 were ocular hypertensive and 16 eyes had glaucoma. No difference was seen for any RNFLT, and average GCC parameter between the software versions (paired t test, P≥0.084). Global loss volume was significantly lower (more normal) with version 6.12 than with version 6.3 (Wilcoxon signed-rank test, P<0.001). The percentage agreement (κ) between the clinical (normal and ocular hypertensive vs. glaucoma) and the software-provided classifications (normal and borderline vs. outside normal limits) were 0.3219 and 0.4442 for average RNFLT, and 0.2926 and 0.4977 for average GCC with versions 1 and 2, respectively (McNemar symmetry test, P≥0.289). No difference in average RNFLT and GCC classification (McNemar symmetry test, P≥0.727) and the number of eyes with at least 1 segmentation error (P≥0.109) was found between the software versions, respectively. Although GCC segmentation was improved with software version 6.12 compared with the current version in highly myopic eyes, this did not result in a significant change of the average RNFLT and GCC values, and did not significantly improve the software-provided classification for glaucoma.

  16. Seqcrawler: biological data indexing and browsing platform.

    PubMed

    Sallou, Olivier; Bretaudeau, Anthony; Roult, Aurelien

    2012-07-24

    Seqcrawler takes its roots in software like SRS or Lucegene. It provides an indexing platform to ease the search of data and meta-data in biological banks and it can scale to face the current flow of data. While many biological bank search tools are available on the Internet, mainly provided by large organizations to search their data, there is a lack of free and open source solutions to browse one's own set of data with a flexible query system and able to scale from a single computer to a cloud system. A personal index platform will help labs and bioinformaticians to search their meta-data but also to build a larger information system with custom subsets of data. The software is scalable from a single computer to a cloud-based infrastructure. It has been successfully tested in a private cloud with 3 index shards (pieces of index) hosting ~400 millions of sequence information (whole GenBank, UniProt, PDB and others) for a total size of 600 GB in a fault tolerant architecture (high-availability). It has also been successfully integrated with software to add extra meta-data from blast results to enhance users' result analysis. Seqcrawler provides a complete open source search and store solution for labs or platforms needing to manage large amount of data/meta-data with a flexible and customizable web interface. All components (search engine, visualization and data storage), though independent, share a common and coherent data system that can be queried with a simple HTTP interface. The solution scales easily and can also provide a high availability infrastructure.

  17. It's Time to Consider Open Source Software

    ERIC Educational Resources Information Center

    Pfaffman, Jay

    2007-01-01

    In 1985 Richard Stallman, a computer programmer, released "The GNU Manifesto" in which he proclaimed a golden rule: One must share computer programs. Software vendors required him to agree to license agreements that forbade sharing programs with others, but he refused to "break solidarity" with other computer users whom he assumed also wanted to…

  18. A Team Building Model for Software Engineering Courses Term Projects

    ERIC Educational Resources Information Center

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  19. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  20. Effects of cacheing on multitasking efficiency and programming strategy on an ELXSI 6400

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montry, G.R.; Benner, R.E.

    1985-12-01

    The impact of a cache/shared memory architecture, and, in particular, the cache coherency problem, upon concurrent algorithm and program development is discussed. In this context, a simple set of programming strategies are proposed which streamline code development and improve code performance when multitasking in a cache/shared memory or distributed memory environment.

  1. Dataworks for GNSS: Software for Supporting Data Sharing and Federation of Geodetic Networks

    NASA Astrophysics Data System (ADS)

    Boler, F. M.; Meertens, C. M.; Miller, M. M.; Wier, S.; Rost, M.; Matykiewicz, J.

    2015-12-01

    Continuously-operating Global Navigation Satellite System (GNSS) networks are increasingly being installed globally for a wide variety of science and societal applications. GNSS enables Earth science research in areas including tectonic plate interactions, crustal deformation in response to loading by tectonics, magmatism, water and ice, and the dynamics of water - and thereby energy transfer - in the atmosphere at regional scale. The many individual scientists and organizations that set up GNSS stations globally are often open to sharing data, but lack the resources or expertise to deploy systems and software to manage and curate data and metadata and provide user tools that would support data sharing. UNAVCO previously gained experience in facilitating data sharing through the NASA-supported development of the Geodesy Seamless Archive Centers (GSAC) open source software. GSAC provides web interfaces and simple web services for data and metadata discovery and access, supports federation of multiple data centers, and simplifies transfer of data and metadata to long-term archives. The NSF supported the dissemination of GSAC to multiple European data centers forming the European Plate Observing System. To expand upon GSAC to provide end-to-end, instrument-to-distribution capability, UNAVCO developed Dataworks for GNSS with NSF funding to the COCONet project, and deployed this software on systems that are now operating as Regional GNSS Data Centers as part of the NSF-funded TLALOCNet and COCONet projects. Dataworks consists of software modules written in Python and Java for data acquisition, management and sharing. There are modules for GNSS receiver control and data download, a database schema for metadata, tools for metadata handling, ingest software to manage file metadata, data file management scripts, GSAC, scripts for mirroring station data and metadata from partner GSACs, and extensive software and operator documentation. UNAVCO plans to provide a cloud VM image of Dataworks that would allow standing up a Dataworks-enabled GNSS data center without requiring upfront investment in server hardware. By enabling data creators to organize their data and metadata for sharing, Dataworks helps scientists expand their data curation awareness and responsibility, and enhances data access for all.

  2. Sharing programming resources between Bio* projects through remote procedure call and native call stack strategies.

    PubMed

    Prins, Pjotr; Goto, Naohisa; Yates, Andrew; Gautier, Laurent; Willis, Scooter; Fields, Christopher; Katayama, Toshiaki

    2012-01-01

    Open-source software (OSS) encourages computer programmers to reuse software components written by others. In evolutionary bioinformatics, OSS comes in a broad range of programming languages, including C/C++, Perl, Python, Ruby, Java, and R. To avoid writing the same functionality multiple times for different languages, it is possible to share components by bridging computer languages and Bio* projects, such as BioPerl, Biopython, BioRuby, BioJava, and R/Bioconductor. In this chapter, we compare the two principal approaches for sharing software between different programming languages: either by remote procedure call (RPC) or by sharing a local call stack. RPC provides a language-independent protocol over a network interface; examples are RSOAP and Rserve. The local call stack provides a between-language mapping not over the network interface, but directly in computer memory; examples are R bindings, RPy, and languages sharing the Java Virtual Machine stack. This functionality provides strategies for sharing of software between Bio* projects, which can be exploited more often. Here, we present cross-language examples for sequence translation, and measure throughput of the different options. We compare calling into R through native R, RSOAP, Rserve, and RPy interfaces, with the performance of native BioPerl, Biopython, BioJava, and BioRuby implementations, and with call stack bindings to BioJava and the European Molecular Biology Open Software Suite. In general, call stack approaches outperform native Bio* implementations and these, in turn, outperform RPC-based approaches. To test and compare strategies, we provide a downloadable BioNode image with all examples, tools, and libraries included. The BioNode image can be run on VirtualBox-supported operating systems, including Windows, OSX, and Linux.

  3. Quantifying heterogeneity attributable to polythetic diagnostic criteria: theoretical framework and empirical application.

    PubMed

    Olbert, Charles M; Gala, Gary J; Tupler, Larry A

    2014-05-01

    Heterogeneity within psychiatric disorders is both theoretically and practically problematic: For many disorders, it is possible for 2 individuals to share very few or even no symptoms in common yet share the same diagnosis. Polythetic diagnostic criteria have long been recognized to contribute to this heterogeneity, yet no unified theoretical understanding of the coherence of symptom criteria sets currently exists. A general framework for analyzing the logical and mathematical structure, coherence, and diversity of Diagnostic and Statistical Manual diagnostic categories (DSM-5 and DSM-IV-TR) is proposed, drawing from combinatorial mathematics, set theory, and information theory. Theoretical application of this framework to 18 diagnostic categories indicates that in most categories, 2 individuals with the same diagnosis may share no symptoms in common, and that any 2 theoretically possible symptom combinations will share on average less than half their symptoms. Application of this framework to 2 large empirical datasets indicates that patients who meet symptom criteria for major depressive disorder and posttraumatic stress disorder tend to share approximately three-fifths of symptoms in common. For both disorders in each of the datasets, pairs of individuals who shared no common symptoms were observed. Any 2 individuals with either diagnosis were unlikely to exhibit identical symptomatology. The theoretical and empirical results stemming from this approach have substantive implications for etiological research into, and measurement of, psychiatric disorders.

  4. The Emergence of Open-Source Software in China

    ERIC Educational Resources Information Center

    Pan, Guohua; Bonk, Curtis J.

    2007-01-01

    The open-source software movement is gaining increasing momentum in China. Of the limited numbers of open-source software in China, "Red Flag Linux" stands out most strikingly, commanding 30 percent share of Chinese software market. Unlike the spontaneity of open-source movement in North America, open-source software development in…

  5. Software Simplifies the Sharing of Numerical Models

    NASA Technical Reports Server (NTRS)

    2014-01-01

    To ease the sharing of climate models with university students, Goddard Space Flight Center awarded SBIR funding to Reston, Virginia-based Parabon Computation Inc., a company that specializes in cloud computing. The firm developed a software program capable of running climate models over the Internet, and also created an online environment for people to collaborate on developing such models.

  6. Exploring Persona-Scenarios - Using Storytelling to Create Design Ideas

    NASA Astrophysics Data System (ADS)

    Madsen, Sabine; Nielsen, Lene

    This paper explores the persona-scenario method by investigating how the method can support project participants in generating shared understandings and design ideas. As persona-scenarios are stories we draw on narrative theory to define what a persona-scenario is and which narrative elements it should consist of. Based on an empirical study a key finding is that despite our inherent human ability to construct, tell, and interpret stories it is not easy to write and present a good, coherent, and design-oriented story without methodical support. The paper therefore contributes with guidelines that delineate a) what a design-oriented persona-scenario should consist of (product) and b) how to write it (procedure) in order to generate and validate as many, new, and shared understandings and design ideas as possible (purpose). The purpose of the guidelines is to facilitate the construction of persona-scenarios as good, coherent stories, which make sense to the storytellers and to the audience - and which therefore generate many, new, and shared understandings and design ideas.

  7. Sharing Research Models: Using Software Engineering Practices for Facilitation

    PubMed Central

    Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.

    2011-01-01

    Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780

  8. OntoSoft: A Software Registry for Geosciences

    NASA Astrophysics Data System (ADS)

    Garijo, D.; Gil, Y.

    2017-12-01

    The goal of the EarthCube OntoSoft project is to enable the creation of an ecosystem for software stewardship in geosciences that will empower scientists to manage their software as valuable scientific assets. By sharing software metadata in OntoSoft, scientists enable broader access to that software by other scientists, software professionals, students, and decision makers. Our work to date includes: 1) an ontology for describing scientific software metadata, 2) a distributed scientific software repository that contains more than 750 entries that can be searched and compared across metadata fields, 3) an intelligent user interface that guides scientists to publish software and allows them to crowdsource its corresponding metadata. We have also developed a training program where scientists learn to describe and cite software in their papers in addition to data and provenance, and we are using OntoSoft to show them the benefits of publishing their software metadata. This training program is part of a Geoscience Papers of the Future Initiative, where scientists are reflecting on their current practices, benefits and effort for sharing software and data. This journal paper can be submitted to a Special Section of the AGU Earth and Space Science Journal.

  9. Integration services to enable regional shared electronic health records.

    PubMed

    Oliveira, Ilídio C; Cunha, João P S

    2011-01-01

    eHealth is expected to integrate a comprehensive set of patient data sources into a coherent continuum, but implementations vary and Portugal is still lacking on electronic patient data sharing. In this work, we present a clinical information hub to aggregate multi-institution patient data and bridge the information silos. This integration platform enables a coherent object model, services-oriented applications development and a trust framework. It has been instantiated in the Rede Telemática de Saúde (www.RTSaude.org) to support a regional Electronic Health Record approach, fed dynamically from production systems at eight partner institutions, providing access to more than 11,000,000 care episodes, relating to over 350,000 citizens. The network has obtained the necessary clearance from the Portuguese data protection agency.

  10. Shared-resource computing for small research labs.

    PubMed

    Ackerman, M J

    1982-04-01

    A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.

  11. DSPSR: Digital Signal Processing Software for Pulsar Astronomy

    NASA Astrophysics Data System (ADS)

    van Straten, W.; Bailes, M.

    2010-10-01

    DSPSR, written primarily in C++, is an open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. The library implements an extensive range of modular algorithms for use in coherent dedispersion, filterbank formation, pulse folding, and other tasks. The software is installed and compiled using the standard GNU configure and make system, and is able to read astronomical data in 18 different file formats, including FITS, S2, CPSR, CPSR2, PuMa, PuMa2, WAPP, ASP, and Mark5.

  12. Computing in high-energy physics

    DOE PAGES

    Mount, Richard P.

    2016-05-31

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  13. Computing in high-energy physics

    NASA Astrophysics Data System (ADS)

    Mount, Richard P.

    2016-04-01

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Finally, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  14. Computing in high-energy physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mount, Richard P.

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  15. The role of universities in preparing graduates to use software in the financial services workplace

    NASA Astrophysics Data System (ADS)

    Tickle, Leonie; Kyng, Tim; Wood, Leigh N.

    2014-02-01

    The role of universities in preparing students to use spreadsheet and other technical software in the financial services workplace has been investigated through surveys of university graduates, university academics, and employers. It is found that graduates are less skilled users of software than employers would like, due at least in part to a lack of structured formal training opportunities in the workplace, and a lack of targeted, coherent learning opportunities at university. The widespread and heavy use of software in the workplace means that there is significant potential for productivity gains if universities and employers address these issues.

  16. Low-noise correlation measurements based on software-defined-radio receivers and cooled microwave amplifiers.

    PubMed

    Nieminen, Teemu; Lähteenmäki, Pasi; Tan, Zhenbing; Cox, Daniel; Hakonen, Pertti J

    2016-11-01

    We present a microwave correlation measurement system based on two low-cost USB-connected software defined radio dongles modified to operate as coherent receivers by using a common local oscillator. Existing software is used to obtain I/Q samples from both dongles simultaneously at a software tunable frequency. To achieve low noise, we introduce an easy low-noise solution for cryogenic amplification at 600-900 MHz based on single discrete HEMT with 21 dB gain and 7 K noise temperature. In addition, we discuss the quantization effects in a digital correlation measurement and determination of optimal integration time by applying Allan deviation analysis.

  17. Intercepted signals for ionospheric science

    NASA Astrophysics Data System (ADS)

    Lind, F. D.; Erickson, P. J.; Coster, A. J.; Foster, J. C.; Marchese, J. R.; Berkowitz, Z.; Sahr, J. D.

    2013-05-01

    The ISIS array (Intercepted Signals for Ionospheric Science) is a distributed, coherent software radio array designed for the study of geospace phenomena by observing the scatter of ambient radio frequency (RF) signals. ISIS data acquisition and analysis is performed using the MIDAS-M platform (Millstone Data Acquisition System - Mobile). Observations of RF signals can be performed between HF and L-band using the Array nodes and appropriate antennas. The deployment of the Array focuses on observations of the plasmasphere boundary layer. We discuss the concept of the coherent software radio array, describe the ISIS hardware, and give examples of data from the system for selected applications. In particular, we include the first observations of E region irregularities using the Array. We also present single-site passive radar observations of both meteor trails and E region irregularities using adaptive filtering techniques.

  18. Self-biased broadband magnet-free linear isolator based on one-way space-time coherency

    NASA Astrophysics Data System (ADS)

    Taravati, Sajjad

    2017-12-01

    This paper introduces a self-biased broadband magnet-free and linear isolator based on one-way space-time coherency. The incident wave and the space-time-modulated medium share the same temporal frequency and are hence temporally coherent. However, thanks to the unidirectionally of the space-time modulation, the space-time-modulated medium and the incident wave are spatially coherent only in the forward direction and not in the opposite direction. As a consequence, the energy of the medium strongly couples to the propagating wave in the forward direction, while it conflicts with the propagating wave in the opposite direction, yielding strong isolation. We first derive a closed-form solution for the wave scattering from a spatiotemporally coherent medium and then show that a perfectly coherent space-time-modulated medium provides a moderate isolation level which is also subject to one-way transmission gain. To overcome this issue, we next investigate the effect of space-coherency imperfection between the medium and the wave, while they are still perfectly temporally coherent. Leveraging the spatial-coherency imperfection, the medium exhibits a quasiarbitrary and strong nonreciprocal transmission. Finally, we present the experimental demonstration of the self-biased version of the proposed broadband isolator, exhibiting more than 122 % fractional operation bandwidth.

  19. IDATEN and G-SITENNO: GUI-assisted software for coherent X-ray diffraction imaging experiments and data analyses at SACLA.

    PubMed

    Sekiguchi, Yuki; Yamamoto, Masaki; Oroguchi, Tomotaka; Takayama, Yuki; Suzuki, Shigeyuki; Nakasako, Masayoshi

    2014-11-01

    Using our custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors, cryogenic coherent X-ray diffraction imaging experiments have been undertaken at the SPring-8 Angstrom Compact free electron LAser (SACLA) facility. To efficiently perform experiments and data processing, two software suites with user-friendly graphical user interfaces have been developed. The first is a program suite named IDATEN, which was developed to easily conduct four procedures during experiments: aligning KOTOBUKI-1, loading a flash-cooled sample into the cryogenic goniometer stage inside the vacuum chamber of KOTOBUKI-1, adjusting the sample position with respect to the X-ray beam using a pair of telescopes, and collecting diffraction data by raster scanning the sample with X-ray pulses. Named G-SITENNO, the other suite is an automated version of the original SITENNO suite, which was designed for processing diffraction data. These user-friendly software suites are now indispensable for collecting a large number of diffraction patterns and for processing the diffraction patterns immediately after collecting data within a limited beam time.

  20. An alternative model to distribute VO software to WLCG sites based on CernVM-FS: a prototype at PIC Tier1

    NASA Astrophysics Data System (ADS)

    Lanciotti, E.; Merino, G.; Bria, A.; Blomer, J.

    2011-12-01

    In a distributed computing model as WLCG the software of experiment specific application software has to be efficiently distributed to any site of the Grid. Application software is currently installed in a shared area of the site visible for all Worker Nodes (WNs) of the site through some protocol (NFS, AFS or other). The software is installed at the site by jobs which run on a privileged node of the computing farm where the shared area is mounted in write mode. This model presents several drawbacks which cause a non-negligible rate of job failure. An alternative model for software distribution based on the CERN Virtual Machine File System (CernVM-FS) has been tried at PIC, the Spanish Tierl site of WLCG. The test bed used and the results are presented in this paper.

  1. A Role-Playing Game for a Software Engineering Lab: Developing a Product Line

    ERIC Educational Resources Information Center

    Zuppiroli, Sara; Ciancarini, Paolo; Gabbrielli, Maurizio

    2012-01-01

    Software product line development refers to software engineering practices and techniques for creating families of similar software systems from a basic set of reusable components, called shared assets. Teaching how to deal with software product lines in a university lab course is a challenging task, because there are several practical issues that…

  2. Bioboxes: standardised containers for interchangeable bioinformatics software.

    PubMed

    Belmann, Peter; Dröge, Johannes; Bremges, Andreas; McHardy, Alice C; Sczyrba, Alexander; Barton, Michael D

    2015-01-01

    Software is now both central and essential to modern biology, yet lack of availability, difficult installations, and complex user interfaces make software hard to obtain and use. Containerisation, as exemplified by the Docker platform, has the potential to solve the problems associated with sharing software. We propose bioboxes: containers with standardised interfaces to make bioinformatics software interchangeable.

  3. Monte Carlo simulation for coherent backscattering with diverging illumination (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wu, Wenli; Radosevich, Andrew J.; Eshein, Adam; Nguyen, The-Quyen; Backman, Vadim

    2016-03-01

    Diverging beam illumination is widely used in many optical techniques especially in fiber optic applications and coherence phenomenon is one of the most important properties to consider for these applications. Until now, people have used Monte Carlo simulations to study the backscattering coherence phenomenon in collimated beam illumination only. We are the first one to study the coherence phenomenon under the exact diverging beam geometry by taking into account the impossibility of the existence for the exact time-reversed path pairs of photons, which is the main contribution to the backscattering coherence pattern in collimated beam. In this work, we present a Monte Carlo simulation that considers the influence of the illumination numerical aperture. The simulation tracks the electric field for the unique paths of forward path and reverse path in time-reversed pairs of photons as well as the same path shared by them. With this approach, we can model the coherence pattern formed between the pairs by considering their phase difference at the collection plane directly. To validate this model, we use the Low-coherence Enhanced Backscattering Spectroscopy, one of the instruments looking at the coherence pattern using diverging beam illumination, as the benchmark to compare with. In the end, we show how this diverging configuration would significantly change the coherent pattern under coherent light source and incoherent light source. This Monte Carlo model we developed can be used to study the backscattering phenomenon in both coherence and non-coherence situation with both collimated beam and diverging beam setups.

  4. Automated sensor networks to advance ocean science

    NASA Astrophysics Data System (ADS)

    Schofield, O.; Orcutt, J. A.; Arrott, M.; Vernon, F. L.; Peach, C. L.; Meisinger, M.; Krueger, I.; Kleinert, J.; Chao, Y.; Chien, S.; Thompson, D. R.; Chave, A. D.; Balasuriya, A.

    2010-12-01

    The National Science Foundation has funded the Ocean Observatories Initiative (OOI), which over the next five years will deploy infrastructure to expand scientist’s ability to remotely study the ocean. The deployed infrastructure will be linked by a robust cyberinfrastructure (CI) that will integrate marine observatories into a coherent system-of-systems. OOI is committed to engaging the ocean sciences community during the construction pahse. For the CI, this is being enabled by using a “spiral design strategy” allowing for input throughout the construction phase. In Fall 2009, the OOI CI development team used an existing ocean observing network in the Mid-Atlantic Bight (MAB) to test OOI CI software. The objective of this CI test was to aggregate data from ships, autonomous underwater vehicles (AUVs), shore-based radars, and satellites and make it available to five different data-assimilating ocean forecast models. Scientists used these multi-model forecasts to automate future glider missions in order to demonstrate the feasibility of two-way interactivity between the sensor web and predictive models. The CI software coordinated and prioritized the shared resources that allowed for the semi-automated reconfiguration of assett-tasking, and thus enabled an autonomous execution of observation plans for the fixed and mobile observation platforms. Efforts were coordinated through a web portal that provided an access point for the observational data and model forecasts. Researchers could use the CI software in tandem with the web data portal to assess the performance of individual numerical model results, or multi-model ensembles, through real-time comparisons with satellite, shore-based radar, and in situ robotic measurements. The resulting sensor net will enable a new means to explore and study the world’s oceans by providing scientists a responsive network in the world’s oceans that can be accessed via any wireless network.

  5. Developing a Coherent Research Agenda: Lessons from the REL Northeast & Islands Research Agenda Workshops. REL 2014-014

    ERIC Educational Resources Information Center

    Kochanek, Julie Reed; Lacireno-Paquet, Natalie; Carey, Rebecca

    2014-01-01

    This report describes the approach that REL Northeast and Islands (REL-NEI) used to guide its eight research alliances toward collaboratively identifying a shared research agenda. A key feature of their approach was a two-workshop series, during which alliance members created a set of research questions on a shared topic of education policy and/or…

  6. Optical Coherence Tomography Angiography in Optic Disc Swelling.

    PubMed

    Fard, Masoud Aghsaei; Jalili, Jalil; Sahraiyan, Alireza; Khojasteh, Hassan; Hejazi, Marjane; Ritch, Robert; Subramanian, Prem S

    2018-05-04

    To compare optical coherence tomography angiography (OCT-A) of peripapillary total vasculature and capillaries in patients with optic disc swelling. Cross-sectional study. Twenty nine eyes with acute nonarteritic anterior ischemic optic neuropathy (NAION), 44 eyes with papilledema, 8 eyes with acute optic neuritis, and 48 eyes of normal subjects were imaged using OCT-A. Peripapillary total vasculature information was recorded using a commercial vessel density map. Customized image analysis with major vessel removal was also used to measure whole-image capillary density and peripapillary capillary density (PCD). Mixed models showed that the peripapillary total vasculature density values were significantly lower in NAION eyes, followed by papilledema eyes and control eyes, using commercial software (P < .0001 for all comparisons). The customized software also showed significantly lower PCD of NAION eyes compared with papilledema eyes (all P < .001), but did not show significant differences between papilledema and control subjects. Our software showed significantly lower whole image and PCD in eyes with optic neuritis than papilledema. There was no significant difference between NAION and optic neuritis using our software. The area under the receiver operating curves for discriminating NAION from papilledema eyes and optic neuritis from papilledema eyes was highest for whole-image capillary density (0.94 and 0.80, respectively) with our software, followed by peripapillary total vasculature (0.9 and 0.74, respectively ) with commercial software. OCT-A is helpful to distinguish NAION and papillitis from papilledema. Whole-image capillary density had the greatest diagnostic accuracy for differentiating disc swelling. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. R-CMap-An open-source software for concept mapping.

    PubMed

    Bar, Haim; Mentch, Lucas

    2017-02-01

    Planning and evaluating projects often involves input from many stakeholders. Fusing and organizing many different ideas, opinions, and interpretations into a coherent and acceptable plan or project evaluation is challenging. This is especially true when seeking contributions from a large number of participants, especially when not all can participate in group discussions, or when some prefer to contribute their perspectives anonymously. One of the major breakthroughs in the area of evaluation and program planning has been the use of graphical tools to represent the brainstorming process. This provides a quantitative framework for organizing ideas and general concepts into simple-to-interpret graphs. We developed a new, open-source concept mapping software called R-CMap, which is implemented in R. This software provides a graphical user interface to guide users through the analytical process of concept mapping. The R-CMap software allows users to generate a variety of plots, including cluster maps, point rating and cluster rating maps, as well as pattern matching and go-zone plots. Additionally, R-CMap is capable of generating detailed reports that contain useful statistical summaries of the data. The plots and reports can be embedded in Microsoft Office tools such as Word and PowerPoint, where users may manually adjust various plot and table features to achieve the best visual results in their presentations and official reports. The graphical user interface of R-CMap allows users to define cluster names, change the number of clusters, select rating variables for relevant plots, and importantly, select subsets of respondents by demographic criteria. The latter is particularly useful to project managers in order to identify different patterns of preferences by subpopulations. R-CMap is user-friendly, and does not require any programming experience. However, proficient R users can add to its functionality by directly accessing built-in functions in R and sharing new features with the concept mapping community. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Bandwidth scalable, coherent transmitter based on the parallel synthesis of multiple spectral slices using optical arbitrary waveform generation.

    PubMed

    Geisler, David J; Fontaine, Nicolas K; Scott, Ryan P; He, Tingting; Paraschis, Loukas; Gerstel, Ori; Heritage, Jonathan P; Yoo, S J B

    2011-04-25

    We demonstrate an optical transmitter based on dynamic optical arbitrary waveform generation (OAWG) which is capable of creating high-bandwidth (THz) data waveforms in any modulation format using the parallel synthesis of multiple coherent spectral slices. As an initial demonstration, the transmitter uses only 5.5 GHz of electrical bandwidth and two 10-GHz-wide spectral slices to create 100-ns duration, 20-GHz optical waveforms in various modulation formats including differential phase-shift keying (DPSK), quaternary phase-shift keying (QPSK), and eight phase-shift keying (8PSK) with only changes in software. The experimentally generated waveforms showed clear eye openings and separated constellation points when measured using a real-time digital coherent receiver. Bit-error-rate (BER) performance analysis resulted in a BER < 9.8 × 10(-6) for DPSK and QPSK waveforms. Additionally, we experimentally demonstrate three-slice, 4-ns long waveforms that highlight the bandwidth scalable nature of the optical transmitter. The various generated waveforms show that the key transmitter properties (i.e., packet length, modulation format, data rate, and modulation filter shape) are software definable, and that the optical transmitter is capable of acting as a flexible bandwidth transmitter.

  9. Implementing the concurrent operation of sub-arrays in the ALMA correlator

    NASA Astrophysics Data System (ADS)

    Amestica, Rodrigo; Perez, Jesus; Lacasse, Richard; Saez, Alejandro

    2016-07-01

    The ALMA correlator processes the digitized signals from 64 individual antennas to produce a grand total of 2016 correlated base-lines, with runtime selectable lags resolution and integration time. The on-line software system can process a maximum of 125M visibilities per second, producing an archiving data rate close to one sixteenth of the former (7.8M visibilities per second with a network transfer limit of 60 MB/sec). Mechanisms in the correlator hardware design make it possible to split the total number of antennas in the array into smaller subsets, or sub-arrays, such that they can share correlator resources while executing independent observations. The software part of the sub-system is responsible for configuring and scheduling correlator resources in such a way that observations among independent subarrays occur simultaneously while internally sharing correlator resources under a cooperative arrangement. Configuration of correlator modes through its CAN-bus interface and periodic geometric delay updates are the most relevant activities to schedule concurrently while observations happen at the same time among a number of sub-arrays. For that to work correctly, the software interface to sub-arrays schedules shared correlator resources sequentially before observations actually start on each sub-array. Start times for specific observations are optimized and reported back to the higher level observing software. After that initial sequential phase has taken place then simultaneous executions and recording of correlated data across different sub-arrays move forward concurrently, sharing the local network to broadcast results to other software sub-systems. The present paper presents an overview of the different hardware and software actors within the correlator sub-system that implement some degree of concurrency and synchronization needed for seamless and simultaneous operation of multiple sub-arrays, limitations stemming from the resource-sharing nature of the correlator, limitations intrinsic to the digital technology available in the correlator hardware, and milestones so far reached by this new ALMA feature.

  10. Distributed metadata servers for cluster file systems using shared low latency persistent key-value metadata store

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Pedone, Jr., James M.

    A cluster file system is provided having a plurality of distributed metadata servers with shared access to one or more shared low latency persistent key-value metadata stores. A metadata server comprises an abstract storage interface comprising a software interface module that communicates with at least one shared persistent key-value metadata store providing a key-value interface for persistent storage of key-value metadata. The software interface module provides the key-value metadata to the at least one shared persistent key-value metadata store in a key-value format. The shared persistent key-value metadata store is accessed by a plurality of metadata servers. A metadata requestmore » can be processed by a given metadata server independently of other metadata servers in the cluster file system. A distributed metadata storage environment is also disclosed that comprises a plurality of metadata servers having an abstract storage interface to at least one shared persistent key-value metadata store.« less

  11. Software Sharing Enables Smarter Content Management

    NASA Technical Reports Server (NTRS)

    2007-01-01

    In 2004, NASA established a technology partnership with Xerox Corporation to develop high-tech knowledge management systems while providing new tools and applications that support the Vision for Space Exploration. In return, NASA provides research and development assistance to Xerox to progress its product line. The first result of the technology partnership was a new system called the NX Knowledge Network (based on Xerox DocuShare CPX). Created specifically for NASA's purposes, this system combines Netmark-practical database content management software created by the Intelligent Systems Division of NASA's Ames Research Center-with complementary software from Xerox's global research centers and DocuShare. NX Knowledge Network was tested at the NASA Astrobiology Institute, and is widely used for document management at Ames, Langley Research Center, within the Mission Operations Directorate at Johnson Space Center, and at the Jet Propulsion Laboratory, for mission-related tasks.

  12. Free Software and Free Textbooks

    ERIC Educational Resources Information Center

    Takhteyev, Yuri

    2012-01-01

    Some of the world's best and most sophisticated software is distributed today under "free" or "open source" licenses, which allow the recipients of such software to use, modify, and share it without paying royalties or asking for permissions. If this works for software, could it also work for educational resources, such as books? The economics of…

  13. bioboxes v510

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barton, Michael; Droge, Johannes; Belmann, Peter

    2017-06-22

    Software is now both central and essential to modern biology, yet lack of availability, difficult installations, and complex user interfaces make software hard to obtain and use. Containerisation, as exemplified by the Docker platform, has the potential to solve the problems associated with sharing software. The developers propose bioboxes: containers with standardised interfaces to make bioinformatics software interchangeable.

  14. Specification for Visual Requirements of Work-Centered Software Systems

    DTIC Science & Technology

    2006-10-01

    no person shall be subject to any penity for fallng to comply wih a collection of N ilo ration it does not display a currenty valid OMB control...work- aiding systems. Based on the design concept for a work- centered support system (WCSS), these software systems support user tasks and goals...through both direct and indirect aiding methods within the interface client. In order to ensure the coherent development and delivery of work- centered

  15. Challenges to Software/Computing for Experimentation at the LHC

    NASA Astrophysics Data System (ADS)

    Banerjee, Sunanda

    The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.

  16. Mechanical design of thin-film diamond crystal mounting apparatus for coherence preservation hard x-ray optics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Deming, E-mail: shu@aps.anl.gov; Shvyd’ko, Yuri V.; Stoupin, Stanislav

    2016-07-27

    A new thin-film diamond crystal mounting apparatus has been designed at the Advanced Photon Source (APS) for coherence preservation hard x-ray optics with optimized thermal contact and minimized crystal strain. This novel mechanical design can be applied to new development in the field of: x-ray optics cavities for hard x-ray free-electron laser oscillators (XFELOs), self-seeding monochromators for hard x-ray free-electron laser (XFEL) with high average thermal loading, high heat load diamond crystal monochromators and beam-sharing/beam-split-and-delay devices for XFEL facilities and future upgraded high-brightness coherent x-ray source in the MBA lattice configuration at the APS.

  17. Using Technology to Facilitate Collaboration in Community-Based Participatory Research (CBPR)

    PubMed Central

    Jessell, Lauren; Smith, Vivian; Jemal, Alexis; Windsor, Liliane

    2017-01-01

    This study explores the use of Computer-Supported Collaborative Work (CSCW) technologies, by way of a computer-based system called iCohere. This system was used to facilitate collaboration conducting Community-Based Participatory Research (CBPR). Data was gathered from 13 members of a Community Collaborative Board (CCB). Analysis revealed that iCohere served the following functions: facilitating communication, providing a depository for information and resource sharing, and allowing for remote meeting attendance. Results indicated that while iCohere was useful in performing these functions, less expensive technologies had the potential to achieve similar goals if properly implemented. Implications for future research on CSCW systems and CBPR are discussed. PMID:29056871

  18. Optically buffered Jones-matrix-based multifunctional optical coherence tomography with polarization mode dispersion correction

    PubMed Central

    Hong, Young-Joo; Makita, Shuichi; Sugiyama, Satoshi; Yasuno, Yoshiaki

    2014-01-01

    Polarization mode dispersion (PMD) degrades the performance of Jones-matrix-based polarization-sensitive multifunctional optical coherence tomography (JM-OCT). The problem is specially acute for optically buffered JM-OCT, because the long fiber in the optical buffering module induces a large amount of PMD. This paper aims at presenting a method to correct the effect of PMD in JM-OCT. We first mathematically model the PMD in JM-OCT and then derive a method to correct the PMD. This method is a combination of simple hardware modification and subsequent software correction. The hardware modification is introduction of two polarizers which transform the PMD into global complex modulation of Jones matrix. Subsequently, the software correction demodulates the global modulation. The method is validated with an experimentally obtained point spread function with a mirror sample, as well as by in vivo measurement of a human retina. PMID:25657888

  19. SSBD: a database of quantitative data of spatiotemporal dynamics of biological phenomena

    PubMed Central

    Tohsato, Yukako; Ho, Kenneth H. L.; Kyoda, Koji; Onami, Shuichi

    2016-01-01

    Motivation: Rapid advances in live-cell imaging analysis and mathematical modeling have produced a large amount of quantitative data on spatiotemporal dynamics of biological objects ranging from molecules to organisms. There is now a crucial need to bring these large amounts of quantitative biological dynamics data together centrally in a coherent and systematic manner. This will facilitate the reuse of this data for further analysis. Results: We have developed the Systems Science of Biological Dynamics database (SSBD) to store and share quantitative biological dynamics data. SSBD currently provides 311 sets of quantitative data for single molecules, nuclei and whole organisms in a wide variety of model organisms from Escherichia coli to Mus musculus. The data are provided in Biological Dynamics Markup Language format and also through a REST API. In addition, SSBD provides 188 sets of time-lapse microscopy images from which the quantitative data were obtained and software tools for data visualization and analysis. Availability and Implementation: SSBD is accessible at http://ssbd.qbic.riken.jp. Contact: sonami@riken.jp PMID:27412095

  20. SSBD: a database of quantitative data of spatiotemporal dynamics of biological phenomena.

    PubMed

    Tohsato, Yukako; Ho, Kenneth H L; Kyoda, Koji; Onami, Shuichi

    2016-11-15

    Rapid advances in live-cell imaging analysis and mathematical modeling have produced a large amount of quantitative data on spatiotemporal dynamics of biological objects ranging from molecules to organisms. There is now a crucial need to bring these large amounts of quantitative biological dynamics data together centrally in a coherent and systematic manner. This will facilitate the reuse of this data for further analysis. We have developed the Systems Science of Biological Dynamics database (SSBD) to store and share quantitative biological dynamics data. SSBD currently provides 311 sets of quantitative data for single molecules, nuclei and whole organisms in a wide variety of model organisms from Escherichia coli to Mus musculus The data are provided in Biological Dynamics Markup Language format and also through a REST API. In addition, SSBD provides 188 sets of time-lapse microscopy images from which the quantitative data were obtained and software tools for data visualization and analysis. SSBD is accessible at http://ssbd.qbic.riken.jp CONTACT: sonami@riken.jp. © The Author 2016. Published by Oxford University Press.

  1. Comment on "High resolution coherence analysis between planetary and climate oscillations"

    NASA Astrophysics Data System (ADS)

    Holm, Sverre

    2018-07-01

    The paper by Scafetta entitled "High resolution coherence analysis between planetary and climate oscillations", May 2016 claims coherence between planetary movements and the global temperature anomaly. The claim is based on data analysis using the canonical covariance analysis (CCA) estimator for the magnitude squared coherence (MSC). It assumes a model with a predetermined number of sinusoids for the climate data. The results are highly dependent on this prior assumption, and may therefore be criticized for being based on the opposite of a null hypothesis. More importantly, since values of key parameters in the CCA method are not given, some experiments have been performed using the software of the original authors of the CCA estimator. The purpose was to replicate the results of Scafetta using what was perceived to be the most probable parameter values. Despite best efforts, this was not possible.

  2. Automated Change Detection for Synthetic Aperture Sonar

    DTIC Science & Technology

    2014-01-01

    channels, respectively. The canonical coordinates of x and y are defined as u = FHR−1/2xx x v = GHR−1/2yy y where F and G are the mapping matrices...containing the left and right singular vectors of the coherence matrix C, respectively. The canonical coordinate vectors u and v share the diagonal cross...feature set. The coherent change information between canonical coordinates v and u can be calculated using the residual, v −Ku, owing to the fact that

  3. The [Un]Spoken Challenges of Administrator Collaboration: An Exploration of One District Leadership Team's Use of Protocols to Promote Reflection and Shared Theories of Action

    ERIC Educational Resources Information Center

    Szczesiul, Stacy Agee

    2014-01-01

    This article explores the use of protocol-structured dialogue in promoting reflective practices and shared theories of action within a district leadership team. Protocols have been used to make individuals' theories of action visible and subject to evaluation. This is important for leaders trying to establish coherence across a system; in…

  4. A cache-aided multiprocessor rollback recovery scheme

    NASA Technical Reports Server (NTRS)

    Wu, Kun-Lung; Fuchs, W. Kent

    1989-01-01

    This paper demonstrates how previous uniprocessor cache-aided recovery schemes can be applied to multiprocessor architectures, for recovering from transient processor failures, utilizing private caches and a global shared memory. As with cache-aided uniprocessor recovery, the multiprocessor cache-aided recovery scheme of this paper can be easily integrated into standard bus-based snoopy cache coherence protocols. A consistent shared memory state is maintained without the necessity of global check-pointing.

  5. Development of wide band digital receiver for atmospheric radars using COTS board based SDR

    NASA Astrophysics Data System (ADS)

    Yasodha, Polisetti; Jayaraman, Achuthan; Thriveni, A.

    2016-07-01

    Digital receiver extracts the received echo signal information, and is a potential subsystem for atmospheric radar, also referred to as wind profiling radar (WPR), which provides the vertical profiles of 3-dimensional wind vector in the atmosphere. This paper presents the development of digital receiver using COTS board based Software Defined Radio technique, which can be used for atmospheric radars. The developmental work is being carried out at National Atmospheric Research Laboratory (NARL), Gadanki. The digital receiver consists of a commercially available software defined radio (SDR) board called as universal software radio peripheral B210 (USRP B210) and a personal computer. USRP B210 operates over a wider frequency range from 70 MHz to 6 GHz and hence can be used for variety of radars like Doppler weather radars operating in S/C bands, in addition to wind profiling radars operating in VHF, UHF and L bands. Due to the flexibility and re-configurability of SDR, where the component functionalities are implemented in software, it is easy to modify the software to receive the echoes and process them as per the requirement suitable for the type of the radar intended. Hence, USRP B210 board along with the computer forms a versatile digital receiver from 70 MHz to 6 GHz. It has an inbuilt direct conversion transceiver with two transmit and two receive channels, which can be operated in fully coherent 2x2 MIMO fashion and thus it can be used as a two channel receiver. Multiple USRP B210 boards can be synchronized using the pulse per second (PPS) input provided on the board, to configure multi-channel digital receiver system. RF gain of the transceiver can be varied from 0 to 70 dB. The board can be controlled from the computer via USB 3.0 interface through USRP hardware driver (UHD), which is an open source cross platform driver. The USRP B210 board is connected to the personal computer through USB 3.0. Reference (10 MHz) clock signal from the radar master oscillator is used to lock the board, which is essential for deriving Doppler information. Input from the radar analog receiver is given to one channel of USRP B210, which is down converted to baseband. 12-bit ADC present on the board digitizes the signal and produces I (in-phase) and Q (quadrature-phase) data. The maximum sampling rate possible is about 61 MSPS. The I and Q (time series) data is sent to PC via USB 3.0, where the signal processing is carried out. The online processing steps include decimation, range gating, decoding, coherent integration and FFT computation (optional). The processed data is then stored in the hard disk. C++ programming language is used for developing the real time signal processing. Shared memory along with multi threading is used to collect and process data simultaneously. Before implementing the real time operation, stand alone test of the board was carried out through GNU radio software and the base band output data obtained is found satisfactory. Later the board is integrated with the existing Lower Atmospheric Wind Profiling radar at NARL. The radar receive IF output at 70 MHz is given to the board and the real-time radar data is collected. The data is processed off-line and the range-doppler spectrum is obtained. Online processing software is under progress.

  6. What Librarians Still Don't Know about Free Software

    ERIC Educational Resources Information Center

    Chudnov, Daniel

    2009-01-01

    Free software isn't about cost, and it isn't about hype, it isn't about taking business away from vendors. It's about four kinds of freedom--the freedom to use the software for any purpose, the freedom to study how the software works, the freedom to modify the software to adapt it to one's needs, and the freedom to copy and share copies of the…

  7. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  8. IHE cross-enterprise document sharing for imaging: interoperability testing software

    PubMed Central

    2010-01-01

    Background With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties. PMID:20858241

  9. IHE cross-enterprise document sharing for imaging: interoperability testing software.

    PubMed

    Noumeir, Rita; Renaud, Bérubé

    2010-09-21

    With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.

  10. 10 CFR 603.550 - Acceptability of intellectual property.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... property (e.g., copyrighted material, including software) as cost sharing because: (1) It is difficult to... the contribution. For example, a for-profit firm may offer the use of commercially available software... the software would not be a reasonable basis for valuing its use. ...

  11. 10 CFR 603.550 - Acceptability of intellectual property.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... property (e.g., copyrighted material, including software) as cost sharing because: (1) It is difficult to... the contribution. For example, a for-profit firm may offer the use of commercially available software... the software would not be a reasonable basis for valuing its use. ...

  12. Collaborative Data Publication Utilizing the Open Data Repository's (ODR) Data Publisher

    NASA Technical Reports Server (NTRS)

    Stone, N.; Lafuente, B.; Bristow, T.; Keller, R. M.; Downs, R. T.; Blake, D.; Fonda, M.; Dateo, C.; Pires, A.

    2017-01-01

    Introduction: For small communities in diverse fields such as astrobiology, publishing and sharing data can be a difficult challenge. While large, homogenous fields often have repositories and existing data standards, small groups of independent researchers have few options for publishing standards and data that can be utilized within their community. In conjunction with teams at NASA Ames and the University of Arizona, the Open Data Repository's (ODR) Data Publisher has been conducting ongoing pilots to assess the needs of diverse research groups and to develop software to allow them to publish and share their data collaboratively. Objectives: The ODR's Data Publisher aims to provide an easy-to-use and implement software tool that will allow researchers to create and publish database templates and related data. The end product will facilitate both human-readable interfaces (web-based with embedded images, files, and charts) and machine-readable interfaces utilizing semantic standards. Characteristics: The Data Publisher software runs on the standard LAMP (Linux, Apache, MySQL, PHP) stack to provide the widest server base available. The software is based on Symfony (www.symfony.com) which provides a robust framework for creating extensible, object-oriented software in PHP. The software interface consists of a template designer where individual or master database templates can be created. A master database template can be shared by many researchers to provide a common metadata standard that will set a compatibility standard for all derivative databases. Individual researchers can then extend their instance of the template with custom fields, file storage, or visualizations that may be unique to their studies. This allows groups to create compatible databases for data discovery and sharing purposes while still providing the flexibility needed to meet the needs of scientists in rapidly evolving areas of research. Research: As part of this effort, a number of ongoing pilot and test projects are currently in progress. The Astrobiology Habitable Environments Database Working Group is developing a shared database standard using the ODR's Data Publisher and has a number of example databases where astrobiology data are shared. Soon these databases will be integrated via the template-based standard. Work with this group helps determine what data researchers in these diverse fields need to share and archive. Additionally, this pilot helps determine what standards are viable for sharing these types of data from internally developed standards to existing open standards such as the Dublin Core (http://dublincore.org) and Darwin Core (http://rs.twdg.org) metadata standards. Further studies are ongoing with the University of Arizona Department of Geosciences where a number of mineralogy databases are being constructed within the ODR Data Publisher system. Conclusions: Through the ongoing pilots and discussions with individual researchers and small research teams, a definition of the tools desired by these groups is coming into focus. As the software development moves forward, the goal is to meet the publication and collaboration needs of these scientists in an unobtrusive and functional way.

  13. Specific electrophysiological components disentangle affective sharing and empathic concern in psychopathy.

    PubMed

    Decety, Jean; Lewis, Kimberly L; Cowell, Jason M

    2015-07-01

    Empathic impairment is one of the hallmarks of psychopathy, a personality dimension associated with poverty in affective reactions, lack of attachment to others, and a callous disregard for the feelings, rights, and welfare of others. Neuroscience research on the relation between empathy and psychopathy has predominately focused on the affective sharing and cognitive components of empathy in forensic populations, and much less on empathic concern. The current study used high-density electroencephalography in a community sample to examine the spatiotemporal neurodynamic responses when viewing people in physical distress under two subjective contexts: one evoking affective sharing, the other, empathic concern. Results indicate that early automatic (175-275 ms) and later controlled responses (LPP 400-1,000 ms) were differentially modulated by engagement in affective sharing or empathic concern. Importantly, the late event-related potentials (ERP) component was significantly impacted by dispositional empathy and psychopathy, but the early component was not. Individual differences in dispositional empathic concern directly predicted gamma coherence (25-40 Hz), whereas psychopathy was inversely modulatory. Interestingly, significant suppression in the mu/alpha band (8-13 Hz) when perceiving others in distress was positively associated with higher trait psychopathy, which argues against the assumption that sensorimotor resonance underpins empathy. Greater scores on trait psychopathy were inversely related to subjective ratings of both empathic concern and affective sharing. Overall, the study demonstrates that neural markers of affective sharing and empathic concern to the same cues of another's distress can be distinguished at an electrophysiological level, and that psychopathy alters later time-locked differentiations and spectral coherence associated with empathic concern. Copyright © 2015 the American Physiological Society.

  14. Specific electrophysiological components disentangle affective sharing and empathic concern in psychopathy

    PubMed Central

    Lewis, Kimberly L.; Cowell, Jason M.

    2015-01-01

    Empathic impairment is one of the hallmarks of psychopathy, a personality dimension associated with poverty in affective reactions, lack of attachment to others, and a callous disregard for the feelings, rights, and welfare of others. Neuroscience research on the relation between empathy and psychopathy has predominately focused on the affective sharing and cognitive components of empathy in forensic populations, and much less on empathic concern. The current study used high-density electroencephalography in a community sample to examine the spatiotemporal neurodynamic responses when viewing people in physical distress under two subjective contexts: one evoking affective sharing, the other, empathic concern. Results indicate that early automatic (175–275 ms) and later controlled responses (LPP 400–1,000 ms) were differentially modulated by engagement in affective sharing or empathic concern. Importantly, the late event-related potentials (ERP) component was significantly impacted by dispositional empathy and psychopathy, but the early component was not. Individual differences in dispositional empathic concern directly predicted gamma coherence (25–40 Hz), whereas psychopathy was inversely modulatory. Interestingly, significant suppression in the mu/alpha band (8–13 Hz) when perceiving others in distress was positively associated with higher trait psychopathy, which argues against the assumption that sensorimotor resonance underpins empathy. Greater scores on trait psychopathy were inversely related to subjective ratings of both empathic concern and affective sharing. Overall, the study demonstrates that neural markers of affective sharing and empathic concern to the same cues of another's distress can be distinguished at an electrophysiological level, and that psychopathy alters later time-locked differentiations and spectral coherence associated with empathic concern. PMID:25948868

  15. GUIdock-VNC: using a graphical desktop sharing system to provide a browser-based interface for containerized software

    PubMed Central

    Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong

    2017-01-01

    Abstract Background: Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line–based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. Results: We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. Conclusions: As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. PMID:28327936

  16. GUIdock-VNC: using a graphical desktop sharing system to provide a browser-based interface for containerized software.

    PubMed

    Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee

    2017-04-01

    Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.

  17. Coherent fiber supercontinuum for biophotonics

    PubMed Central

    Tu, Haohua; Boppart, Stephen A.

    2013-01-01

    Biophotonics and nonlinear fiber optics have traditionally been two independent fields. Since the discovery of fiber-based supercontinuum generation in 1999, biophotonics applications employing incoherent light have experienced a large impact from nonlinear fiber optics, primarily because of the access to a wide range of wavelengths and a uniform spatial profile afforded by fiber supercontinuum. However, biophotonics applications employing coherent light have not benefited from the most well-known techniques of supercontinuum generation for reasons such as poor coherence (or high noise), insufficient controllability, and inadequate portability. Fortunately, a few key techniques involving nonlinear fiber optics and femtosecond laser development have emerged to overcome these critical limitations. Despite their relative independence, these techniques are the focus of this review, because they can be integrated into a low-cost portable biophotonics source platform. This platform can be shared across many different areas of research in biophotonics, enabling new applications such as point-of-care coherent optical biomedical imaging. PMID:24358056

  18. Spatial smoothing coherence factor for ultrasound computed tomography

    NASA Astrophysics Data System (ADS)

    Lou, Cuijuan; Xu, Mengling; Ding, Mingyue; Yuchi, Ming

    2016-04-01

    In recent years, many research studies have been carried out on ultrasound computed tomography (USCT) for its application prospect in early diagnosis of breast cancer. This paper applies four kinds of coherence-factor-like beamforming methods to improve the image quality of synthetic aperture focusing method for USCT, including the coherence-factor (CF), the phase coherence factor (PCF), the sign coherence factor (SCF) and the spatial smoothing coherence factor (SSCF) (proposed in our previous work). The performance of these methods was tested with simulated raw data which were generated by the ultrasound simulation software PZFlex 2014. The simulated phantom was set to be water of 4cm diameter with three nylon objects of different diameters inside. The ring-type transducer had 72 elements with a center frequency of 1MHz. The results show that all the methods can reveal the biggest nylon circle with the radius of 2.5mm. SSCF gets the highest SNR among the proposed methods and provides a more homogenous background. None of these methods can reveal the two smaller nylon circles with the radius of 0.75mm and 0.25mm. This may be due to the small number of elements.

  19. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  20. Platform-Independent Cirrus and Spectralis Thickness Measurements in Eyes with Diabetic Macular Edema Using Fully Automated Software

    PubMed Central

    Willoughby, Alex S.; Chiu, Stephanie J.; Silverman, Rachel K.; Farsiu, Sina; Bailey, Clare; Wiley, Henry E.; Ferris, Frederick L.; Jaffe, Glenn J.

    2017-01-01

    Purpose We determine whether the automated segmentation software, Duke Optical Coherence Tomography Retinal Analysis Program (DOCTRAP), can measure, in a platform-independent manner, retinal thickness on Cirrus and Spectralis spectral domain optical coherence tomography (SD-OCT) images in eyes with diabetic macular edema (DME) under treatment in a clinical trial. Methods Automatic segmentation software was used to segment the internal limiting membrane (ILM), inner retinal pigment epithelium (RPE), and Bruch's membrane (BM) in SD-OCT images acquired by Cirrus and Spectralis commercial systems, from the same eye, on the same day during a clinical interventional DME trial. Mean retinal thickness differences were compared across commercial and DOCTRAP platforms using intraclass correlation (ICC) and Bland-Altman plots. Results The mean 1 mm central subfield thickness difference (standard error [SE]) comparing segmentation of Spectralis images with DOCTRAP versus HEYEX was 0.7 (0.3) μm (0.2 pixels). The corresponding values comparing segmentation of Cirrus images with DOCTRAP versus Cirrus software was 2.2 (0.7) μm. The mean 1 mm central subfield thickness difference (SE) comparing segmentation of Cirrus and Spectralis scan pairs with DOCTRAP using BM as the outer retinal boundary was −2.3 (0.9) μm compared to 2.8 (0.9) μm with inner RPE as the outer boundary. Conclusions DOCTRAP segmentation of Cirrus and Spectralis images produces validated thickness measurements that are very similar to each other, and very similar to the values generated by the corresponding commercial software in eyes with treated DME. Translational Relevance This software enables automatic total retinal thickness measurements across two OCT platforms, a process that is impractical to perform manually. PMID:28180033

  1. Development of a Dynamic Time Sharing Scheduled Environment Final Report CRADA No. TC-824-94E

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jette, M.; Caliga, D.

    Massively parallel computers, such as the Cray T3D, have historically supported resource sharing solely with space sharing. In that method, multiple problems are solved by executing them on distinct processors. This project developed a dynamic time- and space-sharing scheduler to achieve greater interactivity and throughput than could be achieved with space-sharing alone. CRI and LLNL worked together on the design, testing, and review aspects of this project. There were separate software deliverables. CFU implemented a general purpose scheduling system as per the design specifications. LLNL ported the local gang scheduler software to the LLNL Cray T3D. In this approach, processorsmore » are allocated simultaneously to aU components of a parallel program (in a “gang”). Program execution is preempted as needed to provide for interactivity. Programs are also reIocated to different processors as needed to efficiently pack the computer’s torus of processors. In phase one, CRI developed an interface specification after discussions with LLNL for systemlevel software supporting a time- and space-sharing environment on the LLNL T3D. The two parties also discussed interface specifications for external control tools (such as scheduling policy tools, system administration tools) and applications programs. CRI assumed responsibility for the writing and implementation of all the necessary system software in this phase. In phase two, CRI implemented job-rolling on the Cray T3D, a mechanism for preempting a program, saving its state to disk, and later restoring its state to memory for continued execution. LLNL ported its gang scheduler to the LLNL T3D utilizing the CRI interface implemented in phases one and two. During phase three, the functionality and effectiveness of the LLNL gang scheduler was assessed to provide input to CRI time- and space-sharing, efforts. CRI will utilize this information in the development of general schedulers suitable for other sites and future architectures.« less

  2. Superconducting Microwave Multivibrator Produced by Coherent Feedback

    NASA Astrophysics Data System (ADS)

    Kerckhoff, Joseph; Lehnert, K. W.

    2012-10-01

    We investigate a nonlinear coherent feedback circuit constructed from preexisting superconducting microwave devices. The network exhibits emergent bistable and astable states, and we demonstrate its operation as a latch and the frequency locking of its oscillations. While the network is tedious to model by hand, our observations agree quite well with the semiclassical dynamical model produced by a new software package (N. Tezak , arXiv:1111.3081v1 [Phil. Trans. R. Soc. A (to be published)]) that systematically interpreted an idealized schematic of the system as a quantum optic feedback network.

  3. Shared-hole graph search with adaptive constraints for 3D optic nerve head optical coherence tomography image segmentation

    PubMed Central

    Yu, Kai; Shi, Fei; Gao, Enting; Zhu, Weifang; Chen, Haoyu; Chen, Xinjian

    2018-01-01

    Optic nerve head (ONH) is a crucial region for glaucoma detection and tracking based on spectral domain optical coherence tomography (SD-OCT) images. In this region, the existence of a “hole” structure makes retinal layer segmentation and analysis very challenging. To improve retinal layer segmentation, we propose a 3D method for ONH centered SD-OCT image segmentation, which is based on a modified graph search algorithm with a shared-hole and locally adaptive constraints. With the proposed method, both the optic disc boundary and nine retinal surfaces can be accurately segmented in SD-OCT images. An overall mean unsigned border positioning error of 7.27 ± 5.40 µm was achieved for layer segmentation, and a mean Dice coefficient of 0.925 ± 0.03 was achieved for optic disc region detection. PMID:29541497

  4. Quantum-noise randomized data encryption for wavelength-division-multiplexed fiber-optic networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corndorf, Eric; Liang Chuang; Kanter, Gregory S.

    2005-06-15

    We demonstrate high-rate randomized data-encryption through optical fibers using the inherent quantum-measurement noise of coherent states of light. Specifically, we demonstrate 650 Mbit/s data encryption through a 10 Gbit/s data-bearing, in-line amplified 200-km-long line. In our protocol, legitimate users (who share a short secret key) communicate using an M-ry signal set while an attacker (who does not share the secret key) is forced to contend with the fundamental and irreducible quantum-measurement noise of coherent states. Implementations of our protocol using both polarization-encoded signal sets as well as polarization-insensitive phase-keyed signal sets are experimentally and theoretically evaluated. Different from the performancemore » criteria for the cryptographic objective of key generation (quantum key-generation), one possible set of performance criteria for the cryptographic objective of data encryption is established and carefully considered.« less

  5. A fast bottom-up algorithm for computing the cut sets of noncoherent fault trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corynen, G.C.

    1987-11-01

    An efficient procedure for finding the cut sets of large fault trees has been developed. Designed to address coherent or noncoherent systems, dependent events, shared or common-cause events, the method - called SHORTCUT - is based on a fast algorithm for transforming a noncoherent tree into a quasi-coherent tree (COHERE), and on a new algorithm for reducing cut sets (SUBSET). To assure sufficient clarity and precision, the procedure is discussed in the language of simple sets, which is also developed in this report. Although the new method has not yet been fully implemented on the computer, we report theoretical worst-casemore » estimates of its computational complexity. 12 refs., 10 figs.« less

  6. Knowledge Sharing through Pair Programming in Learning Environments: An Empirical Study

    ERIC Educational Resources Information Center

    Kavitha, R. K.; Ahmed, M. S.

    2015-01-01

    Agile software development is an iterative and incremental methodology, where solutions evolve from self-organizing, cross-functional teams. Pair programming is a type of agile software development technique where two programmers work together with one computer for developing software. This paper reports the results of the pair programming…

  7. The relationship of global form and motion detection to reading fluency.

    PubMed

    Englund, Julia A; Palomares, Melanie

    2012-08-15

    Visual motion processing in typical and atypical readers has suggested aspects of reading and motion processing share a common cortical network rooted in dorsal visual areas. Few studies have examined the relationship between reading performance and visual form processing, which is mediated by ventral cortical areas. We investigated whether reading fluency correlates with coherent motion detection thresholds in typically developing children using random dot kinematograms. As a comparison, we also evaluated the correlation between reading fluency and static form detection thresholds. Results show that both dorsal and ventral visual functions correlated with components of reading fluency, but that they have different developmental characteristics. Motion coherence thresholds correlated with reading rate and accuracy, which both improved with chronological age. Interestingly, when controlling for non-verbal abilities and age, reading accuracy significantly correlated with thresholds for coherent form detection but not coherent motion detection in typically developing children. Dorsal visual functions that mediate motion coherence seem to be related maturation of broad cognitive functions including non-verbal abilities and reading fluency. However, ventral visual functions that mediate form coherence seem to be specifically related to accurate reading in typically developing children. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Interocular suppression in amblyopia for global orientation processing.

    PubMed

    Zhou, Jiawei; Huang, Pi-Chun; Hess, Robert F

    2013-04-22

    We developed a dichoptic global orientation coherence paradigm to quantify interocular suppression in amblyopia. This task is biased towards ventral processing and allows comparison with two other techniques-global motion processing, which is more dorsally biased, and binocular phase combination, which most likely reflects striate function. We found a similar pattern for the relationship between coherence threshold and interocular contrast curves (thresholds vs. interocular contrast ratios or TvRs) in our new paradigm compared with those of the previous dichoptic global motion coherence paradigm. The effective contrast ratios at balance point (where the signals from the two eyes have equal weighting) in our new paradigm were larger than those of the dichoptic global motion coherence paradigm but less than those of the binocular phase combination paradigm. The measured effective contrast ratios in the three paradigms were also positively correlated with each other, with the two global coherence paradigms having the highest correlation. We concluded that: (a) The dichoptic global orientation coherence paradigm is effective in quantifying interocular suppression in amblyopia; and (b) Interocular suppression, while sharing a common suppression mechanism at the early stage in the pathway (e.g., striate cortex), may have additional extra-striate contributions that affect both dorsal and ventral streams differentially.

  9. Expanding Software Productivity and Power while Reducing Costs.

    ERIC Educational Resources Information Center

    Winer, Ellen N.

    1988-01-01

    Microcomputer efficiency and software economy can be achieved through file transfer and data sharing. Costs can be reduced by purchasing computer systems that allow for expansion and portability of data. (MLF)

  10. Multimedia Shared Stories: Teaching Literacy Skills to Diverse Learners

    ERIC Educational Resources Information Center

    Rivera, Christopher J.

    2013-01-01

    Through research, shared stories have demonstrated their effectiveness in teaching literacy skills to students with disabilities, including students who are culturally and linguistically diverse. In an effort to keep pace with ever-changing technology, shared stories can be transformed into a multimedia experience using software that is commonly…

  11. The effects of a shared, Intranet science learning environment on the academic behaviors of problem-solving and metacognitive reflection

    NASA Astrophysics Data System (ADS)

    Parker, Mary Jo

    This study investigated the effects of a shared, Intranet science environment on the academic behaviors of problem-solving and metacognitive reflection. Seventy-eight subjects included 9th and 10th grade male and female biology students. A quasi-experimental design with pre- and post-test data collection and randomization occurring through assignment of biology classes to traditional or shared, Intranet learning groups was employed. Pilot, web-based distance education software (CourseInfo) created the Intranet learning environment. A modified ecology curriculum provided contextualization and content for traditional and shared learning environments. The effect of this environment on problem-solving, was measured using the standardized Watson-Glaser Critical Thinking Appraisal test. Metacognitive reflection, was measured in three ways: (a) number of concepts used, (b) number of concept links noted, and (c) number of concept nodes noted. Visual learning software, Inspiration, generated concept maps. Secondary research questions evaluated the pilot CourseInfo software for (a) tracked user movement, (b) discussion forum findings, and (c) difficulties experienced using CourseInfo software. Analysis of problem-solving group means reached no levels of significance resulting from the shared, Intranet environment. Paired t-Test of individual differences in problem-solving reached levels of significance. Analysis of metacognitive reflection by number of concepts reached levels of significance. Metacognitive reflection by number of concept links noted also reach significance. No significance was found for metacognitive reflection by number of concept nodes. No gender differences in problem-solving ability and metacognitive reflection emerged. Lack of gender differences in the shared, Intranet environment strongly suggests an equalizing effect due to the cooperative, collaborative nature of Intranet environments. Such environments appeal to, and rank high with, the female gender. Tracking learner movements in web-based, science environments has metacognitive and problem-solving learner implications. CourseInfo software offers one method of informing instruction within web-based learning environments focusing on academic behaviors. A shared, technology-supported learning environment may pose one model which science classrooms can use to create equitable scientific study across gender. The lack of significant differences resulting from this environment presents one model for improvement of individual problem-solving ability and metacognitive reflection across gender.

  12. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  13. The UNIX Operating System: A Model for Software Design.

    ERIC Educational Resources Information Center

    Kernighan, Brian W.; Morgan, Samuel P.

    1982-01-01

    Describes UNIX time-sharing operating system, including the program environment, software development tools, flexibility and ease of change, portability and other advantages, and five applications and three nonapplications of the system. (JN)

  14. The CECAM Electronic Structure Library: community-driven development of software libraries for electronic structure simulations

    NASA Astrophysics Data System (ADS)

    Oliveira, Micael

    The CECAM Electronic Structure Library (ESL) is a community-driven effort to segregate shared pieces of software as libraries that could be contributed and used by the community. Besides allowing to share the burden of developing and maintaining complex pieces of software, these can also become a target for re-coding by software engineers as hardware evolves, ensuring that electronic structure codes remain at the forefront of HPC trends. In a series of workshops hosted at the CECAM HQ in Lausanne, the tools and infrastructure for the project were prepared, and the first contributions were included and made available online (http://esl.cecam.org). In this talk I will present the different aspects and aims of the ESL and how these can be useful for the electronic structure community.

  15. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  16. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1993-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  17. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  18. Resource Sharing of Micro-Software, or, What Ever Happened to All That CP/M Compatibility?

    ERIC Educational Resources Information Center

    DeYoung, Barbara

    1984-01-01

    Explores incompatible operating systems as the basic reason why software packages will not work on different microcomputers; defines operating system; explores compatibility issues surrounding the IBM MS-DOS; and presents two future trends in hardware and software developments which indicate a return to true compatibility. (Author/MBR)

  19. PLUME and research sotware

    NASA Astrophysics Data System (ADS)

    Baudin, Veronique; Gomez-Diaz, Teresa

    2013-04-01

    The PLUME open platform (https://www.projet-plume.org) has as first goal to share competences and to value the knowledge of software experts within the French higher education and research communities. The project proposes in its platform the access to more than 380 index cards describing useful and economic software for this community, with open access to everybody. The second goal of PLUME focuses on to improve the visibility of software produced by research laboratories within the higher education and research communities. The "development-ESR" index cards briefly describe the main features of the software, including references to research publications associated to it. The platform counts more than 300 cards describing research software, where 89 cards have an English version. In this talk we describe the theme classification and the taxonomy of the index cards and the evolution with new themes added to the project. We will also focus on the organisation of PLUME as an open project and its interests in the promotion of free/open source software from and for research, contributing to the creation of a community of shared knowledge.

  20. CmapTools: A Software Environment for Knowledge Modeling and Sharing

    NASA Technical Reports Server (NTRS)

    Canas, Alberto J.

    2004-01-01

    In an ongoing collaborative effort between a group of NASA Ames scientists and researchers at the Institute for Human and Machine Cognition (IHMC) of the University of West Florida, a new version of CmapTools has been developed that enable scientists to construct knowledge models of their domain of expertise, share them with other scientists, make them available to anybody on the Internet with access to a Web browser, and peer-review other scientists models. These software tools have been successfully used at NASA to build a large-scale multimedia on Mars and in knowledge model on Habitability Assessment. The new version of the software places emphasis on greater usability for experts constructing their own knowledge models, and support for the creation of large knowledge models with large number of supporting resources in the forms of images, videos, web pages, and other media. Additionally, the software currently allows scientists to cooperate with each other in the construction, sharing and criticizing of knowledge models. Scientists collaborating from remote distances, for example researchers at the Astrobiology Institute, can concurrently manipulate the knowledge models they are viewing without having to do this at a special videoconferencing facility.

  1. An Open Source Software and Web-GIS Based Platform for Airborne SAR Remote Sensing Data Management, Distribution and Sharing

    NASA Astrophysics Data System (ADS)

    Changyong, Dou; Huadong, Guo; Chunming, Han; Ming, Liu

    2014-03-01

    With more and more Earth observation data available to the community, how to manage and sharing these valuable remote sensing datasets is becoming an urgent issue to be solved. The web based Geographical Information Systems (GIS) technology provides a convenient way for the users in different locations to share and make use of the same dataset. In order to efficiently use the airborne Synthetic Aperture Radar (SAR) remote sensing data acquired in the Airborne Remote Sensing Center of the Institute of Remote Sensing and Digital Earth (RADI), Chinese Academy of Sciences (CAS), a Web-GIS based platform for airborne SAR data management, distribution and sharing was designed and developed. The major features of the system include map based navigation search interface, full resolution imagery shown overlaid the map, and all the software adopted in the platform are Open Source Software (OSS). The functions of the platform include browsing the imagery on the map navigation based interface, ordering and downloading data online, image dataset and user management, etc. At present, the system is under testing in RADI and will come to regular operation soon.

  2. Exploring the Cognitive Foundations of the Shared Attention Mechanism: Evidence for a Relationship Between Self-Categorization and Shared Attention Across the Autism Spectrum.

    PubMed

    Skorich, Daniel P; Gash, Tahlia B; Stalker, Katie L; Zheng, Lidan; Haslam, S Alexander

    2017-05-01

    The social difficulties of autism spectrum disorder (ASD) are typically explained as a disruption in the Shared Attention Mechanism (SAM) sub-component of the theory of mind (ToM) system. In the current paper, we explore the hypothesis that SAM's capacity to construct the self-other-object relations necessary for shared-attention arises from a self-categorization process, which is weaker among those with more autistic-like traits. We present participants with self-categorization and shared-attention tasks, and measure their autism-spectrum quotient (AQ). Results reveal a negative relationship between AQ and shared-attention, via self-categorization, suggesting a role for self-categorization in the disruption in SAM seen in ASD. Implications for intervention, and for a ToM model in which weak central coherence plays a role are discussed.

  3. Flexible lock-in detection system based on synchronized computer plug-in boards applied in sensitive gas spectroscopy

    NASA Astrophysics Data System (ADS)

    Andersson, Mats; Persson, Linda; Svensson, Tomas; Svanberg, Sune

    2007-11-01

    We present a flexible and compact, digital, lock-in detection system and its use in high-resolution tunable diode laser spectroscopy. The system involves coherent sampling, and is based on the synchronization of two data acquisition cards running on a single standard computer. A software-controlled arbitrary waveform generator is used for laser modulation, and a four-channel analog/digital board records detector signals. Gas spectroscopy is performed in the wavelength modulation regime. The coherently detected signal is averaged a selected number of times before it is stored or analyzed by software-based, lock-in techniques. Multiple harmonics of the modulation signal (1f, 2f, 3f, 4f, etc.) are available in each single data set. The sensitivity is of the order of 10-5, being limited by interference fringes in the measurement setup. The capabilities of the system are demonstrated by measurements of molecular oxygen in ambient air, as well as dispersed gas in scattering materials, such as plants and human tissue.

  4. Flexible lock-in detection system based on synchronized computer plug-in boards applied in sensitive gas spectroscopy.

    PubMed

    Andersson, Mats; Persson, Linda; Svensson, Tomas; Svanberg, Sune

    2007-11-01

    We present a flexible and compact, digital, lock-in detection system and its use in high-resolution tunable diode laser spectroscopy. The system involves coherent sampling, and is based on the synchronization of two data acquisition cards running on a single standard computer. A software-controlled arbitrary waveform generator is used for laser modulation, and a four-channel analog/digital board records detector signals. Gas spectroscopy is performed in the wavelength modulation regime. The coherently detected signal is averaged a selected number of times before it is stored or analyzed by software-based, lock-in techniques. Multiple harmonics of the modulation signal (1f, 2f, 3f, 4f, etc.) are available in each single data set. The sensitivity is of the order of 10(-5), being limited by interference fringes in the measurement setup. The capabilities of the system are demonstrated by measurements of molecular oxygen in ambient air, as well as dispersed gas in scattering materials, such as plants and human tissue.

  5. Fiber optic interferometry for industrial process monitoring and control applications

    NASA Astrophysics Data System (ADS)

    Marcus, Michael A.

    2002-02-01

    Over the past few years we have been developing applications for a high-resolution (sub-micron accuracy) fiber optic coupled dual Michelson interferometer-based instrument. It is being utilized in a variety of applications including monitoring liquid layer thickness uniformity on coating hoppers, film base thickness uniformity measurement, digital camera focus assessment, optical cell path length assessment and imager and wafer surface profile mapping. The instrument includes both coherent and non-coherent light sources, custom application dependent optical probes and sample interfaces, a Michelson interferometer, custom electronics, a Pentium-based PC with data acquisition cards and LabWindows CVI or LabView based application specific software. This paper describes the development evolution of this instrument platform and applications highlighting robust instrument design, hardware, software, and user interfaces development. The talk concludes with a discussion of a new high-speed instrument configuration, which can be utilized for high speed surface profiling and as an on-line web thickness gauge.

  6. A method of measuring anterior chamber volume using the anterior segment optical coherence tomographer and specialized software.

    PubMed

    Wang, Ningli; Wang, Bingsong; Zhai, Gaoshou; Lei, Kun; Wang, Lan; Congdon, Nathan

    2007-05-01

    To describe and evaluate a new method for measuring anterior chamber volume (ACV). Observational case series. The authors measured ACV using the anterior chamber (AC) optical coherence tomographer (OCT) and applied image-processing software developed by them. Repeatability was evaluated. The ACV was measured in patient groups with normal ACs, shallow ACs, and deep ACs. The volume difference before and after laser peripheral iridotomy (LPI) was analyzed for the shallow and deep groups. Coefficients of repeatability for intraoperator, interoperator, and interimage measurements were 0.406%, 0.958%, and 0.851%, respectively. The limits of agreement for intraoperator and interoperator measurement were -0.911 microl to 1.343 microl and -7.875 microl to -2.463 microl, respectively. There were significant ACV differences in normal, shallow, and deep AC eyes (P < .001) and before and after LPI in shallow AC (P < .001) and deep AC (P = .008) eyes. The ACV values obtained by this method were repeatable and in accord with clinical observation.

  7. Understanding Social OER Environments--A Quantitative Study on Factors Influencing the Motivation to Share and Collaborate

    ERIC Educational Resources Information Center

    Pirkkalainen, Henri; Jokinen, Jussi P. P.; Pawlowski, Jan M.

    2014-01-01

    Social software environments are increasingly used for open education: teachers and learners share and collaborate in these environments. While there are various possibilities for the inclusion of such social functionalities for OER, many organizational, individual and technological challenges can hinder the motivation of teachers to share and…

  8. Library Information System Time-Sharing (LISTS) Project. Final Report.

    ERIC Educational Resources Information Center

    Black, Donald V.

    The Library Information System Time-Sharing (LISTS) experiment was based on three innovations in data processing technology: (1) the advent of computer time-sharing on third-generation machines, (2) the development of general-purpose file-management software and (3) the introduction of large, library-oriented data bases. The main body of the…

  9. Summary of the ACAT Round Table Discussion: Open-source, knowledge sharing and scientific collaboration

    NASA Astrophysics Data System (ADS)

    Carminati, Federico; Perret-Gallix, Denis; Riemann, Tord

    2014-06-01

    Round table discussions are in the tradition of ACAT. This year's plenary round table discussion was devoted to questions related to the use of scientific software in High Energy Physics and beyond. The 90 minutes of discussion were lively, and quite a lot of diverse opinions were spelled out. Although the discussion was, in part, controversial, the participants agreed unanimously on several basic issues in software sharing: • The importance of having various licensing models in academic research; • The basic value of proper recognition and attribution of intellectual property, including scientific software; • The user respect for the conditions of use, including licence statements, as formulated by the author. The need of a similar discussion on the issues of data sharing was emphasized and it was recommended to cover this subject at the conference round table discussion of next ACAT. In this contribution, we summarise selected topics that were covered in the introductory talks and in the following discussion.

  10. ITOS to EDGE "Bridge" Software for Morpheus Lunar/Martian Vehicle

    NASA Technical Reports Server (NTRS)

    Hirsh, Robert; Fuchs, Jordan

    2012-01-01

    My project Involved Improving upon existing software and writing new software for the Project Morpheus Team. Specifically, I created and updated Integrated Test and Operations Systems (ITOS) user Interfaces for on-board Interaction with the vehicle during archive playback as well as live streaming data. These Interfaces are an integral part of the testing and operations for the Morpheus vehicle providing any and all information from the vehicle to evaluate instruments and insure coherence and control of the vehicle during Morpheus missions. I also created a "bridge" program for Interfacing "live" telemetry data with the Engineering DOUG Graphics Engine (EDGE) software for a graphical (standalone or VR dome) view of live Morpheus nights or archive replays, providing graphical representation of vehicle night and movement during subsequent tests and in real missions.

  11. The National Capital Region closed circuit television video interoperability project.

    PubMed

    Contestabile, John; Patrone, David; Babin, Steven

    2016-01-01

    The National Capital Region (NCR) includes many government jurisdictions and agencies using different closed circuit TV (CCTV) cameras and video management software. Because these agencies often must work together to respond to emergencies and events, a means of providing interoperability for CCTV video is critically needed. Video data from different CCTV systems that are not inherently interoperable is represented in the "data layer." An "integration layer" ingests the data layer source video and normalizes the different video formats. It then aggregates and distributes this video to a "presentation layer" where it can be viewed by almost any application used by other agencies and without any proprietary software. A native mobile video viewing application is also developed that uses the presentation layer to provide video to different kinds of smartphones. The NCR includes Washington, DC, and surrounding counties in Maryland and Virginia. The video sharing architecture allows one agency to see another agency's video in their native viewing application without the need to purchase new CCTV software or systems. A native smartphone application was also developed to enable them to share video via mobile devices even when they use different video management systems. A video sharing architecture has been developed for the NCR that creates an interoperable environment for sharing CCTV video in an efficient and cost effective manner. In addition, it provides the desired capability of sharing video via a native mobile application.

  12. In silico tools for sharing data and knowledge on toxicity and metabolism: derek for windows, meteor, and vitic.

    PubMed

    Marchant, Carol A; Briggs, Katharine A; Long, Anthony

    2008-01-01

    ABSTRACT Lhasa Limited is a not-for-profit organization that exists to promote the sharing of data and knowledge in chemistry and the life sciences. It has developed the software tools Derek for Windows, Meteor, and Vitic to facilitate such sharing. Derek for Windows and Meteor are knowledge-based expert systems that predict the toxicity and metabolism of a chemical, respectively. Vitic is a chemically intelligent toxicity database. An overview of each software system is provided along with examples of the sharing of data and knowledge in the context of their development. These examples include illustrations of (1) the use of data entry and editing tools for the sharing of data and knowledge within organizations; (2) the use of proprietary data to develop nonconfidential knowledge that can be shared between organizations; (3) the use of shared expert knowledge to refine predictions; (4) the sharing of proprietary data between organizations through the formation of data-sharing groups; and (5) the use of proprietary data to validate predictions. Sharing of chemical toxicity and metabolism data and knowledge in this way offers a number of benefits including the possibilities of faster scientific progress and reductions in the use of animals in testing. Maximizing the accessibility of data also becomes increasingly crucial as in silico systems move toward the prediction of more complex phenomena for which limited data are available.

  13. The systematic evolution of a NASA software technology, Appendix C

    NASA Technical Reports Server (NTRS)

    Deregt, M. P.; Dulfer, J. E.

    1972-01-01

    A long range program is described whose ultimate purpose is to make possible the production of software in NASA within predictable schedule and budget constraints and with major characteristics such as size, run-time, and correctness predictable within reasonable tolerances. As part of the program a pilot NASA computer center will be chosen to apply software development and management techniques systematically and determine a set which is effective. The techniques will be developed by a Technology Group, which will guide the pilot project and be responsible for its success. The application of the technology will involve a sequence of NASA programming tasks graduated from simpler ones at first to complex systems in late phases of the project. The evaluation of the technology will be made by monitoring the operation of the software at the users' installations. In this way a coherent discipline for software design, production maintenance, and management will be evolved.

  14. Team Collaboration Software

    NASA Technical Reports Server (NTRS)

    Wang, Yeou-Fang; Schrock, Mitchell; Baldwin, John R.; Borden, Charles S.

    2010-01-01

    The Ground Resource Allocation and Planning Environment (GRAPE 1.0) is a Web-based, collaborative team environment based on the Microsoft SharePoint platform, which provides Deep Space Network (DSN) resource planners tools and services for sharing information and performing analysis.

  15. Spindle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2013-04-04

    Spindle is software infrastructure that solves file system scalabiltiy problems associated with starting dynamically linked applications in HPC environments. When an HPC applications starts up thousands of pricesses at once, and those processes simultaneously access a shared file system to look for shared libraries, it can cause significant performance problems for both the application and other users. Spindle scalably coordinates the distribution of shared libraries to an application to avoid hammering the shared file system.

  16. Information and organization in public health institutes: an ontology-based modeling of the entities in the reception-analysis-report phases.

    PubMed

    Pozza, Giandomenico; Borgo, Stefano; Oltramari, Alessandro; Contalbrigo, Laura; Marangon, Stefano

    2016-09-08

    Ontologies are widely used both in the life sciences and in the management of public and private companies. Typically, the different offices in an organization develop their own models and related ontologies to capture specific tasks and goals. Although there might be an overall coordination, the use of distinct ontologies can jeopardize the integration of data across the organization since data sharing and reusability are sensitive to modeling choices. The paper provides a study of the entities that are typically found at the reception, analysis and report phases in public institutes in the life science domain. Ontological considerations and techniques are introduced and their implementation exemplified by studying the Istituto Zooprofilattico Sperimentale delle Venezie (IZSVe), a public veterinarian institute with different geographical locations and several laboratories. Different modeling issues are discussed like the identification and characterization of the main entities in these phases; the classification of the (types of) data; the clarification of the contexts and the roles of the involved entities. The study is based on a foundational ontology and shows how it can be extended to a comprehensive and coherent framework comprising the different institute's roles, processes and data. In particular, it shows how to use notions lying at the borderline between ontology and applications, like that of knowledge object. The paper aims to help the modeler to understand the core viewpoint of the organization and to improve data transparency. The study shows that the entities at play can be analyzed within a single ontological perspective allowing us to isolate a single ontological framework for the whole organization. This facilitates the development of coherent representations of the entities and related data, and fosters the use of integrated software for data management and reasoning across the company.

  17. Involuntary eye motion correction in retinal optical coherence tomography: Hardware or software solution?

    PubMed

    Baghaie, Ahmadreza; Yu, Zeyun; D'Souza, Roshan M

    2017-04-01

    In this paper, we review state-of-the-art techniques to correct eye motion artifacts in Optical Coherence Tomography (OCT) imaging. The methods for eye motion artifact reduction can be categorized into two major classes: (1) hardware-based techniques and (2) software-based techniques. In the first class, additional hardware is mounted onto the OCT scanner to gather information about the eye motion patterns during OCT data acquisition. This information is later processed and applied to the OCT data for creating an anatomically correct representation of the retina, either in an offline or online manner. In software based techniques, the motion patterns are approximated either by comparing the acquired data to a reference image, or by considering some prior assumptions about the nature of the eye motion. Careful investigations done on the most common methods in the field provides invaluable insight regarding future directions of the research in this area. The challenge in hardware-based techniques lies in the implementation aspects of particular devices. However, the results of these techniques are superior to those obtained from software-based techniques because they are capable of capturing secondary data related to eye motion during OCT acquisition. Software-based techniques on the other hand, achieve moderate success and their performance is highly dependent on the quality of the OCT data in terms of the amount of motion artifacts contained in them. However, they are still relevant to the field since they are the sole class of techniques with the ability to be applied to legacy data acquired using systems that do not have extra hardware to track eye motion. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. The Next Generation of Personal Computers.

    ERIC Educational Resources Information Center

    Crecine, John P.

    1986-01-01

    Discusses factors converging to create high-capacity, low-cost nature of next generation of microcomputers: a coherent vision of what graphics workstation and future computing environment should be like; hardware developments leading to greater storage capacity at lower costs; and development of software and expertise to exploit computing power…

  19. CONTIN XPCS: software for inverse transform analysis of X-ray photon correlation spectroscopy dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Ross N.; Narayanan, Suresh; Zhang, Fan

    X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) reveal materials dynamics using coherent scattering, with XPCS permitting the investigation of dynamics in a more diverse array of materials than DLS. Heterogeneous dynamics occur in many material systems. The authors' recent work has shown how classic tools employed in the DLS analysis of heterogeneous dynamics can be extended to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. The present work describes the software implementation of inverse transform analysis of XPCS data. This software, calledCONTIN XPCS, is an extension of traditionalCONTINanalysis and accommodates the various dynamics encountered inmore » equilibrium XPCS measurements.« less

  20. CONTIN XPCS: software for inverse transform analysis of X-ray photon correlation spectroscopy dynamics

    DOE PAGES

    Andrews, Ross N.; Narayanan, Suresh; Zhang, Fan; ...

    2018-02-01

    X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) reveal materials dynamics using coherent scattering, with XPCS permitting the investigation of dynamics in a more diverse array of materials than DLS. Heterogeneous dynamics occur in many material systems. The authors' recent work has shown how classic tools employed in the DLS analysis of heterogeneous dynamics can be extended to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. The present work describes the software implementation of inverse transform analysis of XPCS data. This software, calledCONTIN XPCS, is an extension of traditionalCONTINanalysis and accommodates the various dynamics encountered inmore » equilibrium XPCS measurements.« less

  1. Coherent receiver design based on digital signal processing in optical high-speed intersatellite links with M-phase-shift keying

    NASA Astrophysics Data System (ADS)

    Schaefer, Semjon; Gregory, Mark; Rosenkranz, Werner

    2016-11-01

    We present simulative and experimental investigations of different coherent receiver designs for high-speed optical intersatellite links. We focus on frequency offset (FO) compensation in homodyne and intradyne detection systems. The considered laser communication terminal uses an optical phase-locked loop (OPLL), which ensures stable homodyne detection. However, the hardware complexity increases with the modulation order. Therefore, we show that software-based intradyne detection is an attractive alternative for OPLL-based homodyne systems. Our approach is based on digital FO and phase noise compensation, in order to achieve a more flexible coherent detection scheme. Analytic results will further show the theoretical impact of the different detection schemes on the receiver sensitivity. Finally, we compare the schemes in terms of bit error ratio measurements and optimal receiver design.

  2. Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory

    NASA Technical Reports Server (NTRS)

    Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.

    1994-01-01

    As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.

  3. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  4. A real-time coherent dedispersion pipeline for the giant metrewave radio telescope

    NASA Astrophysics Data System (ADS)

    De, Kishalay; Gupta, Yashwant

    2016-02-01

    A fully real-time coherent dedispersion system has been developed for the pulsar back-end at the Giant Metrewave Radio Telescope (GMRT). The dedispersion pipeline uses the single phased array voltage beam produced by the existing GMRT software back-end (GSB) to produce coherently dedispersed intensity output in real time, for the currently operational bandwidths of 16 MHz and 32 MHz. Provision has also been made to coherently dedisperse voltage beam data from observations recorded on disk. We discuss the design and implementation of the real-time coherent dedispersion system, describing the steps carried out to optimise the performance of the pipeline. Presently functioning on an Intel Xeon X5550 CPU equipped with a NVIDIA Tesla C2075 GPU, the pipeline allows dispersion free, high time resolution data to be obtained in real-time. We illustrate the significant improvements over the existing incoherent dedispersion system at the GMRT, and present some preliminary results obtained from studies of pulsars using this system, demonstrating its potential as a useful tool for low frequency pulsar observations. We describe the salient features of our implementation, comparing it with other recently developed real-time coherent dedispersion systems. This implementation of a real-time coherent dedispersion pipeline for a large, low frequency array instrument like the GMRT, will enable long-term observing programs using coherent dedispersion to be carried out routinely at the observatory. We also outline the possible improvements for such a pipeline, including prospects for the upgraded GMRT which will have bandwidths about ten times larger than at present.

  5. Academic Research Integration System

    ERIC Educational Resources Information Center

    Surugiu, Iula; Velicano, Manole

    2008-01-01

    This paper comprises results concluding the research activity done so far regarding enhanced web services and system integration. The objective of the paper is to define the software architecture for a coherent framework and methodology for enhancing existing web services into an integrated system. This document presents the research work that has…

  6. Optimization of Passive Coherent Receiver System Placement

    DTIC Science & Technology

    2013-09-01

    spheroid object with a constant radar cross section (RCS). Additionally, the receiver and transmitters are assumed to be notional isotropic antennae...software- defined radio for equatorial plasma instability studies,” Radio Science, vol. 48, pp. 1–11. Aug. 2013. [2] P. C. Zhang and B. Y. Li, “Passive

  7. Patchy ‘coherence’: using normalization process theory to evaluate a multi-faceted shared decision making implementation program (MAGIC)

    PubMed Central

    2013-01-01

    Background Implementing shared decision making into routine practice is proving difficult, despite considerable interest from policy-makers, and is far more complex than merely making decision support interventions available to patients. Few have reported successful implementation beyond research studies. MAking Good Decisions In Collaboration (MAGIC) is a multi-faceted implementation program, commissioned by The Health Foundation (UK), to examine how best to put shared decision making into routine practice. In this paper, we investigate healthcare professionals’ perspectives on implementing shared decision making during the MAGIC program, to examine the work required to implement shared decision making and to inform future efforts. Methods The MAGIC program approached implementation of shared decision making by initiating a range of interventions including: providing workshops; facilitating development of brief decision support tools (Option Grids); initiating a patient activation campaign (‘Ask 3 Questions’); gathering feedback using Decision Quality Measures; providing clinical leads meetings, learning events, and feedback sessions; and obtaining executive board level support. At 9 and 15 months (May and November 2011), two rounds of semi-structured interviews were conducted with healthcare professionals in three secondary care teams to explore views on the impact of these interventions. Interview data were coded by two reviewers using a framework derived from the Normalization Process Theory. Results A total of 54 interviews were completed with 31 healthcare professionals. Partial implementation of shared decision making could be explained using the four components of the Normalization Process Theory: ‘coherence,’ ‘cognitive participation,’ ‘collective action,’ and ‘reflexive monitoring.’ Shared decision making was integrated into routine practice when clinical teams shared coherent views of role and purpose (‘coherence’). Shared decision making was facilitated when teams engaged in developing and delivering interventions (‘cognitive participation’), and when those interventions fit with existing skill sets and organizational priorities (‘collective action’) resulting in demonstrable improvements to practice (‘reflexive monitoring’). The implementation process uncovered diverse and conflicting attitudes toward shared decision making; ‘coherence’ was often missing. Conclusions The study showed that implementation of shared decision making is more complex than the delivery of patient decision support interventions to patients, a portrayal that often goes unquestioned. Normalizing shared decision making requires intensive work to ensure teams have a shared understanding of the purpose of involving patients in decisions, and undergo the attitudinal shifts that many health professionals feel are required when comprehension goes beyond initial interpretations. Divergent views on the value of engaging patients in decisions remain a significant barrier to implementation. PMID:24006959

  8. [Sense of coherence and subjective overload, anxiety and depression in caregivers of elderly relatives].

    PubMed

    López-Martínez, Catalina; Frías-Osuna, Antonio; Del-Pino-Casado, Rafael

    2017-11-23

    To analyze the relationship between the sense of coherence and subjective overload, anxiety and depression in caregivers of dependent elderly relatives. Cross-sectional study in an area of the province of Jaén (Andalusia, Spain) with a probabilistic sample of 132 caregivers of dependent elderly. sense of coherence (Life Orientation Questionnaire), subjective burden (Caregiver Strain Index), anxiety and depression (Goldberg Scale), objective burden (Dedication to Care Scale), sex and kinship. Main analyses: bivariate analysis using the Pearson correlation coefficient and multivariate analysis using multiple linear regression. Most of the caregivers studied were women (86.4%), daughter or son of the care recipient (74.2%) and shared home with the latter (69.7%). When controlling for objective burden, sex and kinship, we found that the sense of coherence was inversely related to subjective burden (β = -0.46; p <0.001), anxiety (β = -0.57; p = 0.001) and depression (β = -0.66; p <0.001). The sense of coherence might be an important protective factor of subjective burden, anxiety and depression in caregivers of dependent elderly relatives. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    PubMed

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  10. Merlin - Massively parallel heterogeneous computing

    NASA Technical Reports Server (NTRS)

    Wittie, Larry; Maples, Creve

    1989-01-01

    Hardware and software for Merlin, a new kind of massively parallel computing system, are described. Eight computers are linked as a 300-MIPS prototype to develop system software for a larger Merlin network with 16 to 64 nodes, totaling 600 to 3000 MIPS. These working prototypes help refine a mapped reflective memory technique that offers a new, very general way of linking many types of computer to form supercomputers. Processors share data selectively and rapidly on a word-by-word basis. Fast firmware virtual circuits are reconfigured to match topological needs of individual application programs. Merlin's low-latency memory-sharing interfaces solve many problems in the design of high-performance computing systems. The Merlin prototypes are intended to run parallel programs for scientific applications and to determine hardware and software needs for a future Teraflops Merlin network.

  11. Development of the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2010-01-01

    A general overview of the development of a data acquisition and processing system is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.

  12. Sharing the Code.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2003-01-01

    Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)

  13. Neuroinformatics Software Applications Supporting Electronic Data Capture, Management, and Sharing for the Neuroimaging Community

    PubMed Central

    Nichols, B. Nolan; Pohl, Kilian M.

    2017-01-01

    Accelerating insight into the relation between brain and behavior entails conducting small and large-scale research endeavors that lead to reproducible results. Consensus is emerging between funding agencies, publishers, and the research community that data sharing is a fundamental requirement to ensure all such endeavors foster data reuse and fuel reproducible discoveries. Funding agency and publisher mandates to share data are bolstered by a growing number of data sharing efforts that demonstrate how information technologies can enable meaningful data reuse. Neuroinformatics evaluates scientific needs and develops solutions to facilitate the use of data across the cognitive and neurosciences. For example, electronic data capture and management tools designed to facilitate human neurocognitive research can decrease the setup time of studies, improve quality control, and streamline the process of harmonizing, curating, and sharing data across data repositories. In this article we outline the advantages and disadvantages of adopting software applications that support these features by reviewing the tools available and then presenting two contrasting neuroimaging study scenarios in the context of conducting a cross-sectional and a multisite longitudinal study. PMID:26267019

  14. Motion coherence affects human perception and pursuit similarly.

    PubMed

    Beutter, B R; Stone, L S

    2000-01-01

    Pursuit and perception both require accurate information about the motion of objects. Recovering the motion of objects by integrating the motion of their components is a difficult visual task. Successful integration produces coherent global object motion, while a failure to integrate leaves the incoherent local motions of the components unlinked. We compared the ability of perception and pursuit to perform motion integration by measuring direction judgments and the concomitant eye-movement responses to line-figure parallelograms moving behind stationary rectangular apertures. The apertures were constructed such that only the line segments corresponding to the parallelogram's sides were visible; thus, recovering global motion required the integration of the local segment motion. We investigated several potential motion-integration rules by using stimuli with different object, vector-average, and line-segment terminator-motion directions. We used an oculometric decision rule to directly compare direction discrimination for pursuit and perception. For visible apertures, the percept was a coherent object, and both the pursuit and perceptual performance were close to the object-motion prediction. For invisible apertures, the percept was incoherently moving segments, and both the pursuit and perceptual performance were close to the terminator-motion prediction. Furthermore, both psychometric and oculometric direction thresholds were much higher for invisible apertures than for visible apertures. We constructed a model in which both perception and pursuit are driven by a shared motion-processing stage, with perception having an additional input from an independent static-processing stage. Model simulations were consistent with our perceptual and oculomotor data. Based on these results, we propose the use of pursuit as an objective and continuous measure of perceptual coherence. Our results support the view that pursuit and perception share a common motion-integration stage, perhaps within areas MT or MST.

  15. Motion coherence affects human perception and pursuit similarly

    NASA Technical Reports Server (NTRS)

    Beutter, B. R.; Stone, L. S.

    2000-01-01

    Pursuit and perception both require accurate information about the motion of objects. Recovering the motion of objects by integrating the motion of their components is a difficult visual task. Successful integration produces coherent global object motion, while a failure to integrate leaves the incoherent local motions of the components unlinked. We compared the ability of perception and pursuit to perform motion integration by measuring direction judgments and the concomitant eye-movement responses to line-figure parallelograms moving behind stationary rectangular apertures. The apertures were constructed such that only the line segments corresponding to the parallelogram's sides were visible; thus, recovering global motion required the integration of the local segment motion. We investigated several potential motion-integration rules by using stimuli with different object, vector-average, and line-segment terminator-motion directions. We used an oculometric decision rule to directly compare direction discrimination for pursuit and perception. For visible apertures, the percept was a coherent object, and both the pursuit and perceptual performance were close to the object-motion prediction. For invisible apertures, the percept was incoherently moving segments, and both the pursuit and perceptual performance were close to the terminator-motion prediction. Furthermore, both psychometric and oculometric direction thresholds were much higher for invisible apertures than for visible apertures. We constructed a model in which both perception and pursuit are driven by a shared motion-processing stage, with perception having an additional input from an independent static-processing stage. Model simulations were consistent with our perceptual and oculomotor data. Based on these results, we propose the use of pursuit as an objective and continuous measure of perceptual coherence. Our results support the view that pursuit and perception share a common motion-integration stage, perhaps within areas MT or MST.

  16. Optical design and simulation of a new coherence beamline at NSLS-II

    NASA Astrophysics Data System (ADS)

    Williams, Garth J.; Chubar, Oleg; Berman, Lonny; Chu, Yong S.; Robinson, Ian K.

    2017-08-01

    We will discuss the optical design for a proposed beamline at NSLS-II, a late-third generation storage ring source, designed to exploit the spatial coherence of the X-rays to extract high-resolution spatial information from ordered and disordered materials through Coherent Diffractive Imaging, executed in the Bragg- and forward-scattering geometries. This technique offers a powerful tool to image sub-10 nm spatial features and, within ordered materials, sub-Angstrom mapping of deformation fields. Driven by the opportunity to apply CDI to a wide range of samples, with sizes ranging from sub-micron to tens-of-microns, two optical designs have been proposed and simulated under a wide variety of optical configurations using the software package Synchrotron Radiation Workshop. The designs, their goals, and the results of the simulation, including NSLS-II ring and undulator source parameters, of the beamline performance as a function of its variable optical components is described.

  17. High-speed spectral domain polarization- sensitive optical coherence tomography using a single camera and an optical switch at 1.3 microm.

    PubMed

    Lee, Sang-Won; Jeong, Hyun-Woo; Kim, Beop-Min

    2010-01-01

    We propose high-speed spectral domain polarization-sensitive optical coherence tomography (SD-PS-OCT) using a single camera and a 1x2 optical switch at the 1.3-microm region. The PS-low coherence interferometer used in the system is constructed using free-space optics. The reflected horizontal and vertical polarization light rays are delivered via an optical switch to a single spectrometer by turns. Therefore, our system costs less to build than those that use dual spectrometers, and the processes of timing and triggering are simpler from the viewpoints of both hardware and software. Our SD-PS-OCT has a sensitivity of 101.5 dB, an axial resolution of 8.2 microm, and an acquisition speed of 23,496 A-scans per second. We obtain the intensity, phase retardation, and fast axis orientation images of a rat tail tendon ex vivo.

  18. Real-time dual-polarization transmission based on hybrid optical wireless communications

    NASA Astrophysics Data System (ADS)

    Sousa, Artur N.; Alimi, Isiaka A.; Ferreira, Ricardo M.; Shahpari, Ali; Lima, Mário; Monteiro, Paulo P.; Teixeira, António L.

    2018-01-01

    We present experimental work on a gigabit-capable and long-reach hybrid coherent UWDM-PON plus FSO system for supporting different applications over the same fiber infrastructure in the mobile backhaul (MBH) networks. Also, for the first time, we demonstrate a reconfigurable real-time DSP transmission/reception of DP-QPSK signals over standard single-mode fiber (SSMF) and FSO links. The receiver presented is based on a commercial field-programmable gate array (FPGA). The considered communication links are based on 20 UDWDM channels with 625 Mbaud and 2.5 GHz channel spacing. We are able to demonstrate the lowest sampling rate required for digital coherent PON by employing four 1.25 Gsa/s ADCs using an electrical front-end receiver that offers only 1 GHz analog bandwidth. We achieved this by implementing a phase and polarization diversity coherent receiver combined with the DP-QPSK modulation formats. The system performance is estimated in terms of receiver sensitivity. The results show the viability of coherent PON and flexible dual-polarization supported by software-defined transceivers for the MBH.

  19. Strength and coherence of binocular rivalry depends on shared stimulus complexity.

    PubMed

    Alais, David; Melcher, David

    2007-01-01

    Presenting incompatible images to the eyes results in alternations of conscious perception, a phenomenon known as binocular rivalry. We examined rivalry using either simple stimuli (oriented gratings) or coherent visual objects (faces, houses etc). Two rivalry characteristics were measured: Depth of rivalry suppression and coherence of alternations. Rivalry between coherent visual objects exhibits deep suppression and coherent rivalry, whereas rivalry between gratings exhibits shallow suppression and piecemeal rivalry. Interestingly, rivalry between a simple and a complex stimulus displays the same characteristics (shallow and piecemeal) as rivalry between two simple stimuli. Thus, complex stimuli fail to rival globally unless the fellow stimulus is also global. We also conducted a face adaptation experiment. Adaptation to rivaling faces improved subsequent face discrimination (as expected), but adaptation to a rivaling face/grating pair did not. To explain this, we suggest rivalry must be an early and local process (at least initially), instigated by the failure of binocular fusion, which can then become globally organized by feedback from higher-level areas when both rivalry stimuli are global, so that rivalry tends to oscillate coherently. These globally assembled images then flow through object processing areas, with the dominant image gaining in relative strength in a form of 'biased competition', therefore accounting for the deeper suppression of global images. In contrast, when only one eye receives a global image, local piecemeal suppression from the fellow eye overrides the organizing effects of global feedback to prevent coherent image formation. This indicates the primacy of local over global processes in rivalry.

  20. ESPC Common Model Architecture Earth System Modeling Framework (ESMF) Software and Application Development

    DTIC Science & Technology

    2015-09-30

    originate from NASA , NOAA , and community modeling efforts, and support for creation of the suite was shared by sponsors from other agencies. ESPS...Framework (ESMF) Software and Application Development Cecelia Deluca NESII/CIRES/ NOAA Earth System Research Laboratory 325 Broadway Boulder, CO...Capability (NUOPC) was established between NOAA and Navy to develop a common software architecture for easy and efficient interoperability. The

  1. Cache-based error recovery for shared memory multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Wu, Kun-Lung; Fuchs, W. Kent; Patel, Janak H.

    1989-01-01

    A multiprocessor cache-based checkpointing and recovery scheme for of recovering from transient processor errors in a shared-memory multiprocessor with private caches is presented. New implementation techniques that use checkpoint identifiers and recovery stacks to reduce performance degradation in processor utilization during normal execution are examined. This cache-based checkpointing technique prevents rollback propagation, provides for rapid recovery, and can be integrated into standard cache coherence protocols. An analytical model is used to estimate the relative performance of the scheme during normal execution. Extensions that take error latency into account are presented.

  2. Collective Contexts in Conversation: Grounding by Proxy

    ERIC Educational Resources Information Center

    Eshghi, Arash; Healey, Patrick G. T.

    2016-01-01

    Anecdotal evidence suggests that participants in conversation can sometimes act as a coalition. This implies a level of conversational organization in which groups of individuals form a coherent unit. This paper investigates the implications of this phenomenon for psycholinguistic and semantic models of shared context in dialog. We present a…

  3. Engaged Voices--Dialogic Interaction and the Construction of Shared Social Meanings

    ERIC Educational Resources Information Center

    Cruddas, Leora

    2007-01-01

    The notion of "pupil voice" reproduces the binary distinction between adult and child, pupil and teacher and therefore serves to reinforce "conventional" constructions of childhood. The concept of "voice" invokes an essentialist construction of self that is singular, coherent, consistent and rational. It is arguably…

  4. Web Platform for Sharing Modeling Software in the Field of Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Dubenskaya, Julia; Kryukov, Alexander; Demichev, Andrey

    2018-02-01

    We describe the prototype of a Web platform intended for sharing software programs for computer modeling in the rapidly developing field of the nonlinear optics phenomena. The suggested platform is built on the top of the HUBZero open-source middleware. In addition to the basic HUBZero installation we added to our platform the capability to run Docker containers via an external application server and to send calculation programs to those containers for execution. The presented web platform provides a wide range of features and might be of benefit to nonlinear optics researchers.

  5. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.

  6. Beyond the Quantitative and Qualitative Divide: Research in Art Education as Border Skirmish.

    ERIC Educational Resources Information Center

    Sullivan, Graeme

    1996-01-01

    Analyzes a research project that utilizes a coherent conceptual model of art education research incorporating the demand for empirical rigor and providing for diverse interpretive frameworks. Briefly profiles the NUD*IST (Non-numerical Unstructured Data Indexing Searching and Theorizing) software system that can organize and retrieve complex…

  7. Spectral and Spatial Coherent Emission of Thermal Radiation from Metal-Semiconductor Nanostructures

    DTIC Science & Technology

    2012-03-01

    Coupled Wave Analysis (RCWA) numerical technique and Computer Simulation Technology (CST) electromagnetic modeling software, two structures were...Stephanie Gray, IR-VASE and modeling  Dr. Kevin Gross, FTIR  Mr. Richard Johnston, Cleanroom and Photolithography  Ms. Abbey Juhl, Nanoscribe...Appendix B. Supplemental IR-VASE Measurements and Modeling .............................114 Bibliography

  8. New Frontiers in Heart Rate Variability and Social Coherence Research: Techniques, Technologies, and Implications for Improving Group Dynamics and Outcomes

    PubMed Central

    McCraty, Rollin

    2017-01-01

    Concepts embraced by the term coherence have been identified as central to fields such as quantum physics, physiology, and social science. There are different types of coherence, although the term always implies a harmonious relationship, correlations and connections between the various parts of a system. A specific measure derived from heart rate variability (HRV) provides a measure of physiological coherence. Another type of coherence, social coherence, relates to the harmonious alignment between couples or pairs, family units, small groups, or larger organizations in which a network of relationships exists among individuals who share common interests and objectives. A high degree of social coherence is reflected by stable and harmonious relationships, which allows for the efficient flow and utilization of energy and communication required for optimal collective cohesion and action. Social coherence requires that group members are attuned and are emotionally connected with each other, and that the group’s emotional energy is organized and regulated by the group as a whole. A number of studies are reviewed which have explored various types of synchronization in infants, pairs and groups, indicating that feelings of cooperation, trust, compassion and increased prosocial behaviors depends largely on the establishment of a spontaneous synchronization of various physiological rhythms between individuals. This article discusses a new application using HRV monitoring in social coherence research and the importance of physiological synchronization in group developmental processes and dynamics. Building on the extensive body of research showing that providing feedback of HRV coherence level at the individual level can improve self-regulation, we suggest the following hypotheses: (1) providing feedback of individual and collective HRV coherence and the degree of heart rhythm synchronization will increase group coherence, and heart rhythm synchronization among group members. (2) Training in techniques to increase group coherence and heart rhythm synchronization will correlate with increased prosocial behaviors, such as kindness and cooperation among individuals, improved communication, and decreases in social discord and adversarial interactions. (3) Biomagnetic fields produced by the heart may be a primary mechanism in mediating HRV synchronization among group members. Data supporting each of the hypothesis is discussed. PMID:29075623

  9. New Frontiers in Heart Rate Variability and Social Coherence Research: Techniques, Technologies, and Implications for Improving Group Dynamics and Outcomes.

    PubMed

    McCraty, Rollin

    2017-01-01

    Concepts embraced by the term coherence have been identified as central to fields such as quantum physics, physiology, and social science. There are different types of coherence, although the term always implies a harmonious relationship, correlations and connections between the various parts of a system. A specific measure derived from heart rate variability (HRV) provides a measure of physiological coherence. Another type of coherence, social coherence, relates to the harmonious alignment between couples or pairs, family units, small groups, or larger organizations in which a network of relationships exists among individuals who share common interests and objectives. A high degree of social coherence is reflected by stable and harmonious relationships, which allows for the efficient flow and utilization of energy and communication required for optimal collective cohesion and action. Social coherence requires that group members are attuned and are emotionally connected with each other, and that the group's emotional energy is organized and regulated by the group as a whole. A number of studies are reviewed which have explored various types of synchronization in infants, pairs and groups, indicating that feelings of cooperation, trust, compassion and increased prosocial behaviors depends largely on the establishment of a spontaneous synchronization of various physiological rhythms between individuals. This article discusses a new application using HRV monitoring in social coherence research and the importance of physiological synchronization in group developmental processes and dynamics. Building on the extensive body of research showing that providing feedback of HRV coherence level at the individual level can improve self-regulation, we suggest the following hypotheses: (1) providing feedback of individual and collective HRV coherence and the degree of heart rhythm synchronization will increase group coherence, and heart rhythm synchronization among group members. (2) Training in techniques to increase group coherence and heart rhythm synchronization will correlate with increased prosocial behaviors, such as kindness and cooperation among individuals, improved communication, and decreases in social discord and adversarial interactions. (3) Biomagnetic fields produced by the heart may be a primary mechanism in mediating HRV synchronization among group members. Data supporting each of the hypothesis is discussed.

  10. Practice-Relevant Pedagogy for Mining Software Engineering Curricula Assets

    DTIC Science & Technology

    2007-06-20

    permits the application of the Lean methods by virtually grouping shared services into eWorkcenters to which only non-routine requests are routed...engineering can be applied to IT shared services improvement and provide precise system improvement methods to complement the ITIL best practice. This...Vertical� or internal service- chain of primary business functions and enabling shared services Framework results - Mined patterns that relate

  11. The HydroServer Platform for Sharing Hydrologic Data

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Horsburgh, J. S.; Schreuders, K.; Maidment, D. R.; Zaslavsky, I.; Valentine, D. W.

    2010-12-01

    The CUAHSI Hydrologic Information System (HIS) is an internet based system that supports sharing of hydrologic data. HIS consists of databases connected using the Internet through Web services, as well as software for data discovery, access, and publication. The HIS system architecture is comprised of servers for publishing and sharing data, a centralized catalog to support cross server data discovery and a desktop client to access and analyze data. This paper focuses on HydroServer, the component developed for sharing and publishing space-time hydrologic datasets. A HydroServer is a computer server that contains a collection of databases, web services, tools, and software applications that allow data producers to store, publish, and manage the data from an experimental watershed or project site. HydroServer is designed to permit publication of data as part of a distributed national/international system, while still locally managing access to the data. We describe the HydroServer architecture and software stack, including tools for managing and publishing time series data for fixed point monitoring sites as well as spatially distributed, GIS datasets that describe a particular study area, watershed, or region. HydroServer adopts a standards based approach to data publication, relying on accepted and emerging standards for data storage and transfer. CUAHSI developed HydroServer code is free with community code development managed through the codeplex open source code repository and development system. There is some reliance on widely used commercial software for general purpose and standard data publication capability. The sharing of data in a common format is one way to stimulate interdisciplinary research and collaboration. It is anticipated that the growing, distributed network of HydroServers will facilitate cross-site comparisons and large scale studies that synthesize information from diverse settings, making the network as a whole greater than the sum of its parts in advancing hydrologic research. Details of the CUAHSI HIS can be found at http://his.cuahsi.org, and HydroServer codeplex site http://hydroserver.codeplex.com.

  12. Quantifying Scheduling Challenges for Exascale System Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mondragon, Oscar; Bridges, Patrick G.; Jones, Terry R

    2015-01-01

    The move towards high-performance computing (HPC) ap- plications comprised of coupled codes and the need to dra- matically reduce data movement is leading to a reexami- nation of time-sharing vs. space-sharing in HPC systems. In this paper, we discuss and begin to quantify the perfor- mance impact of a move away from strict space-sharing of nodes for HPC applications. Specifically, we examine the po- tential performance cost of time-sharing nodes between ap- plication components, we determine whether a simple coor- dinated scheduling mechanism can address these problems, and we research how suitable simple constraint-based opti- mization techniques are for solvingmore » scheduling challenges in this regime. Our results demonstrate that current general- purpose HPC system software scheduling and resource al- location systems are subject to significant performance de- ciencies which we quantify for six representative applica- tions. Based on these results, we discuss areas in which ad- ditional research is needed to meet the scheduling challenges of next-generation HPC systems.« less

  13. Using a Foundational Ontology for Reengineering a Software Enterprise Ontology

    NASA Astrophysics Data System (ADS)

    Perini Barcellos, Monalessa; de Almeida Falbo, Ricardo

    The knowledge about software organizations is considerably relevant to software engineers. The use of a common vocabulary for representing the useful knowledge about software organizations involved in software projects is important for several reasons, such as to support knowledge reuse and to allow communication and interoperability between tools. Domain ontologies can be used to define a common vocabulary for sharing and reuse of knowledge about some domain. Foundational ontologies can be used for evaluating and re-designing domain ontologies, giving to these real-world semantics. This paper presents an evaluating of a Software Enterprise Ontology that was reengineered using the Unified Foundation Ontology (UFO) as basis.

  14. Low Power, Low Mass, Modular, Multi-band Software-defined Radios

    NASA Technical Reports Server (NTRS)

    Haskins, Christopher B. (Inventor); Millard, Wesley P. (Inventor)

    2013-01-01

    Methods and systems to implement and operate software-defined radios (SDRs). An SDR may be configured to perform a combination of fractional and integer frequency synthesis and direct digital synthesis under control of a digital signal processor, which may provide a set of relatively agile, flexible, low-noise, and low spurious, timing and frequency conversion signals, and which may be used to maintain a transmit path coherent with a receive path. Frequency synthesis may include dithering to provide additional precision. The SDR may include task-specific software-configurable systems to perform tasks in accordance with software-defined parameters or personalities. The SDR may include a hardware interface system to control hardware components, and a host interface system to provide an interface to the SDR with respect to a host system. The SDR may be configured for one or more of communications, navigation, radio science, and sensors.

  15. The component-based architecture of the HELIOS medical software engineering environment.

    PubMed

    Degoulet, P; Jean, F C; Engelmann, U; Meinzer, H P; Baud, R; Sandblad, B; Wigertz, O; Le Meur, R; Jagermann, C

    1994-12-01

    The constitution of highly integrated health information networks and the growth of multimedia technologies raise new challenges for the development of medical applications. We describe in this paper the general architecture of the HELIOS medical software engineering environment devoted to the development and maintenance of multimedia distributed medical applications. HELIOS is made of a set of software components, federated by a communication channel called the HELIOS Unification Bus. The HELIOS kernel includes three main components, the Analysis-Design and Environment, the Object Information System and the Interface Manager. HELIOS services consist in a collection of toolkits providing the necessary facilities to medical application developers. They include Image Related services, a Natural Language Processor, a Decision Support System and Connection services. The project gives special attention to both object-oriented approaches and software re-usability that are considered crucial steps towards the development of more reliable, coherent and integrated applications.

  16. Localization to delocalization crossover in a driven nonlinear cavity array

    NASA Astrophysics Data System (ADS)

    Brown, Oliver T.; Hartmann, Michael J.

    2018-05-01

    We study nonlinear cavity arrays where the particle relaxation rate in each cavity increases with the excitation number. We show that coherent parametric inputs can drive such arrays into states with commensurate filling that form non-equilibrium analogs of Mott insulating states. We explore the boundaries of the Mott insulating phase and the crossover to a delocalized phase with spontaneous first order coherence. While sharing many similarities with the Mott insulator to superfluid transition in equilibrium, the phase diagrams we find also show marked differences. Particularly the off diagonal order does not become long range since the influence of dephasing processes increases with increasing tunneling rates.

  17. High power CO2 coherent ladar haven't quit the stage of military affairs

    NASA Astrophysics Data System (ADS)

    Zhang, Heyong

    2015-05-01

    The invention of the laser in 1960 created the possibility of using a source of coherent light as a transmitter for a laser radar (ladar). Coherent ladar shares many of the basic features of more common microwave radars. However, it is the extremely short operating wavelength of lasers that introduces new military applications, especially in the area of missile identification, space target tracking, remote rang finding, camouflage discrimination and toxic agent detection. Therefore, the most popular application field such as laser imaging and ranging were focused on CO2 laser in the last few decades. But during the development of solid state and fiber laser, some people said that the CO2 laser will be disappeared and will be replaced by the solid and fiber laser in the field of military and industry. The coherent CO2 laser radar will have the same destiny in the field of military affairs. However, to my opinion, the high power CO2 laser will be the most important laser source for laser radar and countermeasure in the future.

  18. Coherent control of plasmonic nanoantennas using optical eigenmodes

    NASA Astrophysics Data System (ADS)

    Kosmeier, Sebastian; de Luca, Anna Chiara; Zolotovskaya, Svetlana; di Falco, Andrea; Dholakia, Kishan; Mazilu, Michael

    2013-05-01

    The last decade has seen subwavelength focusing of the electromagnetic field in the proximity of nanoplasmonic structures with various designs. However, a shared issue is the spatial confinement of the field, which is mostly inflexible and limited to fixed locations determined by the geometry of the nanostructures, which hampers many applications. Here, we coherently address numerically and experimentally single and multiple plasmonic nanostructures chosen from a given array, resorting to the principle of optical eigenmodes. By decomposing the light field into optical eigenmodes, specifically tailored to the nanostructure, we create a subwavelength, selective and dynamic control of the incident light. The coherent control of plasmonic nanoantennas using this approach shows an almost zero crosstalk. This approach is applicable even in the presence of large transmission aberrations, such as present in holographic diffusers and multimode fibres. The method presents a paradigm shift for the addressing of plasmonic nanostructures by light.

  19. SIF 3.0

    ERIC Educational Resources Information Center

    Waters, John K.

    2009-01-01

    This article introduces Schools Interoperability Framework (SIF), a specification for data sharing among educational software applications that has grown to 10 disparate software applications. This new version (code name Columbus) is likely to give districts more vendors to choose from--maybe a lot more--because it will be arriving with a profound…

  20. Does Your Graphing Software Real-ly Work?

    ERIC Educational Resources Information Center

    Marchand, R. J.; McDevitt, T. J.; Bosse, Michael J.; Nandakumar, N. R.

    2007-01-01

    Many popular mathematical software products including Maple, Mathematica, Derive, Mathcad, Matlab, and some of the TI calculators produce incorrect graphs because they use complex arithmetic instead of "real" arithmetic. This article expounds on this issue, provides possible remedies for instructors to share with their students, and demonstrates…

  1. Global Situational Awareness with Free Tools

    DTIC Science & Technology

    2015-01-15

    Client Technical Solutions • Software Engineering Measurement and Analysis • Architecture Practices • Product Line Practice • Team Software Process...multiple data sources • Snort (Snorby on Security Onion ) • Nagios • SharePoint RSS • Flow • Others • Leverage standard data formats • Keyhole Markup Language

  2. 47 CFR 59.3 - Information concerning deployment of new services and equipment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... services and equipment, including any software or upgrades of software integral to the use or operation of... services and equipment. 59.3 Section 59.3 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INFRASTRUCTURE SHARING § 59.3 Information concerning deployment of...

  3. CONTIN XPCS: Software for Inverse Transform Analysis of X-Ray Photon Correlation Spectroscopy Dynamics

    PubMed Central

    Narayanan, Suresh; Zhang, Fan; Kuzmenko, Ivan; Ilavsky, Jan

    2018-01-01

    X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) both reveal dynamics using coherent scattering, but X-rays permit investigating of dynamics in a much more diverse array of materials. Heterogeneous dynamics occur in many such materials, and we showed how classic tools employed in analysis of heterogeneous DLS dynamics extend to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. This work presents the software implementation of inverse transform analysis of XPCS data called CONTIN XPCS, an extension of traditional CONTIN that accommodates dynamics encountered in equilibrium XPCS measurements. PMID:29875507

  4. CONTIN XPCS: Software for Inverse Transform Analysis of X-Ray Photon Correlation Spectroscopy Dynamics.

    PubMed

    Andrews, Ross N; Narayanan, Suresh; Zhang, Fan; Kuzmenko, Ivan; Ilavsky, Jan

    2018-02-01

    X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) both reveal dynamics using coherent scattering, but X-rays permit investigating of dynamics in a much more diverse array of materials. Heterogeneous dynamics occur in many such materials, and we showed how classic tools employed in analysis of heterogeneous DLS dynamics extend to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. This work presents the software implementation of inverse transform analysis of XPCS data called CONTIN XPCS, an extension of traditional CONTIN that accommodates dynamics encountered in equilibrium XPCS measurements.

  5. Toward the Geoscience Paper of the Future: Best practices for documenting and sharing research from data to software to provenance

    NASA Astrophysics Data System (ADS)

    Gil, Yolanda; David, Cédric H.; Demir, Ibrahim; Essawy, Bakinam T.; Fulweiler, Robinson W.; Goodall, Jonathan L.; Karlstrom, Leif; Lee, Huikyo; Mills, Heath J.; Oh, Ji-Hyun; Pierce, Suzanne A.; Pope, Allen; Tzeng, Mimi W.; Villamizar, Sandra R.; Yu, Xuan

    2016-10-01

    Geoscientists now live in a world rich with digital data and methods, and their computational research cannot be fully captured in traditional publications. The Geoscience Paper of the Future (GPF) presents an approach to fully document, share, and cite all their research products including data, software, and computational provenance. This article proposes best practices for GPF authors to make data, software, and methods openly accessible, citable, and well documented. The publication of digital objects empowers scientists to manage their research products as valuable scientific assets in an open and transparent way that enables broader access by other scientists, students, decision makers, and the public. Improving documentation and dissemination of research will accelerate the pace of scientific discovery by improving the ability of others to build upon published work.

  6. Comparison of retinal thickness by Fourier-domain optical coherence tomography and OCT retinal image analysis software segmentation analysis derived from Stratus optical coherence tomography images

    NASA Astrophysics Data System (ADS)

    Tátrai, Erika; Ranganathan, Sudarshan; Ferencz, Mária; Debuc, Delia Cabrera; Somfai, Gábor Márk

    2011-05-01

    Purpose: To compare thickness measurements between Fourier-domain optical coherence tomography (FD-OCT) and time-domain OCT images analyzed with a custom-built OCT retinal image analysis software (OCTRIMA). Methods: Macular mapping (MM) by StratusOCT and MM5 and MM6 scanning protocols by an RTVue-100 FD-OCT device are performed on 11 subjects with no retinal pathology. Retinal thickness (RT) and the thickness of the ganglion cell complex (GCC) obtained with the MM6 protocol are compared for each early treatment diabetic retinopathy study (ETDRS)-like region with corresponding results obtained with OCTRIMA. RT results are compared by analysis of variance with Dunnett post hoc test, while GCC results are compared by paired t-test. Results: A high correlation is obtained for the RT between OCTRIMA and MM5 and MM6 protocols. In all regions, the StratusOCT provide the lowest RT values (mean difference 43 +/- 8 μm compared to OCTRIMA, and 42 +/- 14 μm compared to RTVue MM6). All RTVue GCC measurements were significantly thicker (mean difference between 6 and 12 μm) than the GCC measurements of OCTRIMA. Conclusion: High correspondence of RT measurements is obtained not only for RT but also for the segmentation of intraretinal layers between FD-OCT and StratusOCT-derived OCTRIMA analysis. However, a correction factor is required to compensate for OCT-specific differences to make measurements more comparable to any available OCT device.

  7. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    PubMed

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate model sharing, and there are corresponding initiatives by the scientific journals. Outside the publishing enterprise, infrastructure to facilitate model sharing in biomechanics exists, and simulation software developers are interested in accommodating the community's needs for sharing of modeling resources. Encouragement for the use of standardized markups, concerns related to quality assurance, acknowledgement of increased burden, and importance of stewardship of resources are noted. In the short-term, it is advisable that the community builds upon recent strategies and experiments with new pathways for continued demonstration of model sharing, its promotion, and its utility. Nonetheless, the need for a long-term strategy to unify approaches in sharing computational models and related resources is acknowledged. Development of a sustainable platform supported by a culture of open model sharing will likely evolve through continued and inclusive discussions bringing all stakeholders at the table, e.g., by possibly establishing a consortium.

  8. The role of open-source software in innovation and standardization in radiology.

    PubMed

    Erickson, Bradley J; Langer, Steve; Nagy, Paul

    2005-11-01

    The use of open-source software (OSS), in which developers release the source code to applications they have developed, is popular in the software industry. This is done to allow others to modify and improve software (which may or may not be shared back to the community) and to allow others to learn from the software. Radiology was an early participant in this model, supporting OSS that implemented the ACR-National Electrical Manufacturers Association (now Digital Imaging and Communications in Medicine) standard for medical image communications. In radiology and in other fields, OSS has promoted innovation and the adoption of standards. Popular OSS is of high quality because access to source code allows many people to identify and resolve errors. Open-source software is analogous to the peer-review scientific process: one must be able to see and reproduce results to understand and promote what is shared. The authors emphasize that support for OSS need not threaten vendors; most vendors embrace and benefit from standards. Open-source development does not replace vendors but more clearly defines their roles, typically focusing on areas in which proprietary differentiators benefit customers and on professional services such as implementation planning and service. Continued support for OSS is essential for the success of our field.

  9. Architecture independent environment for developing engineering software on MIMD computers

    NASA Technical Reports Server (NTRS)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  10. A hybrid integrated services digital network-internet protocol solution for resident education.

    PubMed

    Erickson, Delnora; Greer, Lester; Belard, Arnaud; Tinnel, Brent; O'Connell, John

    2010-05-01

    The purpose of this study was to explore the effectiveness of incorporating Web-based application sharing of virtual medical simulation software within a multipoint video teleconference (VTC) as a training tool in graduate medical education. National Capital Consortium Radiation Oncology Residency Program resident and attending physicians participated in dosimetry teaching sessions held via VTC using Acrobat Connect application sharing. Residents at remote locations could take turns designing radiation treatments using standard three-dimensional planning software, whereas instructors gave immediate feedback and demonstrated proper techniques. Immediately after each dosimetry lesson, residents were asked to complete a survey that evaluated the effectiveness of the session. At the end of a 3-month trial of using Adobe Connect, residents completed a final survey that compared this teaching technology to the prior VTC-alone method. The mean difference from equality across all quality measures from the weekly survey was 0.8, where 0 indicated neither enhanced nor detracted from the learning experience and 1 indicated a minor enhancement in the learning experience. The mean difference from equality across all measures from the final survey comparing use of application sharing with VTC to VTC alone was 1.5, where 1 indicated slightly better and 2 indicated a somewhat better experience. The teaching efficacy of multipoint VTC is perceived by medical residents to be more effective when complemented by application-sharing software such as Adobe Acrobat Connect.

  11. Code Sharing and Collaboration: Experiences from the Scientist's Expert Assistant Project and their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing between groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for SOFIA, the SIRTF planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, defacto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA - both successes and failures - and offer some lessons learned that may promote further successes in collaboration and re-use.

  12. Code Sharing and Collaboration: Experiences From the Scientist's Expert Assistant Project and Their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Korathkar, Anuradha; Grosvenor, Sandy; Jones, Jeremy; Li, Connie; Mackey, Jennifer; Neher, Ken; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing among groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for the SIRTF (Space Infrared Telescope Facility) planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, de facto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA--both successes and failures, and offer some lessons learned that might promote further successes in collaboration and re-use.

  13. Multimodal ophthalmic imaging using swept source spectrally encoded scanning laser ophthalmoscopy and optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Malone, Joseph D.; El-Haddad, Mohamed T.; Tye, Logan A.; Majeau, Lucas; Godbout, Nicolas; Rollins, Andrew M.; Boudoux, Caroline; Tao, Yuankai K.

    2016-03-01

    Scanning laser ophthalmoscopy (SLO) and optical coherence tomography (OCT) benefit clinical diagnostic imaging in ophthalmology by enabling in vivo noninvasive en face and volumetric visualization of retinal structures, respectively. Spectrally encoding methods enable confocal imaging through fiber optics and reduces system complexity. Previous applications in ophthalmic imaging include spectrally encoded confocal scanning laser ophthalmoscopy (SECSLO) and a combined SECSLO-OCT system for image guidance, tracking, and registration. However, spectrally encoded imaging suffers from speckle noise because each spectrally encoded channel is effectively monochromatic. Here, we demonstrate in vivo human retinal imaging using a swept source spectrally encoded scanning laser ophthalmoscope and OCT (SSSESLO- OCT) at 1060 nm. SS-SESLO-OCT uses a shared 100 kHz Axsun swept source, shared scanner and imaging optics, and are detected simultaneously on a shared, dual channel high-speed digitizer. SESLO illumination and detection was performed using the single mode core and multimode inner cladding of a double clad fiber coupler, respectively, to preserve lateral resolution while improving collection efficiency and reducing speckle contrast at the expense of confocality. Concurrent en face SESLO and cross-sectional OCT images were acquired with 1376 x 500 pixels at 200 frames-per-second. Our system design is compact and uses a shared light source, imaging optics, and digitizer, which reduces overall system complexity and ensures inherent co-registration between SESLO and OCT FOVs. En face SESLO images acquired concurrent with OCT cross-sections enables lateral motion tracking and three-dimensional volume registration with broad applications in multivolume OCT averaging, image mosaicking, and intraoperative instrument tracking.

  14. Information Technology Wants to Be Free

    ERIC Educational Resources Information Center

    Poritz, Jonathan A.

    2012-01-01

    It makes sense for college and university faculty to ally with the free and open-source software community. They share common values. A marvelous additional benefit is that free software on campuses would significantly advance pedagogy and scholarship, increase efficiency, and save money. Only unquestioning obedience to market fundamentalism--or…

  15. Security Code Red or Ready? Leaders Sharing--For Tech Leaders

    ERIC Educational Resources Information Center

    Hall, Don; Kelly, Pat

    2005-01-01

    Increasingly, teachers rely on computer software and networks to both enhance curriculum management and provide engaging learning opportunities in instruction. New software is enabling more frequent formative assessments to better focus day-to-day lessons on the unique needs of individual learners. Administrators use increasingly complex data…

  16. Data and Models as Social Objects in the HydroShare System for Collaboration in the Hydrology Community and Beyond

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.; Crawley, S.; Ramirez, M.; Sadler, J.; Xue, Z.; Bandaragoda, C.

    2016-12-01

    How do you share and publish hydrologic data and models for a large collaborative project? HydroShare is a new, web-based system for sharing hydrologic data and models with specific functionality aimed at making collaboration easier. HydroShare has been developed with U.S. National Science Foundation support under the auspices of the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) to support the collaboration and community cyberinfrastructure needs of the hydrology research community. Within HydroShare, we have developed new functionality for creating datasets, describing them with metadata, and sharing them with collaborators. We cast hydrologic datasets and models as "social objects" that can be shared, collaborated around, annotated, published and discovered. In addition to data and model sharing, HydroShare supports web application programs (apps) that can act on data stored in HydroShare, just as software programs on your PC act on your data locally. This can free you from some of the limitations of local computing capacity and challenges in installing and maintaining software on your own PC. HydroShare's web-based cyberinfrastructure can take work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This presentation will describe HydroShare's collaboration functionality that enables both public and private sharing with individual users and collaborative user groups, and makes it easier for collaborators to iterate on shared datasets and models, creating multiple versions along the way, and publishing them with a permanent landing page, metadata description, and citable Digital Object Identifier (DOI) when the work is complete. This presentation will also describe the web app architecture that supports interoperability with third party servers functioning as application engines for analysis and processing of big hydrologic datasets. While developed to support the cyberinfrastructure needs of the hydrology community, the informatics infrastructure for programmatic interoperability of web resources has a generality beyond the solution of hydrology problems that will be discussed.

  17. Specifying Software Behavior for Requirements and Design

    DTIC Science & Technology

    2013-01-01

    e.g., Behavior Hiding is comprised of the Function Driver and Shared Services modules. Blacked-out modules, which are concerned with mechanisms for...and Shared Services modules. “The Func- tion Driver Module consists of a set of modules called Func- tion Drivers; each Function Driver is the sole...system environment. Functions that capture the rules determining these output values specify that behavior. The Shared Services module concerns aspects of

  18. Airborne Wind Profiling With the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2012-01-01

    A pulsed 2-micron coherent Doppler lidar system at NASA Langley Research Center in Virginia flew on the NASA's DC-8 aircraft during the NASA Genesis and Rapid Intensification Processes (GRIP) during the summer of 2010. The participation was part of the project Doppler Aerosol Wind Lidar (DAWN) Air. Selected results of airborne wind profiling are presented and compared with the dropsonde data for verification purposes. Panoramic presentations of different wind parameters over a nominal observation time span are also presented for selected GRIP data sets. The realtime data acquisition and analysis software that was employed during the GRIP campaign is introduced with its unique features.

  19. NA-42 TI Shared Software Component Library FY2011 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knudson, Christa K.; Rutz, Frederick C.; Dorow, Kevin E.

    The NA-42 TI program initiated an effort in FY2010 to standardize its software development efforts with the long term goal of migrating toward a software management approach that will allow for the sharing and reuse of code developed within the TI program, improve integration, ensure a level of software documentation, and reduce development costs. The Pacific Northwest National Laboratory (PNNL) has been tasked with two activities that support this mission. PNNL has been tasked with the identification, selection, and implementation of a Shared Software Component Library. The intent of the library is to provide a common repository that is accessiblemore » by all authorized NA-42 software development teams. The repository facilitates software reuse through a searchable and easy to use web based interface. As software is submitted to the repository, the component registration process captures meta-data and provides version control for compiled libraries, documentation, and source code. This meta-data is then available for retrieval and review as part of library search results. In FY2010, PNNL and staff from the Remote Sensing Laboratory (RSL) teamed up to develop a software application with the goal of replacing the aging Aerial Measuring System (AMS). The application under development includes an Advanced Visualization and Integration of Data (AVID) framework and associated AMS modules. Throughout development, PNNL and RSL have utilized a common AMS code repository for collaborative code development. The AMS repository is hosted by PNNL, is restricted to the project development team, is accessed via two different geographic locations and continues to be used. The knowledge gained from the collaboration and hosting of this repository in conjunction with PNNL software development and systems engineering capabilities were used in the selection of a package to be used in the implementation of the software component library on behalf of NA-42 TI. The second task managed by PNNL is the development and continued maintenance of the NA-42 TI Software Development Questionnaire. This questionnaire is intended to help software development teams working under NA-42 TI in documenting their development activities. When sufficiently completed, the questionnaire illustrates that the software development activities recorded incorporate significant aspects of the software engineering lifecycle. The questionnaire template is updated as comments are received from NA-42 and/or its development teams and revised versions distributed to those using the questionnaire. PNNL also maintains a list of questionnaire recipients. The blank questionnaire template, the AVID and AMS software being developed, and the completed AVID AMS specific questionnaire are being used as the initial content to be established in the TI Component Library. This report summarizes the approach taken to identify requirements, search for and evaluate technologies, and the approach taken for installation of the software needed to host the component library. Additionally, it defines the process by which users request access for the contribution and retrieval of library content.« less

  20. From Content to Practice: Sharing Educational Practice in Edu-Sharing

    ERIC Educational Resources Information Center

    Klebl, Michael; Kramer, Bernd J.; Zobel, Annett

    2010-01-01

    For technology-enhanced learning, the idea of "learning objects" transfers the technologies of content management, methods of software engineering and principles of open access to educational resources. This paper reports on CampusContent, a research project and competence centre for e-learning at FernUniversitat in Hagen that designed…

  1. Enrolments, Funding and Student Staff Ratios by Sector. Policy Note. Number 2

    ERIC Educational Resources Information Center

    Group of Eight (NJ1), 2011

    2011-01-01

    This briefing examines government and private funding across educational sectors. Key findings include: (1) Differences in funding for public and private education across the sectors: (a) do not reflect policy coherence; and (b) entrench inequities; (2) All sectors receive funding from both public and private sources, though the shares vary.…

  2. School Community Collaboration: Comparing Three Initiatives. Brief to Policymakers, No. 6.

    ERIC Educational Resources Information Center

    Stone, Calvin R.

    As part of a decentralization effort, San Diego schools are developing various community partnerships and joining with other agencies to share the resources of school, community, social service, and health providers. Interagency collaboration may result in a more integrated, coherent service delivery to an increasingly diverse student population.…

  3. Knowledge-Based Inferences across the Hemispheres: Domain Makes a Difference

    ERIC Educational Resources Information Center

    Shears, Connie; Hawkins, Amanda; Varner, Andria; Lewis, Lindsey; Heatley, Jennifer; Twachtmann, Lisa

    2008-01-01

    Language comprehension occurs when the left-hemisphere (LH) and the right-hemisphere (RH) share information derived from discourse [Beeman, M. J., Bowden, E. M., & Gernsbacher, M. A. (2000). Right and left hemisphere cooperation for drawing predictive and coherence inferences during normal story comprehension. "Brain and Language, 71", 310-336].…

  4. OIPAV: an integrated software system for ophthalmic image processing, analysis and visualization

    NASA Astrophysics Data System (ADS)

    Zhang, Lichun; Xiang, Dehui; Jin, Chao; Shi, Fei; Yu, Kai; Chen, Xinjian

    2018-03-01

    OIPAV (Ophthalmic Images Processing, Analysis and Visualization) is a cross-platform software which is specially oriented to ophthalmic images. It provides a wide range of functionalities including data I/O, image processing, interaction, ophthalmic diseases detection, data analysis and visualization to help researchers and clinicians deal with various ophthalmic images such as optical coherence tomography (OCT) images and color photo of fundus, etc. It enables users to easily access to different ophthalmic image data manufactured from different imaging devices, facilitate workflows of processing ophthalmic images and improve quantitative evaluations. In this paper, we will present the system design and functional modules of the platform and demonstrate various applications. With a satisfying function scalability and expandability, we believe that the software can be widely applied in ophthalmology field.

  5. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  6. Information Technology. DOD Needs to Strengthen Management of Its Statutorily Mandated Software and System Process Improvement Efforts

    DTIC Science & Technology

    2009-09-01

    NII)/CIO Assistant Secretary of Defense for Networks and Information Integration/Chief Information Officer CMMI Capability Maturity Model...a Web-based portal to share knowledge about software process-related methodologies, such as the SEI’s Capability Maturity Model Integration ( CMMI ...19 SEI’s IDEALSM model, and Lean Six Sigma.20 For example, the portal features content areas such as software acquisition management, the SEI CMMI

  7. Software for Sharing and Management of Information

    NASA Technical Reports Server (NTRS)

    Chen, James R.; Wolfe, Shawn R.; Wragg, Stephen D.

    2003-01-01

    DIAMS is a set of computer programs that implements a system of collaborative agents that serve multiple, geographically distributed users communicating via the Internet. DIAMS provides a user interface as a Java applet that runs on each user s computer and that works within the context of the user s Internet-browser software. DIAMS helps all its users to manage, gain access to, share, and exchange information in databases that they maintain on their computers. One of the DIAMS agents is a personal agent that helps its owner find information most relevant to current needs. It provides software tools and utilities for users to manage their information repositories with dynamic organization and virtual views. Capabilities for generating flexible hierarchical displays are integrated with capabilities for indexed- query searching to support effective access to information. Automatic indexing methods are employed to support users queries and communication between agents. The catalog of a repository is kept in object-oriented storage to facilitate sharing of information. Collaboration between users is aided by matchmaker agents and by automated exchange of information. The matchmaker agents are designed to establish connections between users who have similar interests and expertise.

  8. EDGE3: A web-based solution for management and analysis of Agilent two color microarray experiments

    PubMed Central

    Vollrath, Aaron L; Smith, Adam A; Craven, Mark; Bradfield, Christopher A

    2009-01-01

    Background The ability to generate transcriptional data on the scale of entire genomes has been a boon both in the improvement of biological understanding and in the amount of data generated. The latter, the amount of data generated, has implications when it comes to effective storage, analysis and sharing of these data. A number of software tools have been developed to store, analyze, and share microarray data. However, a majority of these tools do not offer all of these features nor do they specifically target the commonly used two color Agilent DNA microarray platform. Thus, the motivating factor for the development of EDGE3 was to incorporate the storage, analysis and sharing of microarray data in a manner that would provide a means for research groups to collaborate on Agilent-based microarray experiments without a large investment in software-related expenditures or extensive training of end-users. Results EDGE3 has been developed with two major functions in mind. The first function is to provide a workflow process for the generation of microarray data by a research laboratory or a microarray facility. The second is to store, analyze, and share microarray data in a manner that doesn't require complicated software. To satisfy the first function, EDGE3 has been developed as a means to establish a well defined experimental workflow and information system for microarray generation. To satisfy the second function, the software application utilized as the user interface of EDGE3 is a web browser. Within the web browser, a user is able to access the entire functionality, including, but not limited to, the ability to perform a number of bioinformatics based analyses, collaborate between research groups through a user-based security model, and access to the raw data files and quality control files generated by the software used to extract the signals from an array image. Conclusion Here, we present EDGE3, an open-source, web-based application that allows for the storage, analysis, and controlled sharing of transcription-based microarray data generated on the Agilent DNA platform. In addition, EDGE3 provides a means for managing RNA samples and arrays during the hybridization process. EDGE3 is freely available for download at . PMID:19732451

  9. EDGE(3): a web-based solution for management and analysis of Agilent two color microarray experiments.

    PubMed

    Vollrath, Aaron L; Smith, Adam A; Craven, Mark; Bradfield, Christopher A

    2009-09-04

    The ability to generate transcriptional data on the scale of entire genomes has been a boon both in the improvement of biological understanding and in the amount of data generated. The latter, the amount of data generated, has implications when it comes to effective storage, analysis and sharing of these data. A number of software tools have been developed to store, analyze, and share microarray data. However, a majority of these tools do not offer all of these features nor do they specifically target the commonly used two color Agilent DNA microarray platform. Thus, the motivating factor for the development of EDGE(3) was to incorporate the storage, analysis and sharing of microarray data in a manner that would provide a means for research groups to collaborate on Agilent-based microarray experiments without a large investment in software-related expenditures or extensive training of end-users. EDGE(3) has been developed with two major functions in mind. The first function is to provide a workflow process for the generation of microarray data by a research laboratory or a microarray facility. The second is to store, analyze, and share microarray data in a manner that doesn't require complicated software. To satisfy the first function, EDGE3 has been developed as a means to establish a well defined experimental workflow and information system for microarray generation. To satisfy the second function, the software application utilized as the user interface of EDGE(3) is a web browser. Within the web browser, a user is able to access the entire functionality, including, but not limited to, the ability to perform a number of bioinformatics based analyses, collaborate between research groups through a user-based security model, and access to the raw data files and quality control files generated by the software used to extract the signals from an array image. Here, we present EDGE(3), an open-source, web-based application that allows for the storage, analysis, and controlled sharing of transcription-based microarray data generated on the Agilent DNA platform. In addition, EDGE(3) provides a means for managing RNA samples and arrays during the hybridization process. EDGE(3) is freely available for download at http://edge.oncology.wisc.edu/.

  10. Overview of the LINCS architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fletcher, J.G.; Watson, R.W.

    1982-01-13

    Computing at the Lawrence Livermore National Laboratory (LLNL) has evolved over the past 15 years with a computer network based resource sharing environment. The increasing use of low cost and high performance micro, mini and midi computers and commercially available local networking systems will accelerate this trend. Further, even the large scale computer systems, on which much of the LLNL scientific computing depends, are evolving into multiprocessor systems. It is our belief that the most cost effective use of this environment will depend on the development of application systems structured into cooperating concurrent program modules (processes) distributed appropriately over differentmore » nodes of the environment. A node is defined as one or more processors with a local (shared) high speed memory. Given the latter view, the environment can be characterized as consisting of: multiple nodes communicating over noisy channels with arbitrary delays and throughput, heterogenous base resources and information encodings, no single administration controlling all resources, distributed system state, and no uniform time base. The system design problem is - how to turn the heterogeneous base hardware/firmware/software resources of this environment into a coherent set of resources that facilitate development of cost effective, reliable, and human engineered applications. We believe the answer lies in developing a layered, communication oriented distributed system architecture; layered and modular to support ease of understanding, reconfiguration, extensibility, and hiding of implementation or nonessential local details; communication oriented because that is a central feature of the environment. The Livermore Interactive Network Communication System (LINCS) is a hierarchical architecture designed to meet the above needs. While having characteristics in common with other architectures, it differs in several respects.« less

  11. Lifelong personal health data and application software via virtual machines in the cloud.

    PubMed

    Van Gorp, Pieter; Comuzzi, Marco

    2014-01-01

    Personal Health Records (PHRs) should remain the lifelong property of patients, who should be able to show them conveniently and securely to selected caregivers and institutions. In this paper, we present MyPHRMachines, a cloud-based PHR system taking a radically new architectural solution to health record portability. In MyPHRMachines, health-related data and the application software to view and/or analyze it are separately deployed in the PHR system. After uploading their medical data to MyPHRMachines, patients can access them again from remote virtual machines that contain the right software to visualize and analyze them without any need for conversion. Patients can share their remote virtual machine session with selected caregivers, who will need only a Web browser to access the pre-loaded fragments of their lifelong PHR. We discuss a prototype of MyPHRMachines applied to two use cases, i.e., radiology image sharing and personalized medicine.

  12. Do open access data policies inhibit innovation?

    USGS Publications Warehouse

    Katzner, Todd E.

    2015-01-01

    There has been a great deal of attention paid recently to the idea of data sharing (Van Noorden 2014, Beardsley 2015, Nature Publishing Group2015, www.copdess.com). However, the vast majority of these arguments are in agreement and present as fait accompli the idea that data are a public good and that therefore, once published, they should become open access. In fact, although there are many good reasons for data sharing, there also are a number of cogent and coherent cases to be made against open-access policies (e.g., Fenichel and Skelly 2015). The goal of this piece is not to debate the relevance or accuracy of the points made in favor of data sharing but to elevate the discussion by pointing out key problems with open-access policies and to identify central issues that, if solved, will enhance the utility of data sharing to science and society.

  13. Estimating population ecology models for the WWW market: evidence of competitive oligopolies.

    PubMed

    de Cabo, Ruth Mateos; Gimeno, Ricardo

    2013-01-01

    This paper proposes adapting a particle filtering algorithm to model online Spanish real estate and job search market segments based on the Lotka-Volterra competition equations. For this purpose the authors use data on Internet information searches from Google Trends to proxy for market share. Market share evolution estimations are coherent with those observed in Google Trends. The results show evidence of low website incompatibility in the markets analyzed. Competitive oligopolies are most common in such low-competition markets, instead of the monopolies predicted by theoretical ecology models under strong competition conditions.

  14. Competition.

    PubMed

    Chambers, D W

    1997-01-01

    Our ambivalence toward competition can be traced to an unspoken preference for certain types of competition which give us an advantage over the types we value less. Four types are defined (a) pure (same rules, same objectives), (b) collaborative (same rules, shared objective), (c) market share (different rules, same objectives), and (d) market growth (different rules, value added orientation). The defining characteristics of the four types of competition are respectively: needing a referee, arguing over the spoils, differentiation and substitutability, and customer focus. Dentistry has features of all four types of competition, thus making it difficult to have a meaningful discussion or frame a coherent policy on this topic.

  15. The NIH BD2K center for big data in translational genomics

    PubMed Central

    Paten, Benedict; Diekhans, Mark; Druker, Brian J; Friend, Stephen; Guinney, Justin; Gassner, Nadine; Guttman, Mitchell; James Kent, W; Mantey, Patrick; Margolin, Adam A; Massie, Matt; Novak, Adam M; Nothaft, Frank; Pachter, Lior; Patterson, David; Smuga-Otto, Maciej; Stuart, Joshua M; Van’t Veer, Laura; Haussler, David

    2015-01-01

    The world’s genomics data will never be stored in a single repository – rather, it will be distributed among many sites in many countries. No one site will have enough data to explain genotype to phenotype relationships in rare diseases; therefore, sites must share data. To accomplish this, the genetics community must forge common standards and protocols to make sharing and computing data among many sites a seamless activity. Through the Global Alliance for Genomics and Health, we are pioneering the development of shared application programming interfaces (APIs) to connect the world’s genome repositories. In parallel, we are developing an open source software stack (ADAM) that uses these APIs. This combination will create a cohesive genome informatics ecosystem. Using containers, we are facilitating the deployment of this software in a diverse array of environments. Through benchmarking efforts and big data driver projects, we are ensuring ADAM’s performance and utility. PMID:26174866

  16. Open-source, community-driven microfluidics with Metafluidics.

    PubMed

    Kong, David S; Thorsen, Todd A; Babb, Jonathan; Wick, Scott T; Gam, Jeremy J; Weiss, Ron; Carr, Peter A

    2017-06-07

    Microfluidic devices have the potential to automate and miniaturize biological experiments, but open-source sharing of device designs has lagged behind sharing of other resources such as software. Synthetic biologists have used microfluidics for DNA assembly, cell-free expression, and cell culture, but a combination of expense, device complexity, and reliance on custom set-ups hampers their widespread adoption. We present Metafluidics, an open-source, community-driven repository that hosts digital design files, assembly specifications, and open-source software to enable users to build, configure, and operate a microfluidic device. We use Metafluidics to share designs and fabrication instructions for both a microfluidic ring-mixer device and a 32-channel tabletop microfluidic controller. This device and controller are applied to build genetic circuits using standard DNA assembly methods including ligation, Gateway, Gibson, and Golden Gate. Metafluidics is intended to enable a broad community of engineers, DIY enthusiasts, and other nontraditional participants with limited fabrication skills to contribute to microfluidic research.

  17. Ideas for Advancing Code Sharing: A Different Kind of Hack Day

    NASA Astrophysics Data System (ADS)

    Teuben, P.; Allen, A.; Berriman, B.; DuPrie, K.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Wallin, J. F.

    2014-05-01

    How do we as a community encourage the reuse of software for telescope operations, data processing, and ? How can we support making codes used in research available for others to examine? Continuing the discussion from last year Bring out your codes! BoF session, participants separated into groups to brainstorm ideas to mitigate factors which inhibit code sharing and nurture those which encourage code sharing. The BoF concluded with the sharing of ideas that arose from the brainstorming sessions and a brief summary by the moderator.

  18. Educational Software Employing Group Competition Using an Interactive Electronic Whiteboard

    ERIC Educational Resources Information Center

    Otsuki, Yoko; Bandoh, Hirokazu; Kato, Naoki; Indurkhya, Bipin; Nakagawa, Masaki

    2004-01-01

    This article presents a design of educational software employing group competition using a large interactive electronic whiteboard, and a report on its experimental use. Group competition and collaboration are useful methods to cultivate originality and communication skills. To share the same space, the same large screen, and face-to-face…

  19. Assessing the Utility of an Event-Step ASMD Model by Analysis of Surface Combatant Shared Self-Defense

    DTIC Science & Technology

    2001-09-01

    Oriented Discrete Event Simulation,” Master’s Thesis in Operations Research, Naval Postgraduate School Monterey, CA, 1996. 12. Arntzen , A., “Software...Dependent Hit Probabilities”, Naval Research Logistics, Vol. 31, pp. 363-371, 1984. 3 Arntzen , A., “Software Components for Air Defense Planning

  20. Voice-Recognition Augmented Performance Tools in Performance Poetry Pedagogy

    ERIC Educational Resources Information Center

    Devanny, David; McGowan, Jack

    2016-01-01

    This provocation shares findings from the use of bespoke voice-recognition performance software in a number of seminars (which took place in the 2014-2016 academic years at Glasgow School of Art, University of Warwick, and Falmouth University). The software, made available through this publication, is a web-app which uses Google Chrome's native…

  1. "LearningPad" Conundrum: The Perils of Using Third-Party Software and Student Privacy

    ERIC Educational Resources Information Center

    O'Brien, Jason; Roller, Sarah; Lampley, Sandra

    2017-01-01

    This case focuses on the potential problems associated with sharing personally identifiable information (PII) when students are required to use third-party software. Specifically, third-grade students were required to complete "LearningPad" activities as a component of their homework grade in math, spelling, and language arts. As…

  2. Supporting geoscience with graphical-user-interface Internet tools for the Macintosh

    NASA Astrophysics Data System (ADS)

    Robin, Bernard

    1995-07-01

    This paper describes a suite of Macintosh graphical-user-interface (GUI) software programs that can be used in conjunction with the Internet to support geoscience education. These software programs allow science educators to access and retrieve a large body of resources from an increasing number of network sites, taking advantage of the intuitive, simple-to-use Macintosh operating system. With these tools, educators easily can locate, download, and exchange not only text files but also sound resources, video movie clips, and software application files from their desktop computers. Another major advantage of these software tools is that they are available at no cost and may be distributed freely. The following GUI software tools are described including examples of how they can be used in an educational setting: ∗ Eudora—an e-mail program ∗ NewsWatcher—a newsreader ∗ TurboGopher—a Gopher program ∗ Fetch—a software application for easy File Transfer Protocol (FTP) ∗ NCSA Mosaic—a worldwide hypertext browsing program. An explosive growth of online archives currently is underway as new electronic sites are being added continuously to the Internet. Many of these resources may be of interest to science educators who learn they can share not only ASCII text files, but also graphic image files, sound resources, QuickTime movie clips, and hypermedia projects with colleagues from locations around the world. These powerful, yet simple to learn GUI software tools are providing a revolution in how knowledge can be accessed, retrieved, and shared.

  3. Online Concept Maps: Enhancing Collaborative Learning by Using Technology with Concept Maps.

    ERIC Educational Resources Information Center

    Canas, Alberto J.; Ford, Kenneth M.; Novak, Joseph D.; Hayes, Patrick; Reichherzer, Thomas R.; Suri, Niranjan

    2001-01-01

    Describes a collaborative software system that allows students from distant schools to share claims derived from their concept maps. Sharing takes place by accessing The Knowledge Soup, a repository of propositions submitted by students and stored on a computer server. Students can use propositions from other students to enhance their concept…

  4. Investigating Why Teachers Reported Continued Use and Sharing of an Educational Innovation after the Research Has Ended

    ERIC Educational Resources Information Center

    Hegedus, Stephen J.; Dalton, Sara; Roschelle, Jeremy; Penuel, William; Dickey-Kurdziolek, Margaret; Tatar, Deborah

    2014-01-01

    We investigated prospects for reported sustainable adoption and sharing of an educational innovation through survey research including online questionnaires and telephone interviews. This investigation is part of the Scaling-Up SimCalc experimental program, which combines dynamic representational algebra software (SimCalc MathWorlds) with…

  5. Analysis of Raman lasing without inversion

    NASA Astrophysics Data System (ADS)

    Sheldon, Paul Martin

    1999-12-01

    Properties of lasing without inversion were studied analytically and numerically using Maple computer assisted algebra software. Gain for probe electromagnetic field without population inversion in detuned three level atomic schemes has been found. Matter density matrix dynamics and coherence is explored using Pauli matrices in 2-level systems and Gell-Mann matrices in 3-level systems. It is shown that extreme inversion produces no coherence and hence no lasing. Unitary transformation from the strict field-matter Hamiltonian to an effective two-photon Raman Hamiltonian for multilevel systems has been derived. Feynman diagrams inherent in the derivation show interesting physics. An additional picture change was achieved and showed cw gain possible. Properties of a Raman-like laser based on injection of 3- level coherently driven Λ-type atoms whose Hamiltonian contains the Raman Hamiltonian and microwave coupling the two bottom states have been studied in the limits of small and big photon numbers in the drive field. Another picture change removed the microwave coupler to all orders and simplified analysis. New possibilities of inversionless generation were found.

  6. FPGA-Based Optical Cavity Phase Stabilization for Coherent Pulse Stacking

    DOE PAGES

    Xu, Yilun; Wilcox, Russell; Byrd, John; ...

    2017-11-20

    Coherent pulse stacking (CPS) is a new time-domain coherent addition technique that stacks several optical pulses into a single output pulse, enabling high pulse energy from fiber lasers. We develop a robust, scalable, and distributed digital control system with firmware and software integration for algorithms, to support the CPS application. We model CPS as a digital filter in the Z domain and implement a pulse-pattern-based cavity phase detection algorithm on an field-programmable gate array (FPGA). A two-stage (2+1 cavities) 15-pulse stacking system achieves an 11.0 peak-power enhancement factor. Each optical cavity is fed back at 1.5kHz, and stabilized at anmore » individually-prescribed round-trip phase with 0.7deg and 2.1deg rms phase errors for Stages 1 and 2, respectively. Optical cavity phase control with nanometer accuracy ensures 1.2% intensity stability of the stacked pulse over 12 h. The FPGA-based feedback control system can be scaled to large numbers of optical cavities.« less

  7. FPGA-Based Optical Cavity Phase Stabilization for Coherent Pulse Stacking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Yilun; Wilcox, Russell; Byrd, John

    Coherent pulse stacking (CPS) is a new time-domain coherent addition technique that stacks several optical pulses into a single output pulse, enabling high pulse energy from fiber lasers. We develop a robust, scalable, and distributed digital control system with firmware and software integration for algorithms, to support the CPS application. We model CPS as a digital filter in the Z domain and implement a pulse-pattern-based cavity phase detection algorithm on an field-programmable gate array (FPGA). A two-stage (2+1 cavities) 15-pulse stacking system achieves an 11.0 peak-power enhancement factor. Each optical cavity is fed back at 1.5kHz, and stabilized at anmore » individually-prescribed round-trip phase with 0.7deg and 2.1deg rms phase errors for Stages 1 and 2, respectively. Optical cavity phase control with nanometer accuracy ensures 1.2% intensity stability of the stacked pulse over 12 h. The FPGA-based feedback control system can be scaled to large numbers of optical cavities.« less

  8. Software-Controlled Caches in the VMP Multiprocessor

    DTIC Science & Technology

    1986-03-01

    programming system level that Processors is tuned for the VMP design. In this vein, we are interested in exploring how far the software support can go to ...handled in software, analogously to the handling agement of the shared program state is familiar and of virtual memory page faults. Hardware support for...ensure good behavior, as opposed to how Each cache miss results in bus traffic. Table 2 pro- vides the bus cost for the "average" cache miss. Fig

  9. Data Management Applications for the Service Preparation Subsystem

    NASA Technical Reports Server (NTRS)

    Luong, Ivy P.; Chang, George W.; Bui, Tung; Allen, Christopher; Malhotra, Shantanu; Chen, Fannie C.; Bui, Bach X.; Gutheinz, Sandy C.; Kim, Rachel Y.; Zendejas, Silvino C.; hide

    2009-01-01

    These software applications provide intuitive User Interfaces (UIs) with a consistent look and feel for interaction with, and control of, the Service Preparation Subsystem (SPS). The elements of the UIs described here are the File Manager, Mission Manager, and Log Monitor applications. All UIs provide access to add/delete/update data entities in a complex database schema without requiring technical expertise on the part of the end users. These applications allow for safe, validated, catalogued input of data. Also, the software has been designed in multiple, coherent layers to promote ease of code maintenance and reuse in addition to reducing testing and accelerating maturity.

  10. e!DAL - a framework to store, share and publish research data

    PubMed Central

    2014-01-01

    Background The life-science community faces a major challenge in handling “big data”, highlighting the need for high quality infrastructures capable of sharing and publishing research data. Data preservation, analysis, and publication are the three pillars in the “big data life cycle”. The infrastructures currently available for managing and publishing data are often designed to meet domain-specific or project-specific requirements, resulting in the repeated development of proprietary solutions and lower quality data publication and preservation overall. Results e!DAL is a lightweight software framework for publishing and sharing research data. Its main features are version tracking, metadata management, information retrieval, registration of persistent identifiers (DOI), an embedded HTTP(S) server for public data access, access as a network file system, and a scalable storage backend. e!DAL is available as an API for local non-shared storage and as a remote API featuring distributed applications. It can be deployed “out-of-the-box” as an on-site repository. Conclusions e!DAL was developed based on experiences coming from decades of research data management at the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK). Initially developed as a data publication and documentation infrastructure for the IPK’s role as a data center in the DataCite consortium, e!DAL has grown towards being a general data archiving and publication infrastructure. The e!DAL software has been deployed into the Maven Central Repository. Documentation and Software are also available at: http://edal.ipk-gatersleben.de. PMID:24958009

  11. e!DAL--a framework to store, share and publish research data.

    PubMed

    Arend, Daniel; Lange, Matthias; Chen, Jinbo; Colmsee, Christian; Flemming, Steffen; Hecht, Denny; Scholz, Uwe

    2014-06-24

    The life-science community faces a major challenge in handling "big data", highlighting the need for high quality infrastructures capable of sharing and publishing research data. Data preservation, analysis, and publication are the three pillars in the "big data life cycle". The infrastructures currently available for managing and publishing data are often designed to meet domain-specific or project-specific requirements, resulting in the repeated development of proprietary solutions and lower quality data publication and preservation overall. e!DAL is a lightweight software framework for publishing and sharing research data. Its main features are version tracking, metadata management, information retrieval, registration of persistent identifiers (DOI), an embedded HTTP(S) server for public data access, access as a network file system, and a scalable storage backend. e!DAL is available as an API for local non-shared storage and as a remote API featuring distributed applications. It can be deployed "out-of-the-box" as an on-site repository. e!DAL was developed based on experiences coming from decades of research data management at the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK). Initially developed as a data publication and documentation infrastructure for the IPK's role as a data center in the DataCite consortium, e!DAL has grown towards being a general data archiving and publication infrastructure. The e!DAL software has been deployed into the Maven Central Repository. Documentation and Software are also available at: http://edal.ipk-gatersleben.de.

  12. American Voice Types: Towards a Vocal Typology for American English

    ERIC Educational Resources Information Center

    McPeek, Tyler

    2013-01-01

    Individual voices are not uniformly similar to others, even when factoring out speaker characteristics such as sex, age, dialect, and so on. Some speakers share common features and can cohere into groups based on gross vocal similarity but, to date, no attempt has been made to describe these features systematically or to generate a taxonomy based…

  13. Collaborative Storytelling Experiences in Social Media: Influence of Peer-Assistance Mechanisms

    ERIC Educational Resources Information Center

    Liu, Chen-Chung; Liu, Kuo-Ping; Chen, Wei-Hong; Lin, Chiu-Pin; Chen, Gwo-Dong

    2011-01-01

    Collaborative storytelling activities in social media environments are generally developed in a linear way in which all participants collaborate on a shared story as it is passed from one to another in a relay form. Difficulties with this linear approach arise when collecting the contributions of participants in to a coherent story. This study…

  14. Collaborative Beamfocusing Radio (COBRA)

    NASA Astrophysics Data System (ADS)

    Rode, Jeremy P.; Hsu, Mark J.; Smith, David; Husain, Anis

    2013-05-01

    A Ziva team has recently demonstrated a novel technique called Collaborative Beamfocusing Radios (COBRA) which enables an ad-hoc collection of distributed commercial off-the-shelf software defined radios to coherently align and beamform to a remote radio. COBRA promises to operate even in high multipath and non-line-of-sight environments as well as mobile applications without resorting to computationally expensive closed loop techniques that are currently unable to operate with significant movement. COBRA exploits two key technologies to achieve coherent beamforming. The first is Time Reversal (TR) which compensates for multipath and automatically discovers the optimal spatio-temporal matched filter to enable peak signal gains (up to 20 dB) and diffraction-limited focusing at the intended receiver in NLOS and severe multipath environments. The second is time-aligned buffering which enables TR to synchronize distributed transmitters into a collaborative array. This time alignment algorithm avoids causality violations through the use of reciprocal buffering. Preserving spatio-temporal reciprocity through the TR capture and retransmission process achieves coherent alignment across multiple radios at ~GHz carriers using only standard quartz-oscillators. COBRA has been demonstrated in the lab, aligning two off-the-shelf software defined radios over-the-air to an accuracy of better than 2 degrees of carrier alignment at 450 MHz. The COBRA algorithms are lightweight, with computation in 5 ms on a smartphone class microprocessor. COBRA also has low start-up latency, achieving high accuracy from a cold-start in 30 ms. The COBRA technique opens up a large number of new capabilities in communications, and electronic warfare including selective spatial jamming, geolocation and anti-geolocation.

  15. Software Quality Control at Belle II

    NASA Astrophysics Data System (ADS)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  16. A World I Don't Inhabit: Disquiet and Identity in Second Life and Facebook

    ERIC Educational Resources Information Center

    Boon, Stuart; Sinclair, Christine

    2009-01-01

    The authors use their own experiences with social software to argue for the need for caution in its uses in education. They particularly draw attention to difficulties in engagement, the effects on identity, an emphasis on superficial issues, lack of coherence, and problems with authenticity and trust. While Facebook and Second Life appear to have…

  17. Resurrecting Legacy Code Using Ontosoft Knowledge-Sharing and Digital Object Management to Revitalize and Reproduce Software for Groundwater Management Research

    NASA Astrophysics Data System (ADS)

    Kwon, N.; Gentle, J.; Pierce, S. A.

    2015-12-01

    Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the application is accessible with version control and potential for new branch developments. Finally, metadata about the software has been completed within the OntoSoft portal to provide descriptive curation, make GWDSS searchable, and complete documentation of the scientific software lifecycle.

  18. Scheme for the generation of freely traveling optical trio coherent states

    NASA Astrophysics Data System (ADS)

    Duc, Truong Minh; Dat, Tran Quang; An, Nguyen Ba; Kim, Jaewan

    2013-08-01

    Trio coherent states (TCSs) are non-Gaussian three-mode entangled states which can serve as a useful resource for continuous-variable quantum tasks, so their generation is of primary importance. Schemes exist to generate stable TCSs in terms of vibrational motion of a trapped ion inside a crystal. However, to perform quantum communication and distributed quantum computation the states should be shared beforehand among distant parties. That is, their modes should be able to be directed to different desired locations in space. In this work, we propose an experimental setup to generate such free-traveling TCSs in terms of optical fields. Our scheme uses standard physical resources, such as coherent states, balanced beam splitters, phase shifters, nonideal on-off photodetectors, and realistic weak cross-Kerr nonlinearities, without the need of single photons or homodyne or heterodyne measurements. We study the dependences of the fidelity of the state generated by our scheme with respect to the target TCS and the corresponding generation probability for the parameters involved. In theory, the fidelity could be nearly perfect for whatever weak nonlinearities τ and low photodetector efficiency η, provided that the amplitude |α| of an input coherent state is large enough, namely, |α|≥5/(ητ).

  19. Developmental approach towards high resolution optical coherence tomography for glaucoma diagnostics

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Ketelhut, Steffi; Heiduschka, Peter; Thorn, Marie; Larsen, Michael; Schnekenburger, Jürgen

    2018-02-01

    Glaucoma is caused by a pathological rise in the intraocular pressure, which results in a progressive loss of vision by a damage to retinal cells and the optical nerve head. Early detection of pressure-induced damage is thus essential for the reduction of eye pressure and to prevent severe incapacity or blindness. Within the new European Project GALAHAD (Glaucoma Advanced, Label free High Resolution Automated OCT Diagnostics), we will develop a new low-cost and high-resolution OCT system for the early detection of glaucoma. The device is designed to improve diagnosis based on a new system of optical coherence tomography. Although OCT systems are at present available in ophthalmology centres, high-resolution devices are extremely expensive. The novelty of the new Galahad system is its super wideband light source to achieve high image resolution at a reasonable cost. Proof of concept experiments with cell and tissue Glaucoma test standards and animal models are planned for the test of the new optical components and new algorithms performance for the identification of Glaucoma associated cell and tissue structures. The intense training of the software systems with various samples should result in a increased sensitivity and specificity of the OCT software system.

  20. Digital coherent receiver based transmitter penalty characterization.

    PubMed

    Geisler, David J; Kaufmann, John E

    2016-12-26

    For optical communications links where receivers are signal-power-starved, such as through free-space, it is important to design transmitters and receivers that can operate as close as practically possible to theoretical limits. A total system penalty is typically assessed in terms of how far the end-to-end bit-error rate (BER) is from these limits. It is desirable, but usually difficult, to determine the division of this penalty between the transmitter and receiver. This paper describes a new rigorous and computationally based method that isolates which portion of the penalty can be assessed against the transmitter. There are two basic parts to this approach: (1) use of a coherent optical receiver to perform frequency down-conversion of a transmitter's optical signal waveform to the electrical domain, preserving both optical field amplitude and phase information, and (2): software-based analysis of the digitized electrical waveform. The result is a single numerical metric that quantifies how close a transmitter's signal waveform is to the ideal, based on its BER performance with a perfect software-defined matched-filter receiver demodulator. A detailed description of applying the proposed methodology to the waveform characterization of an optical burst-mode differential phase-shifted keying (DPSK) transmitter is experimentally demonstrated.

  1. Modeling and simulation of magnetic resonance imaging based on intermolecular multiple quantum coherences

    NASA Astrophysics Data System (ADS)

    Cai, Congbo; Dong, Jiyang; Cai, Shuhui; Cheng, En; Chen, Zhong

    2006-11-01

    Intermolecular multiple quantum coherences (iMQCs) have many potential applications since they can provide interaction information between different molecules within the range of dipolar correlation distance, and can provide new contrast in magnetic resonance imaging (MRI). Because of the non-localized property of dipolar field, and the non-linear property of the Bloch equations incorporating the dipolar field term, the evolution behavior of iMQC is difficult to deduce strictly in many cases. In such cases, simulation studies are very important. Simulation results can not only give a guide to optimize experimental conditions, but also help analyze unexpected experimental results. Based on our product operator matrix and the K-space method for dipolar field calculation, the MRI simulation software was constructed, running on Windows operation system. The non-linear Bloch equations are calculated by a fifth-order Cash-Karp Runge-Kutta formulism. Computational time can be efficiently reduced by separating the effects of chemical shifts and strong gradient field. Using this software, simulation of different kinds of complex MRI sequences can be done conveniently and quickly on general personal computers. Some examples were given. The results were discussed.

  2. Machine learning research 1989-90

    NASA Technical Reports Server (NTRS)

    Porter, Bruce W.; Souther, Arthur

    1990-01-01

    Multifunctional knowledge bases offer a significant advance in artificial intelligence because they can support numerous expert tasks within a domain. As a result they amortize the costs of building a knowledge base over multiple expert systems and they reduce the brittleness of each system. Due to the inevitable size and complexity of multifunctional knowledge bases, their construction and maintenance require knowledge engineering and acquisition tools that can automatically identify interactions between new and existing knowledge. Furthermore, their use requires software for accessing those portions of the knowledge base that coherently answer questions. Considerable progress was made in developing software for building and accessing multifunctional knowledge bases. A language was developed for representing knowledge, along with software tools for editing and displaying knowledge, a machine learning program for integrating new information into existing knowledge, and a question answering system for accessing the knowledge base.

  3. A new method for real-time co-registration of 3D coronary angiography and intravascular ultrasound or optical coherence tomography.

    PubMed

    Carlier, Stéphane; Didday, Rich; Slots, Tristan; Kayaert, Peter; Sonck, Jeroen; El-Mourad, Mike; Preumont, Nicolas; Schoors, Dany; Van Camp, Guy

    2014-06-01

    We present a new clinically practical method for online co-registration of 3D quantitative coronary angiography (QCA) and intravascular ultrasound (IVUS) or optical coherence tomography (OCT). The workflow is based on two modified commercially available software packages. Reconstruction steps are explained and compared to previously available methods. The feasibility for different clinical scenarios is illustrated. The co-registration appears accurate, robust and induced a minimal delay on the normal cath lab activities. This new method is based on the 3D angiographic reconstruction of the catheter path and does not require operator's identification of landmarks to establish the image synchronization. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. A multiarchitecture parallel-processing development environment

    NASA Technical Reports Server (NTRS)

    Townsend, Scott; Blech, Richard; Cole, Gary

    1993-01-01

    A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.

  5. Similarities and Differences in the Academic Education of Software Engineering and Architectural Design Professionals

    ERIC Educational Resources Information Center

    Hazzan, Orit; Karni, Eyal

    2006-01-01

    This article focuses on the similarities and differences in the academic education of software engineers and architects. The rationale for this work stems from our observation, each from the perspective of her or his own discipline, that these two professional design and development processes share some similarities. A pilot study was performed,…

  6. Using VirtualGL/TurboVNC Software on the Peregrine System |

    Science.gov Websites

    High-Performance Computing | NREL VirtualGL/TurboVNC Software on the Peregrine System Using , allowing users to access and share large-memory visualization nodes with high-end graphics processing units may be better than just using X11 forwarding when connecting from a remote site with low bandwidth and

  7. Captivating Open University Students with Online Literature Search Tutorials Created Using Screen Capture Software

    ERIC Educational Resources Information Center

    Wales, Tim; Robertson, Penny

    2008-01-01

    Purpose: The aim of this paper is to share the experiences and challenges faced by the Open University Library (OUL) in using screen capture software to develop online literature search tutorials. Design/methodology/approach: A summary of information literacy support at the OUL is provided as background information to explain the decision to…

  8. Quick and Easy: Use Screen Capture Software to Train and Communicate

    ERIC Educational Resources Information Center

    Schuster, Ellen

    2011-01-01

    Screen capture (screen cast) software can be used to develop short videos for training purposes. Developing videos is quick and easy. This article describes how these videos are used as tools to reinforce face-to-face and interactive TV curriculum training in a nutrition education program. Advantages of developing these videos are shared.…

  9. Generalized Nuclear Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conlin, Jeremy

    2017-03-15

    This software is code related to reading/writing/manipulating nuclear data in the Generalized Nuclear Data (GND) format, a new format for sharing nuclear data among institutions. In addition to the software and its documentation, notes and documentation from the WPEC Subgroup 43 will be included. WPEC Subgroup 43 is an international committee charged with creating the API for the GND format.

  10. Conducting a Trial of Web Conferencing Software: Why, How, and Perceptions from the Coalface

    ERIC Educational Resources Information Center

    Reushle, Shirley; Loch, Birgit

    2008-01-01

    This paper reports on the trial of web conferencing software conducted at a regional Australian university with a significant distance population. The paper shares preliminary findings, the views of participants and recommendations for future activity. To design and conduct the trial, an action research method was chosen because it is…

  11. Fiji: an open-source platform for biological-image analysis.

    PubMed

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  12. Lagrangian based methods for coherent structure detection

    NASA Astrophysics Data System (ADS)

    Allshouse, Michael R.; Peacock, Thomas

    2015-09-01

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other two approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.

  13. Research into display sharing techniques for distributed computing environments

    NASA Technical Reports Server (NTRS)

    Hugg, Steven B.; Fitzgerald, Paul F., Jr.; Rosson, Nina Y.; Johns, Stephen R.

    1990-01-01

    The X-based Display Sharing solution for distributed computing environments is described. The Display Sharing prototype includes the base functionality for telecast and display copy requirements. Since the prototype implementation is modular and the system design provided flexibility for the Mission Control Center Upgrade (MCCU) operational consideration, the prototype implementation can be the baseline for a production Display Sharing implementation. To facilitate the process the following discussions are presented: Theory of operation; System of architecture; Using the prototype; Software description; Research tools; Prototype evaluation; and Outstanding issues. The prototype is based on the concept of a dedicated central host performing the majority of the Display Sharing processing, allowing minimal impact on each individual workstation. Each workstation participating in Display Sharing hosts programs to facilitate the user's access to Display Sharing as host machine.

  14. Facilitating Collaboration in Lecture-Based Learning through Shared Notes Using Wireless Technologies

    ERIC Educational Resources Information Center

    Valtonen, T.; Havu-Nuutinen, S.; Dillon, P.; Vesisenaho, M.

    2011-01-01

    This paper reports a case study for developing lecture teaching in higher education by connecting simultaneously the benefits of face-to-face teaching and social software for capturing and sharing students' lecture notes. The study was conducted with 12 university students taking a degree course on pre-primary education. Data were collected on (1)…

  15. Discerning the Shared Beliefs of Teachers in a Secondary School Mathematics Department

    ERIC Educational Resources Information Center

    Beswick, Kim

    2016-01-01

    This study examined the shared beliefs among mathematics teachers in one secondary school in the United Kingdom across the first term of a school year and almost 4 years subsequently. Leximancer software was used to analyse the language used as teachers responded to questions concerning their beliefs about mathematics, mathematics teaching,…

  16. Implementation of a parallel unstructured Euler solver on shared and distributed memory architectures

    NASA Technical Reports Server (NTRS)

    Mavriplis, D. J.; Das, Raja; Saltz, Joel; Vermeland, R. E.

    1992-01-01

    An efficient three dimensional unstructured Euler solver is parallelized on a Cray Y-MP C90 shared memory computer and on an Intel Touchstone Delta distributed memory computer. This paper relates the experiences gained and describes the software tools and hardware used in this study. Performance comparisons between two differing architectures are made.

  17. The eGenVar data management system—cataloguing and sharing sensitive data and metadata for the life sciences

    PubMed Central

    Razick, Sabry; Močnik, Rok; Thomas, Laurent F.; Ryeng, Einar; Drabløs, Finn; Sætrom, Pål

    2014-01-01

    Systematic data management and controlled data sharing aim at increasing reproducibility, reducing redundancy in work, and providing a way to efficiently locate complementing or contradicting information. One method of achieving this is collecting data in a central repository or in a location that is part of a federated system and providing interfaces to the data. However, certain data, such as data from biobanks or clinical studies, may, for legal and privacy reasons, often not be stored in public repositories. Instead, we describe a metadata cataloguing system and a software suite for reporting the presence of data from the life sciences domain. The system stores three types of metadata: file information, file provenance and data lineage, and content descriptions. Our software suite includes both graphical and command line interfaces that allow users to report and tag files with these different metadata types. Importantly, the files remain in their original locations with their existing access-control mechanisms in place, while our system provides descriptions of their contents and relationships. Our system and software suite thereby provide a common framework for cataloguing and sharing both public and private data. Database URL: http://bigr.medisin.ntnu.no/data/eGenVar/ PMID:24682735

  18. Managing Sustainable Data Infrastructures: The Gestalt of EOSDIS

    NASA Astrophysics Data System (ADS)

    Behnke, J.; Lindsay, F. E.; Lowe, D. R.; Mitchell, A. E.; Lynnes, C.

    2016-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been a central component of the NASA Earth observation program since the 1990's. The data collected by NASA's remote sensing instruments represent a significant public investment in research. EOSDIS provides free and open access to this data to a worldwide public research community. From the very beginning, EOSDIS was conceived as a system built on partnerships between NASA Centers, US agencies and academia. EOSDIS manages a wide range of Earth science discipline data that include cryosphere, land cover change, polar processes, field campaigns, ocean surface, digital elevation, atmosphere dynamics and composition, and inter-disciplinary research, among many others. Over the years, EOSDIS has evolved to support increasingly complex and diverse NASA Earth Science data collections. EOSDIS epitomizes a System of Systems, whose many varied and distributed parts are integrated into a single, highly functional organized science data system. A distributed architecture was adopted to ensure discipline-specific support for the science data, while also leveraging standards and establishing policies and tools to enable interdisciplinary research, and analysis across multiple scientific instruments. The EOSDIS is composed of system elements such as geographically distributed archive centers used to manage the stewardship of data. The infrastructure consists of underlying capabilities/connections that enable the primary system elements to function together. For example, one key infrastructure component is the common metadata repository, which enables discovery of all data within the EOSDIS system. . EOSDIS employs processes and standards to ensure partners can work together effectively, and provide coherent services to users. While the separation into domain-specific science archives helps to manage the wide variety of missions and datasets, the common services and practices serve to knit the overall system together into a coherent whole, with sharing of data, metadata, information and software making EOSDIS more than the simple sum of its parts. This paper will describe those parts and how the whole system works together to deliver Earth science data to millions of users.

  19. Microcomputer software development facilities

    NASA Technical Reports Server (NTRS)

    Gorman, J. S.; Mathiasen, C.

    1980-01-01

    A more efficient and cost effective method for developing microcomputer software is to utilize a host computer with high-speed peripheral support. Application programs such as cross assemblers, loaders, and simulators are implemented in the host computer for each of the microcomputers for which software development is a requirement. The host computer is configured to operate in a time share mode for multiusers. The remote terminals, printers, and down loading capabilities provided are based on user requirements. With this configuration a user, either local or remote, can use the host computer for microcomputer software development. Once the software is developed (through the code and modular debug stage) it can be downloaded to the development system or emulator in a test area where hardware/software integration functions can proceed. The microcomputer software program sources reside in the host computer and can be edited, assembled, loaded, and then downloaded as required until the software development project has been completed.

  20. The Particle-in-Cell and Kinetic Simulation Software Center

    NASA Astrophysics Data System (ADS)

    Mori, W. B.; Decyk, V. K.; Tableman, A.; Fonseca, R. A.; Tsung, F. S.; Hu, Q.; Winjum, B. J.; An, W.; Dalichaouch, T. N.; Davidson, A.; Hildebrand, L.; Joglekar, A.; May, J.; Miller, K.; Touati, M.; Xu, X. L.

    2017-10-01

    The UCLA Particle-in-Cell and Kinetic Simulation Software Center (PICKSC) aims to support an international community of PIC and plasma kinetic software developers, users, and educators; to increase the use of this software for accelerating the rate of scientific discovery; and to be a repository of knowledge and history for PIC. We discuss progress towards making available and documenting illustrative open-source software programs and distinct production programs; developing and comparing different PIC algorithms; coordinating the development of resources for the educational use of kinetic software; and the outcomes of our first sponsored OSIRIS users workshop. We also welcome input and discussion from anyone interested in using or developing kinetic software, in obtaining access to our codes, in collaborating, in sharing their own software, or in commenting on how PICKSC can better serve the DPP community. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  1. Nature does not rely on long-lived electronic quantum coherence for photosynthetic energy transfer.

    PubMed

    Duan, Hong-Guang; Prokhorenko, Valentyn I; Cogdell, Richard J; Ashraf, Khuram; Stevens, Amy L; Thorwart, Michael; Miller, R J Dwayne

    2017-08-08

    During the first steps of photosynthesis, the energy of impinging solar photons is transformed into electronic excitation energy of the light-harvesting biomolecular complexes. The subsequent energy transfer to the reaction center is commonly rationalized in terms of excitons moving on a grid of biomolecular chromophores on typical timescales [Formula: see text]100 fs. Today's understanding of the energy transfer includes the fact that the excitons are delocalized over a few neighboring sites, but the role of quantum coherence is considered as irrelevant for the transfer dynamics because it typically decays within a few tens of femtoseconds. This orthodox picture of incoherent energy transfer between clusters of a few pigments sharing delocalized excitons has been challenged by ultrafast optical spectroscopy experiments with the Fenna-Matthews-Olson protein, in which interference oscillatory signals up to 1.5 ps were reported and interpreted as direct evidence of exceptionally long-lived electronic quantum coherence. Here, we show that the optical 2D photon echo spectra of this complex at ambient temperature in aqueous solution do not provide evidence of any long-lived electronic quantum coherence, but confirm the orthodox view of rapidly decaying electronic quantum coherence on a timescale of 60 fs. Our results can be considered as generic and give no hint that electronic quantum coherence plays any biofunctional role in real photoactive biomolecular complexes. Because in this structurally well-defined protein the distances between bacteriochlorophylls are comparable to those of other light-harvesting complexes, we anticipate that this finding is general and directly applies to even larger photoactive biomolecular complexes.

  2. Nature does not rely on long-lived electronic quantum coherence for photosynthetic energy transfer

    NASA Astrophysics Data System (ADS)

    Duan, Hong-Guang; Prokhorenko, Valentyn I.; Cogdell, Richard J.; Ashraf, Khuram; Stevens, Amy L.; Thorwart, Michael; Miller, R. J. Dwayne

    2017-08-01

    During the first steps of photosynthesis, the energy of impinging solar photons is transformed into electronic excitation energy of the light-harvesting biomolecular complexes. The subsequent energy transfer to the reaction center is commonly rationalized in terms of excitons moving on a grid of biomolecular chromophores on typical timescales <<100 fs. Today’s understanding of the energy transfer includes the fact that the excitons are delocalized over a few neighboring sites, but the role of quantum coherence is considered as irrelevant for the transfer dynamics because it typically decays within a few tens of femtoseconds. This orthodox picture of incoherent energy transfer between clusters of a few pigments sharing delocalized excitons has been challenged by ultrafast optical spectroscopy experiments with the Fenna-Matthews-Olson protein, in which interference oscillatory signals up to 1.5 ps were reported and interpreted as direct evidence of exceptionally long-lived electronic quantum coherence. Here, we show that the optical 2D photon echo spectra of this complex at ambient temperature in aqueous solution do not provide evidence of any long-lived electronic quantum coherence, but confirm the orthodox view of rapidly decaying electronic quantum coherence on a timescale of 60 fs. Our results can be considered as generic and give no hint that electronic quantum coherence plays any biofunctional role in real photoactive biomolecular complexes. Because in this structurally well-defined protein the distances between bacteriochlorophylls are comparable to those of other light-harvesting complexes, we anticipate that this finding is general and directly applies to even larger photoactive biomolecular complexes.

  3. Experimental evaluation of multiprocessor cache-based error recovery

    NASA Technical Reports Server (NTRS)

    Janssens, Bob; Fuchs, W. K.

    1991-01-01

    Several variations of cache-based checkpointing for rollback error recovery in shared-memory multiprocessors have been recently developed. By modifying the cache replacement policy, these techniques use the inherent redundancy in the memory hierarchy to periodically checkpoint the computation state. Three schemes, different in the manner in which they avoid rollback propagation, are evaluated. By simulation with address traces from parallel applications running on an Encore Multimax shared-memory multiprocessor, the performance effect of integrating the recovery schemes in the cache coherence protocol are evaluated. The results indicate that the cache-based schemes can provide checkpointing capability with low performance overhead but uncontrollable high variability in the checkpoint interval.

  4. Real-time polarization-sensitive optical coherence tomography data processing with parallel computing

    PubMed Central

    Liu, Gangjun; Zhang, Jun; Yu, Lingfeng; Xie, Tuqiang; Chen, Zhongping

    2010-01-01

    With the increase of the A-line speed of optical coherence tomography (OCT) systems, real-time processing of acquired data has become a bottleneck. The shared-memory parallel computing technique is used to process OCT data in real time. The real-time processing power of a quad-core personal computer (PC) is analyzed. It is shown that the quad-core PC could provide real-time OCT data processing ability of more than 80K A-lines per second. A real-time, fiber-based, swept source polarization-sensitive OCT system with 20K A-line speed is demonstrated with this technique. The real-time 2D and 3D polarization-sensitive imaging of chicken muscle and pig tendon is also demonstrated. PMID:19904337

  5. SU(1 , 1) and SU(2) approaches to the radial oscillator: Generalized coherent states and squeezing of variances

    NASA Astrophysics Data System (ADS)

    Rosas-Ortiz, Oscar; Cruz y Cruz, Sara; Enríquez, Marco

    2016-10-01

    It is shown that each one of the Lie algebras su(1 , 1) and su(2) determine the spectrum of the radial oscillator. States that share the same orbital angular momentum are used to construct the representation spaces of the non-compact Lie group SU(1 , 1) . In addition, three different forms of obtaining the representation spaces of the compact Lie group SU(2) are introduced, they are based on the accidental degeneracies associated with the spherical symmetry of the system as well as on the selection rules that govern the transitions between different energy levels. In all cases the corresponding generalized coherent states are constructed and the conditions to squeeze the involved quadratures are analyzed.

  6. Influence of Smartphones and Software on Acoustic Voice Measures

    PubMed Central

    GRILLO, ELIZABETH U.; BROSIOUS, JENNA N.; SORRELL, STACI L.; ANAND, SUPRAJA

    2016-01-01

    This study assessed the within-subject variability of voice measures captured using different recording devices (i.e., smartphones and head mounted microphone) and software programs (i.e., Analysis of Dysphonia in Speech and Voice (ADSV), Multi-dimensional Voice Program (MDVP), and Praat). Correlations between the software programs that calculated the voice measures were also analyzed. Results demonstrated no significant within-subject variability across devices and software and that some of the measures were highly correlated across software programs. The study suggests that certain smartphones may be appropriate to record daily voice measures representing the effects of vocal loading within individuals. In addition, even though different algorithms are used to compute voice measures across software programs, some of the programs and measures share a similar relationship. PMID:28775797

  7. The Architecture Based Design Method

    DTIC Science & Technology

    2000-01-01

    implementation of components of different types. The software templates include a description of how components interact with shared services and also include citizenship responsibilities for components.

  8. BicPAMS: software for biological data analysis with pattern-based biclustering.

    PubMed

    Henriques, Rui; Ferreira, Francisco L; Madeira, Sara C

    2017-02-02

    Biclustering has been largely applied for the unsupervised analysis of biological data, being recognised today as a key technique to discover putative modules in both expression data (subsets of genes correlated in subsets of conditions) and network data (groups of coherently interconnected biological entities). However, given its computational complexity, only recent breakthroughs on pattern-based biclustering enabled efficient searches without the restrictions that state-of-the-art biclustering algorithms place on the structure and homogeneity of biclusters. As a result, pattern-based biclustering provides the unprecedented opportunity to discover non-trivial yet meaningful biological modules with putative functions, whose coherency and tolerance to noise can be tuned and made problem-specific. To enable the effective use of pattern-based biclustering by the scientific community, we developed BicPAMS (Biclustering based on PAttern Mining Software), a software that: 1) makes available state-of-the-art pattern-based biclustering algorithms (BicPAM (Henriques and Madeira, Alg Mol Biol 9:27, 2014), BicNET (Henriques and Madeira, Alg Mol Biol 11:23, 2016), BicSPAM (Henriques and Madeira, BMC Bioinforma 15:130, 2014), BiC2PAM (Henriques and Madeira, Alg Mol Biol 11:1-30, 2016), BiP (Henriques and Madeira, IEEE/ACM Trans Comput Biol Bioinforma, 2015), DeBi (Serin and Vingron, AMB 6:1-12, 2011) and BiModule (Okada et al., IPSJ Trans Bioinf 48(SIG5):39-48, 2007)); 2) consistently integrates their dispersed contributions; 3) further explores additional accuracy and efficiency gains; and 4) makes available graphical and application programming interfaces. Results on both synthetic and real data confirm the relevance of BicPAMS for biological data analysis, highlighting its essential role for the discovery of putative modules with non-trivial yet biologically significant functions from expression and network data. BicPAMS is the first biclustering tool offering the possibility to: 1) parametrically customize the structure, coherency and quality of biclusters; 2) analyze large-scale biological networks; and 3) tackle the restrictive assumptions placed by state-of-the-art biclustering algorithms. These contributions are shown to be key for an adequate, complete and user-assisted unsupervised analysis of biological data. BicPAMS and its tutorial available in http://www.bicpams.com .

  9. Orthographic Learning and the Role of Text-to-Speech Software in Dutch Disabled Readers

    ERIC Educational Resources Information Center

    Staels, Eva; Van den Broeck, Wim

    2015-01-01

    In this study, we examined whether orthographic learning can be demonstrated in disabled readers learning to read in a transparent orthography (Dutch). In addition, we tested the effect of the use of text-to-speech software, a new form of direct instruction, on orthographic learning. Both research goals were investigated by replicating Share's…

  10. Potential of the Cogex Software Platform to Replace Logbooks in Capstone Design Projects

    ERIC Educational Resources Information Center

    Foley, David; Charron, François; Plante, Jean-Sébastien

    2018-01-01

    Recent technologies are offering the power to share and grow knowledge and ideas in unprecedented ways. The CogEx software platform was developed to take advantage of the digital world with innovative ideas to support designers work in both industrial and academic contexts. This paper presents a qualitative study on the usage of CogEx during…

  11. A Dozen Years after Open Source's 1998 Birth, It's Time for "OpenTechComm"

    ERIC Educational Resources Information Center

    Still, Brian

    2010-01-01

    2008 marked the 10-year Anniversary of the Open Source movement, which has had a substantial impact on not only software production and adoption, but also on the sharing and distribution of information. Technical communication as a discipline has taken some advantage of the movement or its derivative software, but this article argues not as much…

  12. Communication and Organization in Software Development: An Empirical Study

    NASA Technical Reports Server (NTRS)

    Seaman, Carolyn B.; Basili, Victor R.

    1996-01-01

    The empirical study described in this paper addresses the issue of communication among members of a software development organization. The independent variables are various attributes of organizational structure. The dependent variable is the effort spent on sharing information which is required by the software development process in use. The research questions upon which the study is based ask whether or not these attributes of organizational structure have an effect on the amount of communication effort expended. In addition, there are a number of blocking variables which have been identified. These are used to account for factors other than organizational structure which may have an effect on communication effort. The study uses both quantitative and qualitative methods for data collection and analysis. These methods include participant observation, structured interviews, and graphical data presentation. The results of this study indicate that several attributes of organizational structure do affect communication effort, but not in a simple, straightforward way. In particular, the distances between communicators in the reporting structure of the organization, as well as in the physical layout of offices, affects how quickly they can share needed information, especially during meetings. These results provide a better understanding of how organizational structure helps or hinders communication in software development.

  13. Sptrace

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.

    2011-01-01

    Sptrace is a general-purpose space utilization tracing system that is conceptually similar to the commercial Purify product used to detect leaks and other memory usage errors. It is designed to monitor space utilization in any sort of heap, i.e., a region of data storage on some device (nominally memory; possibly shared and possibly persistent) with a flat address space. This software can trace usage of shared and/or non-volatile storage in addition to private RAM (random access memory). Sptrace is implemented as a set of C function calls that are invoked from within the software that is being examined. The function calls fall into two broad classes: (1) functions that are embedded within the heap management software [e.g., JPL's SDR (Simple Data Recorder) and PSM (Personal Space Management) systems] to enable heap usage analysis by populating a virtual time-sequenced log of usage activity, and (2) reporting functions that are embedded within the application program whose behavior is suspect. For ease of use, these functions may be wrapped privately inside public functions offered by the heap management software. Sptrace can be used for VxWorks or RTEMS realtime systems as easily as for Linux or OS/X systems.

  14. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks

    PubMed Central

    2016-01-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus—a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software—an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. PMID:27098025

  15. An exchange format for use-cases of hospital information systems.

    PubMed

    Masuda, G; Sakamoto, N; Sakai, R; Yamamoto, R

    2001-01-01

    Object-oriented software development is a powerful methodology for development of large hospital information systems. We think use-case driven approach is particularly useful for the development. In the use-cases driven approach, use-cases are documented at the first stage in the software development process and they are used through the whole steps in a variety of ways. Therefore, it is important to exchange and share the use-cases and make effective use of them through the overall lifecycle of a development process. In this paper, we propose a method of sharing and exchanging use-case models between applications, developers, and projects. We design an XML based exchange format for use-cases. We then discuss an application of the exchange format to support several software development activities. We preliminarily implemented a support system for object-oriented analysis based on the exchange format. The result shows that using the structural and semantic information in the exchange format enables the support system to assist the object-oriented analysis successfully.

  16. The sensitivity of hearing-impaired adults to acoustic attributes in simulated rooms

    PubMed Central

    Whitmer, William M.; McShefferty, David; Akeroyd, Michael A.

    2016-01-01

    In previous studies we have shown that older hearing-impaired individuals are relatively insensitive to changes in the apparent width of broadband noises when those width changes were based on differences in interaural coherence [W. Whitmer, B. Seeber and M. Akeroyd, J. Acoust. Soc. Am. 132, 369-379 (2012)]. This insensitivity has been linked to senescent difficulties in resolving binaural fine-structure differences. It is therefore possible that interaural coherence, despite its widespread use, may not be the best acoustic surrogate of spatial perception for the aged and impaired. To test this, we simulated the room impulse responses for various acoustic scenarios with differing coherence and lateral (energy) fraction attributes using room modelling software (ODEON). Bilaterally impaired adult participants were asked to sketch the perceived size of speech tokens and musical excerpts that were convolved with these impulse responses and presented to them in a sound-dampened enclosure through a 24-loudspeaker array. Participants’ binaural acuity was also measured using an interaural phase discrimination task. Corroborating our previous findings, the results showed less sensitivity to interaural coherence in the auditory source width judgments of older hearing-impaired individuals, indicating that alternate acoustic measurements in the design of spaces for the elderly may be necessary. PMID:27213028

  17. The sensitivity of hearing-impaired adults to acoustic attributes in simulated rooms.

    PubMed

    Whitmer, William M; McShefferty, David; Akeroyd, Michael A

    2013-06-02

    In previous studies we have shown that older hearing-impaired individuals are relatively insensitive to changes in the apparent width of broadband noises when those width changes were based on differences in interaural coherence [W. Whitmer, B. Seeber and M. Akeroyd, J. Acoust. Soc. Am. 132, 369-379 (2012)]. This insensitivity has been linked to senescent difficulties in resolving binaural fine-structure differences. It is therefore possible that interaural coherence, despite its widespread use, may not be the best acoustic surrogate of spatial perception for the aged and impaired. To test this, we simulated the room impulse responses for various acoustic scenarios with differing coherence and lateral (energy) fraction attributes using room modelling software (ODEON). Bilaterally impaired adult participants were asked to sketch the perceived size of speech tokens and musical excerpts that were convolved with these impulse responses and presented to them in a sound-dampened enclosure through a 24-loudspeaker array. Participants' binaural acuity was also measured using an interaural phase discrimination task. Corroborating our previous findings, the results showed less sensitivity to interaural coherence in the auditory source width judgments of older hearing-impaired individuals, indicating that alternate acoustic measurements in the design of spaces for the elderly may be necessary.

  18. Performance Modeling and Measurement of Parallelized Code for Distributed Shared Memory Multiprocessors

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry

    1998-01-01

    This paper presents a model to evaluate the performance and overhead of parallelizing sequential code using compiler directives for multiprocessing on distributed shared memory (DSM) systems. With increasing popularity of shared address space architectures, it is essential to understand their performance impact on programs that benefit from shared memory multiprocessing. We present a simple model to characterize the performance of programs that are parallelized using compiler directives for shared memory multiprocessing. We parallelized the sequential implementation of NAS benchmarks using native Fortran77 compiler directives for an Origin2000, which is a DSM system based on a cache-coherent Non Uniform Memory Access (ccNUMA) architecture. We report measurement based performance of these parallelized benchmarks from four perspectives: efficacy of parallelization process; scalability; parallelization overhead; and comparison with hand-parallelized and -optimized version of the same benchmarks. Our results indicate that sequential programs can conveniently be parallelized for DSM systems using compiler directives but realizing performance gains as predicted by the performance model depends primarily on minimizing architecture-specific data locality overhead.

  19. Evaluation of actual vs expected photodynamic therapy spot size.

    PubMed

    Ranchod, Tushar M; Brucker, Alexander J; Liu, Chengcheng; Cukras, Catherine A; Hopkins, Tim B; Ying, Gui-Shuang

    2009-05-01

    To determine the accuracy of the photodynamic therapy (PDT) laser spot size on the retina as generated by 2 Food and Drug Administration (FDA)-approved lasers. Prospective observational case series. Fundus photographs were taken of 1 eye of each of 10 subjects with the WinStation 4000 fundus photography system (OIS; Ophthalmic Imaging Systems, Sacramento, California, USA); disc size was calculated using OIS software. Slit-lamp photographs were taken of the PDT laser spot focused on the retina adjacent to the optic disc, using various spot sizes in combination with 3 different contact lenses and 2 different lasers. Spot size at the retina was determined by measuring the ratio of disc diameter to spot diameter in Adobe Photoshop (San Jose, California, USA) and applying this ratio to the OIS disc measurements. Spot size at the retina averaged 87% of expected spot size for the Coherent Opal laser (Coherent Inc, Santa Clara, California, USA) and 104% of expected spot size for the Zeiss Visulas laser (Carl Zeiss Meditec Inc, Dublin, California, USA)(P = .002). Multivariate analysis demonstrated that percentage of expected spot size decreased with larger spot diameter (P = .01 for Coherent laser; P = .02 for Zeiss laser). PDT spot size at the retina appears to be consistently smaller than expected for the Coherent laser while the spot size was consistently within 10% of expected size for the Zeiss laser. The deviation from expected size increased with larger spot size using the Coherent laser.

  20. EPOS Multi-Scale Laboratory platform: a long-term reference tool for experimental Earth Sciences

    NASA Astrophysics Data System (ADS)

    Trippanera, Daniele; Tesei, Telemaco; Funiciello, Francesca; Sagnotti, Leonardo; Scarlato, Piergiorgio; Rosenau, Matthias; Elger, Kirsten; Ulbricht, Damian; Lange, Otto; Calignano, Elisa; Spiers, Chris; Drury, Martin; Willingshofer, Ernst; Winkler, Aldo

    2017-04-01

    With continuous progress on scientific research, a large amount of datasets has been and will be produced. The data access and sharing along with their storage and homogenization within a unique and coherent framework is a new challenge for the whole scientific community. This is particularly emphasized for geo-scientific laboratories, encompassing the most diverse Earth Science disciplines and typology of data. To this aim the "Multiscale Laboratories" Work Package (WP16), operating in the framework of the European Plate Observing System (EPOS), is developing a virtual platform of geo-scientific data and services for the worldwide community of laboratories. This long-term project aims at merging the top class multidisciplinary laboratories in Geoscience into a coherent and collaborative network, facilitating the standardization of virtual access to data, data products and software. This will help our community to evolve beyond the stage in which most of data produced by the different laboratories are available only within the related scholarly publications (often as print-version only) or they remain unpublished and inaccessible on local devices. The EPOS multi-scale laboratory platform will provide the possibility to easily share and discover data by means of open access, DOI-referenced, online data publication including long-term storage, managing and curation services and to set up a cohesive community of laboratories. The WP16 is starting with three pilot cases laboratories: (1) rock physics, (2) palaeomagnetic, and (3) analogue modelling. As a proof of concept, first analogue modelling datasets have been published via GFZ Data Services (http://doidb.wdc-terra.org/search/public/ui?&sort=updated+desc&q=epos). The datasets include rock analogue material properties (e.g. friction data, rheology data, SEM imagery), as well as supplementary figures, images and movies from experiments on tectonic processes. A metadata catalogue tailored to the specific communities will link the growing number of datasets to a centralized EPOS hub. Acknowledging the fact that we are dealing with a variety in levels of maturity regarding available data infrastructures within the different labs, we have set up an architecture that provides different scenarios for participation. Thus, research groups which do not have access to localized repositories and catalogues for sustainable storage of data and metadata can rely on shared services within the Multi-scale Laboratories community. As an example of the usage of data retrieved through the community, an experimentalist willing to decide which material is suitable for his experimental setup can get "virtual lab access" to retrieve information about material parameters with a minimum effort and then may decide to move in a specific laboratory equipped with the instruments needed. The currently participating and collaborating laboratories (Utrecht University, GFZ, Roma Tre University, INGV, NERC, CSIC-ICTJA, CNRS, LMU, UBI, ETH, CNR) warmly welcome everyone who is interested in participating at the development of this project.

  1. ImTK: an open source multi-center information management toolkit

    NASA Astrophysics Data System (ADS)

    Alaoui, Adil; Ingeholm, Mary Lou; Padh, Shilpa; Dorobantu, Mihai; Desai, Mihir; Cleary, Kevin; Mun, Seong K.

    2008-03-01

    The Information Management Toolkit (ImTK) Consortium is an open source initiative to develop robust, freely available tools related to the information management needs of basic, clinical, and translational research. An open source framework and agile programming methodology can enable distributed software development while an open architecture will encourage interoperability across different environments. The ISIS Center has conceptualized a prototype data sharing network that simulates a multi-center environment based on a federated data access model. This model includes the development of software tools to enable efficient exchange, sharing, management, and analysis of multimedia medical information such as clinical information, images, and bioinformatics data from multiple data sources. The envisioned ImTK data environment will include an open architecture and data model implementation that complies with existing standards such as Digital Imaging and Communications (DICOM), Health Level 7 (HL7), and the technical framework and workflow defined by the Integrating the Healthcare Enterprise (IHE) Information Technology Infrastructure initiative, mainly the Cross Enterprise Document Sharing (XDS) specifications.

  2. Performance Characterization of LCLS-II Superconducting Radiofrequency Cryomodules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory, RuthAnn

    This paper will describe the LCLS (Linac Coherent Light Source)-II, Fermilab’s role in the development of LCLS-II, and my contributions as a Lee Teng intern. LCLS-II is a second generation x-ray free electron laser being constructed at SLAC National Accelerator Laboratory. Fermilab is responsible for the design, construction, and testing of several 1.3 GHz cryomodules to be used in LCLS-II. These cryomodules are currently being tested at Fermilab. Some software was written to analyze the data from the cryomodule tests. This software assesses the performance of the cryomodules by looking at data on the cavity voltage, cavity gradient, dark current,more » and radiation.« less

  3. Shared genetic determinants of axial length and height in children: the Guangzhou twin eye study.

    PubMed

    Zhang, Jian; Hur, Yoon-Mi; Huang, Wenyong; Ding, Xiaohu; Feng, Ke; He, Mingguang

    2011-01-01

    To describe the association between axial length (AL) and height and to estimate the extent to which shared genetic or environmental factors influence this covariance. Study participants were recruited from the Guangzhou Twin Registry. Axial length was measured using partial coherence laser interferometry. Height was measured with the participants standing without shoes. We computed twin pairwise correlations and cross-twin cross-trait correlations between AL and height for monozygotic and dizygotic twins and performed model-fitting analyses using a multivariate Cholesky model. The right eye was arbitrarily selected to represent AL of participants. Five hundred sixty-five twin pairs (359 monozygotic and 206 dizygotic) aged 7 to 15 years were available for analysis. Phenotypic correlation between AL and height was 0.46 but decreased to 0.19 after adjusting for age, sex, and age × sex interaction. Bivariate Cholesky model-fitting analyses revealed that 89% of phenotypic correlation was due to shared genetic factors and 11% was due to shared random environmental factors, which includes measurement error. Covariance of AL and height is largely attributable to shared genes. Given that AL is a key determinant of myopia, further work is needed to confirm gene sharing between myopia and stature.

  4. Creating a YouTube-Like Collaborative Environment in Mathematics: Integrating Animated Geogebra Constructions and Student-Generated Screencast Videos

    ERIC Educational Resources Information Center

    Lazarus, Jill; Roulet, Geoffrey

    2013-01-01

    This article discusses the integration of student-generated GeoGebra applets and Jing screencast videos to create a YouTube-like medium for sharing in mathematics. The value of combining dynamic mathematics software and screencast videos for facilitating communication and representations in a digital era is demonstrated herein. We share our…

  5. CONNJUR Workflow Builder: A software integration environment for spectral reconstruction

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803

  6. The Commercial Open Source Business Model

    NASA Astrophysics Data System (ADS)

    Riehle, Dirk

    Commercial open source software projects are open source software projects that are owned by a single firm that derives a direct and significant revenue stream from the software. Commercial open source at first glance represents an economic paradox: How can a firm earn money if it is making its product available for free as open source? This paper presents the core properties of com mercial open source business models and discusses how they work. Using a commercial open source approach, firms can get to market faster with a superior product at lower cost than possible for traditional competitors. The paper shows how these benefits accrue from an engaged and self-supporting user community. Lacking any prior comprehensive reference, this paper is based on an analysis of public statements by practitioners of commercial open source. It forges the various anecdotes into a coherent description of revenue generation strategies and relevant business functions.

  7. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    PubMed

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  8. Low cost and compact quantum key distribution

    NASA Astrophysics Data System (ADS)

    Duligall, J. L.; Godfrey, M. S.; Harrison, K. A.; Munro, W. J.; Rarity, J. G.

    2006-10-01

    We present the design of a novel free-space quantum cryptography system, complete with purpose-built software, that can operate in daylight conditions. The transmitter and receiver modules are built using inexpensive off-the-shelf components. Both modules are compact allowing the generation of renewed shared secrets on demand over a short range of a few metres. An analysis of the software is shown as well as results of error rates and therefore shared secret yields at varying background light levels. As the system is designed to eventually work in short-range consumer applications, we also present a use scenario where the consumer can regularly 'top up' a store of secrets for use in a variety of one-time-pad (OTP) and authentication protocols.

  9. Laboratory demonstration of Stellar Intensity Interferometry using a software correlator

    NASA Astrophysics Data System (ADS)

    Matthews, Nolan; Kieda, David

    2017-06-01

    In this talk I will present measurements of the spatial coherence function of laboratory thermal (black-body) sources using Hanbury-Brown and Twiss interferometry with a digital off-line correlator. Correlations in the intensity fluctuations of a thermal source, such as a star, allow retrieval of the second order coherence function which can be used to perform high resolution imaging and source geometry characterization. We also demonstrate that intensity fluctuations between orthogonal polarization states are uncorrelated but can be used to reduce systematic noise. The work performed here can readily be applied to existing and future Imaging Air-Cherenkov telescopes to measure spatial properties of stellar sources. Some possible candidates for astronomy applications include close binary star systems, fast rotators, Cepheid variables, and potentially even exoplanet characterization.

  10. EMMA: a new paradigm in configurable software

    DOE PAGES

    Nogiec, J. M.; Trombly-Freytag, K.

    2017-11-23

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. As a result, it provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  11. EMMA: A New Paradigm in Configurable Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nogiec, J. M.; Trombly-Freytag, K.

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. It provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  12. EMMA: a new paradigm in configurable software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nogiec, J. M.; Trombly-Freytag, K.

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. As a result, it provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  13. EMMA: a new paradigm in configurable software

    NASA Astrophysics Data System (ADS)

    Nogiec, J. M.; Trombly-Freytag, K.

    2017-10-01

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. It provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  14. Reconstruction 3-dimensional image from 2-dimensional image of status optical coherence tomography (OCT) for analysis of changes in retinal thickness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arinilhaq,; Widita, Rena

    2014-09-30

    Optical Coherence Tomography is often used in medical image acquisition to diagnose that change due easy to use and low price. Unfortunately, this type of examination produces a two-dimensional retinal image of the point of acquisition. Therefore, this study developed a method that combines and reconstruct 2-dimensional retinal images into three-dimensional images to display volumetric macular accurately. The system is built with three main stages: data acquisition, data extraction and 3-dimensional reconstruction. At data acquisition step, Optical Coherence Tomography produced six *.jpg images of each patient were further extracted with MATLAB 2010a software into six one-dimensional arrays. The six arraysmore » are combined into a 3-dimensional matrix using a kriging interpolation method with SURFER9 resulting 3-dimensional graphics of macula. Finally, system provides three-dimensional color graphs based on the data distribution normal macula. The reconstruction system which has been designed produces three-dimensional images with size of 481 × 481 × h (retinal thickness) pixels.« less

  15. Pulse stuttering as a remedy for aliased ground backscatter

    NASA Astrophysics Data System (ADS)

    Bowhill, S. A.

    1983-12-01

    An algorithm that aides in the removal of ground scatter from low frequency Mesosphere, Stratosphere, Troposphere (MST) radar signals is examined. The unwanted ground scatter is shown as a sequence of velocity plots which are almost typical at the various altitudes. The interpulse period is changed in a cyclic way, thereby destroying the coherence of the unwanted signal. The interpulse period must be changed by an amount at least equal to the transmitted pulse width, and optimum performance is obtained when the number of different interpulse period occupies a time span greater than the coherence time of the unwanted signal. Since a 20-msec pulse width is used, it was found convenient to cycle through 50 pulses, the interpulse period changing from 2 msec to 3 msec during the 1/8-second time. This particular pattern of interpulse periods was provided by a software radar controller. With application of this algorithm, the unwanted scatter signal becomes incoherent from one pulse to the next, and therefore is perceived as noise by the coherent integrator and correlator.

  16. Pulse stuttering as a remedy for aliased ground backscatter

    NASA Technical Reports Server (NTRS)

    Bowhill, S. A.

    1983-01-01

    An algorithm that aides in the removal of ground scatter from low frequency Mesosphere, Stratosphere, Troposphere (MST) radar signals is examined. The unwanted ground scatter is shown as a sequence of velocity plots which are almost typical at the various altitudes. The interpulse period is changed in a cyclic way, thereby destroying the coherence of the unwanted signal. The interpulse period must be changed by an amount at least equal to the transmitted pulse width, and optimum performance is obtained when the number of different interpulse period occupies a time span greater than the coherence time of the unwanted signal. Since a 20-msec pulse width is used, it was found convenient to cycle through 50 pulses, the interpulse period changing from 2 msec to 3 msec during the 1/8-second time. This particular pattern of interpulse periods was provided by a software radar controller. With application of this algorithm, the unwanted scatter signal becomes incoherent from one pulse to the next, and therefore is perceived as noise by the coherent integrator and correlator.

  17. BioSPICE: access to the most current computational tools for biologists.

    PubMed

    Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark

    2003-01-01

    The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.

  18. The Role of Standards in Cloud-Computing Interoperability

    DTIC Science & Technology

    2012-10-01

    services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift

  19. A Case Study in Software Adaptation

    DTIC Science & Technology

    2002-01-01

    1 A Case Study in Software Adaptation Giuseppe Valetto Telecom Italia Lab Via Reiss Romoli 274 10148, Turin, Italy +39 011 2288788...configuration of the service; monitoring of database connectivity from within the service; monitoring of crashes and shutdowns of IM servers; monitoring of...of the IM server all share a relational database and a common runtime state repository, which make up the backend tier, and allow replicas to

  20. Learning from hackers: open-source clinical trials.

    PubMed

    Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico

    2012-05-02

    Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.

  1. OpenEIS. Users Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Woohyun; Lutes, Robert G.; Katipamula, Srinivas

    This document is a users guide for OpenEIS, a software code designed to provide standard methods for authoring, sharing, testing, using and improving algorithms for operational building energy efficiency.

  2. ROS-IGTL-Bridge: an open network interface for image-guided therapy using the ROS environment.

    PubMed

    Frank, Tobias; Krieger, Axel; Leonard, Simon; Patel, Niravkumar A; Tokuda, Junichi

    2017-08-01

    With the growing interest in advanced image-guidance for surgical robot systems, rapid integration and testing of robotic devices and medical image computing software are becoming essential in the research and development. Maximizing the use of existing engineering resources built on widely accepted platforms in different fields, such as robot operating system (ROS) in robotics and 3D Slicer in medical image computing could simplify these tasks. We propose a new open network bridge interface integrated in ROS to ensure seamless cross-platform data sharing. A ROS node named ROS-IGTL-Bridge was implemented. It establishes a TCP/IP network connection between the ROS environment and external medical image computing software using the OpenIGTLink protocol. The node exports ROS messages to the external software over the network and vice versa simultaneously, allowing seamless and transparent data sharing between the ROS-based devices and the medical image computing platforms. Performance tests demonstrated that the bridge could stream transforms, strings, points, and images at 30 fps in both directions successfully. The data transfer latency was <1.2 ms for transforms, strings and points, and 25.2 ms for color VGA images. A separate test also demonstrated that the bridge could achieve 900 fps for transforms. Additionally, the bridge was demonstrated in two representative systems: a mock image-guided surgical robot setup consisting of 3D slicer, and Lego Mindstorms with ROS as a prototyping and educational platform for IGT research; and the smart tissue autonomous robot surgical setup with 3D Slicer. The study demonstrated that the bridge enabled cross-platform data sharing between ROS and medical image computing software. This will allow rapid and seamless integration of advanced image-based planning/navigation offered by the medical image computing software such as 3D Slicer into ROS-based surgical robot systems.

  3. Occupational Pursuits: The Army and World War II Occupation Planning

    DTIC Science & Technology

    2010-03-01

    Army became the dominant U.S. government agency in the interagency process concerning post-World War II occupation planning. Despite President ...the Army’s ability to create coherent internal doctrine, the relative weakness of civilian agencies, and the agenda and postwar goals of President ...Despite President Roosevelt’s own misgivings, shared by several influential members of his Cabinet, the Army nonetheless prevailed in shaping

  4. A Non-Heroic Strategy for the Management of Decline: An Examination of the American Approach to Educational Retrenchment.

    ERIC Educational Resources Information Center

    Peruniak, W. S.

    Since Canada and the U.S. share the common problem of declining enrollment, this paper studies the current situation in the States and assesses the merits and shortcomings of the nonheroic strategy being pursued there. The author credits the failure to develop a comprehensive, coherent plan in even one state to several factors, including the…

  5. Coherence Study of Geomagnetic Fluctuations in Frequency Range .04 - 0.6 HZ between Remote Land Sites.

    DTIC Science & Technology

    1983-12-01

    8 B. GEOMAGNETIC BACKGROUND NOISE------------------ 11 III. DATA COLLECTION SYSTEM----------------------------- 13 A... data collection system at two separated land sites, to modify and adapt previously de- veloped software for data analysis and to obtain spectral...the sources that produce these fluctuations. The data collection sites were separated by a distance of 40 km (see Appendix A). One site was at La Mesa

  6. Assessing repository technology. Where do we go from here?

    NASA Technical Reports Server (NTRS)

    Eichmann, David

    1992-01-01

    Three sample information retrieval systems, archie, autoLib, and Wide Area Information Service (WAIS), are compared with regard to their expressiveness and usefulness, first in the general context of information retrieval, and then as prospective software reuse repositories. While the representational capabilities of these systems are limited, they provide a useful foundation for future repository efforts, particularly from the perspective of repository distribution and coherent user interface design.

  7. Assessing repository technology: Where do we go from here?

    NASA Technical Reports Server (NTRS)

    Eichmann, David A.

    1992-01-01

    Three sample information retrieval systems, archie, autoLib, and Wide Area Information Service (WAIS), are compared with regard to their expressiveness and usefulness, first in the general context of information retrieval, and then as perspective software reuse repositories. While the representational capabilities of these systems are limited, they provide a useful foundation for future repository efforts, particularly from the perspective of repository distribution and coherent user interface design.

  8. Optical coherence tomography – current and future applications

    PubMed Central

    Adhi, Mehreen; Duker, Jay S.

    2013-01-01

    Purpose of review Optical coherence tomography (OCT) has revolutionized the clinical practice of ophthalmology. It is a noninvasive imaging technique that provides high-resolution, cross-sectional images of the retina, retinal nerve fiber layer and the optic nerve head. This review discusses the present applications of the commercially available spectral-domain OCT (SD-OCT) systems in the diagnosis and management of retinal diseases, with particular emphasis on choroidal imaging. Future directions of OCT technology and their potential clinical uses are discussed. Recent findings Analysis of the choroidal thickness in healthy eyes and disease states such as age-related macular degeneration, central serous chorioretinopathy, diabetic retinopathy and inherited retinal dystrophies has been successfully achieved using SD-OCT devices with software improvements. Future OCT innovations such as longer-wavelength OCT systems including the swept-source technology, along with Doppler OCT and en-face imaging, may improve the detection of subtle microstructural changes in chorioretinal diseases by improving imaging of the choroid. Summary Advances in OCT technology provide for better understanding of pathogenesis, improved monitoring of progression and assistance in quantifying response to treatment modalities in diseases of the posterior segment of the eye. Further improvements in both hardware and software technologies should further advance the clinician’s ability to assess and manage chorioretinal diseases. PMID:23429598

  9. Data processing software suite SITENNO for coherent X-ray diffraction imaging using the X-ray free-electron laser SACLA.

    PubMed

    Sekiguchi, Yuki; Oroguchi, Tomotaka; Takayama, Yuki; Nakasako, Masayoshi

    2014-05-01

    Coherent X-ray diffraction imaging is a promising technique for visualizing the structures of non-crystalline particles with dimensions of micrometers to sub-micrometers. Recently, X-ray free-electron laser sources have enabled efficient experiments in the `diffraction before destruction' scheme. Diffraction experiments have been conducted at SPring-8 Angstrom Compact free-electron LAser (SACLA) using the custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors. In the experiments, ten thousands of single-shot diffraction patterns can be collected within several hours. Then, diffraction patterns with significant levels of intensity suitable for structural analysis must be found, direct-beam positions in diffraction patterns determined, diffraction patterns from the two CCD detectors merged, and phase-retrieval calculations for structural analyses performed. A software suite named SITENNO has been developed to semi-automatically apply the four-step processing to a huge number of diffraction data. Here, details of the algorithm used in the suite are described and the performance for approximately 9000 diffraction patterns collected from cuboid-shaped copper oxide particles reported. Using the SITENNO suite, it is possible to conduct experiments with data processing immediately after the data collection, and to characterize the size distribution and internal structures of the non-crystalline particles.

  10. Data processing software suite SITENNO for coherent X-ray diffraction imaging using the X-ray free-electron laser SACLA

    PubMed Central

    Sekiguchi, Yuki; Oroguchi, Tomotaka; Takayama, Yuki; Nakasako, Masayoshi

    2014-01-01

    Coherent X-ray diffraction imaging is a promising technique for visualizing the structures of non-crystalline particles with dimensions of micrometers to sub-micrometers. Recently, X-ray free-electron laser sources have enabled efficient experiments in the ‘diffraction before destruction’ scheme. Diffraction experiments have been conducted at SPring-8 Angstrom Compact free-electron LAser (SACLA) using the custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors. In the experiments, ten thousands of single-shot diffraction patterns can be collected within several hours. Then, diffraction patterns with significant levels of intensity suitable for structural analysis must be found, direct-beam positions in diffraction patterns determined, diffraction patterns from the two CCD detectors merged, and phase-retrieval calculations for structural analyses performed. A software suite named SITENNO has been developed to semi-automatically apply the four-step processing to a huge number of diffraction data. Here, details of the algorithm used in the suite are described and the performance for approximately 9000 diffraction patterns collected from cuboid-shaped copper oxide particles reported. Using the SITENNO suite, it is possible to conduct experiments with data processing immediately after the data collection, and to characterize the size distribution and internal structures of the non-crystalline particles. PMID:24763651

  11. Quality control, analysis and secure sharing of Luminex® immunoassay data using the open source LabKey Server platform

    PubMed Central

    2013-01-01

    Background Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists’ capacity to use these immunoassays to evaluate human clinical trials. Results The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose–response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Conclusions Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. PMID:23631706

  12. Quality control, analysis and secure sharing of Luminex® immunoassay data using the open source LabKey Server platform.

    PubMed

    Eckels, Josh; Nathe, Cory; Nelson, Elizabeth K; Shoemaker, Sara G; Nostrand, Elizabeth Van; Yates, Nicole L; Ashley, Vicki C; Harris, Linda J; Bollenbeck, Mark; Fong, Youyi; Tomaras, Georgia D; Piehler, Britt

    2013-04-30

    Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists' capacity to use these immunoassays to evaluate human clinical trials. The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose-response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license.

  13. Signatures of correlated excitonic dynamics in two-dimensional spectroscopy of the Fenna-Matthew-Olson photosynthetic complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caram, Justin R.; Lewis, Nicholas H. C.; Fidler, Andrew F.

    2012-03-14

    Long-lived excitonic coherence in photosynthetic proteins has become an exciting area of research because it may provide design principles for enhancing the efficiency of energy transfer in a broad range of materials. In this publication, we provide new evidence that long-lived excitonic coherence in the Fenna-Mathew-Olson pigment-protein (FMO) complex is consistent with the assumption of cross correlation in the site basis, indicating that each site shares bath fluctuations. We analyze the structure and character of the beating crosspeak between the two lowest energy excitons in two-dimensional (2D) electronic spectra of the FMO Complex. To isolate this dynamic signature, we usemore » the two-dimensional linear prediction Z-transform as a platform for filtering coherent beating signatures within 2D spectra. By separating signals into components in frequency and decay rate representations, we are able to improve resolution and isolate specific coherences. This strategy permits analysis of the shape, position, character, and phase of these features. Simulations of the crosspeak between excitons 1 and 2 in FMO under different regimes of cross correlation verify that statistically independent site fluctuations do not account for the elongation and persistence of the dynamic crosspeak. To reproduce the experimental results, we invoke near complete correlation in the fluctuations experienced by the sites associated with excitons 1 and 2. This model contradicts ab initio quantum mechanic/molecular mechanics simulations that observe no correlation between the energies of individual sites. This contradiction suggests that a new physical model for long-lived coherence may be necessary. The data presented here details experimental results that must be reproduced for a physical model of quantum coherence in photosynthetic energy transfer.« less

  14. Computer-Based Patient Records: Better Planning and Oversight by VA, DOD, and IHS Would Enhance Health Data Sharing

    DTIC Science & Technology

    2001-04-01

    IHS), could share information technology (IT) and patient medical information to provide greater continuity of care, accelerate VA eligibility... patient medical information to provide greater continuity of care, accelerate VA eligibility determinations, and save software development costs.1 In...system, which primarily includes information on patient hospital admission and discharge, patient medications , laboratory results, and radiology

  15. An open-source software platform for data management, visualisation, model building and model sharing in water, energy and other resource modelling domains.

    NASA Astrophysics Data System (ADS)

    Knox, S.; Meier, P.; Mohammed, K.; Korteling, B.; Matrosov, E. S.; Hurford, A.; Huskova, I.; Harou, J. J.; Rosenberg, D. E.; Thilmant, A.; Medellin-Azuara, J.; Wicks, J.

    2015-12-01

    Capacity expansion on resource networks is essential to adapting to economic and population growth and pressures such as climate change. Engineered infrastructure systems such as water, energy, or transport networks require sophisticated and bespoke models to refine management and investment strategies. Successful modeling of such complex systems relies on good data management and advanced methods to visualize and share data.Engineered infrastructure systems are often represented as networks of nodes and links with operating rules describing their interactions. Infrastructure system management and planning can be abstracted to simulating or optimizing new operations and extensions of the network. By separating the data storage of abstract networks from manipulation and modeling we have created a system where infrastructure modeling across various domains is facilitated.We introduce Hydra Platform, a Free Open Source Software designed for analysts and modelers to store, manage and share network topology and data. Hydra Platform is a Python library with a web service layer for remote applications, called Apps, to connect. Apps serve various functions including network or results visualization, data export (e.g. into a proprietary format) or model execution. This Client-Server architecture allows users to manipulate and share centrally stored data. XML templates allow a standardised description of the data structure required for storing network data such that it is compatible with specific models.Hydra Platform represents networks in an abstract way and is therefore not bound to a single modeling domain. It is the Apps that create domain-specific functionality. Using Apps researchers from different domains can incorporate different models within the same network enabling cross-disciplinary modeling while minimizing errors and streamlining data sharing. Separating the Python library from the web layer allows developers to natively expand the software or build web-based apps in other languages for remote functionality. Partner CH2M is developing a commercial user-interface for Hydra Platform however custom interfaces and visualization tools can be built. Hydra Platform is available on GitHub while Apps will be shared on a central repository.

  16. A Probabilistic Software System Attribute Acceptance Paradigm for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2005-01-01

    Standard software requirement formats are written from top-down perspectives only, that is, from an ideal notion of a client s needs. Despite the exactness of the standard format, software and system errors in designed systems have abounded. Bad and inadequate requirements have resulted in cost overruns, schedule slips and lost profitability. Commercial off-the-shelf (COTS) software components are even more troublesome than designed systems because they are often provided as is and subsequently delivered with unsubstantiated validation of described capabilities. For COTS software, there needs to be a way to express the client s software needs in a consistent and formal manner using software system attributes derived from software quality standards. Additionally, the format needs to be amenable to software evaluation processes that integrate observable evidence garnered from historical data. This paper presents a paradigm that effectively bridges the gap between what a client desires (top-down) and what has been demonstrated (bottom-up) for COTS software evaluation. The paradigm addresses the specification of needs before the software evaluation is performed and can be used to increase the shared understanding between clients and software evaluators about what is required and what is technically possible.

  17. Assessment of a spectral domain OCT segmentation software in a retrospective cohort study of exudative AMD patients.

    PubMed

    Tilleul, Julien; Querques, Giuseppe; Canoui-Poitrine, Florence; Leveziel, Nicolas; Souied, Eric H

    2013-01-01

    To assess the ability of the Spectralis optical coherence tomography (OCT) segmentation software to identify the inner limiting membrane and Bruch's membrane in exudative age-related macular degeneration (AMD) patients. Thirty-eight eyes of 38 naive exudative AMD patients were retrospectively included. They all had a complete ophthalmologic examination including Spectralis OCT at baseline, at month 1 and 2. Reliability of the segmentation software was assessed by 2 ophthalmologists. Reliability of the segmentation software was defined as good if both inner limiting membrane and Bruch's membrane were correctly drawn. A total of 38 patients charts were reviewed (114 scans). The inner limiting membrane was correctly drawn by the segmentation software in 114/114 spectral domain OCT scans (100%). Conversely, Bruch's membrane was correctly drawn in 59/114 scans (51.8%). The software was less reliable in locating Bruch's membrane in case of pigment epithelium detachment (PED) than without PED (42.5 vs. 73.5%, respectively; p = 0.049), but its reliability was not associated with SRF or CME (p = 0.55 and p = 0.10, respectively). Segmentation of the inner limiting membrane was constantly trustworthy but Bruch's membrane segmentation was poorly reliable using the automatic Spectralis segmentation software. Based on this software, evaluation of retinal thickness may be incorrect, particularly in case of PED. PED is effectively an important parameter which is not included when measuring retinal thickness. Copyright © 2012 S. Karger AG, Basel.

  18. Mean-deviation analysis in the theory of choice.

    PubMed

    Grechuk, Bogdan; Molyboha, Anton; Zabarankin, Michael

    2012-08-01

    Mean-deviation analysis, along with the existing theories of coherent risk measures and dual utility, is examined in the context of the theory of choice under uncertainty, which studies rational preference relations for random outcomes based on different sets of axioms such as transitivity, monotonicity, continuity, etc. An axiomatic foundation of the theory of coherent risk measures is obtained as a relaxation of the axioms of the dual utility theory, and a further relaxation of the axioms are shown to lead to the mean-deviation analysis. Paradoxes arising from the sets of axioms corresponding to these theories and their possible resolutions are discussed, and application of the mean-deviation analysis to optimal risk sharing and portfolio selection in the context of rational choice is considered. © 2012 Society for Risk Analysis.

  19. Dephasing-covariant operations enable asymptotic reversibility of quantum resources

    NASA Astrophysics Data System (ADS)

    Chitambar, Eric

    2018-05-01

    We study the power of dephasing-covariant operations in the resource theories of coherence and entanglement. These are quantum operations whose actions commute with a projective measurement. In the resource theory of coherence, we find that any two states are asymptotically interconvertible under dephasing-covariant operations. This provides a rare example of a resource theory in which asymptotic reversibility can be attained without needing the maximal set of resource nongenerating operations. When extended to the resource theory of entanglement, the resultant operations share similarities with local operations and classical communication, such as prohibiting the increase of all Rényi α -entropies of entanglement under pure-state transformations. However, we show these operations are still strong enough to enable asymptotic reversibility between any two maximally correlated mixed states, even in the multipartite setting.

  20. Flight and abduction in witchcraft and UFO lore.

    PubMed

    Musgrave, J B; Houran, J

    2000-04-01

    The lore surrounding the mythical Witches' Sabbat and contemporary reports of UFO abductions share three main characteristics: the use of masks, the appearance of "Men in Black," and references to flight and abduction. We review these three commonalities with particular focus on the aspect of flight and abduction. We argue that narratives of the Witches' Sabbat and UFO abductions share the same basic structure, common symbolism, and serve the same psychological needs of providing a coherent explanation for anomalous (ambiguous) experiences while simultaneously giving the experient a sense of freedom, release, and escape from the self. This pattern of similarities suggests the possibility that UFO abductions are a modern version of tales of flight to the Sabbat.

  1. Error recovery in shared memory multiprocessors using private caches

    NASA Technical Reports Server (NTRS)

    Wu, Kun-Lung; Fuchs, W. Kent; Patel, Janak H.

    1990-01-01

    The problem of recovering from processor transient faults in shared memory multiprocesses systems is examined. A user-transparent checkpointing and recovery scheme using private caches is presented. Processes can recover from errors due to faulty processors by restarting from the checkpointed computation state. Implementation techniques using checkpoint identifiers and recovery stacks are examined as a means of reducing performance degradation in processor utilization during normal execution. This cache-based checkpointing technique prevents rollback propagation, provides rapid recovery, and can be integrated into standard cache coherence protocols. An analytical model is used to estimate the relative performance of the scheme during normal execution. Extensions to take error latency into account are presented.

  2. STARS 2.0: 2nd-generation open-source archiving and query software

    NASA Astrophysics Data System (ADS)

    Winegar, Tom

    2008-07-01

    The Subaru Telescope is in process of developing an open-source alternative to the 1st-generation software and databases (STARS 1) used for archiving and query. For STARS 2, we have chosen PHP and Python for scripting and MySQL as the database software. We have collected feedback from staff and observers, and used this feedback to significantly improve the design and functionality of our future archiving and query software. Archiving - We identified two weaknesses in 1st-generation STARS archiving software: a complex and inflexible table structure and uncoordinated system administration for our business model: taking pictures from the summit and archiving them in both Hawaii and Japan. We adopted a simplified and normalized table structure with passive keyword collection, and we are designing an archive-to-archive file transfer system that automatically reports real-time status and error conditions and permits error recovery. Query - We identified several weaknesses in 1st-generation STARS query software: inflexible query tools, poor sharing of calibration data, and no automatic file transfer mechanisms to observers. We are developing improved query tools and sharing of calibration data, and multi-protocol unassisted file transfer mechanisms for observers. In the process, we have redefined a 'query': from an invisible search result that can only transfer once in-house right now, with little status and error reporting and no error recovery - to a stored search result that can be monitored, transferred to different locations with multiple protocols, reporting status and error conditions and permitting recovery from errors.

  3. Sharing data for public health research by members of an international online diabetes social network.

    PubMed

    Weitzman, Elissa R; Adida, Ben; Kelemen, Skyler; Mandl, Kenneth D

    2011-04-27

    Surveillance and response to diabetes may be accelerated through engaging online diabetes social networks (SNs) in consented research. We tested the willingness of an online diabetes community to share data for public health research by providing members with a privacy-preserving social networking software application for rapid temporal-geographic surveillance of glycemic control. SN-mediated collection of cross-sectional, member-reported data from an international online diabetes SN entered into a software application we made available in a "Facebook-like" environment to enable reporting, charting and optional sharing of recent hemoglobin A1c values through a geographic display. Self-enrollment by 17% (n = 1,136) of n = 6,500 active members representing 32 countries and 50 US states. Data were current with 83.1% of most recent A1c values reported obtained within the past 90 days. Sharing was high with 81.4% of users permitting data donation to the community display. 34.1% of users also displayed their A1cs on their SN profile page. Users selecting the most permissive sharing options had a lower average A1c (6.8%) than users not sharing with the community (7.1%, p = .038). 95% of users permitted re-contact. Unadjusted aggregate A1c reported by US users closely resembled aggregate 2007-2008 NHANES estimates (respectively, 6.9% and 6.9%, p = 0.85). Success within an early adopter community demonstrates that online SNs may comprise efficient platforms for bidirectional communication with and data acquisition from disease populations. Advancing this model for cohort and translational science and for use as a complementary surveillance approach will require understanding of inherent selection and publication (sharing) biases in the data and a technology model that supports autonomy, anonymity and privacy.

  4. Knowledge representation of rock plastic deformation

    NASA Astrophysics Data System (ADS)

    Davarpanah, Armita; Babaie, Hassan

    2017-04-01

    The first iteration of the Rock Plastic Deformation (RPD) ontology models the semantics of the dynamic physical and chemical processes and mechanisms that occur during the deformation of the generally inhomogeneous polycrystalline rocks. The ontology represents the knowledge about the production, reconfiguration, displacement, and consumption of the structural components that participate in these processes. It also formalizes the properties that are known by the structural geology and metamorphic petrology communities to hold between the instances of the spatial components and the dynamic processes, the state and system variables, the empirical flow laws that relate the variables, and the laboratory testing conditions and procedures. The modeling of some of the complex physio-chemical, mathematical, and informational concepts and relations of the RPD ontology is based on the class and property structure of some well-established top-level ontologies. The flexible and extensible design of the initial version of the RPD ontology allows it to develop into a model that more fully represents the knowledge of plastic deformation of rocks under different spatial and temporal scales in the laboratory and in solid Earth. The ontology will be used to annotate the datasets related to the microstructures and physical-chemical processes that involve them. This will help the autonomous and globally distributed communities of experimental structural geologists and metamorphic petrologists to coherently and uniformly distribute, discover, access, share, and use their data through automated reasoning and enhanced data integration and software interoperability.

  5. Seeking Shared Practice: A Juxtaposition of the Attributes and Activities of Organized Fossil Groups with Those of Professional Paleontology

    NASA Astrophysics Data System (ADS)

    Crippen, Kent J.; Ellis, Shari; Dunckel, Betty A.; Hendy, Austin J. W.; MacFadden, Bruce J.

    2016-10-01

    This study sought to define the attributes and practices of organized fossil groups (e.g., clubs, paleontological societies) as amateur paleontologists, as well as those of professional paleontologists, and explore the potential for these two groups to work collaboratively as a formalized community. Such an investigation is necessary to develop design principles for an online environment that supports this community and encourages communication and shared practice among individuals with different backgrounds in paleontology and who are geographically isolated. A national survey of fossil group representatives and professional paleontologists was used to address the research questions. The results provide a rich description of the attributes and activities of both groups and are discussed in terms of three design principles for supporting the two groups in a form of collaboration and fellowship via a coherent shared practice within an online learning community.

  6. GIFT-Cloud: A data sharing and collaboration platform for medical imaging research.

    PubMed

    Doel, Tom; Shakir, Dzhoshkun I; Pratt, Rosalind; Aertsen, Michael; Moggridge, James; Bellon, Erwin; David, Anna L; Deprest, Jan; Vercauteren, Tom; Ourselin, Sébastien

    2017-02-01

    Clinical imaging data are essential for developing research software for computer-aided diagnosis, treatment planning and image-guided surgery, yet existing systems are poorly suited for data sharing between healthcare and academia: research systems rarely provide an integrated approach for data exchange with clinicians; hospital systems are focused towards clinical patient care with limited access for external researchers; and safe haven environments are not well suited to algorithm development. We have established GIFT-Cloud, a data and medical image sharing platform, to meet the needs of GIFT-Surg, an international research collaboration that is developing novel imaging methods for fetal surgery. GIFT-Cloud also has general applicability to other areas of imaging research. GIFT-Cloud builds upon well-established cross-platform technologies. The Server provides secure anonymised data storage, direct web-based data access and a REST API for integrating external software. The Uploader provides automated on-site anonymisation, encryption and data upload. Gateways provide a seamless process for uploading medical data from clinical systems to the research server. GIFT-Cloud has been implemented in a multi-centre study for fetal medicine research. We present a case study of placental segmentation for pre-operative surgical planning, showing how GIFT-Cloud underpins the research and integrates with the clinical workflow. GIFT-Cloud simplifies the transfer of imaging data from clinical to research institutions, facilitating the development and validation of medical research software and the sharing of results back to the clinical partners. GIFT-Cloud supports collaboration between multiple healthcare and research institutions while satisfying the demands of patient confidentiality, data security and data ownership. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Gencrypt: one-way cryptographic hashes to detect overlapping individuals across samples

    PubMed Central

    Turchin, Michael C.; Hirschhorn, Joel N.

    2012-01-01

    Summary: Meta-analysis across genome-wide association studies is a common approach for discovering genetic associations. However, in some meta-analysis efforts, individual-level data cannot be broadly shared by study investigators due to privacy and Institutional Review Board concerns. In such cases, researchers cannot confirm that each study represents a unique group of people, leading to potentially inflated test statistics and false positives. To resolve this problem, we created a software tool, Gencrypt, which utilizes a security protocol known as one-way cryptographic hashes to allow overlapping participants to be identified without sharing individual-level data. Availability: Gencrypt is freely available under the GNU general public license v3 at http://www.broadinstitute.org/software/gencrypt/ Contact: joelh@broadinstitute.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22302573

  8. Neuronal bases of structural coherence in contemporary dance observation.

    PubMed

    Bachrach, Asaf; Jola, Corinne; Pallier, Christophe

    2016-01-01

    The neuronal processes underlying dance observation have been the focus of an increasing number of brain imaging studies over the past decade. However, the existing literature mainly dealt with effects of motor and visual expertise, whereas the neural and cognitive mechanisms that underlie the interpretation of dance choreographies remained unexplored. Hence, much attention has been given to the action observation network (AON) whereas the role of other potentially relevant neuro-cognitive mechanisms such as mentalizing (theory of mind) or language (narrative comprehension) in dance understanding is yet to be elucidated. We report the results of an fMRI study where the structural coherence of short contemporary dance choreographies was manipulated parametrically using the same taped movement material. Our participants were all trained dancers. The whole-brain analysis argues that the interpretation of structurally coherent dance phrases involves a subpart (superior parietal) of the AON as well as mentalizing regions in the dorsomedial prefrontal cortex. An ROI analysis based on a similar study using linguistic materials (Pallier et al., 2011) suggests that structural processing in language and dance might share certain neural mechanisms. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Lagrangian based methods for coherent structure detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allshouse, Michael R., E-mail: mallshouse@chaos.utexas.edu; Peacock, Thomas, E-mail: tomp@mit.edu

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other twomore » approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.« less

  10. Counterfactual quantum cryptography based on weak coherent states

    NASA Astrophysics Data System (ADS)

    Yin, Zhen-Qiang; Li, Hong-Wei; Yao, Yao; Zhang, Chun-Mei; Wang, Shuang; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2012-08-01

    In the “counterfactual quantum cryptography” scheme [T.-G. Noh, Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.103.230501 103, 230501 (2009)], two legitimate distant peers may share secret-key bits even when the information carriers do not travel in the quantum channel. The security of this protocol with an ideal single-photon source has been proved by Yin [Z.-Q. Yin, H. W. Li, W. Chen, Z. F. Han, and G. C. Guo, Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.82.042335 82, 042335 (2010)]. In this paper, we prove the security of the counterfactual-quantum-cryptography scheme based on a commonly used weak-coherent-laser source by considering a general collective attack. The basic assumption of this proof is that the efficiency and dark-counting rate of a single-photon detector are consistent for any n-photon Fock states. Then through randomizing the phases of the encoding weak coherent states, Eve's ancilla will be transformed into a classical mixture. Finally, the lower bound of the secret-key-bit rate and a performance analysis for the practical implementation are both given.

  11. Research on the parallel load sharing principle of a novel self-decoupled piezoelectric six-dimensional force sensor.

    PubMed

    Li, Ying-Jun; Yang, Cong; Wang, Gui-Cong; Zhang, Hui; Cui, Huan-Yong; Zhang, Yong-Liang

    2017-09-01

    This paper presents a novel integrated piezoelectric six-dimensional force sensor which can realize dynamic measurement of multi-dimensional space load. Firstly, the composition of the sensor, the spatial layout of force-sensitive components, and measurement principle are analyzed and designed. There is no interference of piezoelectric six-dimensional force sensor in theoretical analysis. Based on the principle of actual work and deformation compatibility coherence, this paper deduces the parallel load sharing principle of the piezoelectric six-dimensional force sensor. The main effect factors which affect the load sharing ratio are obtained. The finite element model of the piezoelectric six-dimensional force sensor is established. In order to verify the load sharing principle of the sensor, a load sharing test device of piezoelectric force sensor is designed and fabricated. The load sharing experimental platform is set up. The experimental results are in accordance with the theoretical analysis and simulation results. The experiments show that the multi-dimensional and heavy force measurement can be realized by the parallel arrangement of the load sharing ring and the force sensitive element in the novel integrated piezoelectric six-dimensional force sensor. The ideal load sharing effect of the sensor can be achieved by appropriate size parameters. This paper has an important guide for the design of the force measuring device according to the load sharing mode. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Deciding together? Best interests and shared decision-making in paediatric intensive care.

    PubMed

    Birchley, Giles

    2014-09-01

    In the western healthcare, shared decision making has become the orthodox approach to making healthcare choices as a way of promoting patient autonomy. Despite the fact that the autonomy paradigm is poorly suited to paediatric decision making, such an approach is enshrined in English common law. When reaching moral decisions, for instance when it is unclear whether treatment or non-treatment will serve a child's best interests, shared decision making is particularly questionable because agreement does not ensure moral validity. With reference to current common law and focusing on intensive care practice, this paper investigates what claims shared decision making may have to legitimacy in a paediatric intensive care setting. Drawing on key texts, I suggest these identify advantages to parents and clinicians but not to the child who is the subject of the decision. Without evidence that shared decision making increases the quality of the decision that is being made, it appears that a focus on the shared nature of a decision does not cohere with the principle that the best interests of the child should remain paramount. In the face of significant pressures toward the displacement of the child's interests in a shared decision, advantages of a shared decision to decisional quality require elucidation. Although a number of arguments of this nature may have potential, should no such advantages be demonstrable we have cause to revise our commitment to either shared decision making or the paramountcy of the child in these circumstances.

  13. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  14. Software Hardware Asset Reuse Enterprise (SHARE) Repository Framework: Related Work and Development Plan

    DTIC Science & Technology

    2009-08-19

    designed to collect the data and assist the analyst in drawing relationships between the data. Palantir Technologies has created one such software...application to support the DoD intelligence community by providing robust capabilities for managing data from various sources10. The Palantir tool...www.palantirtech.com/ - 38 - Figure 17. Palantir Graphical Interface (Gordon-Schlosberg, 2008) Similar examples of the use of ontologies to support data

  15. An International Survey of Industrial Applications of Formal Methods. Volume 1: Purpose, Approach, Analysis, and Conclusions

    DTIC Science & Technology

    1993-09-30

    97 Accesion For NTIS CRA&I DTIC TAB Unannounced 0 Justification ----- ---.......................... Ry Di. t ,:; t,.: 1...months of effort. The product was important for demonstrating to IBM management the potential of the Cleanroom methodology. 3.2.4 Software Architecture ...for Oscilloscopes Using Z (Tektronix) Tektronix in Beaverton, Oregon, used Z to develop a reusable software architecture to be shared among a number

  16. CrossTalk: The Journal of Defense Software Engineering. Volume 22, Number 6, September/October 2009

    DTIC Science & Technology

    2009-10-01

    software to improve the reliability, sustainability, and responsiveness of our warfighting capability. Subscriptions: Send correspondence concerning...endorsed by, the U.S. government, the DoD, the co-sponsors, or the STSC.All product names referenced in this issue are trademarks of their companies...Authors will use this section to share evidence of a demonstrative return on investment, process improvement , quality improvement , reductions to schedule

  17. Evaluation of a deidentification (De-Id) software engine to share pathology reports and clinical documents for research.

    PubMed

    Gupta, Dilip; Saul, Melissa; Gilbertson, John

    2004-02-01

    We evaluated a comprehensive deidentification engine at the University of Pittsburgh Medical Center (UPMC), Pittsburgh, PA, that uses a complex set of rules, dictionaries, pattern-matching algorithms, and the Unified Medical Language System to identify and replace identifying text in clinical reports while preserving medical information for sharing in research. In our initial data set of 967 surgical pathology reports, the software did not suppress outside (103), UPMC (47), and non-UPMC (56) accession numbers; dates (7); names (9) or initials (25) of case pathologists; or hospital or laboratory names (46). In 150 reports, some clinical information was suppressed inadvertently (overmarking). The engine retained eponymic patient names, eg, Barrett and Gleason. In the second evaluation (1,000 reports), the software did not suppress outside (90) or UPMC (6) accession numbers or names (4) or initials (2) of case pathologists. In the third evaluation, the software removed names of patients, hospitals (297/300), pathologists (297/300), transcriptionists, residents and physicians, dates of procedures, and accession numbers (298/300). By the end of the evaluation, the system was reliably and specifically removing safe-harbor identifiers and producing highly readable deidentified text without removing important clinical information. Collaboration between pathology domain experts and system developers and continuous quality assurance are needed to optimize ongoing deidentification processes.

  18. General software design for multisensor data fusion

    NASA Astrophysics Data System (ADS)

    Zhang, Junliang; Zhao, Yuming

    1999-03-01

    In this paper a general method of software design for multisensor data fusion is discussed in detail, which adopts object-oriented technology under UNIX operation system. The software for multisensor data fusion is divided into six functional modules: data collection, database management, GIS, target display and alarming data simulation etc. Furthermore, the primary function, the components and some realization methods of each modular is given. The interfaces among these functional modular relations are discussed. The data exchange among each functional modular is performed by interprocess communication IPC, including message queue, semaphore and shared memory. Thus, each functional modular is executed independently, which reduces the dependence among functional modules and helps software programing and testing. This software for multisensor data fusion is designed as hierarchical structure by the inheritance character of classes. Each functional modular is abstracted and encapsulated through class structure, which avoids software redundancy and enhances readability.

  19. PhysioNet: physiologic signals, time series and related open source software for basic, clinical, and applied research.

    PubMed

    Moody, George B; Mark, Roger G; Goldberger, Ary L

    2011-01-01

    PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.

  20. Design and implementation of a robot control system with traded and shared control capability

    NASA Technical Reports Server (NTRS)

    Hayati, S.; Venkataraman, S. T.

    1989-01-01

    Preliminary results are reported from efforts to design and develop a robotic system that will accept and execute commands from either a six-axis teleoperator device or an autonomous planner, or combine the two. Such a system should have both traded as well as shared control capability. A sharing strategy is presented whereby the overall system, while retaining positive features of teleoperated and autonomous operation, loses its individual negative features. A two-tiered shared control architecture is considered here, consisting of a task level and a servo level. Also presented is a computer architecture for the implementation of this system, including a description of the hardware and software.

  1. Computational high-resolution optical imaging of the living human retina

    NASA Astrophysics Data System (ADS)

    Shemonski, Nathan D.; South, Fredrick A.; Liu, Yuan-Zhi; Adie, Steven G.; Scott Carney, P.; Boppart, Stephen A.

    2015-07-01

    High-resolution in vivo imaging is of great importance for the fields of biology and medicine. The introduction of hardware-based adaptive optics (HAO) has pushed the limits of optical imaging, enabling high-resolution near diffraction-limited imaging of previously unresolvable structures. In ophthalmology, when combined with optical coherence tomography, HAO has enabled a detailed three-dimensional visualization of photoreceptor distributions and individual nerve fibre bundles in the living human retina. However, the introduction of HAO hardware and supporting software adds considerable complexity and cost to an imaging system, limiting the number of researchers and medical professionals who could benefit from the technology. Here we demonstrate a fully automated computational approach that enables high-resolution in vivo ophthalmic imaging without the need for HAO. The results demonstrate that computational methods in coherent microscopy are applicable in highly dynamic living systems.

  2. Simultaneous wavelength and format conversion in SDN/NFV for flexible optical network based on FWM in SOA

    NASA Astrophysics Data System (ADS)

    Zhan, Yueying; Wang, Danshi; Zhang, Min

    2018-04-01

    We propose an all-optical wavelength and format conversion model (CM) for a dynamic data center interconnect node and coherent passive optical network (PON) optical network unit (ONU) in software-defined networking and network function virtualization system based on four-wave mixing in a semiconductor optical amplifier. Five wavelength converted DQPSK signals and two format converted DPSK signals are generated; the performances of the generated signals for two strategies of setting CM in the data center interconnect node and coherent PON ONU, which are over 10 km fiber transmission, have been verified. All of the converted signals are with a power penalty less than 2.2 dB at FEC threshold of 3.8 × 10 - 3, and the optimum bias current of SOA is 300 mA.

  3. Three-dimensional reconstruction for coherent diffraction patterns obtained by XFEL.

    PubMed

    Nakano, Miki; Miyashita, Osamu; Jonic, Slavica; Song, Changyong; Nam, Daewoong; Joti, Yasumasa; Tama, Florence

    2017-07-01

    The three-dimensional (3D) structural analysis of single particles using an X-ray free-electron laser (XFEL) is a new structural biology technique that enables observations of molecules that are difficult to crystallize, such as flexible biomolecular complexes and living tissue in the state close to physiological conditions. In order to restore the 3D structure from the diffraction patterns obtained by the XFEL, computational algorithms are necessary as the orientation of the incident beam with respect to the sample needs to be estimated. A program package for XFEL single-particle analysis based on the Xmipp software package, that is commonly used for image processing in 3D cryo-electron microscopy, has been developed. The reconstruction program has been tested using diffraction patterns of an aerosol nanoparticle obtained by tomographic coherent X-ray diffraction microscopy.

  4. CHOROIDAL THICKNESS IN DIABETIC RETINOPATHY ASSESSED WITH SWEPT-SOURCE OPTICAL COHERENCE TOMOGRAPHY.

    PubMed

    Laíns, Inês; Talcott, Katherine E; Santos, Ana R; Marques, João H; Gil, Pedro; Gil, João; Figueira, João; Husain, Deeba; Kim, Ivana K; Miller, Joan W; Silva, Rufino; Miller, John B

    2018-01-01

    To compare the choroidal thickness (CT) of diabetic eyes (different stages of disease) with controls, using swept-source optical coherence tomography. A multicenter, prospective, cross-sectional study of diabetic and nondiabetic subjects using swept-source optical coherence tomography imaging. Choroidal thickness maps, according to the nine Early Treatment Diabetic Retinopathy Study (ETDRS) subfields, were obtained using automated software. Mean CT was calculated as the mean value within the ETDRS grid, and central CT as the mean in the central 1 mm. Diabetic eyes were divided into four groups: no diabetic retinopathy (No DR), nonproliferative DR (NPDR), NPDR with diabetic macular edema (NPDR + DME), and proliferative DR (PDR). Multilevel mixed linear models were performed for analyses. The authors included 50 control and 160 diabetic eyes (n = 27 No DR, n = 51 NPDR, n = 61 NPDR + DME, and n = 21 PDR). Mean CT (ß = -42.9, P = 0.022) and central CT (ß = -50.2, P = 0.013) were statistically significantly thinner in PDR eyes compared with controls, even after adjusting for confounding factors. Controlling for age, DR eyes presented a significantly decreased central CT than diabetic eyes without retinopathy (β = -36.2, P = 0.009). Swept-source optical coherence tomography demonstrates a significant reduction of CT in PDR compared with controls. In the foveal region, the choroid appears to be thinner in DR eyes than in diabetic eyes without retinopathy.

  5. Scheduling for Locality in Shared-Memory Multiprocessors

    DTIC Science & Technology

    1993-05-01

    Submitted in Partial Fulfillment of the Requirements for the Degree ’)iIC Q(JALfryT INSPECTED 5 DOCTOR OF PHILOSOPHY I Accesion For Supervised by NTIS CRAM... architecture on parallel program performance, explain the implications of this trend on popular parallel programming models, and propose system software to 0...decomoosition and scheduling algorithms. I. SUIUECT TERMS IS. NUMBER OF PAGES shared-memory multiprocessors; architecture trends; loop 110 scheduling

  6. Unclassified Information Sharing and Coordination in Security, Stabilization, Transition and Reconstruction Efforts

    DTIC Science & Technology

    2008-03-01

    is implemented using the Drupal (2007) content management system (CMS) and many of the baseline information sharing and collaboration tools have...been contributed through the Dru- pal open source community. Drupal is a very modular open source software written in PHP hypertext processor...needed to suit the particular problem domain. While other frameworks have the potential to provide similar advantages (“Ruby,” 2007), Drupal was

  7. The A-7E Software Requirements Document: Three Years of Change Data.

    DTIC Science & Technology

    1982-11-08

    Washington DC: Naval Research Laboratory. 1982. Interface Specifications for the A-7E Shared Services Module NRL Memorandum Report. Forthcoming...function driver module (Clements 1981), specifications for the extended computer module (Britton et al. 1982), and specifications for the shared ... services module (Clements 1982). The projected completion date for the SCR project is September 1985. As of the end of 1981, approximately 10 man-years of

  8. Apeiron: engaging students if ocean science

    NASA Astrophysics Data System (ADS)

    Manzella, Alessandro; Manzella, Giuseppe M. R.

    2017-04-01

    Anaxagoras believed that all things existed in a boundless form. Ápeiron begun to rotate under the control of Nous (Mind) and the rotation caused the universe to break up into fragments, each containing parts of all other things. However, since all individual things had originated from the same ápeiron, all things must contain parts of all other things. In some sense, the title contain the main concept on the interdependence of humans and the natural environment that make necessary to have a general understanding on how anthropogenic activities have changed the earth system and how they are impacting the climate cycles. Ápeiron is the interdependence of humans and natural environment. A general understanding on human influences on earth system is necessary. The ability to solve a problem, to write a coherent paragraph, to utter a cogent statement are soft skills supporting sustainable development. Soft skills must be tempered with the ability to integrate knowledge from various sources into a coherent whole. Students, professors and researchers interaction improve personal comprehension. Students must be encouraged to debate ideas and the way to present them. They are asked to look for and develop bases for shared understanding. In this way they participated to the definition of a knowledge building process as a social epistemology: from personal beliefs to social shared vision.

  9. Managing Written Directives: A Software Solution to Streamline Workflow.

    PubMed

    Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat S; Halama, James R; Bova, Davide

    2017-06-01

    A written directive is required by the U.S. Nuclear Regulatory Commission for any use of 131 I above 1.11 MBq (30 μCi) and for patients receiving radiopharmaceutical therapy. This requirement has also been adopted and must be enforced by the agreement states. As the introduction of new radiopharmaceuticals increases therapeutic options in nuclear medicine, time spent on regulatory paperwork also increases. The pressure of managing these time-consuming regulatory requirements may heighten the potential for inaccurate or incomplete directive data and subsequent regulatory violations. To improve on the paper-trail method of directive management, we created a software tool using a Health Insurance Portability and Accountability Act (HIPAA)-compliant database. This software allows for secure data-sharing among physicians, technologists, and managers while saving time, reducing errors, and eliminating the possibility of loss and duplication. Methods: The software tool was developed using Visual Basic, which is part of the Visual Studio development environment for the Windows platform. Patient data are deposited in an Access database on a local HIPAA-compliant secure server or hard disk. Once a working version had been developed, it was installed at our institution and used to manage directives. Updates and modifications of the software were released regularly until no more significant problems were found with its operation. Results: The software has been used at our institution for over 2 y and has reliably kept track of all directives. All physicians and technologists use the software daily and find it superior to paper directives. They can retrieve active directives at any stage of completion, as well as completed directives. Conclusion: We have developed a software solution for the management of written directives that streamlines and structures the departmental workflow. This solution saves time, centralizes the information for all staff to share, and decreases confusion about the creation, completion, filing, and retrieval of directives. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  10. Open Technology Approaches to Geospatial Interface Design

    NASA Astrophysics Data System (ADS)

    Crevensten, B.; Simmons, D.; Alaska Satellite Facility

    2011-12-01

    What problems do you not want your software developers to be solving? Choosing open technologies across the entire stack of software development-from low-level shared libraries to high-level user interaction implementations-is a way to help ensure that customized software yields innovative and valuable tools for Earth Scientists. This demonstration will review developments in web application technologies and the recurring patterns of interaction design regarding exploration and discovery of geospatial data through the Vertex: ASF's Dataportal interface, a project utilizing current open web application standards and technologies including HTML5, jQueryUI, Backbone.js and the Jasmine unit testing framework.

  11. Behavior driven testing in ALMA telescope calibration software

    NASA Astrophysics Data System (ADS)

    Gil, Juan P.; Garces, Mario; Broguiere, Dominique; Shen, Tzu-Chiang

    2016-07-01

    ALMA software development cycle includes well defined testing stages that involves developers, testers and scientists. We adapted Behavior Driven Development (BDD) to testing activities applied to Telescope Calibration (TELCAL) software. BDD is an agile technique that encourages communication between roles by defining test cases using natural language to specify features and scenarios, what allows participants to share a common language and provides a high level set of automated tests. This work describes how we implemented and maintain BDD testing for TELCAL, the infrastructure needed to support it and proposals to expand this technique to other subsystems.

  12. Concierge: Personal Database Software for Managing Digital Research Resources

    PubMed Central

    Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro

    2007-01-01

    This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp). PMID:18974800

  13. Technology collaboration by means of an open source government

    NASA Astrophysics Data System (ADS)

    Berardi, Steven M.

    2009-05-01

    The idea of open source software originally began in the early 1980s, but it never gained widespread support until recently, largely due to the explosive growth of the Internet. Only the Internet has made this kind of concept possible, bringing together millions of software developers from around the world to pool their knowledge. The tremendous success of open source software has prompted many corporations to adopt the culture of open source and thus share information they previously held secret. The government, and specifically the Department of Defense (DoD), could also benefit from adopting an open source culture. In acquiring satellite systems, the DoD often builds walls between program offices, but installing doors between programs can promote collaboration and information sharing. This paper addresses the challenges and consequences of adopting an open source culture to facilitate technology collaboration for DoD space acquisitions. DISCLAIMER: The views presented here are the views of the author, and do not represent the views of the United States Government, United States Air Force, or the Missile Defense Agency.

  14. What should I do next? Using shared representations to solve interaction problems.

    PubMed

    Pezzulo, Giovanni; Dindo, Haris

    2011-06-01

    Studies on how "the social mind" works reveal that cognitive agents engaged in joint actions actively estimate and influence another's cognitive variables and form shared representations with them. (How) do shared representations enhance coordination? In this paper, we provide a probabilistic model of joint action that emphasizes how shared representations help solving interaction problems. We focus on two aspects of the model. First, we discuss how shared representations permit to coordinate at the level of cognitive variables (beliefs, intentions, and actions) and determine a coherent unfolding of action execution and predictive processes in the brains of two agents. Second, we discuss the importance of signaling actions as part of a strategy for sharing representations and the active guidance of another's actions toward the achievement of a joint goal. Furthermore, we present data from a human-computer experiment (the Tower Game) in which two agents (human and computer) have to build together a tower made of colored blocks, but only the human knows the constellation of the tower to be built (e.g., red-blue-red-blue-…). We report evidence that humans use signaling strategies that take another's uncertainty into consideration, and that in turn our model is able to use humans' actions as cues to "align" its representations and to select complementary actions.

  15. Center-to-center : local self-evaluation report

    DOT National Transportation Integrated Search

    2003-04-01

    Texas Department of Transportation implemented a software system to facilitate sharing of traffic management related information and control of Intelligent Transportation System field devices between Traffic Management Centers with heterogeneous Adva...

  16. Reconfigurable firmware-defined radios synthesized from standard digital logic cells

    NASA Astrophysics Data System (ADS)

    Faisal, Muhammad; Park, Youngmin; Wentzloff, David D.

    2011-06-01

    This paper presents recent work on reconfigurable all-digital radio architectures. We leverage the flexibility and scalability of synthesized digital cells to construct reconfigurable radio architectures that consume significantly less power than a software defined radio implementing similar architectures. We present two prototypes of such architectures that can receive and demodulate FM and FRS band signals. Moreover, a radio architecture based on a reconfigurable alldigital phase-locked loop for coherent demodulation is presented.

  17. Coherent Path Beamformer Front End for High Performance Acoustic Modems

    DTIC Science & Technology

    1999-09-30

    transmission underwater. This knowledge will be used to develop a test model for evaluating under water acoustic modem and other shallow water sonar ...rates can be achieved, as shown in the following two sections. WORK COMPLETED Two systems have been developed in the Sonar Laboratory of Ocean...2) More performant variable gain preamplifiers have been installed and the software updated for a better control of the dynamic range. 3) An

  18. Are Earth System model software engineering practices fit for purpose? A case study.

    NASA Astrophysics Data System (ADS)

    Easterbrook, S. M.; Johns, T. C.

    2009-04-01

    We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.

  19. ProXL (Protein Cross-Linking Database): A Platform for Analysis, Visualization, and Sharing of Protein Cross-Linking Mass Spectrometry Data

    PubMed Central

    2016-01-01

    ProXL is a Web application and accompanying database designed for sharing, visualizing, and analyzing bottom-up protein cross-linking mass spectrometry data with an emphasis on structural analysis and quality control. ProXL is designed to be independent of any particular software pipeline. The import process is simplified by the use of the ProXL XML data format, which shields developers of data importers from the relative complexity of the relational database schema. The database and Web interfaces function equally well for any software pipeline and allow data from disparate pipelines to be merged and contrasted. ProXL includes robust public and private data sharing capabilities, including a project-based interface designed to ensure security and facilitate collaboration among multiple researchers. ProXL provides multiple interactive and highly dynamic data visualizations that facilitate structural-based analysis of the observed cross-links as well as quality control. ProXL is open-source, well-documented, and freely available at https://github.com/yeastrc/proxl-web-app. PMID:27302480

  20. The NIH BD2K center for big data in translational genomics.

    PubMed

    Paten, Benedict; Diekhans, Mark; Druker, Brian J; Friend, Stephen; Guinney, Justin; Gassner, Nadine; Guttman, Mitchell; Kent, W James; Mantey, Patrick; Margolin, Adam A; Massie, Matt; Novak, Adam M; Nothaft, Frank; Pachter, Lior; Patterson, David; Smuga-Otto, Maciej; Stuart, Joshua M; Van't Veer, Laura; Wold, Barbara; Haussler, David

    2015-11-01

    The world's genomics data will never be stored in a single repository - rather, it will be distributed among many sites in many countries. No one site will have enough data to explain genotype to phenotype relationships in rare diseases; therefore, sites must share data. To accomplish this, the genetics community must forge common standards and protocols to make sharing and computing data among many sites a seamless activity. Through the Global Alliance for Genomics and Health, we are pioneering the development of shared application programming interfaces (APIs) to connect the world's genome repositories. In parallel, we are developing an open source software stack (ADAM) that uses these APIs. This combination will create a cohesive genome informatics ecosystem. Using containers, we are facilitating the deployment of this software in a diverse array of environments. Through benchmarking efforts and big data driver projects, we are ensuring ADAM's performance and utility. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. ProXL (Protein Cross-Linking Database): A Platform for Analysis, Visualization, and Sharing of Protein Cross-Linking Mass Spectrometry Data.

    PubMed

    Riffle, Michael; Jaschob, Daniel; Zelter, Alex; Davis, Trisha N

    2016-08-05

    ProXL is a Web application and accompanying database designed for sharing, visualizing, and analyzing bottom-up protein cross-linking mass spectrometry data with an emphasis on structural analysis and quality control. ProXL is designed to be independent of any particular software pipeline. The import process is simplified by the use of the ProXL XML data format, which shields developers of data importers from the relative complexity of the relational database schema. The database and Web interfaces function equally well for any software pipeline and allow data from disparate pipelines to be merged and contrasted. ProXL includes robust public and private data sharing capabilities, including a project-based interface designed to ensure security and facilitate collaboration among multiple researchers. ProXL provides multiple interactive and highly dynamic data visualizations that facilitate structural-based analysis of the observed cross-links as well as quality control. ProXL is open-source, well-documented, and freely available at https://github.com/yeastrc/proxl-web-app .

  2. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  3. Design Activity in the Software Cost Reduction Project.

    DTIC Science & Technology

    1986-08-18

    PM Physical Model S G System Generation SS Shared Services SU System Utilities . NOV M N 1600SEP A 0 JUL TOTAL 14000 MAAR cc 100 FEB :IESGN 0o 10000...iy---- .... ;’ TESTING Jan 78 Jan 79 Jan 80 Jan 81 Jan 82 Jan 83 Jan 84 Jan 85 M3ITH Fig. 7 - Shared services activities A F 0 U E C 1600 G B T...DISCUSSING 200M Jan 78 Jan 79 Jan 80 Jan 81 Jan 82 Jan 83 Jan 84 Jan 85 Fig 13 - Shared services design activities 5.~ S% 12 ......,ooU7 . . NRL REPORT 8974 A

  4. Data Telemetry and Acquisition System for Acoustic Signal Processing Investigations.

    DTIC Science & Technology

    1996-02-20

    were VME- based computer systems operating under the VxWorks real - time operating system . Each system shared a common hardware and software... real - time operating system . It interfaces to the Berg PCM Decommutator board, which searches for the embedded synchronization word in the data and re...software were built on top of this architecture. The multi-tasking, message queue and memory management facilities of the VxWorks real - time operating system are

  5. NDEx - the Network Data Exchange, A Network Commons for Biologists | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Network models of biology, whether curated or derived from large-scale data analysis, are critical tools in the understanding of cancer mechanisms and in the design and personalization of therapies. The NDEx Project (Network Data Exchange) will create, deploy, and maintain an open-source, web-based software platform and public website to enable scientists, organizations, and software applications to share, store, manipulate, and publish biological networks.

  6. A Formal Approach to the Provably Correct Synthesis of Mission Critical Embedded Software for Multi Core Embedded Platforms

    DTIC Science & Technology

    2014-04-01

    synchronization primitives based on preset templates can result in over synchronization if unchecked, possibly creating deadlock situations. Further...inputs rather than enforcing synchronization with a global clock. MRICDF models software as a network of communicating actors. Four primitive actors...control wants to send interrupt or not. Since this is shared buffer, a semaphore mechanism is assumed to synchronize the read/write of this buffer. The

  7. JBEI Registry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ham, Timothy

    2008-12-01

    The JBEI Registry is a software to store and manage to a database of biological parts. It is intended to be used as a web service that is accessed via a web browser. It is also capable of running as a desktop program for a single user. The registry software stores, indexes, categories, and allows users to enter, search, retrieve, and contruct biological constructs in silico. It is also able to communicate with other Registries for data sharing and exchange.

  8. Commercial Digital/ADP Equipment in the Ocean Environment. Volume 2. User Appendices

    DTIC Science & Technology

    1978-12-15

    is that the LINDA system uses a mini computer with a time sharing system software which allows several terminals to be operated at the same time...Acquisition System (ODAS) consists of sensors, computer hardware and computer software . Certain sensors are interfaced to the computers for real time...on USNS KANE, USNS BENT, and USKS WILKES. Commercial automatic data processing equipment used in ODAS includes: Item Model Computer PDP-9 Tape

  9. Transportable Payload Operations Control Center reusable software: Building blocks for quality ground data systems

    NASA Technical Reports Server (NTRS)

    Mahmot, Ron; Koslosky, John T.; Beach, Edward; Schwarz, Barbara

    1994-01-01

    The Mission Operations Division (MOD) at Goddard Space Flight Center builds Mission Operations Centers which are used by Flight Operations Teams to monitor and control satellites. Reducing system life cycle costs through software reuse has always been a priority of the MOD. The MOD's Transportable Payload Operations Control Center development team established an extensive library of 14 subsystems with over 100,000 delivered source instructions of reusable, generic software components. Nine TPOCC-based control centers to date support 11 satellites and achieved an average software reuse level of more than 75 percent. This paper shares experiences of how the TPOCC building blocks were developed and how building block developer's, mission development teams, and users are all part of the process.

  10. Building Successful GitHub Communities

    NASA Astrophysics Data System (ADS)

    Smith, A.

    2014-12-01

    Building successful online communities is hard, whether it's in open source software or web-based citizen science. In this presentation I'll share some lessons learned and outline some techniques employed by successful open source projects.

  11. A Perron-Frobenius type of theorem for quantum operations

    NASA Astrophysics Data System (ADS)

    Lagro, Matthew

    Quantum random walks are a generalization of classical Markovian random walks to a quantum mechanical or quantum computing setting. Quantum walks have promising applications but are complicated by quantum decoherence. We prove that the long-time limiting behavior of the class of quantum operations which are the convex combination of norm one operators is governed by the eigenvectors with norm one eigenvalues which are shared by the operators. This class includes all operations formed by a coherent operation with positive probability of orthogonal measurement at each step. We also prove that any operation that has range contained in a low enough dimension subspace of the space of density operators has limiting behavior isomorphic to an associated Markov chain. A particular class of such operations are coherent operations followed by an orthogonal measurement. Applications of the convergence theorems to quantum walks are given.

  12. Statistical Mechanics of Coherent Ising Machine — The Case of Ferromagnetic and Finite-Loading Hopfield Models —

    NASA Astrophysics Data System (ADS)

    Aonishi, Toru; Mimura, Kazushi; Utsunomiya, Shoko; Okada, Masato; Yamamoto, Yoshihisa

    2017-10-01

    The coherent Ising machine (CIM) has attracted attention as one of the most effective Ising computing architectures for solving large scale optimization problems because of its scalability and high-speed computational ability. However, it is difficult to implement the Ising computation in the CIM because the theories and techniques of classical thermodynamic equilibrium Ising spin systems cannot be directly applied to the CIM. This means we have to adapt these theories and techniques to the CIM. Here we focus on a ferromagnetic model and a finite loading Hopfield model, which are canonical models sharing a common mathematical structure with almost all other Ising models. We derive macroscopic equations to capture nonequilibrium phase transitions in these models. The statistical mechanical methods developed here constitute a basis for constructing evaluation methods for other Ising computation models.

  13. Semantic Entity-Component State Management Techniques to Enhance Software Quality for Multimodal VR-Systems.

    PubMed

    Fischbach, Martin; Wiebusch, Dennis; Latoschik, Marc Erich

    2017-04-01

    Modularity, modifiability, reusability, and API usability are important software qualities that determine the maintainability of software architectures. Virtual, Augmented, and Mixed Reality (VR, AR, MR) systems, modern computer games, as well as interactive human-robot systems often include various dedicated input-, output-, and processing subsystems. These subsystems collectively maintain a real-time simulation of a coherent application state. The resulting interdependencies between individual state representations, mutual state access, overall synchronization, and flow of control implies a conceptual close coupling whereas software quality asks for a decoupling to develop maintainable solutions. This article presents five semantics-based software techniques that address this contradiction: Semantic grounding, code from semantics, grounded actions, semantic queries, and decoupling by semantics. These techniques are applied to extend the well-established entity-component-system (ECS) pattern to overcome some of this pattern's deficits with respect to the implied state access. A walk-through of central implementation aspects of a multimodal (speech and gesture) VR-interface is used to highlight the techniques' benefits. This use-case is chosen as a prototypical example of complex architectures with multiple interacting subsystems found in many VR, AR and MR architectures. Finally, implementation hints are given, lessons learned regarding maintainability pointed-out, and performance implications discussed.

  14. Real-Time Spatio-Temporal Twice Whitening for MIMO Energy Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S; Mitra, Pramita; Barhen, Jacob

    2010-01-01

    While many techniques exist for local spectrum sensing of a primary user, each represents a computationally demanding task to secondary user receivers. In software-defined radio, computational complexity lengthens the time for a cognitive radio to recognize changes in the transmission environment. This complexity is even more significant for spatially multiplexed receivers, e.g., in SIMO and MIMO, where the spatio-temporal data sets grow in size with the number of antennae. Limits on power and space for the processor hardware further constrain SDR performance. In this report, we discuss improvements in spatio-temporal twice whitening (STTW) for real-time local spectrum sensing by demonstratingmore » a form of STTW well suited for MIMO environments. We implement STTW on the Coherent Logix hx3100 processor, a multicore processor intended for low-power, high-throughput software-defined signal processing. These results demonstrate how coupling the novel capabilities of emerging multicore processors with algorithmic advances can enable real-time, software-defined processing of large spatio-temporal data sets.« less

  15. Evaluation of interaction dynamics of concurrent processes

    NASA Astrophysics Data System (ADS)

    Sobecki, Piotr; Białasiewicz, Jan T.; Gross, Nicholas

    2017-03-01

    The purpose of this paper is to present the wavelet tools that enable the detection of temporal interactions of concurrent processes. In particular, the determination of interaction coherence of time-varying signals is achieved using a complex continuous wavelet transform. This paper has used electrocardiogram (ECG) and seismocardiogram (SCG) data set to show multiple continuous wavelet analysis techniques based on Morlet wavelet transform. MATLAB Graphical User Interface (GUI), developed in the reported research to assist in quick and simple data analysis, is presented. These software tools can discover the interaction dynamics of time-varying signals, hence they can reveal their correlation in phase and amplitude, as well as their non-linear interconnections. The user-friendly MATLAB GUI enables effective use of the developed software what enables to load two processes under investigation, make choice of the required processing parameters, and then perform the analysis. The software developed is a useful tool for researchers who have a need for investigation of interaction dynamics of concurrent processes.

  16. Representing Hydrologic Models as HydroShare Resources to Facilitate Model Sharing and Collaboration

    NASA Astrophysics Data System (ADS)

    Castronova, A. M.; Goodall, J. L.; Mbewe, P.

    2013-12-01

    The CUAHSI HydroShare project is a collaborative effort that aims to provide software for sharing data and models within the hydrologic science community. One of the early focuses of this work has been establishing metadata standards for describing models and model-related data as HydroShare resources. By leveraging this metadata definition, a prototype extension has been developed to create model resources that can be shared within the community using the HydroShare system. The extension uses a general model metadata definition to create resource objects, and was designed so that model-specific parsing routines can extract and populate metadata fields from model input and output files. The long term goal is to establish a library of supported models where, for each model, the system has the ability to extract key metadata fields automatically, thereby establishing standardized model metadata that will serve as the foundation for model sharing and collaboration within HydroShare. The Soil Water & Assessment Tool (SWAT) is used to demonstrate this concept through a case study application.

  17. Performance analysis of stationary Hadamard matrix diffusers in free-space optical communication links

    NASA Astrophysics Data System (ADS)

    Burrell, Derek J.; Middlebrook, Christopher T.

    2017-08-01

    Wireless communication systems that employ free-space optical links in place of radio/microwave technologies carry substantial benefits in terms of data throughput, network security and design efficiency. Along with these advantages comes the challenge of counteracting signal degradation caused by atmospheric turbulence in free-space environments. A fully coherent laser source experiences random phase delays along its traversing path in turbulent conditions forming a speckle pattern and lowering the received signal-to-noise ratio upon detection. Preliminary research has shown that receiver-side speckle contrast may be significantly reduced and signal-to-noise ratio increased accordingly through the use of a partially coherent light source. While dynamic diffusers and adaptive optics solutions have been proven effective, they also add expense and complexity to a system that relies on accessibility and robustness for successful implementation. A custom Hadamard diffractive matrix design is used to statically induce partial coherence in a transmitted beam to increase signal-to-noise ratio for experimental turbulence scenarios. Atmospheric phase screens are generated using an open-source software package and subsequently loaded into a spatial light modulator using nematic liquid crystals to modulate the phase.

  18. Bringing your tools to CyVerse Discovery Environment using Docker

    PubMed Central

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse’s Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse’s production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use. PMID:27803802

  19. Bringing your tools to CyVerse Discovery Environment using Docker.

    PubMed

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse's Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse's production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use.

  20. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  1. Continuous Energy Photon Transport Implementation in MCATK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Terry R.; Trahan, Travis John; Sweezy, Jeremy Ed

    2016-10-31

    The Monte Carlo Application ToolKit (MCATK) code development team has implemented Monte Carlo photon transport into the MCATK software suite. The current particle transport capabilities in MCATK, which process the tracking and collision physics, have been extended to enable tracking of photons using the same continuous energy approximation. We describe the four photoatomic processes implemented, which are coherent scattering, incoherent scattering, pair-production, and photoelectric absorption. The accompanying background, implementation, and verification of these processes will be presented.

  2. Three Dimensional Optical Coherence Tomography Imaging: Advantages and Advances

    PubMed Central

    Gabriele, Michelle L; Wollstein, Gadi; Ishikawa, Hiroshi; Xu, Juan; Kim, Jongsick; Kagemann, Larry; Folio, Lindsey S; Schuman, Joel S.

    2010-01-01

    Three dimensional (3D) ophthalmic imaging using optical coherence tomography (OCT) has revolutionized assessment of the eye, the retina in particular. Recent technological improvements have made the acquisition of 3D-OCT datasets feasible. However, while volumetric data can improve disease diagnosis and follow-up, novel image analysis techniques are now necessary in order to process the dense 3D-OCT dataset. Fundamental software improvements include methods for correcting subject eye motion, segmenting structures or volumes of interest, extracting relevant data post hoc and signal averaging to improve delineation of retinal layers. In addition, innovative methods for image display, such as C-mode sectioning, provide a unique viewing perspective and may improve interpretation of OCT images of pathologic structures. While all of these methods are being developed, most remain in an immature state. This review describes the current status of 3D-OCT scanning and interpretation, and discusses the need for standardization of clinical protocols as well as the potential benefits of 3D-OCT scanning that could come when software methods for fully exploiting these rich data sets are available clinically. The implications of new image analysis approaches include improved reproducibility of measurements garnered from 3D-OCT, which may then help improve disease discrimination and progression detection. In addition, 3D-OCT offers the potential for preoperative surgical planning and intraoperative surgical guidance. PMID:20542136

  3. ibex: An open infrastructure software platform to facilitate collaborative work in radiomics

    PubMed Central

    Zhang, Lifei; Fried, David V.; Fave, Xenia J.; Hunter, Luke A.; Court, Laurence E.

    2015-01-01

    Purpose: Radiomics, which is the high-throughput extraction and analysis of quantitative image features, has been shown to have considerable potential to quantify the tumor phenotype. However, at present, a lack of software infrastructure has impeded the development of radiomics and its applications. Therefore, the authors developed the imaging biomarker explorer (ibex), an open infrastructure software platform that flexibly supports common radiomics workflow tasks such as multimodality image data import and review, development of feature extraction algorithms, model validation, and consistent data sharing among multiple institutions. Methods: The ibex software package was developed using the matlab and c/c++ programming languages. The software architecture deploys the modern model-view-controller, unit testing, and function handle programming concepts to isolate each quantitative imaging analysis task, to validate if their relevant data and algorithms are fit for use, and to plug in new modules. On one hand, ibex is self-contained and ready to use: it has implemented common data importers, common image filters, and common feature extraction algorithms. On the other hand, ibex provides an integrated development environment on top of matlab and c/c++, so users are not limited to its built-in functions. In the ibex developer studio, users can plug in, debug, and test new algorithms, extending ibex’s functionality. ibex also supports quality assurance for data and feature algorithms: image data, regions of interest, and feature algorithm-related data can be reviewed, validated, and/or modified. More importantly, two key elements in collaborative workflows, the consistency of data sharing and the reproducibility of calculation result, are embedded in the ibex workflow: image data, feature algorithms, and model validation including newly developed ones from different users can be easily and consistently shared so that results can be more easily reproduced between institutions. Results: Researchers with a variety of technical skill levels, including radiation oncologists, physicists, and computer scientists, have found the ibex software to be intuitive, powerful, and easy to use. ibex can be run at any computer with the windows operating system and 1GB RAM. The authors fully validated the implementation of all importers, preprocessing algorithms, and feature extraction algorithms. Windows version 1.0 beta of stand-alone ibex and ibex’s source code can be downloaded. Conclusions: The authors successfully implemented ibex, an open infrastructure software platform that streamlines common radiomics workflow tasks. Its transparency, flexibility, and portability can greatly accelerate the pace of radiomics research and pave the way toward successful clinical translation. PMID:25735289

  4. IBEX: an open infrastructure software platform to facilitate collaborative work in radiomics.

    PubMed

    Zhang, Lifei; Fried, David V; Fave, Xenia J; Hunter, Luke A; Yang, Jinzhong; Court, Laurence E

    2015-03-01

    Radiomics, which is the high-throughput extraction and analysis of quantitative image features, has been shown to have considerable potential to quantify the tumor phenotype. However, at present, a lack of software infrastructure has impeded the development of radiomics and its applications. Therefore, the authors developed the imaging biomarker explorer (IBEX), an open infrastructure software platform that flexibly supports common radiomics workflow tasks such as multimodality image data import and review, development of feature extraction algorithms, model validation, and consistent data sharing among multiple institutions. The IBEX software package was developed using the MATLAB and c/c++ programming languages. The software architecture deploys the modern model-view-controller, unit testing, and function handle programming concepts to isolate each quantitative imaging analysis task, to validate if their relevant data and algorithms are fit for use, and to plug in new modules. On one hand, IBEX is self-contained and ready to use: it has implemented common data importers, common image filters, and common feature extraction algorithms. On the other hand, IBEX provides an integrated development environment on top of MATLAB and c/c++, so users are not limited to its built-in functions. In the IBEX developer studio, users can plug in, debug, and test new algorithms, extending IBEX's functionality. IBEX also supports quality assurance for data and feature algorithms: image data, regions of interest, and feature algorithm-related data can be reviewed, validated, and/or modified. More importantly, two key elements in collaborative workflows, the consistency of data sharing and the reproducibility of calculation result, are embedded in the IBEX workflow: image data, feature algorithms, and model validation including newly developed ones from different users can be easily and consistently shared so that results can be more easily reproduced between institutions. Researchers with a variety of technical skill levels, including radiation oncologists, physicists, and computer scientists, have found the IBEX software to be intuitive, powerful, and easy to use. IBEX can be run at any computer with the windows operating system and 1GB RAM. The authors fully validated the implementation of all importers, preprocessing algorithms, and feature extraction algorithms. Windows version 1.0 beta of stand-alone IBEX and IBEX's source code can be downloaded. The authors successfully implemented IBEX, an open infrastructure software platform that streamlines common radiomics workflow tasks. Its transparency, flexibility, and portability can greatly accelerate the pace of radiomics research and pave the way toward successful clinical translation.

  5. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action

    PubMed Central

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels—from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures. PMID:27920748

  6. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action.

    PubMed

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.

  7. Dynamic provisioning of a HEP computing infrastructure on a shared hybrid HPC system

    NASA Astrophysics Data System (ADS)

    Meier, Konrad; Fleig, Georg; Hauth, Thomas; Janczyk, Michael; Quast, Günter; von Suchodoletz, Dirk; Wiebelt, Bernd

    2016-10-01

    Experiments in high-energy physics (HEP) rely on elaborate hardware, software and computing systems to sustain the high data rates necessary to study rare physics processes. The Institut fr Experimentelle Kernphysik (EKP) at KIT is a member of the CMS and Belle II experiments, located at the LHC and the Super-KEKB accelerators, respectively. These detectors share the requirement, that enormous amounts of measurement data must be processed and analyzed and a comparable amount of simulated events is required to compare experimental results with theoretical predictions. Classical HEP computing centers are dedicated sites which support multiple experiments and have the required software pre-installed. Nowadays, funding agencies encourage research groups to participate in shared HPC cluster models, where scientist from different domains use the same hardware to increase synergies. This shared usage proves to be challenging for HEP groups, due to their specialized software setup which includes a custom OS (often Scientific Linux), libraries and applications. To overcome this hurdle, the EKP and data center team of the University of Freiburg have developed a system to enable the HEP use case on a shared HPC cluster. To achieve this, an OpenStack-based virtualization layer is installed on top of a bare-metal cluster. While other user groups can run their batch jobs via the Moab workload manager directly on bare-metal, HEP users can request virtual machines with a specialized machine image which contains a dedicated operating system and software stack. In contrast to similar installations, in this hybrid setup, no static partitioning of the cluster into a physical and virtualized segment is required. As a unique feature, the placement of the virtual machine on the cluster nodes is scheduled by Moab and the job lifetime is coupled to the lifetime of the virtual machine. This allows for a seamless integration with the jobs sent by other user groups and honors the fairshare policies of the cluster. The developed thin integration layer between OpenStack and Moab can be adapted to other batch servers and virtualization systems, making the concept also applicable for other cluster operators. This contribution will report on the concept and implementation of an OpenStack-virtualized cluster used for HEP workflows. While the full cluster will be installed in spring 2016, a test-bed setup with 800 cores has been used to study the overall system performance and dedicated HEP jobs were run in a virtualized environment over many weeks. Furthermore, the dynamic integration of the virtualized worker nodes, depending on the workload at the institute's computing system, will be described.

  8. Ketamine-Induced Oscillations in the Motor Circuit of the Rat Basal Ganglia

    PubMed Central

    Alegre, Manuel; Pérez-Alcázar, Marta; Iriarte, Jorge; Artieda, Julio

    2011-01-01

    Oscillatory activity can be widely recorded in the cortex and basal ganglia. This activity may play a role not only in the physiology of movement, perception and cognition, but also in the pathophysiology of psychiatric and neurological diseases like schizophrenia or Parkinson's disease. Ketamine administration has been shown to cause an increase in gamma activity in cortical and subcortical structures, and an increase in 150 Hz oscillations in the nucleus accumbens in healthy rats, together with hyperlocomotion. We recorded local field potentials from motor cortex, caudate-putamen (CPU), substantia nigra pars reticulata (SNr) and subthalamic nucleus (STN) in 20 awake rats before and after the administration of ketamine at three different subanesthetic doses (10, 25 and 50 mg/Kg), and saline as control condition. Motor behavior was semiautomatically quantified by custom-made software specifically developed for this setting. Ketamine induced coherent oscillations in low gamma (50 Hz), high gamma (80 Hz) and high frequency (HFO, 150 Hz) bands, with different behavior in the four structures studied. While oscillatory activity at these three peaks was widespread across all structures, interactions showed a different pattern for each frequency band. Imaginary coherence at 150 Hz was maximum between motor cortex and the different basal ganglia nuclei, while low gamma coherence connected motor cortex with CPU and high gamma coherence was more constrained to the basal ganglia nuclei. Power at three bands correlated with the motor activity of the animal, but only coherence values in the HFO and high gamma range correlated with movement. Interactions in the low gamma band did not show a direct relationship to movement. These results suggest that the motor effects of ketamine administration may be primarily mediated by the induction of coherent widespread high-frequency activity in the motor circuit of the basal ganglia, together with a frequency-specific pattern of connectivity among the structures analyzed. PMID:21829443

  9. Ultrahigh-resolution high-speed retinal imaging using spectral-domain optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Cense, Barry; Nassif, Nader A.; Chen, Teresa C.; Pierce, Mark C.; Yun, Seok-Hyun; Hyle Park, B.; Bouma, Brett E.; Tearney, Guillermo J.; de Boer, Johannes F.

    2004-05-01

    We present the first ultrahigh-resolution optical coherence tomography (OCT) structural intensity images and movies of the human retina in vivo at 29.3 frames per second with 500 A-lines per frame. Data was acquired at a continuous rate of 29,300 spectra per second with a 98% duty cycle. Two consecutive spectra were coherently summed to improve sensitivity, resulting in an effective rate of 14,600 A-lines per second at an effective integration time of 68 μs. The turn-key source was a combination of two super luminescent diodes with a combined spectral width of more than 150 nm providing 4.5 mW of power. The spectrometer of the spectraldomain OCT (SD-OCT) setup was centered around 885 nm with a bandwidth of 145 nm. The effective bandwidth in the eye was limited to approximately 100 nm due to increased absorption of wavelengths above 920 nm in the vitreous. Comparing the performance of our ultrahighresolution SD-OCT system with a conventional high-resolution time domain OCT system, the A-line rate of the spectral-domain OCT system was 59 times higher at a 5.4 dB lower sensitivity. With use of a software based dispersion compensation scheme, coherence length broadening due to dispersion mismatch between sample and reference arms was minimized. The coherence length measured from a mirror in air was equal to 4.0 μm (n= 1). The coherence length determined from the specular reflection of the foveal umbo in vivo in a healthy human eye was equal to 3.5 μm (n = 1.38). With this new system, two layers at the location of the retinal pigmented epithelium seem to be present, as well as small features in the inner and outer plexiform layers, which are believed to be small blood vessels.

  10. Abstract ID: 176 Geant4 implementation of inter-atomic interference effect in small-angle coherent X-ray scattering for materials of medical interest.

    PubMed

    Paternò, Gianfranco; Cardarelli, Paolo; Contillo, Adriano; Gambaccini, Mauro; Taibi, Angelo

    2018-01-01

    Advanced applications of digital mammography such as dual-energy and tomosynthesis require multiple exposures and thus deliver higher dose compared to standard mammograms. A straightforward manner to reduce patient dose without affecting image quality would be removal of the anti-scatter grid, provided that the involved reconstruction algorithms are able to take the scatter figure into account [1]. Monte Carlo simulations are very well suited for the calculation of X-ray scatter distribution and can be used to integrate such information within the reconstruction software. Geant4 is an open source C++ particle tracking code widely used in several physical fields, including medical physics [2,3]. However, the coherent scattering cross section used by the standard Geant4 code does not take into account the influence of molecular interference. According to the independent atomic scattering approximation (the so-called free-atom model), coherent radiation is indistinguishable from primary radiation because its angular distribution is peaked in the forward direction. Since interference effects occur between x-rays scattered by neighbouring atoms in matter, it was shown experimentally that the scatter distribution is affected by the molecular structure of the target, even in amorphous materials. The most important consequence is that the coherent scatter distribution is not peaked in the forward direction, and the position of the maximum is strongly material-dependent [4]. In this contribution, we present the implementation of a method to take into account inter-atomic interference in small-angle coherent scattering in Geant4, including a dedicated data set of suitable molecular form factor values for several materials of clinical interest. Furthermore, we present scatter images of simple geometric phantoms in which the Rayleigh contribution is rigorously evaluated. Copyright © 2017.

  11. Towards combined optical coherence tomography and hyper-spectral imaging for gastrointestinal endoscopy

    NASA Astrophysics Data System (ADS)

    Attendu, Xavier; Crunelle, Camille; de Sivry-Houle, Martin Poinsinet; Maubois, Billie; Urbain, Joanie; Turrell, Chloe; Strupler, Mathias; Godbout, Nicolas; Boudoux, Caroline

    2018-04-01

    Previous works have demonstrated feasibility of combining optical coherence tomography (OCT) and hyper-spectral imaging (HSI) through a single double-clad fiber (DCF). In this proceeding we present the continued development of a system combining both modalities and capable of rapid imaging. We discuss the development of a rapidly scanning, dual-band, polygonal swept-source system which combines NIR (1260-1340 nm) and visible (450-800 nm) wavelengths. The NIR band is used for OCT imaging while visible light allows HSI. Scanning rates up to 24 kHz are reported. Furthermore, we present and discuss the fiber system used for light transport, delivery and collection, and the custom signal acquisition software. Key points include the use of a double-clad fiber coupler as well as important alignments and back-reflection management. Simultaneous and co-registered imaging with both modalities is presented in a bench-top system

  12. Eye-motion-corrected optical coherence tomography angiography using Lissajous scanning.

    PubMed

    Chen, Yiwei; Hong, Young-Joo; Makita, Shuichi; Yasuno, Yoshiaki

    2018-03-01

    To correct eye motion artifacts in en face optical coherence tomography angiography (OCT-A) images, a Lissajous scanning method with subsequent software-based motion correction is proposed. The standard Lissajous scanning pattern is modified to be compatible with OCT-A and a corresponding motion correction algorithm is designed. The effectiveness of our method was demonstrated by comparing en face OCT-A images with and without motion correction. The method was further validated by comparing motion-corrected images with scanning laser ophthalmoscopy images, and the repeatability of the method was evaluated using a checkerboard image. A motion-corrected en face OCT-A image from a blinking case is presented to demonstrate the ability of the method to deal with eye blinking. Results show that the method can produce accurate motion-free en face OCT-A images of the posterior segment of the eye in vivo .

  13. GPU-Powered Coherent Beamforming

    NASA Astrophysics Data System (ADS)

    Magro, A.; Adami, K. Zarb; Hickish, J.

    2015-03-01

    Graphics processing units (GPU)-based beamforming is a relatively unexplored area in radio astronomy, possibly due to the assumption that any such system will be severely limited by the PCIe bandwidth required to transfer data to the GPU. We have developed a CUDA-based GPU implementation of a coherent beamformer, specifically designed and optimized for deployment at the BEST-2 array which can generate an arbitrary number of synthesized beams for a wide range of parameters. It achieves ˜1.3 TFLOPs on an NVIDIA Tesla K20, approximately 10x faster than an optimized, multithreaded CPU implementation. This kernel has been integrated into two real-time, GPU-based time-domain software pipelines deployed at the BEST-2 array in Medicina: a standalone beamforming pipeline and a transient detection pipeline. We present performance benchmarks for the beamforming kernel as well as the transient detection pipeline with beamforming capabilities as well as results of test observation.

  14. Key Technologies of Phone Storage Forensics Based on ARM Architecture

    NASA Astrophysics Data System (ADS)

    Zhang, Jianghan; Che, Shengbing

    2018-03-01

    Smart phones are mainly running Android, IOS and Windows Phone three mobile platform operating systems. The android smart phone has the best market shares and its processor chips are almost ARM software architecture. The chips memory address mapping mechanism of ARM software architecture is different with x86 software architecture. To forensics to android mart phone, we need to understand three key technologies: memory data acquisition, the conversion mechanism from virtual address to the physical address, and find the system’s key data. This article presents a viable solution which does not rely on the operating system API for a complete solution to these three issues.

  15. Biology Needs Evolutionary Software Tools: Let’s Build Them Right

    PubMed Central

    Team, Galaxy; Goecks, Jeremy; Taylor, James

    2018-01-01

    Abstract Research in population genetics and evolutionary biology has always provided a computational backbone for life sciences as a whole. Today evolutionary and population biology reasoning are essential for interpretation of large complex datasets that are characteristic of all domains of today’s life sciences ranging from cancer biology to microbial ecology. This situation makes algorithms and software tools developed by our community more important than ever before. This means that we, developers of software tool for molecular evolutionary analyses, now have a shared responsibility to make these tools accessible using modern technological developments as well as provide adequate documentation and training. PMID:29688462

  16. Software modifications to the Demonstration Advanced Avionics Systems (DAAS)

    NASA Technical Reports Server (NTRS)

    Nedell, B. F.; Hardy, G. H.

    1984-01-01

    Critical information required for the design of integrated avionics suitable for generation aviation is applied towards software modifications for the Demonstration Advanced Avionics System (DAAS). The program emphasizes the use of data busing, distributed microprocessors, shared electronic displays and data entry devices, and improved functional capability. A demonstration advanced avionics system (DAAS) is designed, built, and flight tested in a Cessna 402, twin engine, general aviation aircraft. Software modifications are made to DAAS at Ames concurrent with the flight test program. The changes are the result of the experience obtained with the system at Ames, and the comments of the pilots who evaluated the system.

  17. An Inquiry into the Cost of Post Deployment Software Support (PDSS)

    DTIC Science & Technology

    1989-09-01

    Equations .......... ii vi AFIT/GLM/LSY/835- I0 The increasing cost of software maintenance is taking a larger share of the military bidget each year... increments as needed (3:59). The second page of tne Form 75 starts with a section stating how the hours, and consequently the funds, will be allocated to...length of time required, the timeline can be in hourly, weekly, mnunthly, or quarterly increments . Some milestones included are formal approval, test

  18. Hazardous Environment Robotics

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Jet Propulsion Laboratory (JPL) developed video overlay calibration and demonstration techniques for ground-based telerobotics. Through a technology sharing agreement with JPL, Deneb Robotics added this as an option to its robotics software, TELEGRIP. The software is used for remotely operating robots in nuclear and hazardous environments in industries including automotive and medical. The option allows the operator to utilize video to calibrate 3-D computer models with the actual environment, and thus plan and optimize robot trajectories before the program is automatically generated.

  19. Experiences using OpenMP based on Computer Directed Software DSM on a PC Cluster

    NASA Technical Reports Server (NTRS)

    Hess, Matthias; Jost, Gabriele; Mueller, Matthias; Ruehle, Roland

    2003-01-01

    In this work we report on our experiences running OpenMP programs on a commodity cluster of PCs running a software distributed shared memory (DSM) system. We describe our test environment and report on the performance of a subset of the NAS Parallel Benchmarks that have been automaticaly parallelized for OpenMP. We compare the performance of the OpenMP implementations with that of their message passing counterparts and discuss performance differences.

  20. Toward Deriving Software Architectures from Quality Attributes

    DTIC Science & Technology

    1994-08-01

    administration of Its orograms on the basis of religion creec ancestry. belief, age veteran status sexuai orientation or rn violation of federal state or Ioca...environments rely on the notion of a "tool bus" or an explicit shared repository [ Wasser - man 89] to allow easy integration of tools. 4.7 Unit...attributed parse tree and symbol table that the compiler cre- ates and annotates during its various phases. This results in a very different software

  1. What an open source clinical trial community can learn from hackers

    PubMed Central

    Dunn, Adam G.; Day, Richard O.; Mandl, Kenneth D.; Coiera, Enrico

    2014-01-01

    Summary Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. Since a similar gap has already been addressed in the software industry by the open source software movement, we examine how the social and technical principles of the movement can be used to guide the growth of an open source clinical trial community. PMID:22553248

  2. Exploring the Role of Value Networks for Software Innovation

    NASA Astrophysics Data System (ADS)

    Morgan, Lorraine; Conboy, Kieran

    This paper describes a research-in-progress that aims to explore the applicability and implications of open innovation practices in two firms - one that employs agile development methods and another that utilizes open source software. The open innovation paradigm has a lot in common with open source and agile development methodologies. A particular strength of agile approaches is that they move away from 'introverted' development, involving only the development personnel, and intimately involves the customer in all areas of software creation, supposedly leading to the development of a more innovative and hence more valuable information system. Open source software (OSS) development also shares two key elements of the open innovation model, namely the collaborative development of the technology and shared rights to the use of the technology. However, one shortfall with agile development in particular is the narrow focus on a single customer representative. In response to this, we argue that current thinking regarding innovation needs to be extended to include multiple stakeholders both across and outside the organization. Additionally, for firms utilizing open source, it has been found that their position in a network of potential complementors determines the amount of superior value they create for their customers. Thus, this paper aims to get a better understanding of the applicability and implications of open innovation practices in firms that employ open source and agile development methodologies. In particular, a conceptual framework is derived for further testing.

  3. Award ER25750: Coordinated Infrastructure for Fault Tolerance Systems Indiana University Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lumsdaine, Andrew

    2013-03-08

    The main purpose of the Coordinated Infrastructure for Fault Tolerance in Systems initiative has been to conduct research with a goal of providing end-to-end fault tolerance on a systemwide basis for applications and other system software. While fault tolerance has been an integral part of most high-performance computing (HPC) system software developed over the past decade, it has been treated mostly as a collection of isolated stovepipes. Visibility and response to faults has typically been limited to the particular hardware and software subsystems in which they are initially observed. Little fault information is shared across subsystems, allowing little flexibility ormore » control on a system-wide basis, making it practically impossible to provide cohesive end-to-end fault tolerance in support of scientific applications. As an example, consider faults such as communication link failures that can be seen by a network library but are not directly visible to the job scheduler, or consider faults related to node failures that can be detected by system monitoring software but are not inherently visible to the resource manager. If information about such faults could be shared by the network libraries or monitoring software, then other system software, such as a resource manager or job scheduler, could ensure that failed nodes or failed network links were excluded from further job allocations and that further diagnosis could be performed. As a founding member and one of the lead developers of the Open MPI project, our efforts over the course of this project have been focused on making Open MPI more robust to failures by supporting various fault tolerance techniques, and using fault information exchange and coordination between MPI and the HPC system software stack from the application, numeric libraries, and programming language runtime to other common system components such as jobs schedulers, resource managers, and monitoring tools.« less

  4. Software for a GPS-Reflection Remote-Sensing System

    NASA Technical Reports Server (NTRS)

    Lowe, Stephen

    2003-01-01

    A special-purpose software Global Positioning System (GPS) receiver designed for remote sensing with reflected GPS signals is described in Delay/Doppler-Mapping GPS-Reflection Remote-Sensing System (NPO-30385), which appears elsewhere in this issue of NASA Tech Briefs. The input accepted by this program comprises raw (open-loop) digitized GPS signals sampled at a rate of about 20 MHz. The program processes the data samples to perform the following functions: detection of signals; tracking of phases and delays; mapping of delay, Doppler, and delay/Doppler waveforms; dual-frequency processing; coherent integrations as short as 125 s; decoding of navigation messages; and precise time tagging of observable quantities. The software can perform these functions on all detectable satellite signals without dead time. Open-loop data collected over water, land, or ice and processed by this software can be further processed to extract geophysical information. Possible examples include mean sea height, wind speed and direction, and significant wave height (for observations over the ocean); bistatic-radar terrain images and measures of soil moisture and biomass (for observations over land); and estimates of ice age, thickness, and surface density (for observations over ice).

  5. Batching System for Superior Service

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Veridian's Portable Batch System (PBS) was the recipient of the 1997 NASA Space Act Award for outstanding software. A batch system is a set of processes for managing queues and jobs. Without a batch system, it is difficult to manage the workload of a computer system. By bundling the enterprise's computing resources, the PBS technology offers users a single coherent interface, resulting in efficient management of the batch services. Users choose which information to package into "containers" for system-wide use. PBS also provides detailed system usage data, a procedure not easily executed without this software. PBS operates on networked, multi-platform UNIX environments. Veridian's new version, PBS Pro,TM has additional features and enhancements, including support for additional operating systems. Veridian distributes the original version of PBS as Open Source software via the PBS website. Customers can register and download the software at no cost. PBS Pro is also available via the web and offers additional features such as increased stability, reliability, and fault tolerance.A company using PBS can expect a significant increase in the effective management of its computing resources. Tangible benefits include increased utilization of costly resources and enhanced understanding of computational requirements and user needs.

  6. Raising Virtual Laboratories in Australia onto global platforms

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Barker, M.; Fraser, R.; Evans, B. J. K.; Moloney, G.; Proctor, R.; Moise, A. F.; Hamish, H.

    2016-12-01

    Across the globe, Virtual Laboratories (VLs), Science Gateways (SGs), and Virtual Research Environments (VREs) are being developed that enable users who are not co-located to actively work together at various scales to share data, models, tools, software, workflows, best practices, etc. Outcomes range from enabling `long tail' researchers to more easily access specific data collections, to facilitating complex workflows on powerful supercomputers. In Australia, government funding has facilitated the development of a range of VLs through the National eResearch Collaborative Tools and Resources (NeCTAR) program. The VLs provide highly collaborative, research-domain oriented, integrated software infrastructures that meet user community needs. Twelve VLs have been funded since 2012, including the Virtual Geophysics Laboratory (VGL); Virtual Hazards, Impact and Risk Laboratory (VHIRL); Climate and Weather Science Laboratory (CWSLab); Marine Virtual Laboratory (MarVL); and Biodiversity and Climate Change Virtual Laboratory (BCCVL). These VLs share similar technical challenges, with common issues emerging on integration of tools, applications and access data collections via both cloud-based environments and other distributed resources. While each VL began with a focus on a specific research domain, communities of practice have now formed across the VLs around common issues, and facilitate identification of best practice case studies, and new standards. As a result, tools are now being shared where the VLs access data via data services using international standards such as ISO, OGC, W3C. The sharing of these approaches is starting to facilitate re-usability of infrastructure and is a step towards supporting interdisciplinary research. Whilst the focus of the VLs are Australia-centric, by using standards, these environments are able to be extended to analysis on other international datasets. Many VL datasets are subsets of global datasets and so extension to global is a small (and often requested) step. Similarly, most of the tools, software, and other technologies could be shared across infrastructures globally. Therefore, it is now time to better connect the Australian VLs with similar initiatives elsewhere to create international platforms that can contribute to global research challenges.

  7. Enabling Web-Based Analysis of CUAHSI HIS Hydrologic Data Using R and Web Processing Services

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Kadlec, J.; Bayles, M.; Seul, M.; Hooper, R. P.; Cummings, B.

    2015-12-01

    The CUAHSI Hydrologic Information System (CUAHSI HIS) provides open access to a large number of hydrological time series observation and modeled data from many parts of the world. Several software tools have been designed to simplify searching and access to the CUAHSI HIS datasets. These software tools include: Desktop client software (HydroDesktop, HydroExcel), developer libraries (WaterML R Package, OWSLib, ulmo), and the new interactive search website, http://data.cuahsi.org. An issue with using the time series data from CUAHSI HIS for further analysis by hydrologists (for example for verification of hydrological and snowpack models) is the large heterogeneity of the time series data. The time series may be regular or irregular, contain missing data, have different time support, and be recorded in different units. R is a widely used computational environment for statistical analysis of time series and spatio-temporal data that can be used to assess fitness and perform scientific analyses on observation data. R includes the ability to record a data analysis in the form of a reusable script. The R script together with the input time series dataset can be shared with other users, making the analysis more reproducible. The major goal of this study is to examine the use of R as a Web Processing Service for transforming time series data from the CUAHSI HIS and sharing the results on the Internet within HydroShare. HydroShare is an online data repository and social network for sharing large hydrological data sets such as time series, raster datasets, and multi-dimensional data. It can be used as a permanent cloud storage space for saving the time series analysis results. We examine the issues associated with running R scripts online: including code validation, saving of outputs, reporting progress, and provenance management. An explicit goal is that the script which is run locally should produce exactly the same results as the script run on the Internet. Our design can be used as a model for other studies that need to run R scripts on the web.

  8. Biotechnology software in the digital age: are you winning?

    PubMed

    Scheitz, Cornelia Johanna Franziska; Peck, Lawrence J; Groban, Eli S

    2018-01-16

    There is a digital revolution taking place and biotechnology companies are slow to adapt. Many pharmaceutical, biotechnology, and industrial bio-production companies believe that software must be developed and maintained in-house and that data are more secure on internal servers than on the cloud. In fact, most companies in this space continue to employ large IT and software teams and acquire computational infrastructure in the form of in-house servers. This is due to a fear of the cloud not sufficiently protecting in-house resources and the belief that their software is valuable IP. Over the next decade, the ability to quickly adapt to changing market conditions, with agile software teams, will quickly become a compelling competitive advantage. Biotechnology companies that do not adopt the new regime may lose on key business metrics such as return on invested capital, revenue, profitability, and eventually market share.

  9. Beyond formalism

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1991-01-01

    The ongoing debate over the role of formalism and formal specifications in software features many speakers with diverse positions. Yet, in the end, they share the conviction that the requirements of a software system can be unambiguously specified, that acceptable software is a product demonstrably meeting the specifications, and that the design process can be carried out with little interaction between designers and users once the specification has been agreed to. This conviction is part of a larger paradigm prevalent in American management thinking, which holds that organizations are systems that can be precisely specified and optimized. This paradigm, which traces historically to the works of Frederick Taylor in the early 1900s, is no longer sufficient for organizations and software systems today. In the domain of software, a new paradigm, called user-centered design, overcomes the limitations of pure formalism. Pioneered in Scandinavia, user-centered design is spreading through Europe and is beginning to make its way into the U.S.

  10. TreeRipper web application: towards a fully automated optical tree recognition software.

    PubMed

    Hughes, Joseph

    2011-05-20

    Relationships between species, genes and genomes have been printed as trees for over a century. Whilst this may have been the best format for exchanging and sharing phylogenetic hypotheses during the 20th century, the worldwide web now provides faster and automated ways of transferring and sharing phylogenetic knowledge. However, novel software is needed to defrost these published phylogenies for the 21st century. TreeRipper is a simple website for the fully-automated recognition of multifurcating phylogenetic trees (http://linnaeus.zoology.gla.ac.uk/~jhughes/treeripper/). The program accepts a range of input image formats (PNG, JPG/JPEG or GIF). The underlying command line c++ program follows a number of cleaning steps to detect lines, remove node labels, patch-up broken lines and corners and detect line edges. The edge contour is then determined to detect the branch length, tip label positions and the topology of the tree. Optical Character Recognition (OCR) is used to convert the tip labels into text with the freely available tesseract-ocr software. 32% of images meeting the prerequisites for TreeRipper were successfully recognised, the largest tree had 115 leaves. Despite the diversity of ways phylogenies have been illustrated making the design of a fully automated tree recognition software difficult, TreeRipper is a step towards automating the digitization of past phylogenies. We also provide a dataset of 100 tree images and associated tree files for training and/or benchmarking future software. TreeRipper is an open source project licensed under the GNU General Public Licence v3.

  11. Software and hardware infrastructure for research in electrophysiology

    PubMed Central

    Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Řondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Štěbeták, Jan

    2014-01-01

    As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software. PMID:24639646

  12. Software and hardware infrastructure for research in electrophysiology.

    PubMed

    Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Rondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Stěbeták, Jan

    2014-01-01

    As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software.

  13. Personalized, Shareable Geoscience Dataspaces For Simplifying Data Management and Improving Reproducibility

    NASA Astrophysics Data System (ADS)

    Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.

    2015-12-01

    Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.

  14. DigR: a generic model and its open source simulation software to mimic three-dimensional root-system architecture diversity.

    PubMed

    Barczi, Jean-François; Rey, Hervé; Griffon, Sébastien; Jourdan, Christophe

    2018-04-18

    Many studies exist in the literature dealing with mathematical representations of root systems, categorized, for example, as pure structure description, partial derivative equations or functional-structural plant models. However, in these studies, root architecture modelling has seldom been carried out at the organ level with the inclusion of environmental influences that can be integrated into a whole plant characterization. We have conducted a multidisciplinary study on root systems including field observations, architectural analysis, and formal and mathematical modelling. This integrative and coherent approach leads to a generic model (DigR) and its software simulator. Architecture analysis applied to root systems helps at root type classification and architectural unit design for each species. Roots belonging to a particular type share dynamic and morphological characteristics which consist of topological and geometric features. The DigR simulator is integrated into the Xplo environment, with a user interface to input parameter values and make output ready for dynamic 3-D visualization, statistical analysis and saving to standard formats. DigR is simulated in a quasi-parallel computing algorithm and may be used either as a standalone tool or integrated into other simulation platforms. The software is open-source and free to download at http://amapstudio.cirad.fr/soft/xplo/download. DigR is based on three key points: (1) a root-system architectural analysis, (2) root type classification and modelling and (3) a restricted set of 23 root type parameters with flexible values indexed in terms of root position. Genericity and botanical accuracy of the model is demonstrated for growth, branching, mortality and reiteration processes, and for different root architectures. Plugin examples demonstrate the model's versatility at simulating plastic responses to environmental constraints. Outputs of the model include diverse root system structures such as tap-root, fasciculate, tuberous, nodulated and clustered root systems. DigR is based on plant architecture analysis which leads to specific root type classification and organization that are directly linked to field measurements. The open source simulator of the model has been included within a friendly user environment. DigR accuracy and versatility are demonstrated for growth simulations of complex root systems for both annual and perennial plants.

  15. OASIS: a data and software distribution service for Open Science Grid

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Caballero Bejar, J.; De Stefano, J.; Hover, J.; Quick, R.; Teige, S.

    2014-06-01

    The Open Science Grid encourages the concept of software portability: a user's scientific application should be able to run at as many sites as possible. It is necessary to provide a mechanism for OSG Virtual Organizations to install software at sites. Since its initial release, the OSG Compute Element has provided an application software installation directory to Virtual Organizations, where they can create their own sub-directory, install software into that sub-directory, and have the directory shared on the worker nodes at that site. The current model has shortcomings with regard to permissions, policies, versioning, and the lack of a unified, collective procedure or toolset for deploying software across all sites. Therefore, a new mechanism for data and software distributing is desirable. The architecture for the OSG Application Software Installation Service (OASIS) is a server-client model: the software and data are installed only once in a single place, and are automatically distributed to all client sites simultaneously. Central file distribution offers other advantages, including server-side authentication and authorization, activity records, quota management, data validation and inspection, and well-defined versioning and deletion policies. The architecture, as well as a complete analysis of the current implementation, will be described in this paper.

  16. Neurophysiological correlates of eye movement desensitization and reprocessing sessions: preliminary evidence for traumatic memories integration.

    PubMed

    Farina, Benedetto; Imperatori, Claudio; Quintiliani, Maria I; Castelli Gattinara, Paola; Onofri, Antonio; Lepore, Marta; Brunetti, Riccardo; Losurdo, Anna; Testani, Elisa; Della Marca, Giacomo

    2015-11-01

    We have investigated the potential role of eye movement desensitization and reprocessing (EMDR) in enhancing the integration of traumatic memories by measuring EEG coherence, power spectra and autonomic variables before (pre-EMDR) and after (post-EMDR) EMDR sessions during the recall of patient's traumatic memory. Thirteen EMDR sessions of six patients with post-traumatic stress disorder were recorded. EEG analyses were conducted by means of the standardized Low Resolution Electric Tomography (sLORETA) software. Power spectra, EEG coherence and heart rate variability (HRV) were compared between pre- and post-EMDR sessions. After EMDR, we observed a significant increase of alpha power in the left inferior temporal gyrus (T = 3.879; P = 0.041) and an increased EEG coherence in beta band between C3 and T5 electrodes (T = 6.358; P < 0.001). Furthermore, a significant increase of HRV in the post-EMDR sessions was also observed (pre-EMDR: 6.38 ± 6.83; post-EMDR: 2.46 ± 2.95; U-Test = 45, P = 0.043). Finally, the values of lagged coherence were negatively associated with subjective units of disturbance (r(24) = -0.44, P < 0.05) and positively associated with parasympathetic activity (r(24) = 0.40, P < 0.05). Our results suggest that EMDR leads to an integration of dissociated aspects of traumatic memories and, consequently, a decrease of hyperarousal symptoms [Correction made here after initial publication]. © 2014 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  17. REPRODUCIBILITY OF VESSEL DENSITY MEASUREMENT WITH OPTICAL COHERENCE TOMOGRAPHY ANGIOGRAPHY IN EYES WITH AND WITHOUT RETINOPATHY.

    PubMed

    You, Qisheng; Freeman, William R; Weinreb, Robert N; Zangwill, Linda; Manalastas, Patricia I C; Saunders, Luke J; Nudleman, Eric

    2017-08-01

    To determine the intravisit and intervisit reproducibility of optical coherence tomography angiography measurements of macular vessel density in eyes with and without retinal diseases. Fifteen healthy volunteers and 22 patients with retinal diseases underwent repeated optical coherence tomography angiography (Angiovue Imaging System, Optovue Inc) scans after pupil dilation on 2 separate visit days. For each visit day, the eyes were scanned twice. Vessel density defined as the proportion of vessel area with flowing blood over the total measurement area was calculated using Angiovue software. Intravisit and intervisit reproducibility were summarized as coefficient of variations and intraclass correlation coefficients were calculated from variance component models. The coefficient of variations representing the intravisit reproducibility of the superficial macular vessel density measurements for different quadrants on 3 mm × 3-mm scans varied from 2.1% to 4.9% and 3.4% to 6.8% for healthy and diseased eyes, respectively, and for the intervisit it was 2.9% to 5.1% and 4.0% to 6.8%, respectively. The coefficient of variations were lower in healthy eyes than in diseased eyes, lower for intravisit than for intervisit, lower on 3 mm × 3-mm scans than on 6 mm × 6-mm scans, and lower for paracentral subfields than for central subfield. The evidence presented here demonstrates good reproducibility of optical coherence tomography angiography for measurement of superficial macula vessel density in both healthy eyes and eyes with diabetic retinopathy without diabetic macular edema.

  18. Architected Agile Solutions for Software-Reliant Systems

    NASA Astrophysics Data System (ADS)

    Boehm, Barry; Lane, Jo Ann; Koolmanojwong, Supannika; Turner, Richard

    Systems are becoming increasingly reliant on software due to needs for rapid fielding of “70% capabilities,” interoperability, net-centricity, and rapid adaptation to change. The latter need has led to increased interest in agile methods of software development, in which teams rely on shared tacit interpersonal knowledge rather than explicit documented knowledge. However, such systems often need to be scaled up to higher level of performance and assurance, requiring stronger architectural support. Several organizations have recently transformed themselves by developing successful combinations of agility and architecture that can scale to projects of up to 100 personnel. This chapter identifies a set of key principles for such architected agile solutions for software-reliant systems, provides guidance for how much architecting is enough, and illustrates the key principles with several case studies.

  19. Standardizng Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Nathan A.; Klemm, Juli; Harper, Stacey

    2013-02-05

    To enable the rational design of nanomaterials for improved efficacy and safety, it is critical to understand and exploit the physicochemical properties that drive biological behavior (Morris, 2010). Data mining and computer simulation are essential for deriving information about nanomaterial behavior; however, the datasets needed to support such studies are sparse and stored across a variety of repositories and resources. Schrurs and Lison (2012) have expressed the need for more coherence and structure in the conduct of nanotechnology research. Additionally, the lack of common reporting standards and non-uniformity of information reported have proven to be significant barriers to such datamore » sharing and re-use. The Nanotechnology Working Group (Nano WG), of the US National Institutes of Health National Cancer Informatics Program, has been focused on addressing these barriers. The Nano WG - which includes representatives from over 20 organizations including government agencies, academia, industry, standards organizations, and alliances -has developed ISA-TAB-Nano (Thomas et al, 2013), a general framework for representing and integrating diverse types of data related to the description and characterization of nanomaterials. Recognizing that nanoparticle characterization studies have many of the same challenges as ‘omics-based assays, the Nano WG joined the ISA Commons (Sansone et al., 2012) to leverage lessons learned in ‘omics data sharing. The ISA Commons community brings together 50 collaborators at over 30 scientific organizations around the globe, including regulatory and industrial participants in an increasingly diverse set of life science domains. At the core of the ISA Commons is the ISA metadata tracking framework which forms the basis for the ISA-TAB-Nano extension. The extension of the ISA framework to nanotechnology domain illustrates the power of a synergistic approach that seeks the interoperability of data across multiple research disciplines. To increase adoption, especially in the commercial arena from vendors and manufactures, the ISA-Tab-Nano data-sharing specification has also been submitted for consideration as a standard to the American Society for Testing and Materials (ASTM). Delivering a community-driven specification is only the first phase of the process. To be useful and used, ISA-TAB-Nano must be implemented by tools and databases to assist researchers in reporting their data accordingly, shielding them from unnecessary complexity. Our next step is to extend the open source ISA Software Suite to provide user-oriented tools for the collection, curation, and storage of data compliant with the ISA-TAB-Nano specification. Future work will also focus on the application of the ISA-TAB-Nano format to support emerging standards on minimal information about nanomaterials in biological research (Ostraat et al, 2012; MinChar). ISA-TAB-Nano development is a community-driven effort and we welcome new contributions, collaborations and domain expertise. We invite researchers, software developers, vendors, and other stakeholders to work with us to implement the ISA-Tab-Nano format in their existing systems and research. Likewise, we welcome engagement of regulators, funding agencies, editors, and other policy makers to discuss how this standard can be used to facilitate the sharing and reuse of nanotechnology data across a wide range of disciplines. More information about the ISA-TAB-Nano project can be found online at https://wiki.nci.nih.gov/display/ICR/ISA-TAB-Nano.« less

  20. The social disutility of software ownership.

    PubMed

    Douglas, David M

    2011-09-01

    Software ownership allows the owner to restrict the distribution of software and to prevent others from reading the software's source code and building upon it. However, free software is released to users under software licenses that give them the right to read the source code, modify it, reuse it, and distribute the software to others. Proponents of free software such as Richard M. Stallman and Eben Moglen argue that the social disutility of software ownership is a sufficient justification for prohibiting it. This social disutility includes the social instability of disregarding laws and agreements covering software use and distribution, inequality of software access, and the inability to help others by sharing software with them. Here I consider these and other social disutility claims against withholding specific software rights from users, in particular, the rights to read the source code, duplicate, distribute, modify, imitate, and reuse portions of the software within new programs. I find that generally while withholding these rights from software users does cause some degree of social disutility, only the rights to duplicate, modify and imitate cannot legitimately be denied to users on this basis. The social disutility of withholding the rights to distribute the software, read its source code and reuse portions of it in new programs is insufficient to prohibit software owners from denying them to users. A compromise between the software owner and user can minimise the social disutility of withholding these particular rights from users. However, the social disutility caused by software patents is sufficient for rejecting such patents as they restrict the methods of reducing social disutility possible with other forms of software ownership.

  1. Doppler optical coherence tomography of retinal circulation.

    PubMed

    Tan, Ou; Wang, Yimin; Konduru, Ranjith K; Zhang, Xinbo; Sadda, SriniVas R; Huang, David

    2012-09-18

    Noncontact retinal blood flow measurements are performed with a Fourier domain optical coherence tomography (OCT) system using a circumpapillary double circular scan (CDCS) that scans around the optic nerve head at 3.40 mm and 3.75 mm diameters. The double concentric circles are performed 6 times consecutively over 2 sec. The CDCS scan is saved with Doppler shift information from which flow can be calculated. The standard clinical protocol calls for 3 CDCS scans made with the OCT beam passing through the superonasal edge of the pupil and 3 CDCS scan through the inferonal pupil. This double-angle protocol ensures that acceptable Doppler angle is obtained on each retinal branch vessel in at least 1 scan. The CDCS scan data, a 3-dimensional volumetric OCT scan of the optic disc scan, and a color photograph of the optic disc are used together to obtain retinal blood flow measurement on an eye. We have developed a blood flow measurement software called "Doppler optical coherence tomography of retinal circulation" (DOCTORC). This semi-automated software is used to measure total retinal blood flow, vessel cross section area, and average blood velocity. The flow of each vessel is calculated from the Doppler shift in the vessel cross-sectional area and the Doppler angle between the vessel and the OCT beam. Total retinal blood flow measurement is summed from the veins around the optic disc. The results obtained at our Doppler OCT reading center showed good reproducibility between graders and methods (<10%). Total retinal blood flow could be useful in the management of glaucoma, other retinal diseases, and retinal diseases. In glaucoma patients, OCT retinal blood flow measurement was highly correlated with visual field loss (R(2)>0.57 with visual field pattern deviation). Doppler OCT is a new method to perform rapid, noncontact, and repeatable measurement of total retinal blood flow using widely available Fourier-domain OCT instrumentation. This new technology may improve the practicality of making these measurements in clinical studies and routine clinical practice.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, J.P.; Bangs, A.L.; Butler, P.L.

    Hetero Helix is a programming environment which simulates shared memory on a heterogeneous network of distributed-memory computers. The machines in the network may vary with respect to their native operating systems and internal representation of numbers. Hetero Helix presents a simple programming model to developers, and also considers the needs of designers, system integrators, and maintainers. The key software technology underlying Hetero Helix is the use of a compiler'' which analyzes the data structures in shared memory and automatically generates code which translates data representations from the format native to each machine into a common format, and vice versa. Themore » design of Hetero Helix was motivated in particular by the requirements of robotics applications. Hetero Helix has been used successfully in an integration effort involving 27 CPUs in a heterogeneous network and a body of software totaling roughly 100,00 lines of code. 25 refs., 6 figs.« less

  3. New directions in virtual environments and gaming to address obesity and diabetes: industry perspective.

    PubMed

    Ruppert, Barb

    2011-03-01

    Virtual reality is increasingly used for education and treatment in the fields of health and medicine. What is the health potential of virtual reality technology from the software development industry perspective? This article presents interviews with Ben Sawyer of Games for Health, Dr. Walter Greenleaf of InWorld Solutions, and Dr. Ernie Medina of MedPlay Technologies. Games for Health brings together researchers, medical professionals, and game developers to share information on the impact that game technologies can have on health, health care, and policy. InWorld is an Internet-based virtual environment designed specifically for behavioral health care. MedPlay Technologies develops wellness training programs that include exergaming technology. The interviewees share their views on software development and other issues that must be addressed to advance the field of virtual reality for health applications. © 2011 Diabetes Technology Society.

  4. Experimental Internet Environment Software Development

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.

    1998-01-01

    Geographically distributed project teams need an Internet based collaborative work environment or "Intranet." The Virtual Research Center (VRC) is an experimental Intranet server that combines several services such as desktop conferencing, file archives, on-line publishing, and security. Using the World Wide Web (WWW) as a shared space paradigm, the Graphical User Interface (GUI) presents users with images of a lunar colony. Each project has a wing of the colony and each wing has a conference room, library, laboratory, and mail station. In FY95, the VRC development team proved the feasibility of this shared space concept by building a prototype using a Netscape commerce server and several public domain programs. Successful demonstrations of the prototype resulted in approval for a second phase. Phase 2, documented by this report, will produce a seamlessly integrated environment by introducing new technologies such as Java and Adobe Web Links to replace less efficient interface software.

  5. Cross-layer shared protection strategy towards data plane in software defined optical networks

    NASA Astrophysics Data System (ADS)

    Xiong, Yu; Li, Zhiqiang; Zhou, Bin; Dong, Xiancun

    2018-04-01

    In order to ensure reliable data transmission on the data plane and minimize resource consumption, a novel protection strategy towards data plane is proposed in software defined optical networks (SDON). Firstly, we establish a SDON architecture with hierarchical structure of data plane, which divides the data plane into four layers for getting fine-grained bandwidth resource. Then, we design the cross-layer routing and resource allocation based on this network architecture. Through jointly considering the bandwidth resource on all the layers, the SDN controller could allocate bandwidth resource to working path and backup path in an economical manner. Next, we construct auxiliary graphs and transform the shared protection problem into the graph vertex coloring problem. Therefore, the resource consumption on backup paths can be reduced further. The simulation results demonstrate that the proposed protection strategy can achieve lower protection overhead and higher resource utilization ratio.

  6. Medical high-resolution image sharing and electronic whiteboard system: A pure-web-based system for accessing and discussing lossless original images in telemedicine.

    PubMed

    Qiao, Liang; Li, Ying; Chen, Xin; Yang, Sheng; Gao, Peng; Liu, Hongjun; Feng, Zhengquan; Nian, Yongjian; Qiu, Mingguo

    2015-09-01

    There are various medical image sharing and electronic whiteboard systems available for diagnosis and discussion purposes. However, most of these systems ask clients to install special software tools or web plug-ins to support whiteboard discussion, special medical image format, and customized decoding algorithm of data transmission of HRIs (high-resolution images). This limits the accessibility of the software running on different devices and operating systems. In this paper, we propose a solution based on pure web pages for medical HRIs lossless sharing and e-whiteboard discussion, and have set up a medical HRI sharing and e-whiteboard system, which has four-layered design: (1) HRIs access layer: we improved an tile-pyramid model named unbalanced ratio pyramid structure (URPS), to rapidly share lossless HRIs and to adapt to the reading habits of users; (2) format conversion layer: we designed a format conversion engine (FCE) on server side to real time convert and cache DICOM tiles which clients requesting with window-level parameters, to make browsers compatible and keep response efficiency to server-client; (3) business logic layer: we built a XML behavior relationship storage structure to store and share users' behavior, to keep real time co-browsing and discussion between clients; (4) web-user-interface layer: AJAX technology and Raphael toolkit were used to combine HTML and JavaScript to build client RIA (rich Internet application), to meet clients' desktop-like interaction on any pure webpage. This system can be used to quickly browse lossless HRIs, and support discussing and co-browsing smoothly on any web browser in a diversified network environment. The proposal methods can provide a way to share HRIs safely, and may be used in the field of regional health, telemedicine and remote education at a low cost. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. Semi-automated software service integration in virtual organisations

    NASA Astrophysics Data System (ADS)

    Afsarmanesh, Hamideh; Sargolzaei, Mahdi; Shadi, Mahdieh

    2015-08-01

    To enhance their business opportunities, organisations involved in many service industries are increasingly active in pursuit of both online provision of their business services (BSs) and collaborating with others. Collaborative Networks (CNs) in service industry sector, however, face many challenges related to sharing and integration of their collection of provided BSs and their corresponding software services. Therefore, the topic of service interoperability for which this article introduces a framework is gaining momentum in research for supporting CNs. It contributes to generation of formal machine readable specification for business processes, aimed at providing their unambiguous definitions, as needed for developing their equivalent software services. The framework provides a model and implementation architecture for discovery and composition of shared services, to support the semi-automated development of integrated value-added services. In support of service discovery, a main contribution of this research is the formal representation of services' behaviour and applying desired service behaviour specified by users for automated matchmaking with other existing services. Furthermore, to support service integration, mechanisms are developed for automated selection of the most suitable service(s) according to a number of service quality aspects. Two scenario cases are presented, which exemplify several specific features related to service discovery and service integration aspects.

  8. Software Defined Cyberinfrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policiesmore » by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.« less

  9. Migration of the Gaudi and LHCb software repositories from CVS to Subversion

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Degaudenzi, H.; LHCb Collaboration

    2011-12-01

    A common code repository is of primary importance in a distributed development environment such as large HEP experiments. CVS (Concurrent Versions System) has been used in the past years at CERN for the hosting of shared software repositories, among which were the repositories for the Gaudi Framework and the LHCb software projects. Many developers around the world produced alternative systems to share code and revisions among several developers, mainly to overcome the limitations in CVS, and CERN has recently started a new service for code hosting based on the version control system Subversion. The differences between CVS and Subversion and the way the code was organized in Gaudi and LHCb CVS repositories required careful study and planning of the migration. Special care was used to define the organization of the new Subversion repository. To avoid as much as possible disruption in the development cycle, the migration has been gradual with the help of tools developed explicitly to hide the differences between the two systems. The principles guiding the migration steps, the organization of the Subversion repository and the tools developed will be presented, as well as the problems encountered both from the librarian and the user points of view.

  10. Mass spectrometer output file format mzML.

    PubMed

    Deutsch, Eric W

    2010-01-01

    Mass spectrometry is an important technique for analyzing proteins and other biomolecular compounds in biological samples. Each of the vendors of these mass spectrometers uses a different proprietary binary output file format, which has hindered data sharing and the development of open source software for downstream analysis. The solution has been to develop, with the full participation of academic researchers as well as software and hardware vendors, an open XML-based format for encoding mass spectrometer output files, and then to write software to use this format for archiving, sharing, and processing. This chapter presents the various components and information available for this format, mzML. In addition to the XML schema that defines the file structure, a controlled vocabulary provides clear terms and definitions for the spectral metadata, and a semantic validation rules mapping file allows the mzML semantic validator to insure that an mzML document complies with one of several levels of requirements. Complete documentation and example files insure that the format may be uniformly implemented. At the time of release, there already existed several implementations of the format and vendors have committed to supporting the format in their products.

  11. A resilient and secure software platform and architecture for distributed spacecraft

    NASA Astrophysics Data System (ADS)

    Otte, William R.; Dubey, Abhishek; Karsai, Gabor

    2014-06-01

    A distributed spacecraft is a cluster of independent satellite modules flying in formation that communicate via ad-hoc wireless networks. This system in space is a cloud platform that facilitates sharing sensors and other computing and communication resources across multiple applications, potentially developed and maintained by different organizations. Effectively, such architecture can realize the functions of monolithic satellites at a reduced cost and with improved adaptivity and robustness. Openness of these architectures pose special challenges because the distributed software platform has to support applications from different security domains and organizations, and where information flows have to be carefully managed and compartmentalized. If the platform is used as a robust shared resource its management, configuration, and resilience becomes a challenge in itself. We have designed and prototyped a distributed software platform for such architectures. The core element of the platform is a new operating system whose services were designed to restrict access to the network and the file system, and to enforce resource management constraints for all non-privileged processes Mixed-criticality applications operating at different security labels are deployed and controlled by a privileged management process that is also pre-configuring all information flows. This paper describes the design and objective of this layer.

  12. An Adaptive Insertion and Promotion Policy for Partitioned Shared Caches

    NASA Astrophysics Data System (ADS)

    Mahrom, Norfadila; Liebelt, Michael; Raof, Rafikha Aliana A.; Daud, Shuhaizar; Hafizah Ghazali, Nur

    2018-03-01

    Cache replacement policies in chip multiprocessors (CMP) have been investigated extensively and proven able to enhance shared cache management. However, competition among multiple processors executing different threads that require simultaneous access to a shared memory may cause cache contention and memory coherence problems on the chip. These issues also exist due to some drawbacks of the commonly used Least Recently Used (LRU) policy employed in multiprocessor systems, which are because of the cache lines residing in the cache longer than required. In image processing analysis of for example extra pulmonary tuberculosis (TB), an accurate diagnosis for tissue specimen is required. Therefore, a fast and reliable shared memory management system to execute algorithms for processing vast amount of specimen image is needed. In this paper, the effects of the cache replacement policy in a partitioned shared cache are investigated. The goal is to quantify whether better performance can be achieved by using less complex replacement strategies. This paper proposes a Middle Insertion 2 Positions Promotion (MI2PP) policy to eliminate cache misses that could adversely affect the access patterns and the throughput of the processors in the system. The policy employs a static predefined insertion point, near distance promotion, and the concept of ownership in the eviction policy to effectively improve cache thrashing and to avoid resource stealing among the processors.

  13. Research Data in Core Journals in Biology, Chemistry, Mathematics, and Physics.

    PubMed

    Womack, Ryan P

    2015-01-01

    This study takes a stratified random sample of articles published in 2014 from the top 10 journals in the disciplines of biology, chemistry, mathematics, and physics, as ranked by impact factor. Sampled articles were examined for their reporting of original data or reuse of prior data, and were coded for whether the data was publicly shared or otherwise made available to readers. Other characteristics such as the sharing of software code used for analysis and use of data citation and DOIs for data were examined. The study finds that data sharing practices are still relatively rare in these disciplines' top journals, but that the disciplines have markedly different practices. Biology top journals share original data at the highest rate, and physics top journals share at the lowest rate. Overall, the study finds that within the top journals, only 13% of articles with original data published in 2014 make the data available to others.

  14. Research Data in Core Journals in Biology, Chemistry, Mathematics, and Physics

    PubMed Central

    Womack, Ryan P.

    2015-01-01

    This study takes a stratified random sample of articles published in 2014 from the top 10 journals in the disciplines of biology, chemistry, mathematics, and physics, as ranked by impact factor. Sampled articles were examined for their reporting of original data or reuse of prior data, and were coded for whether the data was publicly shared or otherwise made available to readers. Other characteristics such as the sharing of software code used for analysis and use of data citation and DOIs for data were examined. The study finds that data sharing practices are still relatively rare in these disciplines’ top journals, but that the disciplines have markedly different practices. Biology top journals share original data at the highest rate, and physics top journals share at the lowest rate. Overall, the study finds that within the top journals, only 13% of articles with original data published in 2014 make the data available to others. PMID:26636676

  15. Enhancing the many-to-many relations across IHE document sharing communities.

    PubMed

    Ribeiro, Luís S; Costa, Carlos; Oliveira, José Luís

    2012-01-01

    The Integrating Healthcare Enterprise (IHE) initiative is an ongoing project aiming to enable true inter-site interoperability in the health IT field. IHE is a work in progress and many challenges need to be overcome before the healthcare Institutions may share patient clinical records transparently and effortless. Configuring, deploying and testing an IHE document sharing community requires a significant effort to plan and maintain the supporting IT infrastructure. With the new paradigm of cloud computing is now possible to launch software devices on demand and paying accordantly to the usage. This paper presents a framework designed with purpose of expediting the creation of IHE document sharing communities. It provides semi-ready templates of sharing communities that will be customized according the community needs. The framework is a meeting point of the healthcare institutions, creating a favourable environment that might converge in new inter-institutional professional relationships and eventually the creation of new Affinity Domains.

  16. Exploring the attributes of critical thinking: a conceptual basis.

    PubMed

    Forneris, Susan G

    2004-01-01

    Many teaching methods used in nursing education to enhance critical thinking focus on teaching students how to directly apply knowledge; a technically rational approach. While seemingly effective at enhancing students' critical thinking abilities in structured learning situations, these methods don't prepare students to operationalize critical thinking to manage the complexities that actually exist in practice. The work of contemporary educational theorists Paulo Freire, Donald Schon, Chris Argyris, Jack Mezirow, Stephen Brookfield, and Robert Tennyson all share similar perspectives on thinking in practice and the use of reflection to achieve a coherence of understanding. Their perspectives provide insight on how educators can shift from a means-end approach to operationalizing thinking in practice. The author identifies four attributes of critical thinking in practice evidenced in these views, followed by a discussion of specific educational strategies that reflect these attributes, and operationalize a critical thinking process in nursing practice to achieve a coherence of understanding.

  17. Optimal continuous variable quantum teleportation protocol for realistic settings

    NASA Astrophysics Data System (ADS)

    Luiz, F. S.; Rigolin, Gustavo

    2015-03-01

    We show the optimal setup that allows Alice to teleport coherent states | α > to Bob giving the greatest fidelity (efficiency) when one takes into account two realistic assumptions. The first one is the fact that in any actual implementation of the continuous variable teleportation protocol (CVTP) Alice and Bob necessarily share non-maximally entangled states (two-mode finitely squeezed states). The second one assumes that Alice's pool of possible coherent states to be teleported to Bob does not cover the whole complex plane (| α | < ∞). The optimal strategy is achieved by tuning three parameters in the original CVTP, namely, Alice's beam splitter transmittance and Bob's displacements in position and momentum implemented on the teleported state. These slight changes in the protocol are currently easy to be implemented and, as we show, give considerable gain in performance for a variety of possible pool of input states with Alice.

  18. The taming of the shrew: batterers' constructions of their wives' narratives.

    PubMed

    Borochowitz, Dalit Yassour

    2008-10-01

    Constructing a life story is a need shared by all humans to give their lives meaning and coherence. This article explores some of the narrative devices that batterers use to achieve a sense of coherence when telling their stories and justifying their violent behavior. A central theme that emerged from these stories centered on the men's perception of their wives as the embodiment of their own emotions and inner world. Two narrative strategies were identified in this context: (a) The construction of a "couple narrative" that focused on an idealized marital relationship rather than "allowing" the wife her story and (b) constructing a story around the theme of "she's not the same woman I married," which portrays the wife as "a shrew" and the violence as an attempt to discipline her. The stories of 18 batterers were used for this analysis, and two narratives were used to illustrate these strategies.

  19. Continuous variable quantum key distribution with modulated entangled states.

    PubMed

    Madsen, Lars S; Usenko, Vladyslav C; Lassen, Mikael; Filip, Radim; Andersen, Ulrik L

    2012-01-01

    Quantum key distribution enables two remote parties to grow a shared key, which they can use for unconditionally secure communication over a certain distance. The maximal distance depends on the loss and the excess noise of the connecting quantum channel. Several quantum key distribution schemes based on coherent states and continuous variable measurements are resilient to high loss in the channel, but are strongly affected by small amounts of channel excess noise. Here we propose and experimentally address a continuous variable quantum key distribution protocol that uses modulated fragile entangled states of light to greatly enhance the robustness to channel noise. We experimentally demonstrate that the resulting quantum key distribution protocol can tolerate more noise than the benchmark set by the ideal continuous variable coherent state protocol. Our scheme represents a very promising avenue for extending the distance for which secure communication is possible.

  20. The ALMA Common Software as a Basis for a Distributed Software Development

    NASA Astrophysics Data System (ADS)

    Raffi, Gianni; Chiozzi, Gianluca; Glendenning, Brian

    The Atacama Large Millimeter Array (ALMA) is a joint project involving astronomical organizations in Europe, North America and Japan. ALMA will consist of 64 12-m antennas operating in the millimetre and sub-millimetre wavelength range, with baselines of more than 10 km. It will be located at an altitude above 5000 m in the Chilean Atacama desert. The ALMA Computing group is a joint group with staff scattered on 3 continents and is responsible for all the control and data flow software related to ALMA, including tools ranging from support of proposal preparation to archive access of automatically created images. Early in the project it was decided that an ALMA Common Software (ACS) would be developed as a way to provide to all partners involved in the development a common software platform. The original assumption was that some key middleware like communication via CORBA and the use of XML and Java would be part of the project. It was intended from the beginning to develop this software in an incremental way based on releases, so that it would then evolve into an essential embedded part of all ALMA software applications. In this way we would build a basic unity and coherence into a system that will have been developed in a distributed fashion. This paper evaluates our progress after 1.5 year of work, following a few tests and preliminary releases. It analyzes the advantages and difficulties of such an ambitious approach, which creates an interface across all the various control and data flow applications.

  1. Software Agents as Facilitators of Coherent Coalition Operations

    DTIC Science & Technology

    2001-06-01

    In this regard, an important output from DARPA’s CoABS programme is the CoABS Grid - a middleware layer based on Java / Jini technology that provides...2 The C3I Group, Technical Panel 9. 6 developed between two countries who are fighting for control of Binni. To the north is Gao - which has...well developed and fundamentalist country. Gao has managed to annex an area of land, called it Binni and has put in its own puppet government. This

  2. Proactive Response to Potential Material Shortages Arising from Environmental Restrictions Using Automatic Discovery and Extraction of Information from Technical Documents

    DTIC Science & Technology

    2012-12-21

    material data and other key information in a UIMA environment. In the course of this project, the tools and methods developed were used to extract and...Architecture ( UIMA ) library from the Apache Software Foundation. Using this architecture, a given document is run through several “annotators” to...material taxonomy developed for the XSB, Inc. Coherent View™ database. In order to integrate this technology into the Java-based UIMA annotation

  3. The Design and Application of Data Storage System in Miyun Satellite Ground Station

    NASA Astrophysics Data System (ADS)

    Xue, Xiping; Su, Yan; Zhang, Hongbo; Liu, Bin; Yao, Meijuan; Zhao, Shu

    2015-04-01

    China has launched Chang'E-3 satellite in 2013, firstly achieved soft landing on moon for China's lunar probe. Miyun satellite ground station firstly used SAN storage network system based-on Stornext sharing software in Chang'E-3 mission. System performance fully meets the application requirements of Miyun ground station data storage.The Stornext file system is a sharing file system with high performance, supports multiple servers to access the file system using different operating system at the same time, and supports access to data on a variety of topologies, such as SAN and LAN. Stornext focused on data protection and big data management. It is announced that Quantum province has sold more than 70,000 licenses of Stornext file system worldwide, and its customer base is growing, which marks its leading position in the big data management.The responsibilities of Miyun satellite ground station are the reception of Chang'E-3 satellite downlink data and management of local data storage. The station mainly completes exploration mission management, receiving and management of observation data, and provides a comprehensive, centralized monitoring and control functions on data receiving equipment. The ground station applied SAN storage network system based on Stornext shared software for receiving and managing data reliable.The computer system in Miyun ground station is composed by business running servers, application workstations and other storage equipments. So storage systems need a shared file system which supports heterogeneous multi-operating system. In practical applications, 10 nodes simultaneously write data to the file system through 16 channels, and the maximum data transfer rate of each channel is up to 15MB/s. Thus the network throughput of file system is not less than 240MB/s. At the same time, the maximum capacity of each data file is up to 810GB. The storage system planned requires that 10 nodes simultaneously write data to the file system through 16 channels with 240MB/s network throughput.When it is integrated,sharing system can provide 1020MB/s write speed simultaneously.When the master storage server fails, the backup storage server takes over the normal service.The literacy of client will not be affected,in which switching time is less than 5s.The design and integrated storage system meet users requirements. Anyway, all-fiber way is too expensive in SAN; SCSI hard disk transfer rate may still be the bottleneck in the development of the entire storage system. Stornext can provide users with efficient sharing, management, automatic archiving of large numbers of files and hardware solutions. It occupies a leading position in big data management. Storage is the most popular sharing shareware, and there are drawbacks in Stornext: Firstly, Stornext software is expensive, in which charge by the sites. When the network scale is large, the purchase cost will be very high. Secondly, the parameters of Stornext software are more demands on the skills of technical staff. If there is a problem, it is difficult to exclude.

  4. Clearing your Desk! Software and Data Services for Collaborative Web Based GIS Analysis

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Gichamo, T.; Yildirim, A. A.; Liu, Y.

    2015-12-01

    Can your desktop computer crunch the large GIS datasets that are becoming increasingly common across the geosciences? Do you have access to or the know-how to take advantage of advanced high performance computing (HPC) capability? Web based cyberinfrastructure takes work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This talk will describe the HydroShare collaborative environment and web based services being developed to support the sharing and processing of hydrologic data and models. HydroShare supports the upload, storage, and sharing of a broad class of hydrologic data including time series, geographic features and raster datasets, multidimensional space-time data, and other structured collections of data. Web service tools and a Python client library provide researchers with access to HPC resources without requiring them to become HPC experts. This reduces the time and effort spent in finding and organizing the data required to prepare the inputs for hydrologic models and facilitates the management of online data and execution of models on HPC systems. This presentation will illustrate the use of web based data and computation services from both the browser and desktop client software. These web-based services implement the Terrain Analysis Using Digital Elevation Model (TauDEM) tools for watershed delineation, generation of hydrology-based terrain information, and preparation of hydrologic model inputs. They allow users to develop scripts on their desktop computer that call analytical functions that are executed completely in the cloud, on HPC resources using input datasets stored in the cloud, without installing specialized software, learning how to use HPC, or transferring large datasets back to the user's desktop. These cases serve as examples for how this approach can be extended to other models to enhance the use of web and data services in the geosciences.

  5. A research on the security of wisdom campus based on geospatial big data

    NASA Astrophysics Data System (ADS)

    Wang, Haiying

    2018-05-01

    There are some difficulties in wisdom campus, such as geospatial big data sharing, function expansion, data management, analysis and mining geospatial big data for a characteristic, especially the problem of data security can't guarantee cause prominent attention increasingly. In this article we put forward a data-oriented software architecture which is designed by the ideology of orienting data and data as kernel, solve the problem of traditional software architecture broaden the campus space data research, develop the application of wisdom campus.

  6. AKM in Open Source Communities

    NASA Astrophysics Data System (ADS)

    Stamelos, Ioannis; Kakarontzas, George

    Previous chapters in this book have dealt with Architecture Knowledge Management in traditional Closed Source Software (CSS) projects. This chapterwill attempt to examine the ways that knowledge is shared among participants in Free Libre Open Source Software (FLOSS 1) projects and how architectural knowledge is managed w.r.t. CSS. FLOSS projects are organized and developed in a fundamentally different way than CSS projects. FLOSS projects simply do not develop code as CSS projects do. As a consequence, their knowledge management mechanisms are also based on different concepts and tools.

  7. Integrating Software Modules For Robot Control

    NASA Technical Reports Server (NTRS)

    Volpe, Richard A.; Khosla, Pradeep; Stewart, David B.

    1993-01-01

    Reconfigurable, sensor-based control system uses state variables in systematic integration of reusable control modules. Designed for open-architecture hardware including many general-purpose microprocessors, each having own local memory plus access to global shared memory. Implemented in software as extension of Chimera II real-time operating system. Provides transparent computing mechanism for intertask communication between control modules and generic process-module architecture for multiprocessor realtime computation. Used to control robot arm. Proves useful in variety of other control and robotic applications.

  8. Experiences Using OpenMP Based on Compiler Directed Software DSM on a PC Cluster

    NASA Technical Reports Server (NTRS)

    Hess, Matthias; Jost, Gabriele; Mueller, Matthias; Ruehle, Roland; Biegel, Bryan (Technical Monitor)

    2002-01-01

    In this work we report on our experiences running OpenMP (message passing) programs on a commodity cluster of PCs (personal computers) running a software distributed shared memory (DSM) system. We describe our test environment and report on the performance of a subset of the NAS (NASA Advanced Supercomputing) Parallel Benchmarks that have been automatically parallelized for OpenMP. We compare the performance of the OpenMP implementations with that of their message passing counterparts and discuss performance differences.

  9. Hypercluster Parallel Processor

    NASA Technical Reports Server (NTRS)

    Blech, Richard A.; Cole, Gary L.; Milner, Edward J.; Quealy, Angela

    1992-01-01

    Hypercluster computer system includes multiple digital processors, operation of which coordinated through specialized software. Configurable according to various parallel-computing architectures of shared-memory or distributed-memory class, including scalar computer, vector computer, reduced-instruction-set computer, and complex-instruction-set computer. Designed as flexible, relatively inexpensive system that provides single programming and operating environment within which one can investigate effects of various parallel-computing architectures and combinations on performance in solution of complicated problems like those of three-dimensional flows in turbomachines. Hypercluster software and architectural concepts are in public domain.

  10. Multiprocessor shared-memory information exchange

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santoline, L.L.; Bowers, M.D.; Crew, A.W.

    1989-02-01

    In distributed microprocessor-based instrumentation and control systems, the inter-and intra-subsystem communication requirements ultimately form the basis for the overall system architecture. This paper describes a software protocol which addresses the intra-subsystem communications problem. Specifically the protocol allows for multiple processors to exchange information via a shared-memory interface. The authors primary goal is to provide a reliable means for information to be exchanged between central application processor boards (masters) and dedicated function processor boards (slaves) in a single computer chassis. The resultant Multiprocessor Shared-Memory Information Exchange (MSMIE) protocol, a standard master-slave shared-memory interface suitable for use in nuclear safety systems, ismore » designed to pass unidirectional buffers of information between the processors while providing a minimum, deterministic cycle time for this data exchange.« less

  11. Towards a mature measurement environment: Creating a software engineering research environment

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1990-01-01

    Software engineering researchers are building tools, defining methods, and models; however, there are problems with the nature and style of the research. The research is typically bottom-up, done in isolation so the pieces cannot be easily logically or physically integrated. A great deal of the research is essentially the packaging of a particular piece of technology with little indication of how the work would be integrated with other prices of research. The research is not aimed at solving the real problems of software engineering, i.e., the development and maintenance of quality systems in a productive manner. The research results are not evaluated or analyzed via experimentation or refined and tailored to the application environment. Thus, it cannot be easily transferred into practice. Because of these limitations we have not been able to understand the components of the discipline as a coherent whole and the relationships between various models of the process and product. What is needed is a top down experimental, evolutionary framework in which research can be focused, logically and physically integrated to produce quality software productively, and evaluated and tailored to the application environment. This implies the need for experimentation, which in turn implies the need for a laboratory that is associated with the artifact we are studying. This laboratory can only exist in an environment where software is being built, i.e., as part of a real software development and maintenance organization. Thus, we propose that Software Engineering Laboratory (SEL) type activities exist in all organizations to support software engineering research. We describe the SEL from a researcher's point of view, and discuss the corporate and government benefits of the SEL. The discussion focuses on the benefits to the research community.

  12. An Open Software Platform for Sharing Water Resource Models, Code and Data

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Mohamed, Khaled; Korteling, Brett; Matrosov, Evgenii; Huskova, Ivana; Harou, Julien; Rosenberg, David; Tilmant, Amaury; Medellin-Azuara, Josue; Wicks, Jon

    2016-04-01

    The modelling of managed water resource systems requires new approaches in the face of increasing future uncertainty. Water resources management models, even if applied to diverse problem areas, use common approaches such as representing the problem as a network of nodes and links. We propose a data management software platform, called Hydra, that uses this commonality to allow multiple models using a node-link structure to be managed and run using a single software system. Hydra's user interface allows users to manage network topology and associated data. Hydra feeds this data directly into a model, importing from and exporting to different file formats using Apps. An App connects Hydra to a custom model, a modelling system such as GAMS or MATLAB or to different file formats such as MS Excel, CSV and ESRI Shapefiles. Hydra allows users to manage their data in a single, consistent place. Apps can be used to run domain-specific models and allow users to work with their own required file formats. The Hydra App Store offers a collaborative space where model developers can publish, review and comment on Apps, models and data. Example Apps and open-source libraries are available in a variety of languages (Python, Java and .NET). The App Store can act as a hub for water resource modellers to view and share Apps, models and data easily. This encourages an ecosystem of development using a shared platform, resulting in more model integration and potentially greater unity within resource modelling communities. www.hydraplatform.org www.hydraappstore.com

  13. Shared virtual environments for aerospace training

    NASA Technical Reports Server (NTRS)

    Loftin, R. Bowen; Voss, Mark

    1994-01-01

    Virtual environments have the potential to significantly enhance the training of NASA astronauts and ground-based personnel for a variety of activities. A critical requirement is the need to share virtual environments, in real or near real time, between remote sites. It has been hypothesized that the training of international astronaut crews could be done more cheaply and effectively by utilizing such shared virtual environments in the early stages of mission preparation. The Software Technology Branch at NASA's Johnson Space Center has developed the capability for multiple users to simultaneously share the same virtual environment. Each user generates the graphics needed to create the virtual environment. All changes of object position and state are communicated to all users so that each virtual environment maintains its 'currency.' Examples of these shared environments will be discussed and plans for the utilization of the Department of Defense's Distributed Interactive Simulation (DIS) protocols for shared virtual environments will be presented. Finally, the impact of this technology on training and education in general will be explored.

  14. The Data Management and Communications (DMAC) Strategy for the U.S. Integrated Ocean Observing System (IOOS)

    NASA Astrophysics Data System (ADS)

    Hankin, S.

    2004-12-01

    Data management and communications within the marine environment present great challenges due in equal parts to the variety and complexity of the observations that are involved; the rapidly evolving information technology; and the complex history and relationships among community participants. At present there is no coherent Cyberinfrastructure that effectively integrates these data streams across organizations, disciplines and spatial and temporal scales. The resulting lack of integration of data denies US society important benefits, such as improved climate forecasts and more effective protection of coastal marine ecosystems. Therefore, Congress has directed the US marine science communities to come together to plan, design, and implement a sustained Integrated Ocean Observing System (IOOS). Central to the vision of the IOOS is a Data Management and Communications (DMAC) Subsystem that joins Federal, regional, state, municipal, academic and commercial partners in a seamless data sharing framework. The design of the DMAC Subsystem is made particularly challenging by three competing factors: 1) The data types to be integrated are heterogeneous and have complex structure; 2) The holdings are physically distributed and widely ranging in size and complexity; and 3) IOOS is a loose federation of many organizations, large and small, lacking a management hierarchy. Designing the DMAC Subsystem goes beyond solving problems of software engineering; the most demanding aspects of the solution lie in community behavior. An overview of the plan for the DMAC Subsystem and an outline of the next steps forward will be described.

  15. Enhanced coherence within the theta band between pairs of brains engaging in experienced versus naïve Reiki procedures.

    PubMed

    Ventura, Anabela Carraca; Persinger, Michael A

    2014-08-01

    The study objective was to discern whether the coherence between brain activities of the "patient" and practitioner differ between Reiki experts and novices. If the physical process associated with Reiki involves "convergence" between the practitioner and subject, then this congruence should be evident in time-dependent shared power within specific and meaningful frequency electroencephalographic bands. Simultaneous quantitative electroencephalogram measures (19 channels) were recorded from 9 pairs of subjects when 1 of the pairs was an experienced Reiki practitioner or had just been shown the procedure. Pairs recorded their experiences and images. The "practitioner" and "patient" pairs were measured within a quiet, comfortable acoustic chamber. Real-time correlations and coherence between pairs of brains for power (μV(2)·Hz(-1)) within the various frequency bands over the 10-min sessions were recorded and analyzed for each pair. Descriptors of experiences were analyzed for word meanings. Only the coherence within the theta range increased over time between the brains of the Reiki pairs relative to the Sham pairs, particularly over the left hemisphere. The pleasantness-unpleasantness rating for the words employed to describe experiences written after the experiment were more congruent for the Reiki pairs compared to the reference pairs. The increased synchronization of the cerebral activity of the participant and the practitioner during proximal therapies involving touch such as Reiki may be an important component of any subsequent beneficial effects.

  16. RAPTOR: An Enterprise Knowledge Discovery Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2010-11-11

    SharePoint search capability is commonly criticized by users due to the limited functionality provided. This software takes a world class search capability (Piranha) and integrates it with an Enterprise Level Collaboration Application installed on most major government and commercial sites.

  17. Balancing entrepreneurship and business practices for e-collaboration: responsible information sharing in academic research.

    PubMed

    Porter, Mark W; Porter, Mark William; Milley, David; Oliveti, Kristyn; Ladd, Allen; O'Hara, Ryan J; Desai, Bimal R; White, Peter S

    2008-11-06

    Flexible, highly accessible collaboration tools can inherently conflict with controls placed on information sharing by offices charged with privacy protection, compliance, and maintenance of the general business environment. Our implementation of a commercial enterprise wiki within the academic research environment addresses concerns of all involved through the development of a robust user training program, a suite of software customizations that enhance security elements, a robust auditing program, allowance for inter-institutional wiki collaboration, and wiki-specific governance.

  18. MIDAS - A microcomputer-based image display and analysis system with full Landsat frame processing capabilities

    NASA Technical Reports Server (NTRS)

    Hofman, L. B.; Erickson, W. K.; Donovan, W. E.

    1984-01-01

    Image Display and Analysis Systems (MIDAS) developed at NASA/Ames for the analysis of Landsat MSS images is described. The MIDAS computer power and memory, graphics, resource-sharing, expansion and upgrade, environment and maintenance, and software/user-interface requirements are outlined; the implementation hardware (including 32-bit microprocessor, 512K error-correcting RAM, 70 or 140-Mbyte formatted disk drive, 512 x 512 x 24 color frame buffer, and local-area-network transceiver) and applications software (ELAS, CIE, and P-EDITOR) are characterized; and implementation problems, performance data, and costs are examined. Planned improvements in MIDAS hardware and design goals and areas of exploration for MIDAS software are discussed.

  19. Virtual laboratories: new opportunities for collaborative water science

    NASA Astrophysics Data System (ADS)

    Ceola, Serena; Arheimer, Berit; Bloeschl, Guenter; Baratti, Emanuele; Capell, Rene; Castellarin, Attilio; Freer, Jim; Han, Dawei; Hrachowitz, Markus; Hundecha, Yeshewatesfa; Hutton, Christopher; Lindström, Goran; Montanari, Alberto; Nijzink, Remko; Parajka, Juraj; Toth, Elena; Viglione, Alberto; Wagener, Thorsten

    2015-04-01

    Reproducibility and repeatability of experiments are the fundamental prerequisites that allow researchers to validate results and share hydrological knowledge, experience and expertise in the light of global water management problems. Virtual laboratories offer new opportunities to enable these prerequisites since they allow experimenters to share data, tools and pre-defined experimental procedures (i.e. protocols). Here we present the outcomes of a first collaborative numerical experiment undertaken by five different international research groups in a virtual laboratory to address the key issues of reproducibility and repeatability. Moving from the definition of accurate and detailed experimental protocols, a rainfall-runoff model was independently applied to 15 European catchments by the research groups and model results were collectively examined through a web-based discussion. We found that a detailed modelling protocol was crucial to ensure the comparability and reproducibility of the proposed experiment across groups. Our results suggest that sharing comprehensive and precise protocols and running the experiments within a controlled environment (e.g. virtual laboratory) is as fundamental as sharing data and tools for ensuring experiment repeatability and reproducibility across the broad scientific community and thus advancing hydrology in a more coherent way.

  20. A qualitative study of Swedes' opinions about shared electronic health records.

    PubMed

    Lehnbom, Elin C; McLachlan, Andrew J; Brien, Jo-anne E

    2013-01-01

    European countries are world-leading in the development and implementation of e-Health. In Sweden, all primary healthcare centres and most hospitals use digital records. Some regions use the same software which allows for clinical information to be shared (regionally shared EHRs), but there is a movement towards making all EHRs inter-operable to allow for a National Patient Summary (NPS). The aim of this study was to explore the opinions of Swedish consumers and health professionals about shared EHRs and the NPS. Semi-structered phone interviews were conducted with consumers and health professionals. The majority of interviewed health professionals were currently using regionally shared EHRs. In their experience, having access to regionally shared EHRs facilitated a holistic patient approach, assisted in patient follow-up, and reduced inappropriate (over)prescribing. Consumers had a poor level of knowledge about shared EHRs and the NPS. Unlike health professionals, consumers perceived a NPS to be of great value. The findings indicate that there was a discrepancy between health professionals and consumers' knowledge of, and the perceived need for, a NPS.

  1. DOCLIB: a software library for document processing

    NASA Astrophysics Data System (ADS)

    Jaeger, Stefan; Zhu, Guangyu; Doermann, David; Chen, Kevin; Sampat, Summit

    2006-01-01

    Most researchers would agree that research in the field of document processing can benefit tremendously from a common software library through which institutions are able to develop and share research-related software and applications across academic, business, and government domains. However, despite several attempts in the past, the research community still lacks a widely-accepted standard software library for document processing. This paper describes a new library called DOCLIB, which tries to overcome the drawbacks of earlier approaches. Many of DOCLIB's features are unique either in themselves or in their combination with others, e.g. the factory concept for support of different image types, the juxtaposition of image data and metadata, or the add-on mechanism. We cherish the hope that DOCLIB serves the needs of researchers better than previous approaches and will readily be accepted by a larger group of scientists.

  2. The Open Source DataTurbine Initiative: Streaming Data Middleware for Environmental Observing Systems

    NASA Technical Reports Server (NTRS)

    Fountain T.; Tilak, S.; Shin, P.; Hubbard, P.; Freudinger, L.

    2009-01-01

    The Open Source DataTurbine Initiative is an international community of scientists and engineers sharing a common interest in real-time streaming data middleware and applications. The technology base of the OSDT Initiative is the DataTurbine open source middleware. Key applications of DataTurbine include coral reef monitoring, lake monitoring and limnology, biodiversity and animal tracking, structural health monitoring and earthquake engineering, airborne environmental monitoring, and environmental sustainability. DataTurbine software emerged as a commercial product in the 1990 s from collaborations between NASA and private industry. In October 2007, a grant from the USA National Science Foundation (NSF) Office of Cyberinfrastructure allowed us to transition DataTurbine from a proprietary software product into an open source software initiative. This paper describes the DataTurbine software and highlights key applications in environmental monitoring.

  3. Developing a Cyberinfrastructure for integrated assessments of environmental contaminants.

    PubMed

    Kaur, Taranjit; Singh, Jatinder; Goodale, Wing M; Kramar, David; Nelson, Peter

    2005-03-01

    The objective of this study was to design and implement prototype software for capturing field data and automating the process for reporting and analyzing the distribution of mercury. The four phase process used to design, develop, deploy and evaluate the prototype software is described. Two different development strategies were used: (1) design of a mobile data collection application intended to capture field data in a meaningful format and automate transfer into user databases, followed by (2) a re-engineering of the original software to develop an integrated database environment with improved methods for aggregating and sharing data. Results demonstrated that innovative use of commercially available hardware and software components can lead to the development of an end-to-end digital cyberinfrastructure that captures, records, stores, transmits, compiles and integrates multi-source data as it relates to mercury.

  4. Unobtrusive Software and System Health Management with R2U2 on a Parallel MIMD Coprocessor

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Moosbrugger, Patrick

    2017-01-01

    Dynamic monitoring of software and system health of a complex cyber-physical system requires observers that continuously monitor variables of the embedded software in order to detect anomalies and reason about root causes. There exists a variety of techniques for code instrumentation, but instrumentation might change runtime behavior and could require costly software re-certification. In this paper, we present R2U2E, a novel realization of our real-time, Realizable, Responsive, and Unobtrusive Unit (R2U2). The R2U2E observers are executed in parallel on a dedicated 16-core EPIPHANY co-processor, thereby avoiding additional computational overhead to the system under observation. A DMA-based shared memory access architecture allows R2U2E to operate without any code instrumentation or program interference.

  5. Spinoff 2013

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Topics covered include: Innovative Software Tools Measure Behavioral Alertness; Miniaturized, Portable Sensors Monitor Metabolic Health; Patient Simulators Train Emergency Caregivers; Solar Refrigerators Store Life-Saving Vaccines; Monitors Enable Medication Management in Patients' Homes; Handheld Diagnostic Device Delivers Quick Medical Readings; Experiments Result in Safer, Spin-Resistant Aircraft; Interfaces Visualize Data for Airline Safety, Efficiency; Data Mining Tools Make Flights Safer, More Efficient; NASA Standards Inform Comfortable Car Seats; Heat Shield Paves the Way for Commercial Space; Air Systems Provide Life Support to Miners; Coatings Preserve Metal, Stone, Tile, and Concrete; Robots Spur Software That Lends a Hand; Cloud-Based Data Sharing Connects Emergency Managers; Catalytic Converters Maintain Air Quality in Mines; NASA-Enhanced Water Bottles Filter Water on the Go; Brainwave Monitoring Software Improves Distracted Minds; Thermal Materials Protect Priceless, Personal Keepsakes; Home Air Purifiers Eradicate Harmful Pathogens; Thermal Materials Drive Professional Apparel Line; Radiant Barriers Save Energy in Buildings; Open Source Initiative Powers Real-Time Data Streams; Shuttle Engine Designs Revolutionize Solar Power; Procedure-Authoring Tool Improves Safety on Oil Rigs; Satellite Data Aid Monitoring of Nation's Forests; Mars Technologies Spawn Durable Wind Turbines; Programs Visualize Earth and Space for Interactive Education; Processor Units Reduce Satellite Construction Costs; Software Accelerates Computing Time for Complex Math; Simulation Tools Prevent Signal Interference on Spacecraft; Software Simplifies the Sharing of Numerical Models; Virtual Machine Language Controls Remote Devices; Micro-Accelerometers Monitor Equipment Health; Reactors Save Energy, Costs for Hydrogen Production; Cameras Monitor Spacecraft Integrity to Prevent Failures; Testing Devices Garner Data on Insulation Performance; Smart Sensors Gather Information for Machine Diagnostics; Oxygen Sensors Monitor Bioreactors and Ensure Health and Safety; Vision Algorithms Catch Defects in Screen Displays; and Deformable Mirrors Capture Exoplanet Data, Reflect Lasers.

  6. Share Repository Framework: Component Specification and Otology

    DTIC Science & Technology

    2008-04-23

    Palantir Technologies has created one such software application to support the DoD intelligence community by providing robust capabilities for...managing data from various sources. The Palantir tool is based on user-defined ontologies and supports multiple representation and analysis tools

  7. Should Secondary Schools Buy Local Area Networks?

    ERIC Educational Resources Information Center

    Hyde, Hartley

    1986-01-01

    The advantages of microcomputer networks include resource sharing, multiple user communications, and integrating data processing and office automation. This article nonetheless favors stand-alone computers for Australian secondary school classrooms because of unreliable hardware, software design, and copyright problems, and individual progress…

  8. 7 CFR 4280.3 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND RURAL..., fish, or birds, either for fiber, food for human consumption, or livestock feed. Business Incubator. A facility in which small businesses can share premises, support staff, computers, software or hardware...

  9. 15 CFR 295.25 - Special rule for the valuation of transfers between separately-owned joint venture members.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... goods, including computer software, and services provided by the transferor related to the maintenance... non-Federal share of the total cost of the joint research and development program. (c) Definition. The...

  10. 15 CFR 295.25 - Special rule for the valuation of transfers between separately-owned joint venture members.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... goods, including computer software, and services provided by the transferor related to the maintenance... non-Federal share of the total cost of the joint research and development program. (c) Definition. The...

  11. 15 CFR 295.25 - Special rule for the valuation of transfers between separately-owned joint venture members.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... goods, including computer software, and services provided by the transferor related to the maintenance... non-Federal share of the total cost of the joint research and development program. (c) Definition. The...

  12. 15 CFR 295.25 - Special rule for the valuation of transfers between separately-owned joint venture members.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... goods, including computer software, and services provided by the transferor related to the maintenance... non-Federal share of the total cost of the joint research and development program. (c) Definition. The...

  13. In-vivo Fourier domain optical coherence tomography as a new tool for investigation of vasodynamics in the mouse model.

    PubMed

    Meissner, Sven; Müller, Gregor; Walther, Julia; Morawietz, Henning; Koch, Edmund

    2009-01-01

    In-vivo imaging of the vascular system can provide novel insight into the dynamics of vasoconstriction and vasodilation. Fourier domain optical coherence tomography (FD-OCT) is an optical, noncontact imaging technique based on interferometry of short-coherent near-infrared light with axial resolution of less than 10 microm. In this study, we apply FD-OCT as an in-vivo imaging technique to investigate blood vessels in their anatomical context using temporally resolved image stacks. Our chosen model system is the murine saphenous artery and vein, due to their small inner vessel diameters, sensitive response to vasoactive stimuli, and advantageous anatomical position. The vascular function of male wild-type mice (C57BL/6) is determined at the ages of 6 and 20 weeks. Vasoconstriction is analyzed in response to dermal application of potassium (K(+)), and vasodilation in response to sodium nitroprusside (SNP). Vasodynamics are quantified from time series (75 sec, 4 frames per sec, 330 x 512 pixels per frame) of cross sectional images that are analyzed by semiautomated image processing software. The morphology of the saphenous artery and vein is determined by 3-D image stacks of 512 x 512 x 512 pixels. Using the FD-OCT technique, we are able to demonstrate age-dependent differences in vascular function and vasodynamics.

  14. MULTILEVEL ISCHEMIA IN DISORGANIZATION OF THE RETINAL INNER LAYERS ON PROJECTION-RESOLVED OPTICAL COHERENCE TOMOGRAPHY ANGIOGRAPHY.

    PubMed

    Onishi, Alex C; Ashraf, Mohammed; Soetikno, Brian T; Fawzi, Amani A

    2018-04-10

    To examine the relationship between ischemia and disorganization of the retinal inner layers (DRIL). Cross-sectional retrospective study of 20 patients (22 eyes) with diabetic retinopathy presenting to a tertiary academic referral center, who had DRIL on structural optical coherence tomography (OCT) using Spectralis HRA + OCT (Heidelberg Engineering, Heidelberg, Germany) and OCT angiography with XR Avanti (Optovue Inc, Fremont, CA) on the same day. Optical coherence tomography angiography images were further processed to remove flow signal projection artifacts using a software algorithm adapted from recent studies. Retinal capillary perfusion in the superficial capillary plexuses, middle capillary plexuses, and deep capillary plexuses, as well as integrity of the photoreceptor lines on OCT was compared in areas with DRIL to control areas without DRIL in the same eye. Qualitative assessment of projection-resolved OCT angiography of eyes with DRIL on structural OCT demonstrated significant perfusion deficits compared with adjacent control areas (P < 0.001). Most lesions (85.7%) showed superimposed superficial capillary plexus and/or middle capillary plexus nonperfusion in addition to deep capillary plexus nonflow. Areas of DRIL were significantly associated with photoreceptor disruption (P = 0.035) compared with adjacent DRIL-free areas. We found that DRIL is associated with multilevel retinal capillary nonperfusion, suggesting an important role for ischemia in this OCT phenotype.

  15. The Effects of Block Size on the Performance of Coherent Caches in Shared-Memory Multiprocessors

    DTIC Science & Technology

    1993-05-01

    increase with the bandwidth and latency. For those applications with poor spatial locality, the best choice of cache line size is determined by the...observation was used in the design of two schemes: LimitLESS di- rectories and Tag caches. LimitLESS directories [15] were designed for the ALEWIFE...small packets may be used to avoid network congestion. The most important factor influencing the choice of cache line size for a multipro- cessor is the

  16. Continuous variable quantum cryptography: beating the 3 dB loss limit.

    PubMed

    Silberhorn, Ch; Ralph, T C; Lütkenhaus, N; Leuchs, G

    2002-10-14

    We demonstrate that secure quantum key distribution systems based on continuous variable implementations can operate beyond the apparent 3 dB loss limit that is implied by the beam splitting attack. The loss limit was established for standard minimum uncertainty states such as coherent states. We show that, by an appropriate postselection mechanism, we can enter a region where Eve's knowledge on Alice's key falls behind the information shared between Alice and Bob, even in the presence of substantial losses.

  17. Control code for laboratory adaptive optics teaching system

    NASA Astrophysics Data System (ADS)

    Jin, Moonseob; Luder, Ryan; Sanchez, Lucas; Hart, Michael

    2017-09-01

    By sensing and compensating wavefront aberration, adaptive optics (AO) systems have proven themselves crucial in large astronomical telescopes, retinal imaging, and holographic coherent imaging. Commercial AO systems for laboratory use are now available in the market. One such is the ThorLabs AO kit built around a Boston Micromachines deformable mirror. However, there are limitations in applying these systems to research and pedagogical projects since the software is written with limited flexibility. In this paper, we describe a MATLAB-based software suite to interface with the ThorLabs AO kit by using the MATLAB Engine API and Visual Studio. The software is designed to offer complete access to the wavefront sensor data, through the various levels of processing, to the command signals to the deformable mirror and fast steering mirror. In this way, through a MATLAB GUI, an operator can experiment with every aspect of the AO system's functioning. This is particularly valuable for tests of new control algorithms as well as to support student engagement in an academic environment. We plan to make the code freely available to the community.

  18. Design and performance evaluation of an OpenFlow-based control plane for software-defined elastic optical networks with direct-detection optical OFDM (DDO-OFDM) transmission.

    PubMed

    Liu, Lei; Peng, Wei-Ren; Casellas, Ramon; Tsuritani, Takehiro; Morita, Itsuro; Martínez, Ricardo; Muñoz, Raül; Yoo, S J B

    2014-01-13

    Optical Orthogonal Frequency Division Multiplexing (O-OFDM), which transmits high speed optical signals using multiple spectrally overlapped lower-speed subcarriers, is a promising candidate for supporting future elastic optical networks. In contrast to previous works which focus on Coherent Optical OFDM (CO-OFDM), in this paper, we consider the direct-detection optical OFDM (DDO-OFDM) as the transport technique, which leads to simpler hardware and software realizations, potentially offering a low-cost solution for elastic optical networks, especially in metro networks, and short or medium distance core networks. Based on this network scenario, we design and deploy a software-defined networking (SDN) control plane enabled by extending OpenFlow, detailing the network architecture, the routing and spectrum assignment algorithm, OpenFlow protocol extensions and the experimental validation. To the best of our knowledge, it is the first time that an OpenFlow-based control plane is reported and its performance is quantitatively measured in an elastic optical network with DDO-OFDM transmission.

  19. Diagnosis of glaucoma and detection of glaucoma progression using spectral domain optical coherence tomography.

    PubMed

    Grewal, Dilraj S; Tanna, Angelo P

    2013-03-01

    With the rapid adoption of spectral domain optical coherence tomography (SDOCT) in clinical practice and the recent advances in software technology, there is a need for a review of the literature on glaucoma detection and progression analysis algorithms designed for the commercially available instruments. Peripapillary retinal nerve fiber layer (RNFL) thickness and macular thickness, including segmental macular thickness calculation algorithms, have been demonstrated to be repeatable and reproducible, and have a high degree of diagnostic sensitivity and specificity in discriminating between healthy and glaucomatous eyes across the glaucoma continuum. Newer software capabilities such as glaucoma progression detection algorithms provide an objective analysis of longitudinally obtained structural data that enhances our ability to detect glaucomatous progression. RNFL measurements obtained with SDOCT appear more sensitive than time domain OCT (TDOCT) for glaucoma progression detection; however, agreement with the assessments of visual field progression is poor. Over the last few years, several studies have been performed to assess the diagnostic performance of SDOCT structural imaging and its validity in assessing glaucoma progression. Most evidence suggests that SDOCT performs similarly to TDOCT for glaucoma diagnosis; however, SDOCT may be superior for the detection of early stage disease. With respect to progression detection, SDOCT represents an important technological advance because of its improved resolution and repeatability. Advancements in RNFL thickness quantification, segmental macular thickness calculation and progression detection algorithms, when used correctly, may help to improve our ability to diagnose and manage glaucoma.

  20. En-face imaging of the ellipsoid zone in the retina from optical coherence tomography B-scans

    NASA Astrophysics Data System (ADS)

    Holmes, T.; Larkin, S.; Downing, M.; Csaky, K.

    2015-03-01

    It is generally believed that photoreceptor integrity is related to the ellipsoid zone appearance in optical coherence tomography (OCT) B-scans. Algorithms and software were developed for viewing and analyzing the ellipsoid zone. The software performs the following: (a), automated ellipsoid zone isolation in the B-scans, (b), en-face view of the ellipsoid-zone reflectance, (c), alignment and overlay of (b) onto reflectance images of the retina, and (d), alignment and overlay of (c) with microperimetry sensitivity points. Dataset groups were compared from normal and dry age related macular degeneration (DAMD) subjects. Scalar measurements for correlation against condition included the mean and standard deviation of the ellipsoid zone's reflectance. The imageprocessing techniques for automatically finding the ellipsoid zone are based upon a calculation of optical flow which tracks the edges of laminated structures across an image. Statistical significance was shown in T-tests of these measurements with the population pools separated as normal and DAMD subjects. A display of en-face ellipsoid-zone reflectance shows a clear and recognizable difference between any of the normal and DAMD subjects in that they show generally uniform and nonuniform reflectance, respectively, over the region near the macula. Regions surrounding points of low microperimetry (μP) sensitivity have nonregular and lower levels of ellipsoid-zone reflectance nearby. These findings support the idea that the photoreceptor integrity could be affecting both the ellipsoid-zone reflectance and the sensitivity measurements.

Top