Sample records for technology we-net task

  1. Using O*NET Based Higher Education Job Descriptions for Resume Development

    ERIC Educational Resources Information Center

    Manzi, P. A.; Roe, J.; Pierre-Louis, D.

    2011-01-01

    The purpose of this three part article is to illustrate to career development professionals and students who are graduates of higher education MS and Ed.D/Ph.D programs, how to use the O*NET to develop an effective resume. The O*NET provides detailed information about work tasks, knowledge and skills, and in some titles, tools and technology where…

  2. Assessing task-technology fit in a PACS upgrade: do users' and developers' appraisals converge?

    PubMed

    Lepanto, Luigi; Sicotte, Claude; Lehoux, Pascale

    2011-12-01

    The purpose of this study was to measure users' perceived benefits of a picture archiving and communication system (PACS) upgrade, and compare their responses to those predicted by developers. The Task-Technology Fit (TTF) model served as the theoretical framework to study the relation between TTF, utilization, and perceived benefits. A self-administered survey was distributed to radiologists working in a university hospital undergoing a PACS upgrade. Four variables were measured: impact, utilization, TTF, and perceived net benefits. The radiologists were divided into subgroups according to their utilization profiles. Analysis of variance was performed and the hypotheses were tested with regression analysis. Interviews were conducted with developers involved in the PACS upgrade who were asked to predict impact and TTF. Users identified only a moderate fit between the PACS enhancements and their tasks, while developers predicted a high level of TTF. The combination of a moderate fit and an underestimation of the potential impact of changes in the PACS led to a low score for perceived net benefits. Results varied significantly among user subgroups. Globally, the data support the hypotheses that TTF predicts utilization and perceived net benefits, but not that utilization predicts perceived net benefits. TTF is a valid tool to assess perceived benefits, but it is important to take into account the characteristics of users. In the context of a technology that is rapidly evolving, there needs to be an alignment of what users perceive as a good fit and the functionality developers incorporate into their products.

  3. Facilitating Autonomy and Creativity in Second Language Learning through Cyber-Tasks, Hyperlinks and Net-Surfing

    ERIC Educational Resources Information Center

    Akinwamide, T. K.; Adedara, O. G.

    2012-01-01

    The digitalization of academic interactions and collaborations in this present technologically conscious world is making collaborations between technology and pedagogy in the teaching and learning processes to display logical and systematic reasoning rather than the usual stereotyped informed decisions. This simply means, pedagogically, learning…

  4. Buying Power

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    Technology product procurement can be a daunting task for a college or university--especially a smaller institution--to accomplish alone. Perhaps this is why schools are tackling it by banding together. When it comes to purchasing technology, a little help from friends is the key to economies of scale, which frequently net schools the best…

  5. Heat pump concepts for nZEB Technology developments, design tools and testing of heat pump systems for nZEB in the USA: Country report IEA HPT Annex 40 Task 2, Task 3 and Task 4 of the USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baxter, Van D.; Payne, W. Vance; Ling, Jiazhen

    The IEA HPT Annex 40 "Heat pump concepts for Nearly Zero Energy Buildings" deals with the application of heat pumps as a core component of the HVAC system for Nearly or Net Zero energy buildings (nZEB). This report covers Task 2 on the system comparison and optimisation and Task 3 dedicated to the development of adapted technologies for nZEB and field monitoring results of heat pump systems in nZEB. In the US team three institutions are involved and have worked on the following projects: The Oak Ridge National Laboratory (ORNL) will summarize development activities through the field demonstration stage formore » several integrated heat pump (IHP) systems electric ground-source (GS-IHP) and air-source (AS-IHP) versions and an engine driven AS-IHP version. The first commercial GS-IHP product was just introduced to the market in December 2012. This work is a contribution to Task 3 of the Annex. The University of Maryland will contribute a software development project to Task 2 of the Annex. The software ThermCom evaluates occupied space thermal comfort conditions accounting for all radiative and convective heat transfer effects as well as local air properties. The National Institute of Standards and Technology (NIST) is working on a field study effort on the NIST Net Zero Energy Residential Test Facility (NZERTF). This residential building was constructed on the NIST campus and officially opened in summer 2013. During the first year, between July 2013 and June 2014, baseline performance of the NZERTF was monitored under a simulated occupancy protocol. The house was equipped with an air-to-air heat pump which included a dedicated dehumidification operating mode. Outdoor conditions, internal loads and modes of heat pump operation were monitored. Field study results with respect to heat pump operation will be reported and recommendations on heat pump optimization for a net zero energy building will be provided. This work is a contribution to Task 3 of the Annex.« less

  6. Using Virtual Reality for Task-Based Exercises in Teaching Non-Traditional Students of German

    ERIC Educational Resources Information Center

    Libbon, Stephanie

    2004-01-01

    Using task-based exercises that required web searches and online activities, this course introduced non-traditional students to the sights and sounds of the German culture and language and simultaneously to computer technology. Through partner work that required negotiation of the net as well as of the language, these adult beginning German…

  7. Medical Situational Awareness in Theater Advanced Concept Technology Demonstration Project Proposal

    DTIC Science & Technology

    2004-06-01

    making it an impossible task to sort, understand , and generate actionable knowledge within operational timeframes. Medical Situational Awareness in...need for greater medical situation awareness in theater and for greater integration of theater medical information into the net-centric rapid...There is a need for greater Medical Situation Awareness in theater and for greater integration of theater medical information into the ForceNet

  8. College and the Digital Generation: Assessing and Training Students for the Technological Demands of College by Exploring Relationships between Computer Self-Efficacy and Computer Proficiency

    ERIC Educational Resources Information Center

    Morris, Kathleen M.

    2010-01-01

    Today's college students are often labeled the "Net Generation" and assumed to be computer savvy and technological minded. Exposure to and use of technologies can increase self-efficacy regarding ability to complete desired computer tasks, but students arrive on campuses unable to pass computer proficiency exams. This is concerning because some…

  9. Mobile robot sense net

    NASA Astrophysics Data System (ADS)

    Konolige, Kurt G.; Gutmann, Steffen; Guzzoni, Didier; Ficklin, Robert W.; Nicewarner, Keith E.

    1999-08-01

    Mobile robot hardware and software is developing to the point where interesting applications for groups of such robots can be contemplated. We envision a set of mobots acting to map and perform surveillance or other task within an indoor environment (the Sense Net). A typical application of the Sense Net would be to detect survivors in buildings damaged by earthquake or other disaster, where human searchers would be put a risk. As a team, the Sense Net could reconnoiter a set of buildings faster, more reliably, and more comprehensibly than an individual mobot. The team, for example, could dynamically form subteams to perform task that cannot be done by individual robots, such as measuring the range to a distant object by forming a long baseline stereo sensor form a pari of mobots. In addition, the team could automatically reconfigure itself to handle contingencies such as disabled mobots. This paper is a report of our current progress in developing the Sense Net, after the first year of a two-year project. In our approach, each mobot has sufficient autonomy to perform several tasks, such as mapping unknown areas, navigating to specific positions, and detecting, tracking, characterizing, and classifying human and vehicular activity. We detail how some of these tasks are accomplished, and how the mobot group is tasked.

  10. Playing the Metadata Game: Technologies and Strategies Used by Climate Diagnostics Center for Cataloging and Distributing Climate Data.

    NASA Astrophysics Data System (ADS)

    Schweitzer, R. H.

    2001-05-01

    The Climate Diagnostics Center maintains a collection of gridded climate data primarily for use by local researchers. Because this data is available on fast digital storage and because it has been converted to netCDF using a standard metadata convention (called COARDS), we recognize that this data collection is also useful to the community at large. At CDC we try to use technology and metadata standards to reduce our costs associated with making these data available to the public. The World Wide Web has been an excellent technology platform for meeting that goal. Specifically we have developed Web-based user interfaces that allow users to search, plot and download subsets from the data collection. We have also been exploring use of the Pacific Marine Environment Laboratory's Live Access Server (LAS) as an engine for this task. This would result in further savings by allowing us to concentrate on customizing the LAS where needed, rather that developing and maintaining our own system. One such customization currently under development is the use of Java Servlets and JavaServer pages in conjunction with a metadata database to produce a hierarchical user interface to LAS. In addition to these Web-based user interfaces all of our data are available via the Distributed Oceanographic Data System (DODS). This allows other sites using LAS and individuals using DODS-enabled clients to use our data as if it were a local file. All of these technology systems are driven by metadata. When we began to create netCDF files, we collaborated with several other agencies to develop a netCDF convention (COARDS) for metadata. At CDC we have extended that convention to incorporate additional metadata elements to make the netCDF files as self-describing as possible. Part of the local metadata is a set of controlled names for the variable, level in the atmosphere and ocean, statistic and data set for each netCDF file. To allow searching and easy reorganization of these metadata, we loaded the metadata from the netCDF files into a mySQL database. The combination of the mySQL database and the controlled names makes it possible to automate the construction of user interfaces and standard format metadata descriptions, like Federal Geographic Data Committee (FGDC) and Directory Interchange Format (DIF). These standard descriptions also include an association between our controlled names and standard keywords such as those developed by the Global Change Master Directory (GCMD). This talk will give an overview of each of these technology and metadata standards as it applies to work at the Climate Diagnostics Center. The talk will also discuss the pros and cons of each approach and discuss areas for future development.

  11. Relation between SM-covers and SM-decompositions of Petri nets

    NASA Astrophysics Data System (ADS)

    Karatkevich, Andrei; Wiśniewski, Remigiusz

    2015-12-01

    A task of finding for a given Petri net a set of sequential components being able to represent together the behavior of the net arises often in formal analysis of Petri nets and in applications of Petri net to logical control. Such task can be met in two different variants: obtaining a Petri net cover or a decomposition. Petri net cover supposes that a set of the subnets of given net is selected, and the sequential nets forming a decomposition may have additional places, which do not belong to the decomposed net. The paper discusses difference and relations between two mentioned tasks and their results.

  12. Development of cost-effective surfactant flooding technology, Quarterly report, October 1995--December 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, G.A.; Sepehrnoori, K.

    1995-12-31

    The objective of this research is to develop cost-effective surfactant flooding technology by using simulation studies to evaluate and optimize alternative design strategies taking into account reservoir characteristics process chemistry, and process design options such as horizontal wells. Task 1 is the development of an improved numerical method for our simulator that will enable us to solve a wider class of these difficult simulation problems accurately and affordably. Task 2 is the application of this simulator to the optimization of surfactant flooding to reduce its risk and cost. In this quarter, we have continued working on Task 2 to optimizemore » surfactant flooding design and have included economic analysis to the optimization process. An economic model was developed using a spreadsheet and the discounted cash flow (DCF) method of economic analysis. The model was designed specifically for a domestic onshore surfactant flood and has been used to economically evaluate previous work that used a technical approach to optimization. The DCF model outputs common economic decision making criteria, such as net present value (NPV), internal rate of return (IRR), and payback period.« less

  13. Precision Casting via Advanced Simulation and Manufacturing

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A two-year program was conducted to develop and commercially implement selected casting manufacturing technologies to enable significant reductions in the costs of castings, increase the complexity and dimensional accuracy of castings, and reduce the development times for delivery of high quality castings. The industry-led R&D project was cost shared with NASA's Aerospace Industry Technology Program (AITP). The Rocketdyne Division of Boeing North American, Inc. served as the team lead with participation from Lockheed Martin, Ford Motor Company, Howmet Corporation, PCC Airfoils, General Electric, UES, Inc., University of Alabama, Auburn University, Robinson, Inc., Aracor, and NASA-LeRC. The technical effort was organized into four distinct tasks. The accomplishments reported herein. Task 1.0 developed advanced simulation technology for core molding. Ford headed up this task. On this program, a specialized core machine was designed and built. Task 2.0 focused on intelligent process control for precision core molding. Howmet led this effort. The primary focus of these experimental efforts was to characterize the process parameters that have a strong impact on dimensional control issues of injection molded cores during their fabrication. Task 3.0 developed and applied rapid prototyping to produce near net shape castings. Rocketdyne was responsible for this task. CAD files were generated using reverse engineering, rapid prototype patterns were fabricated using SLS and SLA, and castings produced and evaluated. Task 4.0 was aimed at developing technology transfer. Rocketdyne coordinated this task. Casting related technology, explored and evaluated in the first three tasks of this program, was implemented into manufacturing processes.

  14. Squeeze-SegNet: a new fast deep convolutional neural network for semantic segmentation

    NASA Astrophysics Data System (ADS)

    Nanfack, Geraldin; Elhassouny, Azeddine; Oulad Haj Thami, Rachid

    2018-04-01

    The recent researches in Deep Convolutional Neural Network have focused their attention on improving accuracy that provide significant advances. However, if they were limited to classification tasks, nowadays with contributions from Scientific Communities who are embarking in this field, they have become very useful in higher level tasks such as object detection and pixel-wise semantic segmentation. Thus, brilliant ideas in the field of semantic segmentation with deep learning have completed the state of the art of accuracy, however this architectures become very difficult to apply in embedded systems as is the case for autonomous driving. We present a new Deep fully Convolutional Neural Network for pixel-wise semantic segmentation which we call Squeeze-SegNet. The architecture is based on Encoder-Decoder style. We use a SqueezeNet-like encoder and a decoder formed by our proposed squeeze-decoder module and upsample layer using downsample indices like in SegNet and we add a deconvolution layer to provide final multi-channel feature map. On datasets like Camvid or City-states, our net gets SegNet-level accuracy with less than 10 times fewer parameters than SegNet.

  15. Using NetCloak to develop server-side Web-based experiments without writing CGI programs.

    PubMed

    Wolfe, Christopher R; Reyna, Valerie F

    2002-05-01

    Server-side experiments use the Web server, rather than the participant's browser, to handle tasks such as random assignment, eliminating inconsistencies with JAVA and other client-side applications. Heretofore, experimenters wishing to create server-side experiments have had to write programs to create common gateway interface (CGI) scripts in programming languages such as Perl and C++. NetCloak uses simple, HTML-like commands to create CGIs. We used NetCloak to implement an experiment on probability estimation. Measurements of time on task and participants' IP addresses assisted quality control. Without prior training, in less than 1 month, we were able to use NetCloak to design and create a Web-based experiment and to help graduate students create three Web-based experiments of their own.

  16. Design and Validation of an Open-Source, Partial Task Trainer for Endonasal Neuro-Endoscopic Skills Development: Indian Experience.

    PubMed

    Singh, Ramandeep; Baby, Britty; Damodaran, Natesan; Srivastav, Vinkle; Suri, Ashish; Banerjee, Subhashis; Kumar, Subodh; Kalra, Prem; Prasad, Sanjiva; Paul, Kolin; Anand, Sneh; Kumar, Sanjeev; Dhiman, Varun; Ben-Israel, David; Kapoor, Kulwant Singh

    2016-02-01

    Box trainers are ideal simulators, given they are inexpensive, accessible, and use appropriate fidelity. The development and validation of an open-source, partial task simulator that teaches the fundamental skills necessary for endonasal skull-base neuro-endoscopic surgery. We defined the Neuro-Endo-Trainer (NET) SkullBase-Task-GraspPickPlace with an activity area by analyzing the computed tomography scans of 15 adult patients with sellar suprasellar parasellar tumors. Four groups of participants (Group E, n = 4: expert neuroendoscopists; Group N, n =19: novice neurosurgeons; Group R, n = 11: neurosurgery residents with multiple iterations; and Group T, n = 27: neurosurgery residents with single iteration) performed grasp, pick, and place tasks using NET and were graded on task completion time and skills assessment scale score. Group E had lower task completion times and greater skills assessment scale scores than both Group N and R (P ≤ 0.03, 0.001). The performance of Groups N and R was found to be equivalent; in self-assessing neuro-endoscopic skill, the participants in these groups were found to have equally low pretraining scores (4/10) with significant improvement shown after NET simulation (6, 7 respectively). Angled scopes resulted in decreased scores with tilted plates compared with straight plates (30° P ≤ 0.04, 45° P ≤ 0.001). With tilted plates, decreased scores were observed when we compared the 0° with 45° endoscope (right, P ≤ 0.008; left, P ≤ 0.002). The NET, a face and construct valid open-source partial task neuroendoscopic trainer, was designed. Presimulation novice neurosurgeons and neurosurgical residents were described as having insufficient skills and preparation to practice neuro-endoscopy. Plate tilt and endoscope angle were shown to be important factors in participant performance. The NET was found to be a useful partial-task trainer for skill building in neuro-endoscopy. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Universal Signal Conditioning Amplifier

    NASA Technical Reports Server (NTRS)

    Kinney, Frank

    1997-01-01

    The Technological Research and Development Authority (TRDA) and NASA-KSC entered into a cooperative agreement in March of 1994 to achieve the utilization and commercialization of a technology development for benefiting both the Space Program and U.S. industry on a "dual-use basis". The technology involved in this transfer is a new, unique Universal Conditioning Amplifier (USCA) used in connection with various types of transducers. The project was initiated in partnership with I-Net Corporation, Lockheed Martin Telemetry & Instrumentation (formerly Loral Test and Information Systems) and Brevard Community College. The project consists of designing, miniaturizing, manufacturing, and testing an existing prototype of USCA that was developed for NASA-KSC by the I-Net Corporation. The USCA is a rugged and field-installable self (or remotely)- programmable amplifier that works in combination with a tag random access memory (RAM) attached to various types of transducers. This summary report comprises performance evaluations, TRDA partnership tasks, a project summary, project milestones and results.

  18. The importance of using open source technologies and common standards for interoperability within eHealth: Perspectives from the Millennium Villages Project

    PubMed Central

    Borland, Rob; Barasa, Mourice; Iiams-Hauser, Casey; Velez, Olivia; Kaonga, Nadi Nina; Berg, Matt

    2013-01-01

    The purpose of this paper is to illustrate the importance of using open source technologies and common standards for interoperability when implementing eHealth systems and illustrate this through case studies, where possible. The sources used to inform this paper draw from the implementation and evaluation of the eHealth Program in the context of the Millennium Villages Project (MVP). As the eHealth Team was tasked to deploy an eHealth architecture, the Millennium Villages Global-Network (MVG-Net), across all fourteen of the MVP sites in Sub-Saharan Africa, the team recognized the need for standards and uniformity but also realized that context would be an important factor. Therefore, the team decided to utilize open source solutions. The MVP implementation of MVG-Net provides a model for those looking to implement informatics solutions across disciplines and countries. Furthermore, there are valuable lessons learned that the eHealth community can benefit from. By sharing lessons learned and developing an accessible, open-source eHealth platform, we believe that we can more efficiently and rapidly achieve the health-related and collaborative Millennium Development Goals (MDGs). PMID:22894051

  19. Three case studies of the GasNet model in discrete domains.

    PubMed

    Santos, C L; de Oliveira, P P; Husbands, P; Souza, C R

    2001-06-01

    A new neural network model - the GasNet - has been recently reported in the literature, which, in addition to the traditional electric type, point-to-point communication between units, also uses communication through a diffilsable chemical modulator. Here we assess the applicability of this model in three different scenarios, the XOR problem, a food gathering task for a simulated robot, and a docking task for a virtual spaceship. All of them represent discrete domains, a contrast with the one where the GasNet was originally introduced, which had an essentially continuous nature. These scenarios are well-known benchmark problems from the literature and, since they exhibit varying degrees of complexity, they impose distinct performance demands on the GasNet. The experiments were primarily intended to better understand the model, by extending the original problem domain where GasNet was introduced. The results reported point at some difficulties with the current GasNet model.

  20. Addressing the NETS*S in K-12 Classrooms: Implications for Teacher Education

    ERIC Educational Resources Information Center

    Niederhauser, Dale S.; Lindstrom, Denise L.; Strobel, Johannes

    2007-01-01

    The National Educational Technology Standards for Students (NETS*S) were developed to provide guidelines for effective and meaningful technology use with K-12 students. In the present study we used the NETS*S as a framework to analyze ways that teachers integrated instructional technology use and provided opportunities for their students to…

  1. Patient-centered technological assessment and monitoring of depression for low-income patients.

    PubMed

    Wu, Shinyi; Vidyanti, Irene; Liu, Pai; Hawkins, Caitlin; Ramirez, Magaly; Guterman, Jeffrey; Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Ell, Kathleen

    2014-01-01

    Depression is a significant challenge for ambulatory care because it worsens health status and outcomes, increases health care utilizations and costs, and elevates suicide risk. An automatic telephonic assessment (ATA) system that links with tasks and alerts to providers may improve quality of depression care and increase provider productivity. We used ATA system in a trial to assess and monitor depressive symptoms of 444 safety-net primary care patients with diabetes. We assessed system properties, evaluated preliminary clinical outcomes, and estimated cost savings. The ATA system is feasible, reliable, valid, safe, and likely cost-effective for depression screening and monitoring for low-income primary care population.

  2. Partitioning the Metabolic Cost of Human Running: A Task-by-Task Approach

    PubMed Central

    Arellano, Christopher J.; Kram, Rodger

    2014-01-01

    Compared with other species, humans can be very tractable and thus an ideal “model system” for investigating the metabolic cost of locomotion. Here, we review the biomechanical basis for the metabolic cost of running. Running has been historically modeled as a simple spring-mass system whereby the leg acts as a linear spring, storing, and returning elastic potential energy during stance. However, if running can be modeled as a simple spring-mass system with the underlying assumption of perfect elastic energy storage and return, why does running incur a metabolic cost at all? In 1980, Taylor et al. proposed the “cost of generating force” hypothesis, which was based on the idea that elastic structures allow the muscles to transform metabolic energy into force, and not necessarily mechanical work. In 1990, Kram and Taylor then provided a more explicit and quantitative explanation by demonstrating that the rate of metabolic energy consumption is proportional to body weight and inversely proportional to the time of foot-ground contact for a variety of animals ranging in size and running speed. With a focus on humans, Kram and his colleagues then adopted a task-by-task approach and initially found that the metabolic cost of running could be “individually” partitioned into body weight support (74%), propulsion (37%), and leg-swing (20%). Summing all these biomechanical tasks leads to a paradoxical overestimation of 131%. To further elucidate the possible interactions between these tasks, later studies quantified the reductions in metabolic cost in response to synergistic combinations of body weight support, aiding horizontal forces, and leg-swing-assist forces. This synergistic approach revealed that the interactive nature of body weight support and forward propulsion comprises ∼80% of the net metabolic cost of running. The task of leg-swing at most comprises ∼7% of the net metabolic cost of running and is independent of body weight support and forward propulsion. In our recent experiments, we have continued to refine this task-by-task approach, demonstrating that maintaining lateral balance comprises only 2% of the net metabolic cost of running. In contrast, arm-swing reduces the cost by ∼3%, indicating a net metabolic benefit. Thus, by considering the synergistic nature of body weight support and forward propulsion, as well as the tasks of leg-swing and lateral balance, we can account for 89% of the net metabolic cost of human running. PMID:24838747

  3. Partitioning the metabolic cost of human running: a task-by-task approach.

    PubMed

    Arellano, Christopher J; Kram, Rodger

    2014-12-01

    Compared with other species, humans can be very tractable and thus an ideal "model system" for investigating the metabolic cost of locomotion. Here, we review the biomechanical basis for the metabolic cost of running. Running has been historically modeled as a simple spring-mass system whereby the leg acts as a linear spring, storing, and returning elastic potential energy during stance. However, if running can be modeled as a simple spring-mass system with the underlying assumption of perfect elastic energy storage and return, why does running incur a metabolic cost at all? In 1980, Taylor et al. proposed the "cost of generating force" hypothesis, which was based on the idea that elastic structures allow the muscles to transform metabolic energy into force, and not necessarily mechanical work. In 1990, Kram and Taylor then provided a more explicit and quantitative explanation by demonstrating that the rate of metabolic energy consumption is proportional to body weight and inversely proportional to the time of foot-ground contact for a variety of animals ranging in size and running speed. With a focus on humans, Kram and his colleagues then adopted a task-by-task approach and initially found that the metabolic cost of running could be "individually" partitioned into body weight support (74%), propulsion (37%), and leg-swing (20%). Summing all these biomechanical tasks leads to a paradoxical overestimation of 131%. To further elucidate the possible interactions between these tasks, later studies quantified the reductions in metabolic cost in response to synergistic combinations of body weight support, aiding horizontal forces, and leg-swing-assist forces. This synergistic approach revealed that the interactive nature of body weight support and forward propulsion comprises ∼80% of the net metabolic cost of running. The task of leg-swing at most comprises ∼7% of the net metabolic cost of running and is independent of body weight support and forward propulsion. In our recent experiments, we have continued to refine this task-by-task approach, demonstrating that maintaining lateral balance comprises only 2% of the net metabolic cost of running. In contrast, arm-swing reduces the cost by ∼3%, indicating a net metabolic benefit. Thus, by considering the synergistic nature of body weight support and forward propulsion, as well as the tasks of leg-swing and lateral balance, we can account for 89% of the net metabolic cost of human running. © The Author 2014. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  4. Prioritizing health system and disease burden factors: an evaluation of the net benefit of transferring health technology interventions to different districts in Zimbabwe.

    PubMed

    Shamu, Shepherd; Rusakaniko, Simbarashe; Hongoro, Charles

    2016-01-01

    Health-care technologies (HCTs) play an important role in any country's health-care system. Zimbabwe's health-care system uses a lot of HCTs developed in other countries. However, a number of local factors have affected the absorption and use of these technologies. We therefore set out to test the hypothesis that the net benefit regression framework (NBRF) could be a helpful benefit testing model that enables assessment of intra-national variables in HCT transfer. We used an NBRF model to assess the benefits of transferring cost-effective technologies to different jurisdictions. We used the country's 57 administrative districts to proxy different jurisdictions. For the dependent variable, we combined the cost and effectiveness ratios with the districts' per capita health expenditure. The cost and effectiveness ratios were obtained from HIV/AIDS and malaria randomized controlled trials, which did either a prospective or retrospective cost-effectiveness analysis. The independent variables were district demographic and socioeconomic determinants of health. The study showed that intra-national variation resulted in different net benefits of the same health technology intervention if implemented in different districts in Zimbabwe. The study showed that population data, health data, infrastructure, demographic and health-seeking behavior had significant effects on the net margin benefit for the different districts. The net benefits also differed in terms of magnitude as a result of the local factors. Net benefit testing using local data is a very useful tool for assessing the transferability and further adoption of HCTs developed elsewhere. However, adopting interventions with a positive net benefit should also not be an end in itself. Information on positive or negative net benefit could also be used to ascertain either the level of future savings that a technology can realize or the level of investment needed for the particular technology to become beneficial.

  5. Task planning with uncertainty for robotic systems. Thesis

    NASA Technical Reports Server (NTRS)

    Cao, Tiehua

    1993-01-01

    In a practical robotic system, it is important to represent and plan sequences of operations and to be able to choose an efficient sequence from them for a specific task. During the generation and execution of task plans, different kinds of uncertainty may occur and erroneous states need to be handled to ensure the efficiency and reliability of the system. An approach to task representation, planning, and error recovery for robotic systems is demonstrated. Our approach to task planning is based on an AND/OR net representation, which is then mapped to a Petri net representation of all feasible geometric states and associated feasibility criteria for net transitions. Task decomposition of robotic assembly plans based on this representation is performed on the Petri net for robotic assembly tasks, and the inheritance of properties of liveness, safeness, and reversibility at all levels of decomposition are explored. This approach provides a framework for robust execution of tasks through the properties of traceability and viability. Uncertainty in robotic systems are modeled by local fuzzy variables, fuzzy marking variables, and global fuzzy variables which are incorporated in fuzzy Petri nets. Analysis of properties and reasoning about uncertainty are investigated using fuzzy reasoning structures built into the net. Two applications of fuzzy Petri nets, robot task sequence planning and sensor-based error recovery, are explored. In the first application, the search space for feasible and complete task sequences with correct precedence relationships is reduced via the use of global fuzzy variables in reasoning about subgoals. In the second application, sensory verification operations are modeled by mutually exclusive transitions to reason about local and global fuzzy variables on-line and automatically select a retry or an alternative error recovery sequence when errors occur. Task sequencing and task execution with error recovery capability for one and multiple soft components in robotic systems are investigated.

  6. Web Survey Design in ASP.Net 2.0: A Simple Task with One Line of Code

    ERIC Educational Resources Information Center

    Liu, Chang

    2007-01-01

    Over the past few years, more and more companies have been investing in electronic commerce (EC) by designing and implementing Web-based applications. In the world of practice, the importance of using Web technology to reach individual customers has been presented by many researchers. This paper presents an easy way of conducting marketing…

  7. Factors shaping effective utilization of health information technology in urban safety-net clinics.

    PubMed

    George, Sheba; Garth, Belinda; Fish, Allison; Baker, Richard

    2013-09-01

    Urban safety-net clinics are considered prime targets for the adoption of health information technology innovations; however, little is known about their utilization in such safety-net settings. Current scholarship provides limited guidance on the implementation of health information technology into safety-net settings as it typically assumes that adopting institutions have sufficient basic resources. This study addresses this gap by exploring the unique challenges urban resource-poor safety-net clinics must consider when adopting and utilizing health information technology. In-depth interviews (N = 15) were used with key stakeholders (clinic chief executive officers, medical directors, nursing directors, chief financial officers, and information technology directors) from staff at four clinics to explore (a) nonhealth information technology-related clinic needs, (b) how health information technology may provide solutions, and (c) perceptions of and experiences with health information technology. Participants identified several challenges, some of which appear amenable to health information technology solutions. Also identified were requirements for effective utilization of health information technology including physical infrastructural improvements, funding for equipment/training, creation of user groups to share health information technology knowledge/experiences, and specially tailored electronic billing guidelines. We found that despite the potential benefit that can be derived from health information technologies, the unplanned and uninformed introduction of these tools into these settings might actually create more problems than are solved. From these data, we were able to identify a set of factors that should be considered when integrating health information technology into the existing workflows of low-resourced urban safety-net clinics in order to maximize their utilization and enhance the quality of health care in such settings.

  8. Patient-Centered Technological Assessment and Monitoring of Depression for Low-Income Patients

    PubMed Central

    Wu, Shinyi; Vidyanti, Irene; Liu, Pai; Hawkins, Caitlin; Ramirez, Magaly; Guterman, Jeffrey; Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Ell, Kathleen

    2014-01-01

    Depression is a significant challenge for ambulatory care because it worsens health status and outcomes, increases health care utilizations and costs, and elevates suicide risk. An automatic telephonic assessment (ATA) system that links with tasks and alerts to providers may improve quality of depression care and increase provider productivity. We used ATA system in a trial to assess and monitor depressive symptoms of 444 safety-net primary care patients with diabetes. We assessed system properties, evaluated preliminary clinical outcomes, and estimated cost savings. The ATA system is feasible, reliable, valid, safe, and likely cost-effective for depression screening and monitoring for low-income primary care population. PMID:24525531

  9. Evaluation of Alternative Field Buses for Lighting ControlApplications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Ed; Rubinstein, Francis

    2005-03-21

    The Subcontract Statement of Work consists of two major tasks. This report is the Final Report in fulfillment of the contract deliverable for Task 1. The purpose of Task 1 was to evaluate existing and emerging protocols and standards for interfacing sensors and controllers for communicating with integrated lighting control systems in commercial buildings. The detailed task description follows: Task 1. Evaluate alternative sensor/field buses. The objective of this task is to evaluate existing and emerging standards for interfacing sensors and controllers for communicating with integrated lighting control systems in commercial buildings. The protocols to be evaluated will include atmore » least: (1) 1-Wire Net, (2) DALI, (3) MODBUS (or appropriate substitute such as EIB) and (4) ZigBee. The evaluation will include a comparative matrix for comparing the technical performance features of the different alternative systems. The performance features to be considered include: (1) directionality and network speed, (2) error control, (3) latency times, (4) allowable cable voltage drop, (5) topology, and (6) polarization. Specifically, Subcontractor will: (1) Analyze the proposed network architecture and identify potential problems that may require further research and specification. (2) Help identify and specify additional software and hardware components that may be required for the communications network to operate properly. (3) Identify areas of the architecture that can benefit from existing standards and technology and enumerate those standards and technologies. (4) Identify existing companies that may have relevant technology that can be applied to this research. (5) Help determine if new standards or technologies need to be developed.« less

  10. NetView technical research

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This is the Final Technical Report for the NetView Technical Research task. This report is prepared in accordance with Contract Data Requirements List (CDRL) item A002. NetView assistance was provided and details are presented under the following headings: NetView Management Systems (NMS) project tasks; WBAFB IBM 3090; WPAFB AMDAHL; WPAFB IBM 3084; Hill AFB; McClellan AFB AMDAHL; McClellan AFB IBM 3090; and Warner-Robins AFB.

  11. Applications of neural networks to landmark detection in 3-D surface data

    NASA Astrophysics Data System (ADS)

    Arndt, Craig M.

    1992-09-01

    The problem of identifying key landmarks in 3-dimensional surface data is of considerable interest in solving a number of difficult real-world tasks, including object recognition and image processing. The specific problem that we address in this research is to identify the specific landmarks (anatomical) in human surface data. This is a complex task, currently performed visually by an expert human operator. In order to replace these human operators and increase reliability of the data acquisition, we need to develop a computer algorithm which will utilize the interrelations between the 3-dimensional data to identify the landmarks of interest. The current presentation describes a method for designing, implementing, training, and testing a custom architecture neural network which will perform the landmark identification task. We discuss the performance of the net in relationship to human performance on the same task and how this net has been integrated with other AI and traditional programming methods to produce a powerful analysis tool for computer anthropometry.

  12. ClimateNet: A Machine Learning dataset for Climate Science Research

    NASA Astrophysics Data System (ADS)

    Prabhat, M.; Biard, J.; Ganguly, S.; Ames, S.; Kashinath, K.; Kim, S. K.; Kahou, S.; Maharaj, T.; Beckham, C.; O'Brien, T. A.; Wehner, M. F.; Williams, D. N.; Kunkel, K.; Collins, W. D.

    2017-12-01

    Deep Learning techniques have revolutionized commercial applications in Computer vision, speech recognition and control systems. The key for all of these developments was the creation of a curated, labeled dataset ImageNet, for enabling multiple research groups around the world to develop methods, benchmark performance and compete with each other. The success of Deep Learning can be largely attributed to the broad availability of this dataset. Our empirical investigations have revealed that Deep Learning is similarly poised to benefit the task of pattern detection in climate science. Unfortunately, labeled datasets, a key pre-requisite for training, are hard to find. Individual research groups are typically interested in specialized weather patterns, making it hard to unify, and share datasets across groups and institutions. In this work, we are proposing ClimateNet: a labeled dataset that provides labeled instances of extreme weather patterns, as well as associated raw fields in model and observational output. We develop a schema in NetCDF to enumerate weather pattern classes/types, store bounding boxes, and pixel-masks. We are also working on a TensorFlow implementation to natively import such NetCDF datasets, and are providing a reference convolutional architecture for binary classification tasks. Our hope is that researchers in Climate Science, as well as ML/DL, will be able to use (and extend) ClimateNet to make rapid progress in the application of Deep Learning for Climate Science research.

  13. The equivalency between logic Petri workflow nets and workflow nets.

    PubMed

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.

  14. The Equivalency between Logic Petri Workflow Nets and Workflow Nets

    PubMed Central

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented. PMID:25821845

  15. Network issues for large mass storage requirements

    NASA Technical Reports Server (NTRS)

    Perdue, James

    1992-01-01

    File Servers and Supercomputing environments need high performance networks to balance the I/O requirements seen in today's demanding computing scenarios. UltraNet is one solution which permits both high aggregate transfer rates and high task-to-task transfer rates as demonstrated in actual tests. UltraNet provides this capability as both a Server-to-Server and Server-to-Client access network giving the supercomputing center the following advantages highest performance Transport Level connections (to 40 MBytes/sec effective rates); matches the throughput of the emerging high performance disk technologies, such as RAID, parallel head transfer devices and software striping; supports standard network and file system applications using SOCKET's based application program interface such as FTP, rcp, rdump, etc.; supports access to the Network File System (NFS) and LARGE aggregate bandwidth for large NFS usage; provides access to a distributed, hierarchical data server capability using DISCOS UniTree product; supports file server solutions available from multiple vendors, including Cray, Convex, Alliant, FPS, IBM, and others.

  16. Biophysical and economic limits to negative CO2 emissions

    NASA Astrophysics Data System (ADS)

    Smith, Pete; Davis, Steven J.; Creutzig, Felix; Fuss, Sabine; Minx, Jan; Gabrielle, Benoit; Kato, Etsushi; Jackson, Robert B.; Cowie, Annette; Kriegler, Elmar; van Vuuren, Detlef P.; Rogelj, Joeri; Ciais, Philippe; Milne, Jennifer; Canadell, Josep G.; McCollum, David; Peters, Glen; Andrew, Robbie; Krey, Volker; Shrestha, Gyami; Friedlingstein, Pierre; Gasser, Thomas; Grübler, Arnulf; Heidug, Wolfgang K.; Jonas, Matthias; Jones, Chris D.; Kraxner, Florian; Littleton, Emma; Lowe, Jason; Moreira, José Roberto; Nakicenovic, Nebojsa; Obersteiner, Michael; Patwardhan, Anand; Rogner, Mathis; Rubin, Ed; Sharifi, Ayyoob; Torvanger, Asbjørn; Yamagata, Yoshiki; Edmonds, Jae; Yongsung, Cho

    2016-01-01

    To have a >50% chance of limiting warming below 2 °C, most recent scenarios from integrated assessment models (IAMs) require large-scale deployment of negative emissions technologies (NETs). These are technologies that result in the net removal of greenhouse gases from the atmosphere. We quantify potential global impacts of the different NETs on various factors (such as land, greenhouse gas emissions, water, albedo, nutrients and energy) to determine the biophysical limits to, and economic costs of, their widespread application. Resource implications vary between technologies and need to be satisfactorily addressed if NETs are to have a significant role in achieving climate goals.

  17. Petri nets as a modeling tool for discrete concurrent tasks of the human operator. [describing sequential and parallel demands on human operators

    NASA Technical Reports Server (NTRS)

    Schumacher, W.; Geiser, G.

    1978-01-01

    The basic concepts of Petri nets are reviewed as well as their application as the fundamental model of technical systems with concurrent discrete events such as hardware systems and software models of computers. The use of Petri nets is proposed for modeling the human operator dealing with concurrent discrete tasks. Their properties useful in modeling the human operator are discussed and practical examples are given. By means of and experimental investigation of binary concurrent tasks which are presented in a serial manner, the representation of human behavior by Petri nets is demonstrated.

  18. Renewable Energy Development in Hermosa Beach, California

    NASA Astrophysics Data System (ADS)

    Morris, K.

    2016-12-01

    The City of Hermosa Beach, California, with the support of the AGU's TEX program, is exploring the potential for renewable energy generation inside the City, as part of the implementation of the City's 2015 Municipal Carbon Neutral Plan. Task 1: Estimate the technical potential of existing and future technologies Given the City's characteristics, this task will identify feasible technologies: wind, solar, tidal/wave, wastewater biogas, landfill biogas, microscale anaerobic digestion (AD), and complementary energy storage. Some options may be open to the City acting alone, but others will require working with municipal partners and private entities that provide services to Hermosa Beach (e.g., wastewater treatment). Energy storage is a means to integrate intermittent renewable energy output. Task 2: Review transaction types and pathways In this task, feasible technologies will be further examined in terms of municipal ordinances and contractual paths: (a) power purchase agreements (PPAs) with developers, under which the City would purchase energy or storage services directly; (b) leases with developers, under which the City would rent sites (e.g., municipal rooftops) to developers; (c) ordinances related to permitting, under which the City would reduce regulatory barriers to entry for developers; (d) pilot projects, under which the City would engage with developers to test new technologies such as wind/wave/microscale AD (pursuant to PPAs and/or leases); and (e) existing projects, under which the City would work with current wastewater and landfill contractors to understand (i) current plans to develop renewable energy, and (ii) opportunities for the City to work with such contractors to promote renewable energy. Task 3: Estimate costs by technology Finally, the last task will gather existing information about the costs, both current and projected, of the feasible technologies, including (i) overnight construction cost (capital); (ii) integration costs (e.g., charges from Edison and energy storage); (iii) costs that may be avoided due to promotion of renewable energy; and (iv) comparisons of projected annual nominal costs (in $/MWh and net present values).

  19. Negative emissions—Part 3: Innovation and upscaling

    NASA Astrophysics Data System (ADS)

    Nemet, Gregory F.; Callaghan, Max W.; Creutzig, Felix; Fuss, Sabine; Hartmann, Jens; Hilaire, Jérôme; Lamb, William F.; Minx, Jan C.; Rogers, Sophia; Smith, Pete

    2018-06-01

    We assess the literature on innovation and upscaling for negative emissions technologies (NETs) using a systematic and reproducible literature coding procedure. To structure our review, we employ the framework of sequential stages in the innovation process, with which we code each NETs article in innovation space. We find that while there is a growing body of innovation literature on NETs, 59% of the articles are focused on the earliest stages of the innovation process, ‘research and development’ (R&D). The subsequent stages of innovation are also represented in the literature, but at much lower levels of activity than R&D. Distinguishing between innovation stages that are related to the supply of the technology (R&D, demonstrations, scale up) and demand for the technology (demand pull, niche markets, public acceptance), we find an overwhelming emphasis (83%) on the supply side. BECCS articles have an above average share of demand-side articles while direct air carbon capture and storage has a very low share. Innovation in NETs has much to learn from successfully diffused technologies; appealing to heterogeneous users, managing policy risk, as well as understanding and addressing public concerns are all crucial yet not well represented in the extant literature. Results from integrated assessment models show that while NETs play a key role in the second half of the 21st century for 1.5 °C and 2 °C scenarios, the major period of new NETs deployment is between 2030 and 2050. Given that the broader innovation literature consistently finds long time periods involved in scaling up and deploying novel technologies, there is an urgency to developing NETs that is largely unappreciated. This challenge is exacerbated by the thousands to millions of actors that potentially need to adopt these technologies for them to achieve planetary scale. This urgency is reflected neither in the Paris Agreement nor in most of the literature we review here. If NETs are to be deployed at the levels required to meet 1.5 °C and 2 °C targets, then important post-R&D issues will need to be addressed in the literature, including incentives for early deployment, niche markets, scale-up, demand, and—particularly if deployment is to be hastened—public acceptance.

  20. Using deep learning to segment breast and fibroglandular tissue in MRI volumes.

    PubMed

    Dalmış, Mehmet Ufuk; Litjens, Geert; Holland, Katharina; Setio, Arnaud; Mann, Ritse; Karssemeijer, Nico; Gubern-Mérida, Albert

    2017-02-01

    Automated segmentation of breast and fibroglandular tissue (FGT) is required for various computer-aided applications of breast MRI. Traditional image analysis and computer vision techniques, such atlas, template matching, or, edge and surface detection, have been applied to solve this task. However, applicability of these methods is usually limited by the characteristics of the images used in the study datasets, while breast MRI varies with respect to the different MRI protocols used, in addition to the variability in breast shapes. All this variability, in addition to various MRI artifacts, makes it a challenging task to develop a robust breast and FGT segmentation method using traditional approaches. Therefore, in this study, we investigated the use of a deep-learning approach known as "U-net." We used a dataset of 66 breast MRI's randomly selected from our scientific archive, which includes five different MRI acquisition protocols and breasts from four breast density categories in a balanced distribution. To prepare reference segmentations, we manually segmented breast and FGT for all images using an in-house developed workstation. We experimented with the application of U-net in two different ways for breast and FGT segmentation. In the first method, following the same pipeline used in traditional approaches, we trained two consecutive (2C) U-nets: first for segmenting the breast in the whole MRI volume and the second for segmenting FGT inside the segmented breast. In the second method, we used a single 3-class (3C) U-net, which performs both tasks simultaneously by segmenting the volume into three regions: nonbreast, fat inside the breast, and FGT inside the breast. For comparison, we applied two existing and published methods to our dataset: an atlas-based method and a sheetness-based method. We used Dice Similarity Coefficient (DSC) to measure the performances of the automated methods, with respect to the manual segmentations. Additionally, we computed Pearson's correlation between the breast density values computed based on manual and automated segmentations. The average DSC values for breast segmentation were 0.933, 0.944, 0.863, and 0.848 obtained from 3C U-net, 2C U-nets, atlas-based method, and sheetness-based method, respectively. The average DSC values for FGT segmentation obtained from 3C U-net, 2C U-nets, and atlas-based methods were 0.850, 0.811, and 0.671, respectively. The correlation between breast density values based on 3C U-net and manual segmentations was 0.974. This value was significantly higher than 0.957 as obtained from 2C U-nets (P < 0.0001, Steiger's Z-test with Bonferoni correction) and 0.938 as obtained from atlas-based method (P = 0.0016). In conclusion, we applied a deep-learning method, U-net, for segmenting breast and FGT in MRI in a dataset that includes a variety of MRI protocols and breast densities. Our results showed that U-net-based methods significantly outperformed the existing algorithms and resulted in significantly more accurate breast density computation. © 2016 American Association of Physicists in Medicine.

  1. Classification of foods by transferring knowledge from ImageNet dataset

    NASA Astrophysics Data System (ADS)

    Heravi, Elnaz J.; Aghdam, Hamed H.; Puig, Domenec

    2017-03-01

    Automatic classification of foods is a way to control food intake and tackle with obesity. However, it is a challenging problem since foods are highly deformable and complex objects. Results on ImageNet dataset have revealed that Convolutional Neural Network has a great expressive power to model natural objects. Nonetheless, it is not trivial to train a ConvNet from scratch for classification of foods. This is due to the fact that ConvNets require large datasets and to our knowledge there is not a large public dataset of food for this purpose. Alternative solution is to transfer knowledge from trained ConvNets to the domain of foods. In this work, we study how transferable are state-of-art ConvNets to the task of food classification. We also propose a method for transferring knowledge from a bigger ConvNet to a smaller ConvNet by keeping its accuracy similar to the bigger ConvNet. Our experiments on UECFood256 datasets show that Googlenet, VGG and residual networks produce comparable results if we start transferring knowledge from appropriate layer. In addition, we show that our method is able to effectively transfer knowledge to the smaller ConvNet using unlabeled samples.

  2. Translating self-persuasion into an adolescent HPV vaccine promotion intervention for parents attending safety-net clinics.

    PubMed

    Baldwin, Austin S; Denman, Deanna C; Sala, Margarita; Marks, Emily G; Shay, L Aubree; Fuller, Sobha; Persaud, Donna; Lee, Simon Craddock; Skinner, Celette Sugg; Wiebe, Deborah J; Tiro, Jasmin A

    2017-04-01

    Self-persuasion is an effective behavior change strategy, but has not been translated for low-income, less educated, uninsured populations attending safety-net clinics or to promote human papillomavirus (HPV) vaccination. We developed a tablet-based application (in English and Spanish) to elicit parental self-persuasion for adolescent HPV vaccination and evaluated its feasibility in a safety-net population. Parents (N=45) of age-eligible adolescents used the self-persuasion application. Then, during cognitive interviews, staff gathered quantitative and qualitative feedback on the self-persuasion tasks including parental decision stage. The self-persuasion tasks were rated as easy to complete and helpful. We identified six question prompts rated as uniformly helpful, not difficult to answer, and generated non-redundant responses from participants. Among the 33 parents with unvaccinated adolescents, 27 (81.8%) reported deciding to get their adolescent vaccinated after completing the self-persuasion tasks. The self-persuasion application was feasible and resulted in a change in parents' decision stage. Future studies can now test the efficacy of the tablet-based application on HPV vaccination. The self-persuasion application facilitates verbalization of reasons for HPV vaccination in low literacy, safety-net settings. This self-administered application has the potential to be more easily incorporated into clinical practice than other patient education approaches. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  4. Low Carbon Technology Options for the Natural Gas ...

    EPA Pesticide Factsheets

    The ultimate goal of this task is to perform environmental and economic analysis of natural gas based power production technologies (different routes) to investigate and evaluate strategies for reducing emissions from the power sector. It is a broad research area. Initially, the research will be focused on the preliminary analyses of hydrogen fuel based power production technologies utilizing hydrogen fuel in a large size, heavy-duty gas turbines in integrated reformer combined cycle (IRCC) and integrated gasification combined cycle (IGCC) for electric power generation. The research will be expanded step-by-step to include other advanced (e.g., Net Power, a potentially transformative technology utilizing a high efficiency CO2 conversion cycle (Allam cycle), and chemical looping etc.) pre-combustion and post-combustion technologies applied to natural gas, other fossil fuels (coal and heavy oil) and biomass/biofuel based on findings. Screening analysis is already under development and data for the analysis is being processed. The immediate action on this task include preliminary economic and environmental analysis of power production technologies applied to natural gas. Data for catalytic reforming technology to produce hydrogen from natural gas is being collected and compiled on Microsoft Excel. The model will be expanded for exploring and comparing various technologies scenarios to meet our goal. The primary focus of this study is to: 1) understand the chemic

  5. Hierarchical Goal Network Planning: Initial Results

    DTIC Science & Technology

    2011-05-31

    svikas@cs.umd.edu Ugur Kuter Smart Information Flow Technologies 211 North 1st Street Minneapolis, MN 55401 USA ukuter@sift.net Dana S. Nau Dept. of...inferred. References [1] Ron Alford, Ugur Kuter, and Dana S. Nau. Translating HTNs to PDDL: A small amount of domain knowledge can go a long way. In...10] Ugur Kuter, Dana S. Nau, Marco Pistore, and Paolo Traverso. Task decomposition on abstract states, for planning under nondeterminism. Artif

  6. Advanced Technology Direction and Control Communications Systems

    DTIC Science & Technology

    1979-07-16

    Leupert amesD./Morrell ~I~ Frederick . euprJae DCPAaol-78-C-1 259] ,. IERFORMlk6 ORGANIZATION NAME AND ADDRESS %a. P ROGRAM ELEMiENT. PROJIECT. TASK AREA...without the change in communicatios LCciniLlucs b•;i,,g aipiarci-IL. A JTIDS net is organized on the principle of time division. Various elements are...sideband (DSB-AI) "* Amplitude modulation, vestigial sideband (VSB-AM) * Frequency sI Shift modulation (FSK) "* Phase shift modulation (PSK) Ti DSB

  7. Task sequence planning in a robot workcell using AND/OR nets

    NASA Technical Reports Server (NTRS)

    Cao, Tiehua; Sanderson, Arthur C.

    1991-01-01

    An approach to task sequence planning for a generalized robotic manufacturing or material handling workcell is described. Given the descriptions of the objects in this system and all feasible geometric relationships among these objects, an AND/OR net which describes the relationships of all feasible geometric states and associated feasibility criteria for net transitions is generated. This AND/OR net is mapped into a Petri net which incorporates all feasible sequences of operations. The resulting Petri net is shown to be bounded and have guaranteed properties of liveness, safeness, and reversibility. Sequences are found from the reachability tree of the Petri net. Feasibility criteria for net transitions may be used to generate an extended Petri net representation of lower level command sequences. The resulting Petri net representation may be used for on-line scheduling and control of the system of feasible sequences. A simulation example of the sequences is described.

  8. Enterprise systems security management: a framework for breakthrough protection

    NASA Astrophysics Data System (ADS)

    Farroha, Bassam S.; Farroha, Deborah L.

    2010-04-01

    Securing the DoD information network is a tremendous task due to its size, access locations and the amount of network intrusion attempts on a daily basis. This analysis investigates methods/architecture options to deliver capabilities for secure information sharing environment. Crypto-binding and intelligent access controls are basic requirements for secure information sharing in a net-centric environment. We introduce many of the new technology components to secure the enterprise. The cooperative mission requirements lead to developing automatic data discovery and data stewards granting access to Cross Domain (CD) data repositories or live streaming data. Multiple architecture models are investigated to determine best-of-breed approaches including SOA and Private/Public Clouds.

  9. Efficiency achievements from a user-developed real-time modifiable clinical information system.

    PubMed

    Bishop, Roderick O; Patrick, Jon; Besiso, Ali

    2015-02-01

    This investigation was initiated after the introduction of a new information system into the Nepean Hospital Emergency Department. A retrospective study determined that the problems introduced by the new system led to reduced efficiency of the clinical staff, demonstrated by deterioration in the emergency department's (ED's) performance. This article is an investigation of methods to improve the design and implementation of clinical information systems for an ED by using a process of clinical team-led design and a technology built on a radically new philosophy denoted as emergent clinical information systems. The specific objectives were to construct a system, the Nepean Emergency Department Information Management System (NEDIMS), using a combination of new design methods; determine whether it provided any reduction in time and click burden on the user in comparison to an enterprise proprietary system, Cerner FirstNet; and design and evaluate a model of the effect that any reduction had on patient throughput in the department. The methodology for conducting a direct comparison between the 2 systems used the 6 activity centers in the ED of clerking, triage, nursing assessments, fast track, acute care, and nurse unit manager. A quantitative study involved the 2 systems being measured for their efficiency on 17 tasks taken from the activity centers. A total of 332 task instances were measured for duration and number of mouse clicks in live usage on Cerner FirstNet and in reproduction of the same Cerner FirstNet work on NEDIMS as an off-line system. The results showed that NEDIMS is at least 41% more efficient than Cerner FirstNet (95% confidence interval 21.6% to 59.8%). In some cases, the NEDIMS tasks were remodeled to demonstrate the value of feedback to create improvements and the speed and economy of design revision in the emergent clinical information systems approach. The cost of the effort in remodeling the designs showed that the time spent on remodeling is recovered within a few days in time savings to clinicians. An analysis of the differences between Cerner FirstNet and NEDIMS for sequences of patient journeys showed an average difference of 127 seconds and 15.2 clicks. A simulation model of workflows for typical patient journeys for a normal daily attendance of 165 patients showed that NEDIMS saved 23.9 hours of staff time per day compared with Cerner FirstNet. The results of this investigation show that information systems that are designed by a clinical team using a technology that enables real-time adaptation provides much greater efficiency for the ED. Staff consider that a point-and-click user interface constantly interrupts their train of thought in a way that does not happen when writing on paper. This is partially overcome by the reduction of cognitive load that arises from minimizing the number of clicks to complete a task in the context of global versus local workflow optimization. Copyright © 2014 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  10. Meta-Analytic Evidence for a Reversal Learning Effect on the Iowa Gambling Task in Older Adults.

    PubMed

    Pasion, Rita; Gonçalves, Ana R; Fernandes, Carina; Ferreira-Santos, Fernando; Barbosa, Fernando; Marques-Teixeira, João

    2017-01-01

    Iowa Gambling Task (IGT) is one of the most widely used tools to assess economic decision-making. However, the research tradition on aging and the Iowa Gambling Task (IGT) has been mainly focused on the overall performance of older adults in relation to younger or clinical groups, remaining unclear whether older adults are capable of learning along the task. We conducted a meta-analysis to examine older adults' decision-making on the IGT, to test the effects of aging on reversal learning (45 studies) and to provide normative data on total and block net scores (55 studies). From the accumulated empirical evidence, we found an average total net score of 7.55 (±25.9). We also observed a significant reversal learning effect along the blocks of the IGT, indicating that older adults inhibit the prepotent response toward immediately attractive options associated with high losses, in favor of initially less attractive options associated with long-run profit. During block 1, decisions of older adults led to a negative gambling net score, reflecting the expected initial pattern of risk-taking. However, the shift toward more safe options occurred between block 2 (small-to-medium effect size) and blocks 3, 4, 5 (medium-to-large effect size). These main findings highlight that older adults are able to move from the initial uncertainty, when the possible outcomes are unknown, to decisions based on risk, when the outcomes are learned and may be used to guide future adaptive decision-making.

  11. Multi-tasking arbitration and behaviour design for human-interactive robots

    NASA Astrophysics Data System (ADS)

    Kobayashi, Yuichi; Onishi, Masaki; Hosoe, Shigeyuki; Luo, Zhiwei

    2013-05-01

    Robots that interact with humans in household environments are required to handle multiple real-time tasks simultaneously, such as carrying objects, collision avoidance and conversation with human. This article presents a design framework for the control and recognition processes to meet these requirements taking into account stochastic human behaviour. The proposed design method first introduces a Petri net for synchronisation of multiple tasks. The Petri net formulation is converted to Markov decision processes and processed in an optimal control framework. Three tasks (safety confirmation, object conveyance and conversation) interact and are expressed by the Petri net. Using the proposed framework, tasks that normally tend to be designed by integrating many if-then rules can be designed in a systematic manner in a state estimation and optimisation framework from the viewpoint of the shortest time optimal control. The proposed arbitration method was verified by simulations and experiments using RI-MAN, which was developed for interactive tasks with humans.

  12. TopologyNet: Topology based deep convolutional and multi-task neural networks for biomolecular property predictions

    PubMed Central

    2017-01-01

    Although deep learning approaches have had tremendous success in image, video and audio processing, computer vision, and speech recognition, their applications to three-dimensional (3D) biomolecular structural data sets have been hindered by the geometric and biological complexity. To address this problem we introduce the element-specific persistent homology (ESPH) method. ESPH represents 3D complex geometry by one-dimensional (1D) topological invariants and retains important biological information via a multichannel image-like representation. This representation reveals hidden structure-function relationships in biomolecules. We further integrate ESPH and deep convolutional neural networks to construct a multichannel topological neural network (TopologyNet) for the predictions of protein-ligand binding affinities and protein stability changes upon mutation. To overcome the deep learning limitations from small and noisy training sets, we propose a multi-task multichannel topological convolutional neural network (MM-TCNN). We demonstrate that TopologyNet outperforms the latest methods in the prediction of protein-ligand binding affinities, mutation induced globular protein folding free energy changes, and mutation induced membrane protein folding free energy changes. Availability: weilab.math.msu.edu/TDL/ PMID:28749969

  13. NetVLAD: CNN Architecture for Weakly Supervised Place Recognition.

    PubMed

    Arandjelovic, Relja; Gronat, Petr; Torii, Akihiko; Pajdla, Tomas; Sivic, Josef

    2018-06-01

    We tackle the problem of large scale visual place recognition, where the task is to quickly and accurately recognize the location of a given query photograph. We present the following four principal contributions. First, we develop a convolutional neural network (CNN) architecture that is trainable in an end-to-end manner directly for the place recognition task. The main component of this architecture, NetVLAD, is a new generalized VLAD layer, inspired by the "Vector of Locally Aggregated Descriptors" image representation commonly used in image retrieval. The layer is readily pluggable into any CNN architecture and amenable to training via backpropagation. Second, we create a new weakly supervised ranking loss, which enables end-to-end learning of the architecture's parameters from images depicting the same places over time downloaded from Google Street View Time Machine. Third, we develop an efficient training procedure which can be applied on very large-scale weakly labelled tasks. Finally, we show that the proposed architecture and training procedure significantly outperform non-learnt image representations and off-the-shelf CNN descriptors on challenging place recognition and image retrieval benchmarks.

  14. Utilizing Free and Open Source Software to access, view and compare in situ observations, EO products and model output data

    NASA Astrophysics Data System (ADS)

    Vines, Aleksander; Hamre, Torill; Lygre, Kjetil

    2014-05-01

    The GreenSeas project (Development of global plankton data base and model system for eco-climate early warning) aims to advance the knowledge and predictive capacities of how marine ecosystems will respond to global change. A main task has been to set up a data delivery and monitoring core service following the open and free data access policy implemented in the Global Monitoring for the Environment and Security (GMES) programme. The aim is to ensure open and free access to historical plankton data, new data (EO products and in situ measurements), model data (including estimates of simulation error) and biological, environmental and climatic indicators to a range of stakeholders, such as scientists, policy makers and environmental managers. To this end, we have developed a geo-spatial database of both historical and new in situ physical, biological and chemical parameters for the Southern Ocean, Atlantic, Nordic Seas and the Arctic, and organized related satellite-derived quantities and model forecasts in a joint geo-spatial repository. For easy access to these data, we have implemented a web-based GIS (Geographical Information Systems) where observed, derived and forcasted parameters can be searched, displayed, compared and exported. Model forecasts can also be uploaded dynamically to the system, to allow modelers to quickly compare their results with available in situ and satellite observations. We have implemented the web-based GIS(Geographical Information Systems) system based on free and open source technologies: Thredds Data Server, ncWMS, GeoServer, OpenLayers, PostGIS, Liferay, Apache Tomcat, PRTree, NetCDF-Java, json-simple, Geotoolkit, Highcharts, GeoExt, MapFish, FileSaver, jQuery, jstree and qUnit. We also wanted to used open standards to communicate between the different services and we use WMS, WFS, netCDF, GML, OPeNDAP, JSON, and SLD. The main advantage we got from using FOSS was that we did not have to invent the wheel all over again, but could use already existing code and functionalities on our software for free: Of course most the software did not have to be open source for this, but in some cases we had to do minor modifications to make the different technologies work together. We could extract the parts of the code that we needed for a specific task. One example of this was to use part of the code from ncWMS and Thredds to help our main application to both read netCDF files and present them in the browser. This presentation will focus on both difficulties we had with and advantages we got from developing this tool with FOSS.

  15. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under-performers: they counted quite reliably each short read to their respective taxon, producing the typical genome length bias. The benchmark dataset is available at http://pitgroup.org/static/3RandomGenome-100kavg150bps.fna.

  16. A colored petri nets based workload evaluation model and its validation through Multi-Attribute Task Battery-II.

    PubMed

    Wang, Peng; Fang, Weining; Guo, Beiyuan

    2017-04-01

    This paper proposed a colored petri nets based workload evaluation model. A formal interpretation of workload was firstly introduced based on the process that reflection of petri nets components to task. A petri net based description of Multiple Resources theory was given by comprehending it from a new angle. A new application of VACP rating scales named V/A-C-P unit, and the definition of colored transitions were proposed to build a model of task process. The calculation of workload mainly has the following four steps: determine token's initial position and values; calculate the weight of directed arcs on the basis of the rules proposed; calculate workload from different transitions, and correct the influence of repetitive behaviors. Verify experiments were carried out based on Multi-Attribute Task Battery-II software. Our results show that there is a strong correlation between the model values and NASA -Task Load Index scores (r=0.9513). In addition, this method can also distinguish behavior characteristics between different people. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Deep learning with convolutional neural networks for EEG decoding and visualization.

    PubMed

    Schirrmeister, Robin Tibor; Springenberg, Jost Tobias; Fiederer, Lukas Dominique Josef; Glasstetter, Martin; Eggensperger, Katharina; Tangermann, Michael; Hutter, Frank; Burgard, Wolfram; Ball, Tonio

    2017-11-01

    Deep learning with convolutional neural networks (deep ConvNets) has revolutionized computer vision through end-to-end learning, that is, learning from the raw data. There is increasing interest in using deep ConvNets for end-to-end EEG analysis, but a better understanding of how to design and train ConvNets for end-to-end EEG decoding and how to visualize the informative EEG features the ConvNets learn is still needed. Here, we studied deep ConvNets with a range of different architectures, designed for decoding imagined or executed tasks from raw EEG. Our results show that recent advances from the machine learning field, including batch normalization and exponential linear units, together with a cropped training strategy, boosted the deep ConvNets decoding performance, reaching at least as good performance as the widely used filter bank common spatial patterns (FBCSP) algorithm (mean decoding accuracies 82.1% FBCSP, 84.0% deep ConvNets). While FBCSP is designed to use spectral power modulations, the features used by ConvNets are not fixed a priori. Our novel methods for visualizing the learned features demonstrated that ConvNets indeed learned to use spectral power modulations in the alpha, beta, and high gamma frequencies, and proved useful for spatially mapping the learned features by revealing the topography of the causal contributions of features in different frequency bands to the decoding decision. Our study thus shows how to design and train ConvNets to decode task-related information from the raw EEG without handcrafted features and highlights the potential of deep ConvNets combined with advanced visualization techniques for EEG-based brain mapping. Hum Brain Mapp 38:5391-5420, 2017. © 2017 Wiley Periodicals, Inc. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  18. Correlational Neural Networks.

    PubMed

    Chandar, Sarath; Khapra, Mitesh M; Larochelle, Hugo; Ravindran, Balaraman

    2016-02-01

    Common representation learning (CRL), wherein different descriptions (or views) of the data are embedded in a common subspace, has been receiving a lot of attention recently. Two popular paradigms here are canonical correlation analysis (CCA)-based approaches and autoencoder (AE)-based approaches. CCA-based approaches learn a joint representation by maximizing correlation of the views when projected to the common subspace. AE-based methods learn a common representation by minimizing the error of reconstructing the two views. Each of these approaches has its own advantages and disadvantages. For example, while CCA-based approaches outperform AE-based approaches for the task of transfer learning, they are not as scalable as the latter. In this work, we propose an AE-based approach, correlational neural network (CorrNet), that explicitly maximizes correlation among the views when projected to the common subspace. Through a series of experiments, we demonstrate that the proposed CorrNet is better than AE and CCA with respect to its ability to learn correlated common representations. We employ CorrNet for several cross-language tasks and show that the representations learned using it perform better than the ones learned using other state-of-the-art approaches.

  19. X3D-Earth: Full Globe Coverage Utilizing Multiple Dataset

    DTIC Science & Technology

    2010-09-01

    DtedNvtProcessor Class ..................................................128 Figure 63. Subversion Checkout in Netbeans ...to the Ant build.xml file within a NetBeans Project: <target name=“moveToHamming” depends=““> <scp todir=“user@hamming.uc.nps.edu:/work/user/DTED...This task was generated using the NetBeans IDE (can be downloaded at www.netbeans.org). The task was then executed within NetBeans . This type of

  20. Meta-Analytic Evidence for a Reversal Learning Effect on the Iowa Gambling Task in Older Adults

    PubMed Central

    Pasion, Rita; Gonçalves, Ana R.; Fernandes, Carina; Ferreira-Santos, Fernando; Barbosa, Fernando; Marques-Teixeira, João

    2017-01-01

    Iowa Gambling Task (IGT) is one of the most widely used tools to assess economic decision-making. However, the research tradition on aging and the Iowa Gambling Task (IGT) has been mainly focused on the overall performance of older adults in relation to younger or clinical groups, remaining unclear whether older adults are capable of learning along the task. We conducted a meta-analysis to examine older adults' decision-making on the IGT, to test the effects of aging on reversal learning (45 studies) and to provide normative data on total and block net scores (55 studies). From the accumulated empirical evidence, we found an average total net score of 7.55 (±25.9). We also observed a significant reversal learning effect along the blocks of the IGT, indicating that older adults inhibit the prepotent response toward immediately attractive options associated with high losses, in favor of initially less attractive options associated with long-run profit. During block 1, decisions of older adults led to a negative gambling net score, reflecting the expected initial pattern of risk-taking. However, the shift toward more safe options occurred between block 2 (small-to-medium effect size) and blocks 3, 4, 5 (medium-to-large effect size). These main findings highlight that older adults are able to move from the initial uncertainty, when the possible outcomes are unknown, to decisions based on risk, when the outcomes are learned and may be used to guide future adaptive decision-making. PMID:29075222

  1. ChemNet: A Transferable and Generalizable Deep Neural Network for Small-Molecule Property Prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goh, Garrett B.; Siegel, Charles M.; Vishnu, Abhinav

    With access to large datasets, deep neural networks through representation learning have been able to identify patterns from raw data, achieving human-level accuracy in image and speech recognition tasks. However, in chemistry, availability of large standardized and labelled datasets is scarce, and with a multitude of chemical properties of interest, chemical data is inherently small and fragmented. In this work, we explore transfer learning techniques in conjunction with the existing Chemception CNN model, to create a transferable and generalizable deep neural network for small-molecule property prediction. Our latest model, ChemNet learns in a semi-supervised manner from inexpensive labels computed frommore » the ChEMBL database. When fine-tuned to the Tox21, HIV and FreeSolv dataset, which are 3 separate chemical tasks that ChemNet was not originally trained on, we demonstrate that ChemNet exceeds the performance of existing Chemception models, contemporary MLP models that trains on molecular fingerprints, and it matches the performance of the ConvGraph algorithm, the current state-of-the-art. Furthermore, as ChemNet has been pre-trained on a large diverse chemical database, it can be used as a universal “plug-and-play” deep neural network, which accelerates the deployment of deep neural networks for the prediction of novel small-molecule chemical properties.« less

  2. Evaluating the Paper-to-Screen Translation of Participant-Aided Sociograms with High-Risk Participants

    PubMed Central

    Hogan, Bernie; Melville, Joshua R.; Philips, Gregory Lee; Janulis, Patrick; Contractor, Noshir; Mustanski, Brian S.; Birkett, Michelle

    2016-01-01

    While much social network data exists online, key network metrics for high-risk populations must still be captured through self-report. This practice has suffered from numerous limitations in workflow and response burden. However, advances in technology, network drawing libraries and databases are making interactive network drawing increasingly feasible. We describe the translation of an analog-based technique for capturing personal networks into a digital framework termed netCanvas that addresses many existing shortcomings such as: 1) complex data entry; 2) extensive interviewer intervention and field setup; 3) difficulties in data reuse; and 4) a lack of dynamic visualizations. We test this implementation within a health behavior study of a high-risk and difficult-to-reach population. We provide a within–subjects comparison between paper and touchscreens. We assert that touchscreen-based social network capture is now a viable alternative for highly sensitive data and social network data entry tasks. PMID:28018995

  3. Evaluating the Paper-to-Screen Translation of Participant-Aided Sociograms with High-Risk Participants.

    PubMed

    Hogan, Bernie; Melville, Joshua R; Philips, Gregory Lee; Janulis, Patrick; Contractor, Noshir; Mustanski, Brian S; Birkett, Michelle

    2016-05-01

    While much social network data exists online, key network metrics for high-risk populations must still be captured through self-report. This practice has suffered from numerous limitations in workflow and response burden. However, advances in technology, network drawing libraries and databases are making interactive network drawing increasingly feasible. We describe the translation of an analog-based technique for capturing personal networks into a digital framework termed netCanvas that addresses many existing shortcomings such as: 1) complex data entry; 2) extensive interviewer intervention and field setup; 3) difficulties in data reuse; and 4) a lack of dynamic visualizations. We test this implementation within a health behavior study of a high-risk and difficult-to-reach population. We provide a within-subjects comparison between paper and touchscreens. We assert that touchscreen-based social network capture is now a viable alternative for highly sensitive data and social network data entry tasks.

  4. Making working memory work: a meta-analysis of executive-control and working memory training in older adults.

    PubMed

    Karbach, Julia; Verhaeghen, Paul

    2014-11-01

    This meta-analysis examined the effects of process-based executive-function and working memory training (49 articles, 61 independent samples) in older adults (> 60 years). The interventions resulted in significant effects on performance on the trained task and near-transfer tasks; significant results were obtained for the net pretest-to-posttest gain relative to active and passive control groups and for the net effect at posttest relative to active and passive control groups. Far-transfer effects were smaller than near-transfer effects but were significant for the net pretest-to-posttest gain relative to passive control groups and for the net gain at posttest relative to both active and passive control groups. We detected marginally significant differences in training-induced improvements between working memory and executive-function training, but no differences between the training-induced improvements observed in older adults and younger adults, between the benefits associated with adaptive and nonadaptive training, or between the effects in active and passive control conditions. Gains did not vary with total training time. © The Author(s) 2014.

  5. RATT: Rapid Annotation Transfer Tool

    PubMed Central

    Otto, Thomas D.; Dillon, Gary P.; Degrave, Wim S.; Berriman, Matthew

    2011-01-01

    Second-generation sequencing technologies have made large-scale sequencing projects commonplace. However, making use of these datasets often requires gene function to be ascribed genome wide. Although tool development has kept pace with the changes in sequence production, for tasks such as mapping, de novo assembly or visualization, genome annotation remains a challenge. We have developed a method to rapidly provide accurate annotation for new genomes using previously annotated genomes as a reference. The method, implemented in a tool called RATT (Rapid Annotation Transfer Tool), transfers annotations from a high-quality reference to a new genome on the basis of conserved synteny. We demonstrate that a Mycobacterium tuberculosis genome or a single 2.5 Mb chromosome from a malaria parasite can be annotated in less than five minutes with only modest computational resources. RATT is available at http://ratt.sourceforge.net. PMID:21306991

  6. WE-G-BRA-02: SafetyNet: Automating Radiotherapy QA with An Event Driven Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadley, S; Kessler, M; Litzenberg, D

    2015-06-15

    Purpose: Quality assurance is an essential task in radiotherapy that often requires many manual tasks. We investigate the use of an event driven framework in conjunction with software agents to automate QA and eliminate wait times. Methods: An in house developed subscription-publication service, EventNet, was added to the Aria OIS to be a message broker for critical events occurring in the OIS and software agents. Software agents operate without user intervention and perform critical QA steps. The results of the QA are documented and the resulting event is generated and passed back to EventNet. Users can subscribe to those eventsmore » and receive messages based on custom filters designed to send passing or failing results to physicists or dosimetrists. Agents were developed to expedite the following QA tasks: Plan Revision, Plan 2nd Check, SRS Winston-Lutz isocenter, Treatment History Audit, Treatment Machine Configuration. Results: Plan approval in the Aria OIS was used as the event trigger for plan revision QA and Plan 2nd check agents. The agents pulled the plan data, executed the prescribed QA, stored the results and updated EventNet for publication. The Winston Lutz agent reduced QA time from 20 minutes to 4 minutes and provided a more accurate quantitative estimate of radiation isocenter. The Treatment Machine Configuration agent automatically reports any changes to the Treatment machine or HDR unit configuration. The agents are reliable, act immediately, and execute each task identically every time. Conclusion: An event driven framework has inverted the data chase in our radiotherapy QA process. Rather than have dosimetrists and physicists push data to QA software and pull results back into the OIS, the software agents perform these steps immediately upon receiving the sentinel events from EventNet. Mr Keranen is an employee of Varian Medical Systems. Dr. Moran’s institution receives research support for her effort for a linear accelerator QA project from Varian Medical Systems. Other quality projects involving her effort are funded by Blue Cross Blue Shield of Michigan, Breast Cancer Research Foundation, and the NIH.« less

  7. Advancing medical device innovation through collaboration and coordination of structured data capture pilots: Report from the Medical Device Epidemiology Network (MDEpiNet) Specific, Measurable, Achievable, Results-Oriented, Time Bound (SMART) Think Tank.

    PubMed

    Reed, Terrie L; Drozda, Joseph P; Baskin, Kevin M; Tcheng, James; Conway, Karen; Wilson, Natalia; Marinac-Dabic, Danica; Heise, Theodore; Krucoff, Mitchell W

    2017-12-01

    The Medical Device Epidemiology Network (MDEpiNet) is a public private partnership (PPP) that provides a platform for collaboration on medical device evaluation and depth of expertise for supporting pilots to capture, exchange and use device information for improving device safety and protecting public health. The MDEpiNet SMART Think Tank, held in February, 2013, sought to engage expert stakeholders who were committed to improving the capture of device data, including Unique Device Identification (UDI), in key electronic health information. Prior to the Think Tank there was limited collaboration among stakeholders beyond a few single health care organizations engaged in electronic capture and exchange of device data. The Think Tank resulted in what has become two sustainable multi-stakeholder device data capture initiatives, BUILD and VANGUARD. These initiatives continue to mature within the MDEpiNet PPP structure and are well aligned with the goals outlined in recent FDA-initiated National Medical Device Planning Board and Medical Device Registry Task Force white papers as well as the vision for the National Evaluation System for health Technology.%. Published by Elsevier Inc.

  8. Automated Pathogenesis-Based Diagnosis of Lumbar Neural Foraminal Stenosis via Deep Multiscale Multitask Learning.

    PubMed

    Han, Zhongyi; Wei, Benzheng; Leung, Stephanie; Nachum, Ilanit Ben; Laidley, David; Li, Shuo

    2018-02-15

    Pathogenesis-based diagnosis is a key step to prevent and control lumbar neural foraminal stenosis (LNFS). It conducts both early diagnosis and comprehensive assessment by drawing crucial pathological links between pathogenic factors and LNFS. Automated pathogenesis-based diagnosis would simultaneously localize and grade multiple spinal organs (neural foramina, vertebrae, intervertebral discs) to diagnose LNFS and discover pathogenic factors. The automated way facilitates planning optimal therapeutic schedules and relieving clinicians from laborious workloads. However, no successful work has been achieved yet due to its extreme challenges since 1) multiple targets: each lumbar spine has at least 17 target organs, 2) multiple scales: each type of target organ has structural complexity and various scales across subjects, and 3) multiple tasks, i.e., simultaneous localization and diagnosis of all lumbar organs, are extremely difficult than individual tasks. To address these huge challenges, we propose a deep multiscale multitask learning network (DMML-Net) integrating a multiscale multi-output learning and a multitask regression learning into a fully convolutional network. 1) DMML-Net merges semantic representations to reinforce the salience of numerous target organs. 2) DMML-Net extends multiscale convolutional layers as multiple output layers to boost the scale-invariance for various organs. 3) DMML-Net joins a multitask regression module and a multitask loss module to prompt the mutual benefit between tasks. Extensive experimental results demonstrate that DMML-Net achieves high performance (0.845 mean average precision) on T1/T2-weighted MRI scans from 200 subjects. This endows our method an efficient tool for clinical LNFS diagnosis.

  9. Muscles do more positive than negative work in human locomotion

    PubMed Central

    DeVita, Paul; Helseth, Joseph; Hortobagyi, Tibor

    2008-01-01

    Summary Muscle work during level walking and ascent and descent ramp and stairway walking was assessed in order to explore the proposition that muscles perform more positive than negative work during these locomotion tasks. Thirty four healthy human adults were tested while maintaining a constant average walking velocity in the five gait conditions. Ground reaction force and sagittal plane kinematic data were obtained during the stance phases of these gaits and used in inverse dynamic analyses to calculate joint torques and powers at the hip, knee and ankle. Muscle work was derived as the area under the joint power vs time curves and was partitioned into positive, negative and net components. Dependent t-tests were used to compare positive and negative work in level walking and net joint work between ascent and descent gaits on the ramp and stairs (P<0.010). Total negative and positive work in level walking was −34 J and 50 J, respectively, with the difference in magnitude being statistically significant (P<0.001). Level walking was therefore performed with 16 J of net positive muscle work per step. The magnitude of the net work in ramp ascent was 25% greater than the magnitude of net work in ramp descent (89 vs −71 J m−1, P<0.010). Similarly, the magnitude of the net work in stair ascent was 43% greater than the magnitude of net work in stair descent (107 vs −75 J step−1, P<0.000). We identified three potential causes for the reduced negative vs positive work in these locomotion tasks: (1) the larger magnitude of the accelerations induced by the larger ground reaction forces in descending compared to ascending gaits elicited greater energy dissipation in non-muscular tissues, (2) the ground reaction force vector was directed closer to the joint centers in ramp and stair descent compared to ascent, which reduced the load on the muscular tissues and their energy dissipating response, and (3) despite the need to produce negative muscle work in descending gaits, both ramp and stair descent also had positive muscle work to propel the lower extremity upward and forward into the swing phase movement trajectory. We used these data to formulate two novel hypotheses about human locomotion. First, level walking requires muscles to generate a net positive amount of work per gait cycle to overcome energy losses by other tissues. Second, skeletal muscles generate more mechanical energy in gait tasks that raise the center of mass compared to the mechanical energy they dissipate in gait tasks that lower the center of mass, despite equivalent changes in total mechanical energy. PMID:17872990

  10. Muscles do more positive than negative work in human locomotion.

    PubMed

    DeVita, Paul; Helseth, Joseph; Hortobagyi, Tibor

    2007-10-01

    Muscle work during level walking and ascent and descent ramp and stairway walking was assessed in order to explore the proposition that muscles perform more positive than negative work during these locomotion tasks. Thirty four healthy human adults were tested while maintaining a constant average walking velocity in the five gait conditions. Ground reaction force and sagittal plane kinematic data were obtained during the stance phases of these gaits and used in inverse dynamic analyses to calculate joint torques and powers at the hip, knee and ankle. Muscle work was derived as the area under the joint power vs time curves and was partitioned into positive, negative and net components. Dependent t-tests were used to compare positive and negative work in level walking and net joint work between ascent and descent gaits on the ramp and stairs (P<0.010). Total negative and positive work in level walking was -34 J and 50 J, respectively, with the difference in magnitude being statistically significant (P<0.001). Level walking was therefore performed with 16 J of net positive muscle work per step. The magnitude of the net work in ramp ascent was 25% greater than the magnitude of net work in ramp descent (89 vs -71 J m(-1), P<0.010). Similarly, the magnitude of the net work in stair ascent was 43% greater than the magnitude of net work in stair descent (107 vs -75 J step(-1), P<0.000). We identified three potential causes for the reduced negative vs positive work in these locomotion tasks: (1) the larger magnitude of the accelerations induced by the larger ground reaction forces in descending compared to ascending gaits elicited greater energy dissipation in non-muscular tissues, (2) the ground reaction force vector was directed closer to the joint centers in ramp and stair descent compared to ascent, which reduced the load on the muscular tissues and their energy dissipating response, and (3) despite the need to produce negative muscle work in descending gaits, both ramp and stair descent also had positive muscle work to propel the lower extremity upward and forward into the swing phase movement trajectory. We used these data to formulate two novel hypotheses about human locomotion. First, level walking requires muscles to generate a net positive amount of work per gait cycle to overcome energy losses by other tissues. Second, skeletal muscles generate more mechanical energy in gait tasks that raise the center of mass compared to the mechanical energy they dissipate in gait tasks that lower the center of mass, despite equivalent changes in total mechanical energy.

  11. A comparison of usability factors of four mobile devices for accessing healthcare information by adolescents.

    PubMed

    Sheehan, B; Lee, Y; Rodriguez, M; Tiase, V; Schnall, R

    2012-01-01

    Mobile health (mHealth) is a growing field aimed at developing mobile information and communication technologies for healthcare. Adolescents are known for their ubiquitous use of mobile technologies in everyday life. However, the use of mHealth tools among adolescents is not well described. We examined the usability of four commonly used mobile devices (an iPhone, an Android with touchscreen keyboard, an Android with built-in keyboard, and an iPad) for accessing healthcare information among a group of urban-dwelling adolescents. Guided by the FITT (Fit between Individuals, Task, and Technology) framework, a thinkaloud protocol was combined with a questionnaire to describe usability on three dimensions: 1) task-technology fit; 2) individual-technology fit; and 3) individual-task fit. For task-technology fit, we compared the efficiency, and effectiveness of each of the devices tested and found that the iPhone was the most usable had the fewest errors and prompts and had the lowest mean overall task time For individual-task fit, we compared efficiency and learnability measures by website tasks and found no statistically significant effect on tasks steps, task time and number of errors. Following our comparison of success rates by website tasks, we compared the difference between two mobile applications which were used for diet tracking and found statistically significant effect on tasks steps, task time and number of errors. For individual-technology fit, interface quality was significantly different across devices indicating that this is an important factor to be considered in developing future mobile devices. All of our users were able to complete all of the tasks, however the time needed to complete the tasks was significantly different by mobile device and mHealth application. Future design of mobile technology and mHealth applications should place particular importance on interface quality.

  12. Djeen (Database for Joomla!'s Extensible Engine): a research information management system for flexible multi-technology project administration.

    PubMed

    Stahl, Olivier; Duvergey, Hugo; Guille, Arnaud; Blondin, Fanny; Vecchio, Alexandre Del; Finetti, Pascal; Granjeaud, Samuel; Vigy, Oana; Bidaut, Ghislain

    2013-06-06

    With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. We developed Djeen (Database for Joomla!'s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group.Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material.

  13. Djeen (Database for Joomla!’s Extensible Engine): a research information management system for flexible multi-technology project administration

    PubMed Central

    2013-01-01

    Background With the advance of post-genomic technologies, the need for tools to manage large scale data in biology becomes more pressing. This involves annotating and storing data securely, as well as granting permissions flexibly with several technologies (all array types, flow cytometry, proteomics) for collaborative work and data sharing. This task is not easily achieved with most systems available today. Findings We developed Djeen (Database for Joomla!’s Extensible Engine), a new Research Information Management System (RIMS) for collaborative projects. Djeen is a user-friendly application, designed to streamline data storage and annotation collaboratively. Its database model, kept simple, is compliant with most technologies and allows storing and managing of heterogeneous data with the same system. Advanced permissions are managed through different roles. Templates allow Minimum Information (MI) compliance. Conclusion Djeen allows managing project associated with heterogeneous data types while enforcing annotation integrity and minimum information. Projects are managed within a hierarchy and user permissions are finely-grained for each project, user and group. Djeen Component source code (version 1.5.1) and installation documentation are available under CeCILL license from http://sourceforge.net/projects/djeen/files and supplementary material. PMID:23742665

  14. Cost and benefit estimates of partially-automated vehicle collision avoidance technologies.

    PubMed

    Harper, Corey D; Hendrickson, Chris T; Samaras, Constantine

    2016-10-01

    Many light-duty vehicle crashes occur due to human error and distracted driving. Partially-automated crash avoidance features offer the potential to reduce the frequency and severity of vehicle crashes that occur due to distracted driving and/or human error by assisting in maintaining control of the vehicle or issuing alerts if a potentially dangerous situation is detected. This paper evaluates the benefits and costs of fleet-wide deployment of blind spot monitoring, lane departure warning, and forward collision warning crash avoidance systems within the US light-duty vehicle fleet. The three crash avoidance technologies could collectively prevent or reduce the severity of as many as 1.3 million U.S. crashes a year including 133,000 injury crashes and 10,100 fatal crashes. For this paper we made two estimates of potential benefits in the United States: (1) the upper bound fleet-wide technology diffusion benefits by assuming all relevant crashes are avoided and (2) the lower bound fleet-wide benefits of the three technologies based on observed insurance data. The latter represents a lower bound as technology is improved over time and cost reduced with scale economies and technology improvement. All three technologies could collectively provide a lower bound annual benefit of about $18 billion if equipped on all light-duty vehicles. With 2015 pricing of safety options, the total annual costs to equip all light-duty vehicles with the three technologies would be about $13 billion, resulting in an annual net benefit of about $4 billion or a $20 per vehicle net benefit. By assuming all relevant crashes are avoided, the total upper bound annual net benefit from all three technologies combined is about $202 billion or an $861 per vehicle net benefit, at current technology costs. The technologies we are exploring in this paper represent an early form of vehicle automation and a positive net benefit suggests the fleet-wide adoption of these technologies would be beneficial from an economic and social perspective. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.

    2016-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  16. Neural network technologies

    NASA Technical Reports Server (NTRS)

    Villarreal, James A.

    1991-01-01

    A whole new arena of computer technologies is now beginning to form. Still in its infancy, neural network technology is a biologically inspired methodology which draws on nature's own cognitive processes. The Software Technology Branch has provided a software tool, Neural Execution and Training System (NETS), to industry, government, and academia to facilitate and expedite the use of this technology. NETS is written in the C programming language and can be executed on a variety of machines. Once a network has been debugged, NETS can produce a C source code which implements the network. This code can then be incorporated into other software systems. Described here are various software projects currently under development with NETS and the anticipated future enhancements to NETS and the technology.

  17. A Petri-net coordination model for an intelligent mobile robot

    NASA Technical Reports Server (NTRS)

    Wang, F.-Y.; Kyriakopoulos, K. J.; Tsolkas, A.; Saridis, G. N.

    1990-01-01

    The authors present a Petri net model of the coordination level of an intelligent mobile robot system (IMRS). The purpose of this model is to specify the integration of the individual efforts on path planning, supervisory motion control, and vision systems that are necessary for the autonomous operation of the mobile robot in a structured dynamic environment. This is achieved by analytically modeling the various units of the system as Petri net transducers and explicitly representing the task precedence and information dependence among them. The model can also be used to simulate the task processing and to evaluate the efficiency of operations and the responsibility of decisions in the coordination level of the IMRS. Some simulation results on the task processing and learning are presented.

  18. The Customers' Perspective: The EdNET 98 Survey of Buyers and Managers of Educational Technology. Constructive Input for the Educational Technology Industry from the EdNET 98 Education Executives Advisory Board.

    ERIC Educational Resources Information Center

    Craighead, Donna; Bigham, Vicki Smith; Heller, Nelson B.

    The EdNET 98 Education Executives Advisory Board, also known as Partners in Education Program (PEP), is a featured activity of the EdNET 98 Conference. Its focus is to bring educators and vendors together to share their perspectives about technology in education and discussion technology-related concerns and issues. This report presents results…

  19. Experimental investigations into cryosorption pumping of plasma exhaust

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perinic, D.; Mack, A.

    1988-09-01

    Within the framework of the European Fusion Technology Programme the Karlsruhe Nuclear Research Centre has been awarded a contract for the development of cryosorption panels for compound cryopumps of the NEt plasma exhaust pumping system. This task includes the development of a bonding technique for porous sorbent materials with metal substrates and a test programme for development and optimization of cryopanels. A variety of material combinations for sorbent, bonding and substrate were evaluated and listed in a test matrix. Bonding tests involving soldering, cementing and plasma spraying techniques have been carried out.

  20. NetNorM: Capturing cancer-relevant information in somatic exome mutation data with gene networks for cancer stratification and prognosis.

    PubMed

    Le Morvan, Marine; Zinovyev, Andrei; Vert, Jean-Philippe

    2017-06-01

    Genome-wide somatic mutation profiles of tumours can now be assessed efficiently and promise to move precision medicine forward. Statistical analysis of mutation profiles is however challenging due to the low frequency of most mutations, the varying mutation rates across tumours, and the presence of a majority of passenger events that hide the contribution of driver events. Here we propose a method, NetNorM, to represent whole-exome somatic mutation data in a form that enhances cancer-relevant information using a gene network as background knowledge. We evaluate its relevance for two tasks: survival prediction and unsupervised patient stratification. Using data from 8 cancer types from The Cancer Genome Atlas (TCGA), we show that it improves over the raw binary mutation data and network diffusion for these two tasks. In doing so, we also provide a thorough assessment of somatic mutations prognostic power which has been overlooked by previous studies because of the sparse and binary nature of mutations.

  1. NetNorM: Capturing cancer-relevant information in somatic exome mutation data with gene networks for cancer stratification and prognosis

    PubMed Central

    2017-01-01

    Genome-wide somatic mutation profiles of tumours can now be assessed efficiently and promise to move precision medicine forward. Statistical analysis of mutation profiles is however challenging due to the low frequency of most mutations, the varying mutation rates across tumours, and the presence of a majority of passenger events that hide the contribution of driver events. Here we propose a method, NetNorM, to represent whole-exome somatic mutation data in a form that enhances cancer-relevant information using a gene network as background knowledge. We evaluate its relevance for two tasks: survival prediction and unsupervised patient stratification. Using data from 8 cancer types from The Cancer Genome Atlas (TCGA), we show that it improves over the raw binary mutation data and network diffusion for these two tasks. In doing so, we also provide a thorough assessment of somatic mutations prognostic power which has been overlooked by previous studies because of the sparse and binary nature of mutations. PMID:28650955

  2. SEMANTIC3D.NET: a New Large-Scale Point Cloud Classification Benchmark

    NASA Astrophysics Data System (ADS)

    Hackel, T.; Savinov, N.; Ladicky, L.; Wegner, J. D.; Schindler, K.; Pollefeys, M.

    2017-05-01

    This paper presents a new 3D point cloud classification benchmark data set with over four billion manually labelled points, meant as input for data-hungry (deep) learning methods. We also discuss first submissions to the benchmark that use deep convolutional neural networks (CNNs) as a work horse, which already show remarkable performance improvements over state-of-the-art. CNNs have become the de-facto standard for many tasks in computer vision and machine learning like semantic segmentation or object detection in images, but have no yet led to a true breakthrough for 3D point cloud labelling tasks due to lack of training data. With the massive data set presented in this paper, we aim at closing this data gap to help unleash the full potential of deep learning methods for 3D labelling tasks. Our semantic3D.net data set consists of dense point clouds acquired with static terrestrial laser scanners. It contains 8 semantic classes and covers a wide range of urban outdoor scenes: churches, streets, railroad tracks, squares, villages, soccer fields and castles. We describe our labelling interface and show that our data set provides more dense and complete point clouds with much higher overall number of labelled points compared to those already available to the research community. We further provide baseline method descriptions and comparison between methods submitted to our online system. We hope semantic3D.net will pave the way for deep learning methods in 3D point cloud labelling to learn richer, more general 3D representations, and first submissions after only a few months indicate that this might indeed be the case.

  3. Internet-enabled solutions for health care business problems.

    PubMed

    Kennedy, R; Geisler, M

    1997-01-01

    Many health care delivery organizations have built, installed, or made use of Nets. As single entities merge with others, and independent institutions become part of much larger delivery networks, the need for collaboration is critical. With the formation of such partnerships, existing platforms will become increasingly available from which it will be possible to build disparate technologies that must somehow be part of a single working "system." Nets can enable this leveraging, allowing access from multiple technological platforms. The collaboration, distribution, application integration, and messaging possibilities with the Nets are unprecedented. We believe that meeting a health care delivery organization's needs without these benefits will soon be unthinkable. While Nets are not the answer to the challenges facing health care delivery today, they certainly are a large contributor to the solution.

  4. U.S. Army Human Capital Enterprise (HCE) ARFORGEN Data Management, Correlation, Integration and Synchronization Analysis

    DTIC Science & Technology

    2011-08-15

    system must, at a minimum, include  design and configuration  framework  supporting:  Part 1.   Net  Ready. The system must support  net ‐ centric operations...Analyze, evaluate and incorporate relevant DoD Architecture Framework . 5) Document standards for each task / condition combination. 6) Prepare final FAA...task  Analyze, evaluate and incorporate relevant Army Architecture Framework  Document standards for each task/condition combination forming

  5. Forecasting the impact of virtual environment technology on maintenance training

    NASA Technical Reports Server (NTRS)

    Schlager, Mark S.; Boman, Duane; Piantanida, Tom; Stephenson, Robert

    1993-01-01

    To assist NASA and the Air Force in determining how and when to invest in virtual environment (VE) technology for maintenance training, we identified possible roles for VE technology in such training, assessed its cost-effectiveness relative to existing technologies, and formulated recommendations for a research agenda that would address instructional and system development issues involved in fielding a VE training system. In the first phase of the study, we surveyed VE developers to forecast capabilities, maturity, and estimated costs for VE component technologies. We then identified maintenance tasks and their training costs through interviews with maintenance technicians, instructors, and training developers. Ten candidate tasks were selected from two classes of maintenance tasks (seven aircraft maintenance and three space maintenance) using five criteria developed to identify types of tasks most likely to benefit from VE training. Three tasks were used as specific cases for cost-benefit analysis. In formulating research recommendations, we considered three aspects of feasibility: technological considerations, cost-effectiveness, and anticipated R&D efforts. In this paper, we describe the major findings in each of these areas and suggest research efforts that we believe will help achieve the goal of a cost-effective VE maintenance training system by the next decade.

  6. OR.NET: multi-perspective qualitative evaluation of an integrated operating room based on IEEE 11073 SDC.

    PubMed

    Rockstroh, M; Franke, S; Hofer, M; Will, A; Kasparick, M; Andersen, B; Neumuth, T

    2017-08-01

    Clinical working environments have become very complex imposing many different tasks in diagnosis, medical treatment, and care procedures. During the German flagship project OR.NET, more than 50 partners developed technologies for an open integration of medical devices and IT systems in the operating room. The aim of the present work was to evaluate a large set of the proposed concepts from the perspectives of various stakeholders. The demonstration OR is focused on interventions from the head and neck surgery and was developed in close cooperation with surgeons and numerous colleagues of the project partners. The demonstration OR was qualitatively evaluated including technical as well as clinical aspects. In the evaluation, a questionnaire was used to obtain feedback from hospital operators. The clinical implications were covered by structured interviews with surgeons, anesthesiologists and OR staff. In the present work, we qualitatively evaluate a subset of the proposed concepts from the perspectives of various stakeholders. The feedback of the clinicians indicates that there is a need for a flexible data and control integration. The hospital operators stress the need for tools to simplify risk management in openly integrated operating rooms. The implementation of openly integrated operating rooms will positively affect the surgeons, the anesthesiologists, the surgical nursing staff, as well as the technical personnel and the hospital operators. The evaluation demonstrated the need for OR integration technologies and identified the missing tools to support risk management and approval as the main barriers for future installments.

  7. Values in the Net Neutrality Debate: Applying Content Analysis to Testimonies from Public Hearings

    ERIC Educational Resources Information Center

    Cheng, An-Shou

    2012-01-01

    The Net neutrality debate is an important telecommunications policy issue that closely tied to technological innovation, economic development, and information access. Existing studies on Net neutrality have focused primarily on technological requirements, economic analysis, and regulatory justifications. Since values, technology, and policy are…

  8. Multi-focus and multi-level techniques for visualization and analysis of networks with thematic data

    NASA Astrophysics Data System (ADS)

    Cossalter, Michele; Mengshoel, Ole J.; Selker, Ted

    2013-01-01

    Information-rich data sets bring several challenges in the areas of visualization and analysis, even when associated with node-link network visualizations. This paper presents an integration of multi-focus and multi-level techniques that enable interactive, multi-step comparisons in node-link networks. We describe NetEx, a visualization tool that enables users to simultaneously explore different parts of a network and its thematic data, such as time series or conditional probability tables. NetEx, implemented as a Cytoscape plug-in, has been applied to the analysis of electrical power networks, Bayesian networks, and the Enron e-mail repository. In this paper we briefly discuss visualization and analysis of the Enron social network, but focus on data from an electrical power network. Specifically, we demonstrate how NetEx supports the analytical task of electrical power system fault diagnosis. Results from a user study with 25 subjects suggest that NetEx enables more accurate isolation of complex faults compared to an especially designed software tool.

  9. BrainNetCNN: Convolutional neural networks for brain networks; towards predicting neurodevelopment.

    PubMed

    Kawahara, Jeremy; Brown, Colin J; Miller, Steven P; Booth, Brian G; Chau, Vann; Grunau, Ruth E; Zwicker, Jill G; Hamarneh, Ghassan

    2017-02-01

    We propose BrainNetCNN, a convolutional neural network (CNN) framework to predict clinical neurodevelopmental outcomes from brain networks. In contrast to the spatially local convolutions done in traditional image-based CNNs, our BrainNetCNN is composed of novel edge-to-edge, edge-to-node and node-to-graph convolutional filters that leverage the topological locality of structural brain networks. We apply the BrainNetCNN framework to predict cognitive and motor developmental outcome scores from structural brain networks of infants born preterm. Diffusion tensor images (DTI) of preterm infants, acquired between 27 and 46 weeks gestational age, were used to construct a dataset of structural brain connectivity networks. We first demonstrate the predictive capabilities of BrainNetCNN on synthetic phantom networks with simulated injury patterns and added noise. BrainNetCNN outperforms a fully connected neural-network with the same number of model parameters on both phantoms with focal and diffuse injury patterns. We then apply our method to the task of joint prediction of Bayley-III cognitive and motor scores, assessed at 18 months of age, adjusted for prematurity. We show that our BrainNetCNN framework outperforms a variety of other methods on the same data. Furthermore, BrainNetCNN is able to identify an infant's postmenstrual age to within about 2 weeks. Finally, we explore the high-level features learned by BrainNetCNN by visualizing the importance of each connection in the brain with respect to predicting the outcome scores. These findings are then discussed in the context of the anatomy and function of the developing preterm infant brain. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Understanding the digital divide in the clinical setting: the technology knowledge gap experienced by US safety net patients during teleretinal screening.

    PubMed

    George, Sheba; Moran, Erin; Fish, Allison; Ogunyemi, Lola

    2013-01-01

    Differential access to everyday technology and healthcare amongst safety net patients is associated with low technological and health literacies, respectively. These low rates of literacy produce a complex patient "knowledge gap" that influences the effectiveness of telehealth technologies. To understand this "knowledge gap", six focus groups (2 African-American and 4 Latino) were conducted with patients who received teleretinal screenings in U.S. urban safety-net settings. Findings indicate that patients' "knowledge gap" is primarily produced at three points: (1) when patients' preexisting personal barriers to care became exacerbated in the clinical setting; (2) through encounters with technology during screening; and (3) in doctor-patient follow-up. This "knowledge gap" can produce confusion and fear, potentially affecting patients' confidence in quality of care and limiting their disease management ability. In rethinking the digital divide to include the consequences of this knowledge gap faced by patients in the clinical setting, we suggest that patient education focus on both their disease and specific telehealth technologies deployed in care delivery.

  11. A Binary Array Asynchronous Sorting Algorithm with Using Petri Nets

    NASA Astrophysics Data System (ADS)

    Voevoda, A. A.; Romannikov, D. O.

    2017-01-01

    Nowadays the tasks of computations speed-up and/or their optimization are actual. Among the approaches on how to solve these tasks, a method applying approaches of parallelization and asynchronization to a sorting algorithm is considered in the paper. The sorting methods are ones of elementary methods and they are used in a huge amount of different applications. In the paper, we offer a method of an array sorting that based on a division into a set of independent adjacent pairs of numbers and their parallel and asynchronous comparison. And this one distinguishes the offered method from the traditional sorting algorithms (like quick sorting, merge sorting, insertion sorting and others). The algorithm is implemented with the use of Petri nets, like the most suitable tool for an asynchronous systems description.

  12. Technology-facilitated depression care management among predominantly Latino diabetes patients within a public safety net care system: comparative effectiveness trial design.

    PubMed

    Wu, Shinyi; Ell, Kathleen; Gross-Schulman, Sandra G; Sklaroff, Laura Myerchin; Katon, Wayne J; Nezu, Art M; Lee, Pey-Jiuan; Vidyanti, Irene; Chou, Chih-Ping; Guterman, Jeffrey J

    2014-03-01

    Health disparities in minority populations are well recognized. Hispanics and Latinos constitute the largest ethnic minority group in the United States; a significant proportion receives their care via a safety net. The prevalence of diabetes mellitus and comorbid depression is high among this group, but the uptake of evidence-based collaborative depression care management has been suboptimal. The study design and baseline characteristics of the enrolled sample in the Diabetes-Depression Care-management Adoption Trial (DCAT) establishes a quasi-experimental comparative effectiveness research clinical trial aimed at accelerating the adoption of collaborative depression care in safety net clinics. The study was conducted in collaboration with the Los Angeles County Department of Health Services at eight county-operated clinics. DCAT has enrolled 1406 low-income, predominantly Hispanic/Latino patients with diabetes to test a translational model of depression care management. This three-group study compares usual care with a collaborative care team support model and a technology-facilitated depression care model that provides automated telephonic depression screening and monitoring tailored to patient conditions and preferences. Call results are integrated into a diabetes disease management registry that delivers provider notifications, generates tasks, and issues critical alerts. All subjects receive comprehensive assessments at baseline, 6, 12, and 18 months by independent English-Spanish bilingual interviewers. Study outcomes include depression outcomes, treatment adherence, satisfaction, acceptance of assessment and monitoring technology, social and economic stress reduction, diabetes self-care management, health care utilization, and care management model cost and cost-effectiveness comparisons. DCAT's goal is to optimize depression screening, treatment, follow-up, outcomes, and cost savings to reduce health disparities. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Function Allocation between Automation and Human Pilot for Airborne Separation Assurance

    NASA Technical Reports Server (NTRS)

    Idris, Husni; Enea, Gabriele; Lewis, TImothy A.

    2016-01-01

    Maintaining safe separation between aircraft is a key determinant of the airspace capacity to handle air transportation. With the advent of satellite-based surveillance, aircraft equipped with the needed technologies are now capable of maintaining awareness of their location in the airspace and sharing it with their surrounding traffic. As a result, concepts and cockpit automation are emerging to enable delegating the responsibility of maintaining safe separation from traffic to the pilot; thus increasing the airspace capacity by alleviating the limitation of the current non-scalable centralized ground-based system. In this paper, an analysis of allocating separation assurance functions to the human pilot and cockpit automation is presented to support the design of these concepts and technologies. A task analysis was conducted with the help of Petri nets to identify the main separation assurance functions and their interactions. Each function was characterized by three behavior levels that may be needed to perform the task: skill, rule and knowledge based levels. Then recommendations are made for allocating each function to an automation scale based on their behavior level characterization and with the help of Subject matter experts.

  14. Elastic SCAD as a novel penalization method for SVM classification tasks in high-dimensional data.

    PubMed

    Becker, Natalia; Toedt, Grischa; Lichter, Peter; Benner, Axel

    2011-05-09

    Classification and variable selection play an important role in knowledge discovery in high-dimensional data. Although Support Vector Machine (SVM) algorithms are among the most powerful classification and prediction methods with a wide range of scientific applications, the SVM does not include automatic feature selection and therefore a number of feature selection procedures have been developed. Regularisation approaches extend SVM to a feature selection method in a flexible way using penalty functions like LASSO, SCAD and Elastic Net.We propose a novel penalty function for SVM classification tasks, Elastic SCAD, a combination of SCAD and ridge penalties which overcomes the limitations of each penalty alone.Since SVM models are extremely sensitive to the choice of tuning parameters, we adopted an interval search algorithm, which in comparison to a fixed grid search finds rapidly and more precisely a global optimal solution. Feature selection methods with combined penalties (Elastic Net and Elastic SCAD SVMs) are more robust to a change of the model complexity than methods using single penalties. Our simulation study showed that Elastic SCAD SVM outperformed LASSO (L1) and SCAD SVMs. Moreover, Elastic SCAD SVM provided sparser classifiers in terms of median number of features selected than Elastic Net SVM and often better predicted than Elastic Net in terms of misclassification error.Finally, we applied the penalization methods described above on four publicly available breast cancer data sets. Elastic SCAD SVM was the only method providing robust classifiers in sparse and non-sparse situations. The proposed Elastic SCAD SVM algorithm provides the advantages of the SCAD penalty and at the same time avoids sparsity limitations for non-sparse data. We were first to demonstrate that the integration of the interval search algorithm and penalized SVM classification techniques provides fast solutions on the optimization of tuning parameters.The penalized SVM classification algorithms as well as fixed grid and interval search for finding appropriate tuning parameters were implemented in our freely available R package 'penalizedSVM'.We conclude that the Elastic SCAD SVM is a flexible and robust tool for classification and feature selection tasks for high-dimensional data such as microarray data sets.

  15. Elastic SCAD as a novel penalization method for SVM classification tasks in high-dimensional data

    PubMed Central

    2011-01-01

    Background Classification and variable selection play an important role in knowledge discovery in high-dimensional data. Although Support Vector Machine (SVM) algorithms are among the most powerful classification and prediction methods with a wide range of scientific applications, the SVM does not include automatic feature selection and therefore a number of feature selection procedures have been developed. Regularisation approaches extend SVM to a feature selection method in a flexible way using penalty functions like LASSO, SCAD and Elastic Net. We propose a novel penalty function for SVM classification tasks, Elastic SCAD, a combination of SCAD and ridge penalties which overcomes the limitations of each penalty alone. Since SVM models are extremely sensitive to the choice of tuning parameters, we adopted an interval search algorithm, which in comparison to a fixed grid search finds rapidly and more precisely a global optimal solution. Results Feature selection methods with combined penalties (Elastic Net and Elastic SCAD SVMs) are more robust to a change of the model complexity than methods using single penalties. Our simulation study showed that Elastic SCAD SVM outperformed LASSO (L1) and SCAD SVMs. Moreover, Elastic SCAD SVM provided sparser classifiers in terms of median number of features selected than Elastic Net SVM and often better predicted than Elastic Net in terms of misclassification error. Finally, we applied the penalization methods described above on four publicly available breast cancer data sets. Elastic SCAD SVM was the only method providing robust classifiers in sparse and non-sparse situations. Conclusions The proposed Elastic SCAD SVM algorithm provides the advantages of the SCAD penalty and at the same time avoids sparsity limitations for non-sparse data. We were first to demonstrate that the integration of the interval search algorithm and penalized SVM classification techniques provides fast solutions on the optimization of tuning parameters. The penalized SVM classification algorithms as well as fixed grid and interval search for finding appropriate tuning parameters were implemented in our freely available R package 'penalizedSVM'. We conclude that the Elastic SCAD SVM is a flexible and robust tool for classification and feature selection tasks for high-dimensional data such as microarray data sets. PMID:21554689

  16. MMPM - Mission implementation of Mars MetNet Precursor

    NASA Astrophysics Data System (ADS)

    Harri, A.-M.

    2009-04-01

    We are developing a new kind of planetary exploration mission for Mars - MetNet in situ observation network based on a new semi-hard landing vehicle called the Met-Net Lander (MNL). The key technical aspects and solutions of the mission will be discussed. The eventual scope of the MetNet Mission is to deploy some 20 MNLs on the Martian surface using inflatable descent system structures, which will be supported by observations from the orbit around Mars. Currently we are working on the MetNet Mars Precursor Mission (MMPM) to deploy one MetNet Lander to Mars in the 2009/2011 launch window as a technology and science demonstration mission. The MNL will have a versatile science payload focused on the atmospheric science of Mars. Detailed characterization of the Martian atmospheric circulation patterns, boundary layer phenomena, and climatology cycles, require simultaneous in-situ measurements by a network of observation posts on the Martian surface. The scientific payload of the MetNet Mission encompasses separate instrument packages for the atmospheric entry and descent phase and for the surface operation phase. The MetNet mission concept and key probe technologies have been developed and the critical subsystems have been qualified to meet the Martian environmental and functional conditions. This development effort has been fulfilled in collaboration between the Finnish Meteorological Institute (FMI), the Russian Lavoschkin Association (LA) and the Russian Space Research Institute (IKI) since August 2001. Currently the INTA (Instituto Nacional de Técnica Aeroespacial) from Spain is also participating in the MetNet payload development.

  17. Virtual Control Policy for Binary Ordered Resources Petri Net Class.

    PubMed

    Rovetto, Carlos A; Concepción, Tomás J; Cano, Elia Esther

    2016-08-18

    Prevention and avoidance of deadlocks in sensor networks that use the wormhole routing algorithm is an active research domain. There are diverse control policies that will address this problem being our approach a new method. In this paper we present a virtual control policy for the new specialized Petri net subclass called Binary Ordered Resources Petri Net (BORPN). Essentially, it is an ordinary class constructed from various state machines that share unitary resources in a complex form, which allows branching and joining of processes. The reduced structure of this new class gives advantages that allow analysis of the entire system's behavior, which is a prohibitive task for large systems because of the complexity and routing algorithms.

  18. Additive manufacturing of near-net-shape bonded magnets: Prospects and challenges

    DOE PAGES

    Li, Ling; Post, Brian; Kunc, Vlastimil; ...

    2017-01-03

    Additive manufacturing (AM) or 3D printing is well known for producing arbitrary shaped parts without any tooling required, offering a promising alternative to the conventional injection molding method to fabricate near-net-shaped magnets. In order to determine their applicability in the fabrication of Nd-Fe-B bondedmagnets, we compare two 3D printing technologies, namely binder jetting and material extrusion. Some prospects and challenges of these state-of-the-art technologies for large-scale industrial applications will be discussed.

  19. Software reuse issues affecting AdaNET

    NASA Technical Reports Server (NTRS)

    Mcbride, John G.

    1989-01-01

    The AdaNet program is reviewing its long-term goals and strategies. A significant concern is whether current AdaNet plans adequately address the major strategic issues of software reuse technology. The major reuse issues of providing AdaNet services that should be addressed as part of future AdaNet development are identified and reviewed. Before significant development proceeds, a plan should be developed to resolve the aforementioned issues. This plan should also specify a detailed approach to develop AdaNet. A three phased strategy is recommended. The first phase would consist of requirements analysis and produce an AdaNet system requirements specification. It would consider the requirements of AdaNet in terms of mission needs, commercial realities, and administrative policies affecting development, and the experience of AdaNet and other projects promoting the transfer software engineering technology. Specifically, requirements analysis would be performed to better understand the requirements for AdaNet functions. The second phase would provide a detailed design of the system. The AdaNet should be designed with emphasis on the use of existing technology readily available to the AdaNet program. A number of reuse products are available upon which AdaNet could be based. This would significantly reduce the risk and cost of providing an AdaNet system. Once a design was developed, implementation would proceed in the third phase.

  20. Applying Technology Ranking and Systems Engineering in Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Luna, Bernadette (Technical Monitor)

    2000-01-01

    According to the Advanced Life Support (ALS) Program Plan, the Systems Modeling and Analysis Project (SMAP) has two important tasks: 1) prioritizing investments in ALS Research and Technology Development (R&TD), and 2) guiding the evolution of ALS systems. Investments could be prioritized simply by independently ranking different technologies, but we should also consider a technology's impact on system design. Guiding future ALS systems will require SMAP to consider many aspects of systems engineering. R&TD investments can be prioritized using familiar methods for ranking technology. The first step is gathering data on technology performance, safety, readiness level, and cost. Then the technologies are ranked using metrics or by decision analysis using net present economic value. The R&TD portfolio can be optimized to provide the maximum expected payoff in the face of uncertain future events. But more is needed. The optimum ALS system can not be designed simply by selecting the best technology for each predefined subsystem. Incorporating a new technology, such as food plants, can change the specifications of other subsystems, such as air regeneration. Systems must be designed top-down starting from system objectives, not bottom-up from selected technologies. The familiar top-down systems engineering process includes defining mission objectives, mission design, system specification, technology analysis, preliminary design, and detail design. Technology selection is only one part of systems analysis and engineering, and it is strongly related to the subsystem definitions. ALS systems should be designed using top-down systems engineering. R&TD technology selection should consider how the technology affects ALS system design. Technology ranking is useful but it is only a small part of systems engineering.

  1. The use of narrative sampling in the assessment of social cognition: the Narrative of Emotions Task (NET).

    PubMed

    Buck, Benjamin; Ludwig, Kelsey; Meyer, Piper S; Penn, David L

    2014-07-30

    Social cognitive deficits in schizophrenia are well documented and related to functional outcome. Current social cognition measures are often not psychometrically validated, too heterogeneous for standardization, and focus principally on one domain of social cognition rather than the simultaneous activation of multiple domains. Also, few if any allow for personalization of stimuli and interpretation of personally evocative events. An alternative methodology that addresses these limitations is the analysis of samples of personal narratives. The present study evaluates the psychometric properties of a measure called the Narrative of Emotions Task (NET). The NET was used to assess the performance of participants with a diagnosis of schizophrenia or schizoaffective disorder and nonclinical controls. Use of the NET revealed significant impairments in the emotional narratives of participants with schizophrenia. Various NET indices were significantly related to current measures of theory of mind and emotion perception, as well as a social skills role-play, but were not related to measures of attributional style or clinician-rated functioning scales. Overall, the NET׳s psychometric properties justify further use of the narrative sampling method of social cognition assessment in this population. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Negative emissions—Part 2: Costs, potentials and side effects

    NASA Astrophysics Data System (ADS)

    Fuss, Sabine; Lamb, William F.; Callaghan, Max W.; Hilaire, Jérôme; Creutzig, Felix; Amann, Thorben; Beringer, Tim; de Oliveira Garcia, Wagner; Hartmann, Jens; Khanna, Tarun; Luderer, Gunnar; Nemet, Gregory F.; Rogelj, Joeri; Smith, Pete; Vicente, José Luis Vicente; Wilcox, Jennifer; del Mar Zamora Dominguez, Maria; Minx, Jan C.

    2018-06-01

    The most recent IPCC assessment has shown an important role for negative emissions technologies (NETs) in limiting global warming to 2 °C cost-effectively. However, a bottom-up, systematic, reproducible, and transparent literature assessment of the different options to remove CO2 from the atmosphere is currently missing. In part 1 of this three-part review on NETs, we assemble a comprehensive set of the relevant literature so far published, focusing on seven technologies: bioenergy with carbon capture and storage (BECCS), afforestation and reforestation, direct air carbon capture and storage (DACCS), enhanced weathering, ocean fertilisation, biochar, and soil carbon sequestration. In this part, part 2 of the review, we present estimates of costs, potentials, and side-effects for these technologies, and qualify them with the authors’ assessment. Part 3 reviews the innovation and scaling challenges that must be addressed to realise NETs deployment as a viable climate mitigation strategy. Based on a systematic review of the literature, our best estimates for sustainable global NET potentials in 2050 are 0.5–3.6 GtCO2 yr‑1 for afforestation and reforestation, 0.5–5 GtCO2 yr‑1 for BECCS, 0.5–2 GtCO2 yr‑1 for biochar, 2–4 GtCO2 yr‑1 for enhanced weathering, 0.5–5 GtCO2 yr‑1 for DACCS, and up to 5 GtCO2 yr‑1 for soil carbon sequestration. Costs vary widely across the technologies, as do their permanency and cumulative potentials beyond 2050. It is unlikely that a single NET will be able to sustainably meet the rates of carbon uptake described in integrated assessment pathways consistent with 1.5 °C of global warming.

  3. Automated Detection of Diabetic Retinopathy using Deep Learning.

    PubMed

    Lam, Carson; Yi, Darvin; Guo, Margaret; Lindsey, Tony

    2018-01-01

    Diabetic retinopathy is a leading cause of blindness among working-age adults. Early detection of this condition is critical for good prognosis. In this paper, we demonstrate the use of convolutional neural networks (CNNs) on color fundus images for the recognition task of diabetic retinopathy staging. Our network models achieved test metric performance comparable to baseline literature results, with validation sensitivity of 95%. We additionally explored multinomial classification models, and demonstrate that errors primarily occur in the misclassification of mild disease as normal due to the CNNs inability to detect subtle disease features. We discovered that preprocessing with contrast limited adaptive histogram equalization and ensuring dataset fidelity by expert verification of class labels improves recognition of subtle features. Transfer learning on pretrained GoogLeNet and AlexNet models from ImageNet improved peak test set accuracies to 74.5%, 68.8%, and 57.2% on 2-ary, 3-ary, and 4-ary classification models, respectively.

  4. MetNet - Martian Network Mission

    NASA Astrophysics Data System (ADS)

    Harri, A.-M.

    2009-04-01

    We are developing a new kind of planetary exploration mission for Mars - MetNet in situ observation network based on a new semi-hard landing vehicle called the Met-Net Lander (MNL). The actual practical mission development work started in January 2009 with participation from various countries and space agencies. The scientific rationale and goals as well as key mission solutions will be discussed. The eventual scope of the MetNet Mission is to deploy some 20 MNLs on the Martian surface using inflatable descent system structures, which will be supported by observations from the orbit around Mars. Currently we are working on the MetNet Mars Precursor Mission (MMPM) to deploy one MetNet Lander to Mars in the 2009/2011 launch window as a technology and science demonstration mission. The MNL will have a versatile science payload focused on the atmospheric science of Mars. Detailed characterization of the Martian atmospheric circulation patterns, boundary layer phenomena, and climatology cycles, require simultaneous in-situ measurements by a network of observation posts on the Martian surface. The scientific payload of the MetNet Mission encompasses separate instrument packages for the atmospheric entry and descent phase and for the surface operation phase. The MetNet mission concept and key probe technologies have been developed and the critical subsystems have been qualified to meet the Martian environmental and functional conditions. This development effort has been fulfilled in collaboration between the Finnish Meteorological Institute (FMI), the Russian Lavoschkin Association (LA) and the Russian Space Research Institute (IKI) since August 2001. Currently the INTA (Instituto Nacional de Técnica Aeroespacial) from Spain is also participating in the MetNet payload development.

  5. PCANet: A Simple Deep Learning Baseline for Image Classification?

    PubMed

    Chan, Tsung-Han; Jia, Kui; Gao, Shenghua; Lu, Jiwen; Zeng, Zinan; Ma, Yi

    2015-12-01

    In this paper, we propose a very simple deep learning network for image classification that is based on very basic data processing components: 1) cascaded principal component analysis (PCA); 2) binary hashing; and 3) blockwise histograms. In the proposed architecture, the PCA is employed to learn multistage filter banks. This is followed by simple binary hashing and block histograms for indexing and pooling. This architecture is thus called the PCA network (PCANet) and can be extremely easily and efficiently designed and learned. For comparison and to provide a better understanding, we also introduce and study two simple variations of PCANet: 1) RandNet and 2) LDANet. They share the same topology as PCANet, but their cascaded filters are either randomly selected or learned from linear discriminant analysis. We have extensively tested these basic networks on many benchmark visual data sets for different tasks, including Labeled Faces in the Wild (LFW) for face verification; the MultiPIE, Extended Yale B, AR, Facial Recognition Technology (FERET) data sets for face recognition; and MNIST for hand-written digit recognition. Surprisingly, for all tasks, such a seemingly naive PCANet model is on par with the state-of-the-art features either prefixed, highly hand-crafted, or carefully learned [by deep neural networks (DNNs)]. Even more surprisingly, the model sets new records for many classification tasks on the Extended Yale B, AR, and FERET data sets and on MNIST variations. Additional experiments on other public data sets also demonstrate the potential of PCANet to serve as a simple but highly competitive baseline for texture classification and object recognition.

  6. Deep Networks Can Resemble Human Feed-forward Vision in Invariant Object Recognition

    PubMed Central

    Kheradpisheh, Saeed Reza; Ghodrati, Masoud; Ganjtabesh, Mohammad; Masquelier, Timothée

    2016-01-01

    Deep convolutional neural networks (DCNNs) have attracted much attention recently, and have shown to be able to recognize thousands of object categories in natural image databases. Their architecture is somewhat similar to that of the human visual system: both use restricted receptive fields, and a hierarchy of layers which progressively extract more and more abstracted features. Yet it is unknown whether DCNNs match human performance at the task of view-invariant object recognition, whether they make similar errors and use similar representations for this task, and whether the answers depend on the magnitude of the viewpoint variations. To investigate these issues, we benchmarked eight state-of-the-art DCNNs, the HMAX model, and a baseline shallow model and compared their results to those of humans with backward masking. Unlike in all previous DCNN studies, we carefully controlled the magnitude of the viewpoint variations to demonstrate that shallow nets can outperform deep nets and humans when variations are weak. When facing larger variations, however, more layers were needed to match human performance and error distributions, and to have representations that are consistent with human behavior. A very deep net with 18 layers even outperformed humans at the highest variation level, using the most human-like representations. PMID:27601096

  7. Automated comprehensive Adolescent Idiopathic Scoliosis assessment using MVC-Net.

    PubMed

    Wu, Hongbo; Bailey, Chris; Rasoulinejad, Parham; Li, Shuo

    2018-05-18

    Automated quantitative estimation of spinal curvature is an important task for the ongoing evaluation and treatment planning of Adolescent Idiopathic Scoliosis (AIS). It solves the widely accepted disadvantage of manual Cobb angle measurement (time-consuming and unreliable) which is currently the gold standard for AIS assessment. Attempts have been made to improve the reliability of automated Cobb angle estimation. However, it is very challenging to achieve accurate and robust estimation of Cobb angles due to the need for correctly identifying all the required vertebrae in both Anterior-posterior (AP) and Lateral (LAT) view x-rays. The challenge is especially evident in LAT x-ray where occlusion of vertebrae by the ribcage occurs. We therefore propose a novel Multi-View Correlation Network (MVC-Net) architecture that can provide a fully automated end-to-end framework for spinal curvature estimation in multi-view (both AP and LAT) x-rays. The proposed MVC-Net uses our newly designed multi-view convolution layers to incorporate joint features of multi-view x-rays, which allows the network to mitigate the occlusion problem by utilizing the structural dependencies of the two views. The MVC-Net consists of three closely-linked components: (1) a series of X-modules for joint representation of spinal structure (2) a Spinal Landmark Estimator network for robust spinal landmark estimation, and (3) a Cobb Angle Estimator network for accurate Cobb Angles estimation. By utilizing an iterative multi-task training algorithm to train the Spinal Landmark Estimator and Cobb Angle Estimator in tandem, the MVC-Net leverages the multi-task relationship between landmark and angle estimation to reliably detect all the required vertebrae for accurate Cobb angles estimation. Experimental results on 526 x-ray images from 154 patients show an impressive 4.04° Circular Mean Absolute Error (CMAE) in AP Cobb angle and 4.07° CMAE in LAT Cobb angle estimation, which demonstrates the MVC-Net's capability of robust and accurate estimation of Cobb angles in multi-view x-rays. Our method therefore provides clinicians with a framework for efficient, accurate, and reliable estimation of spinal curvature for comprehensive AIS assessment. Copyright © 2018. Published by Elsevier B.V.

  8. Not Just Playing Around: The MoLeNET Experience of Using Games Technologies to Support Teaching and Learning

    ERIC Educational Resources Information Center

    Petley, Rebecca; Attewell, Jill; Savill-Smith, Carol

    2011-01-01

    MoLeNET is a unique collaborative initiative, currently in its third year, which encourages and enables the introduction of mobile learning in English post 14 education via supported shared-cost projects. Mobile learning in MoLeNET is defined by MoLeNET as "The exploitation of ubiquitous handheld technologies, together with wireless and…

  9. Defining Elements of Value in Health Care-A Health Economics Approach: An ISPOR Special Task Force Report [3].

    PubMed

    Lakdawalla, Darius N; Doshi, Jalpa A; Garrison, Louis P; Phelps, Charles E; Basu, Anirban; Danzon, Patricia M

    2018-02-01

    The third section of our Special Task Force report identifies and defines a series of elements that warrant consideration in value assessments of medical technologies. We aim to broaden the view of what constitutes value in health care and to spur new research on incorporating additional elements of value into cost-effectiveness analysis (CEA). Twelve potential elements of value are considered. Four of them-quality-adjusted life-years, net costs, productivity, and adherence-improving factors-are conventionally included or considered in value assessments. Eight others, which would be more novel in economic assessments, are defined and discussed: reduction in uncertainty, fear of contagion, insurance value, severity of disease, value of hope, real option value, equity, and scientific spillovers. Most of these are theoretically well understood and available for inclusion in value assessments. The two exceptions are equity and scientific spillover effects, which require more theoretical development and consensus. A number of regulatory authorities around the globe have shown interest in some of these novel elements. Augmenting CEA to consider these additional elements would result in a more comprehensive CEA in line with the "impact inventory" of the Second Panel on Cost-Effectiveness in Health and Medicine. Possible approaches for valuation and inclusion of these elements include integrating them as part of a net monetary benefit calculation, including elements as attributes in health state descriptions, or using them as criteria in a multicriteria decision analysis. Further research is needed on how best to measure and include them in decision making. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. Improved silicon nitride for advanced heat engines

    NASA Technical Reports Server (NTRS)

    Yeh, Hun C.; Fang, Ho T.

    1987-01-01

    The technology base required to fabricate silicon nitride components with the strength, reliability, and reproducibility necessary for actual heat engine applications is presented. Task 2 was set up to develop test bars with high Weibull slope and greater high temperature strength, and to conduct an initial net shape component fabrication evaluation. Screening experiments were performed in Task 7 on advanced materials and processing for input to Task 2. The technical efforts performed in the second year of a 5-yr program are covered. The first iteration of Task 2 was completed as planned. Two half-replicated, fractional factorial (2 sup 5), statistically designed matrix experiments were conducted. These experiments have identified Denka 9FW Si3N4 as an alternate raw material to GTE SN502 Si3N4 for subsequent process evaluation. A detailed statistical analysis was conducted to correlate processing conditions with as-processed test bar properties. One processing condition produced a material with a 97 ksi average room temperature MOR (100 percent of goal) with 13.2 Weibull slope (83 percent of goal); another condition produced 86 ksi (6 percent over baseline) room temperature strength with a Weibull slope of 20 (125 percent of goal).

  11. Usage of the back-propagation method for alphabet recognition

    NASA Astrophysics Data System (ADS)

    Shaila Sree, R. N.; Eswaran, Kumar; Sundararajan, N.

    1999-03-01

    Artificial Neural Networks play a pivotal role in the branch of Artificial Intelligence. They can be trained efficiently for a variety of tasks using different methods, of which the Back Propagation method is one among them. The paper studies the choosing of various design parameters of a neural network for the Back Propagation method. The study shows that when these parameters are properly assigned, the training task of the net is greatly simplified. The character recognition problem has been chosen as a test case for this study. A sample space of different handwritten characters of the English alphabet was gathered. A Neural net is finally designed taking many the design aspects into consideration and trained for different styles of writing. Experimental results are reported and discussed. It has been found that an appropriate choice of the design parameters of the neural net for the Back Propagation method reduces the training time and improves the performance of the net.

  12. Virtual Control Policy for Binary Ordered Resources Petri Net Class

    PubMed Central

    Rovetto, Carlos A.; Concepción, Tomás J.; Cano, Elia Esther

    2016-01-01

    Prevention and avoidance of deadlocks in sensor networks that use the wormhole routing algorithm is an active research domain. There are diverse control policies that will address this problem being our approach a new method. In this paper we present a virtual control policy for the new specialized Petri net subclass called Binary Ordered Resources Petri Net (BORPN). Essentially, it is an ordinary class constructed from various state machines that share unitary resources in a complex form, which allows branching and joining of processes. The reduced structure of this new class gives advantages that allow analysis of the entire system’s behavior, which is a prohibitive task for large systems because of the complexity and routing algorithms. PMID:27548170

  13. Mars MetNet Precursor Mission Status

    NASA Astrophysics Data System (ADS)

    Harri, A.-M.; Aleksashkin, S.; Guerrero, H.; Schmidt, W.; Genzer, M.; Vazquez, L.; Haukka, H.

    2013-09-01

    We are developing a new kind of planetary exploration mission for Mars in collaboration between the Finnish Meteorological Institute (FMI), Lavochkin Association (LA), Space Research Institute (IKI) and Institutio Nacional de Tecnica Aerospacial (INTA). The Mars MetNet mission is based on a new semi-hard landing vehicle called MetNet Lander (MNL). The scientific payload of the Mars MetNet Precursor [1] mission is divided into three categories: Atmospheric instruments, Optical devices and Composition and structure devices. Each of the payload instruments will provide significant insights in to the Martian atmospheric behavior. The key technologies of the MetNet Lander have been qualified and the electrical qualification model (EQM) of the payload bay has been built and successfully tested.

  14. Integration of net zero energy building with smart grid to improve regional electrification ratio towards sustainable development

    NASA Astrophysics Data System (ADS)

    Latief, Yusuf; Berawi, Mohammed Ali; Supriadi, Leni; Bintang Koesalamwardi, Ario; Petroceany, Jade; Herzanita, Ayu

    2017-12-01

    Indonesia is currently encouraging its physical, social and economy development. Physical development for economic development have to be supported by energy availability. For Indonesia, 90% of electrification ratio is still become an important task that has to be completed by the Government. However, the effort to increase electrification can become an environmental problem if it’s done with BAU scenario. The by-product of electric generation is the GHG, which increasing every year since 2006 from various sectors i.e. industry, housing, commercial, transportation, and energy. Net Zero Energy Building (NZEB) is an energy efficient building which can produce energy independently from clean and renewable sources. The energy that is generated by NZEB can be used for the building itself, and can be exported to the central grid. The integration of NZEB and Smart Grid can solve today’s issue on electrification ratio. Literature study will find benchmarks which can be applied in Indonesia along with possible obstacles in applying this technology.

  15. Smartphones as Experimental Tools: Different Methods to Determine the Gravitational Acceleration in Classroom Physics by Using Everyday Devices

    ERIC Educational Resources Information Center

    Kuhn, Jochen; Vogt, Patrik

    2013-01-01

    New media technology becomes more and more important for our daily life as well as for teaching physics. Within the scope of our N.E.T. research project we develop experiments using New Media Experimental Tools (N.E.T.) in physics education and study their influence on students learning abilities. We want to present the possibilities e.g. of…

  16. The use of narrative sampling in the assessment of social cognition: The Narrative of Emotions Task (NET)

    PubMed Central

    Buck, Benjamin; Ludwig, Kelsey; Penn, David L.

    2014-01-01

    Social cognitive deficits in schizophrenia are well documented and related to functional outcome. Current social cognition measures are often not psychometrically validated, too heterogeneous for standardization, and focus principally on one domain of social cognition rather than the simultaneous activation of multiple domains. Also, few if any allow for personalization of stimuli and interpretation of personally evocative events. An alternative methodology that addresses these limitations is the analysis of samples of personal narratives. The present study evaluates the psychometric properties of a measure called the Narrative of Emotions Task (NET). The NET was used to assess the performance of participants with a diagnosis of schizophrenia or schizoaffective disorder and nonclinical controls. Use of the NET revealed significant impairments in the emotional narratives of participants with schizophrenia. Various NET indices were significantly related to current measures of theory of mind and emotion perception, as well as a social skills role-play, but were not related to measures of attributional style or clinician-rated functioning scales. Overall, the NET's psychometric properties justify further use of the narrative sampling method of social cognition assessment in this population. PMID:24726270

  17. Marketing netcoatings for aquaculture.

    PubMed

    Martin, Robert J

    2014-10-17

    Unsustainable harvesting of natural fish stocks is driving an ever growing marine aquaculture industry. Part of the aquaculture support industry is net suppliers who provide producers with nets used in confining fish while they are grown to market size. Biofouling must be addressed in marine environments to ensure maximum product growth by maintaining water flow and waste removal through the nets. Biofouling is managed with copper and organic biocide based net coatings. The aquaculture industry provides a case study for business issues related to entry of improved fouling management technology into the marketplace. Several major hurdles hinder entry of improved novel technologies into the market. The first hurdle is due to the structure of business relationships. Net suppliers can actually cut their business profits dramatically by introducing improved technologies. A second major hurdle is financial costs of registration and demonstration of efficacy and quality product with a new technology. Costs of registration are prohibitive if only the net coatings market is involved. Demonstration of quality product requires collaboration and a team approach between formulators, net suppliers and farmers. An alternative solution is a vertically integrated business model in which the support business and product production business are part of the same company.

  18. Marketing Netcoatings for Aquaculture

    PubMed Central

    Martin, Robert J.

    2014-01-01

    Unsustainable harvesting of natural fish stocks is driving an ever growing marine aquaculture industry. Part of the aquaculture support industry is net suppliers who provide producers with nets used in confining fish while they are grown to market size. Biofouling must be addressed in marine environments to ensure maximum product growth by maintaining water flow and waste removal through the nets. Biofouling is managed with copper and organic biocide based net coatings. The aquaculture industry provides a case study for business issues related to entry of improved fouling management technology into the marketplace. Several major hurdles hinder entry of improved novel technologies into the market. The first hurdle is due to the structure of business relationships. Net suppliers can actually cut their business profits dramatically by introducing improved technologies. A second major hurdle is financial costs of registration and demonstration of efficacy and quality product with a new technology. Costs of registration are prohibitive if only the net coatings market is involved. Demonstration of quality product requires collaboration and a team approach between formulators, net suppliers and farmers. An alternative solution is a vertically integrated business model in which the support business and product production business are part of the same company. PMID:25329615

  19. Terrorism and Cybercrime

    DTIC Science & Technology

    2008-05-01

    www.fbi.gov/congress/congress03/farnan051503.htm 63 Federal Bureau of Investigations. “Netting Cyber Criminals : Inside the Connecticut Computer Crimes Task...http://www.fbi.gov/cyberinvest/cyberhome.htm Federal Bureau of Investigations. “Netting Cyber Criminals : Inside the Connecticut Computer Crimes

  20. Greening the Net Generation: Outdoor Adult Learning in the Digital Age

    ERIC Educational Resources Information Center

    Walter, Pierre

    2013-01-01

    Adult learning today takes place primarily within walled classrooms or in other indoor settings, and often in front of various types of digital screens. As adults have adopted the digital technologies and indoor lifestyle attributed to the so-called "Net Generation," we have become detached from contact with the natural world outdoors.…

  1. Net Warrior D10 Technology Report: Airborne Early Warning and Control (AEW&C) and Data Link Nodes

    DTIC Science & Technology

    2012-04-01

    ADO ) approach to implementing Network Centric Warfare (NCW) through ‘learning by doing’. Net Warrior was conceived to address, through... frameworks are able to satisfy design needs of applications to produce stable mission and net centric systems. NW-D10 employed a SOA approach to...UNCLASSIFIED Net Warrior D10 Technology Report: Airborne Early Warning and Control (AEW&C) and Data Link Nodes Derek Dominish

  2. Visualisation and Analytic Strategies for Anticipating the Folding of Nets

    ERIC Educational Resources Information Center

    Wright, Vince

    2016-01-01

    Visual and analytic strategies are features of students' schemes for spatial tasks. The strategies used by six students to anticipate the folding of nets were investigated. Evidence suggested that visual and analytic strategies were strongly connected in competent performance.

  3. Prospects for detecting a net photon circular polarization produced by decaying dark matter

    NASA Astrophysics Data System (ADS)

    Elagin, Andrey; Kumar, Jason; Sandick, Pearl; Teng, Fei

    2017-11-01

    If dark matter interactions with Standard Model particles are C P violating, then dark matter annihilation/decay can produce photons with a net circular polarization. We consider the prospects for experimentally detecting evidence for such a circular polarization. We identify optimal models for dark matter interactions with the Standard Model, from the point of view of detectability of the net polarization, for the case of either symmetric or asymmetric dark matter. We find that, for symmetric dark matter, evidence for net polarization could be found by a search of the Galactic center by an instrument sensitive to circular polarization with an efficiency-weighted exposure of at least 50 ,000 cm2 yr , provided the systematic detector uncertainties are constrained at the 1% level. Better sensitivity can be obtained in the case of asymmetric dark matter. We discuss the prospects for achieving the needed level of performance using possible detector technologies.

  4. Connecting the Force from Space: The IRIS Joint Capability Technology Demonstration

    DTIC Science & Technology

    2010-01-01

    the Joint in Joint Capability Technology Demonstration, we have two sponsors, both U.S. Strategic Command and the Defense Information Systems...Capability Technology Demonstration will provide an excellent source of data on space-based Internet Protocol net- working. Operational... Internet Routing in Space Joint Capability Technology Demonstration Operational Manager, Space and Missile Defense Battle Lab, Colorado Springs

  5. Optimal SSN Tasking to Enhance Real-time Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Ferreira, J., III; Hussein, I.; Gerber, J.; Sivilli, R.

    2016-09-01

    Space Situational Awareness (SSA) is currently constrained by an overwhelming number of resident space objects (RSOs) that need to be tracked and the amount of data these observations produce. The Joint Centralized Autonomous Tasking System (JCATS) is an autonomous, net-centric tool that approaches these SSA concerns from an agile, information-based stance. Finite set statistics and stochastic optimization are used to maintain an RSO catalog and develop sensor tasking schedules based on operator configured, state information-gain metrics to determine observation priorities. This improves the efficiency of sensors to target objects as awareness changes and new information is needed, not at predefined frequencies solely. A net-centric, service-oriented architecture (SOA) allows for JCATS integration into existing SSA systems. Testing has shown operationally-relevant performance improvements and scalability across multiple types of scenarios and against current sensor tasking tools.

  6. Cryogenic Tank Technology Program (CTTP)

    NASA Technical Reports Server (NTRS)

    Vaughn, T. P.

    2001-01-01

    The objectives of the Cryogenic Tank Technology Program were to: (1) determine the feasibility and cost effectiveness of near net shape hardware; (2) demonstrate near net shape processes by fabricating large scale-flight quality hardware; and (3) advance state of current weld processing technologies for aluminum lithium alloys.

  7. Saliency U-Net: A regional saliency map-driven hybrid deep learning network for anomaly segmentation

    NASA Astrophysics Data System (ADS)

    Karargyros, Alex; Syeda-Mahmood, Tanveer

    2018-02-01

    Deep learning networks are gaining popularity in many medical image analysis tasks due to their generalized ability to automatically extract relevant features from raw images. However, this can make the learning problem unnecessarily harder requiring network architectures of high complexity. In case of anomaly detection, in particular, there is often sufficient regional difference between the anomaly and the surrounding parenchyma that could be easily highlighted through bottom-up saliency operators. In this paper we propose a new hybrid deep learning network using a combination of raw image and such regional maps to more accurately learn the anomalies using simpler network architectures. Specifically, we modify a deep learning network called U-Net using both the raw and pre-segmented images as input to produce joint encoding (contraction) and expansion paths (decoding) in the U-Net. We present results of successfully delineating subdural and epidural hematomas in brain CT imaging and liver hemangioma in abdominal CT images using such network.

  8. A Vision for the Net Generation Media Center. Media Matters

    ERIC Educational Resources Information Center

    Johnson, Doug

    2005-01-01

    Many children today have never lived in a home without a computer. They are the "Net Generation," constantly "connected" by iPod, cell phone, keyboard, digital video camera, or game controller to various technologies. Recent studies have found that Net Genners see technology as "embedded in society," a primary means of connection with friends, and…

  9. Tropospheric O3 compromises net primary production in young stands of trembling aspen, paper birch and sugar maple in response to elevated atmospheric CO2

    Treesearch

    John S. King; Mark E. Kubiske; Kurt S. Pregitzer; George R. Hendrey; Evan P. McDonald; Christian P. Giardina; Vanessa S. Quinn; David F. Karnosky

    2005-01-01

    Concentrations of atmospheric CO2 and tropospheric ozone (O3) are rising concurrently in the atmosphere, with potentially antagonistic effects on forest net primary production (NPP) and implications for terrestrial carbon sequestration. Using free-air CO2 enrichment (FACE) technology, we exposed north...

  10. A New Network Modeling Tool for the Ground-based Nuclear Explosion Monitoring Community

    NASA Astrophysics Data System (ADS)

    Merchant, B. J.; Chael, E. P.; Young, C. J.

    2013-12-01

    Network simulations have long been used to assess the performance of monitoring networks to detect events for such purposes as planning station deployments and network resilience to outages. The standard tool has been the SAIC-developed NetSim package. With correct parameters, NetSim can produce useful simulations; however, the package has several shortcomings: an older language (FORTRAN), an emphasis on seismic monitoring with limited support for other technologies, limited documentation, and a limited parameter set. Thus, we are developing NetMOD (Network Monitoring for Optimal Detection), a Java-based tool designed to assess the performance of ground-based networks. NetMOD's advantages include: coded in a modern language that is multi-platform, utilizes modern computing performance (e.g. multi-core processors), incorporates monitoring technologies other than seismic, and includes a well-validated default parameter set for the IMS stations. NetMOD is designed to be extendable through a plugin infrastructure, so new phenomenological models can be added. Development of the Seismic Detection Plugin is being pursued first. Seismic location and infrasound and hydroacoustic detection plugins will follow. By making NetMOD an open-release package, it can hopefully provide a common tool that the monitoring community can use to produce assessments of monitoring networks and to verify assessments made by others.

  11. A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.

    2017-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  12. Disruptive Effects of Net-Centricity on Command and Control

    DTIC Science & Technology

    2008-06-01

    expectations too quickly are vulnerable to disruptive technologies . When the disruptive innovation gains market share, and old customers adopt new...it is important to remember that disruptive technologies are not merely those that have introduced steep performance improvements, but which, at the...technologies. Disruptive technologies are thereby distinguished from discontinuous sustaining innovations. Net-centric information environments are proving

  13. Effect of a dual-task net-step exercise on cognitive and gait function in older adults.

    PubMed

    Kitazawa, Kazutoshi; Showa, Satoko; Hiraoka, Akira; Fushiki, Yasuhiro; Sakauchi, Humio; Mori, Mitsuru

    2015-01-01

    Participation in generally recommended aerobics or strength exercises may be challenging for older adults. Therefore, it is necessary to consider the types and levels of physical activities suited for them to improve their cognitive and gait function and adherence to exercise programs. This has prompted efforts to identify exercises that require less physical strength and frequency of performance, while still offering cognitive and health benefits. Here, we aimed to assess the effect of a novel dual-task net-step exercise (NSE) performed once a week for 8 consecutive weeks on improvements in cognitive performance and gait function in an older population. In this pretest/posttest experimental case control study, 60 healthy older adults (mean age 76.4 years) were recruited from community-dwelling people and separated randomly into 2 groups: a dual-task NSE group and a control group. The NSE group was asked to walk across a net without stepping on the ropes or being caught in the net. Two computer panel-type cognitive functional assessments, the Touch-M and Touch Panel-Type Dementia Assessment Scale, were administered at baseline and after 8 weeks of intervention to determine the effects of NSE. Improvements in gait function were also evaluated using Timed Up and Go test scores. Mixed-effect models with repeated measures (group × time) (analysis of variance, F test) were used to test the effects of NSE. Adjustments were made for covariates including age and sex (analysis of covariance). The NSE group showed significant improvement in cognitive performance (6.8% change; total Touch-M score 5.4 points; P = .04) and gait performance (11.5% change; Timed Up and Go time -0.98 second; P < .001) over the 8-week period. In the control group, there was no significant improvement. This study shows that dual-task NSE is capable of improving cognitive and gait performance in healthy older adults. Our results indicate that NSE offers an option for a large segment of the older population who need an easier way to maintain their cognitive health and gait function.

  14. Robustness analysis of non-ordinary Petri nets for flexible assembly/disassembly processes based on structural decomposition

    NASA Astrophysics Data System (ADS)

    Hsieh, Fu-Shiung

    2011-03-01

    Design of robust supervisory controllers for manufacturing systems with unreliable resources has received significant attention recently. Robustness analysis provides an alternative way to analyse a perturbed system to quickly respond to resource failures. Although we have analysed the robustness properties of several subclasses of ordinary Petri nets (PNs), analysis for non-ordinary PNs has not been done. Non-ordinary PNs have weighted arcs and have the advantage to compactly model operations requiring multiple parts or resources. In this article, we consider a class of flexible assembly/disassembly manufacturing systems and propose a non-ordinary flexible assembly/disassembly Petri net (NFADPN) model for this class of systems. As the class of flexible assembly/disassembly manufacturing systems can be regarded as the integration and interactions of a set of assembly/disassembly subprocesses, a bottom-up approach is adopted in this article to construct the NFADPN models. Due to the routing flexibility in NFADPN, there may exist different ways to accomplish the tasks. To characterise different ways to accomplish the tasks, we propose the concept of completely connected subprocesses. As long as there exists a set of completely connected subprocesses for certain type of products, the production of that type of products can still be maintained without requiring the whole NFADPN to be live. To take advantage of the alternative routes without enforcing liveness for the whole system, we generalise the concept of persistent production proposed to NFADPN. We propose a condition for persistent production based on the concept of completely connected subprocesses. We extend robustness analysis to NFADPN by exploiting its structure. We identify several patterns of resource failures and characterise the conditions to maintain operation in the presence of resource failures.

  15. Trapped between two tails: trading off scientific uncertainties via climate targets

    NASA Astrophysics Data System (ADS)

    Lemoine, Derek; McJeon, Haewon C.

    2013-09-01

    Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology-rich GCAM integrated assessment model to assess the robustness of 450 and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides net benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.

  16. Research and Analysis of Image Processing Technologies Based on DotNet Framework

    NASA Astrophysics Data System (ADS)

    Ya-Lin, Song; Chen-Xi, Bai

    Microsoft.Net is a kind of most popular program development tool. This paper gave a detailed analysis concluded about some image processing technologies of the advantages and disadvantages by .Net processed image while the same algorithm is used in Programming experiments. The result shows that the two best efficient methods are unsafe pointer and Direct 3D, and Direct 3D used to 3D simulation development, and the others are useful in some fields while these technologies are poor efficiency and not suited to real-time processing. The experiment results in paper will help some projects about image processing and simulation based DotNet and it has strong practicability.

  17. Training Analyses Supporting the Land Warrior and Ground Soldier Systems

    DTIC Science & Technology

    2009-07-01

    unit with LW and MW expressed in terms of unit force effectiveness, impacts to the DOTMLPF domains, life cycle cost, and ability to mitigate Joint...other individual tasks, Soldier and/or leader, be added to NET; should any be eliminated? What methods of instruction/resources should remain the...presentation of the training observation results from the nine-day NET. Terminal Learning Objectives The NET POI ( Omega Training Group, 2006

  18. Net-Zero Building Technologies Create Substantial Energy Savings -

    Science.gov Websites

    -by-step information for decision making around net-zero energy building technologies. The past three improved insulation, windows, and heating and cooling systems. Despite these strides, energy use by energy building methodologies and technologies during a tour of the RSF's rooftop PV system. Photo by

  19. National Educational Technology. Standards for Students.

    ERIC Educational Resources Information Center

    International Society for Technology in Education, Eugene, OR.

    The primary goals of the National Educational Technology Standards (NETS) project is to enable stakeholders in PreK-12 education to develop national standards for the educational uses of technology that will facilitate school improvement in the United States. The NETS Project will develop standards to guide educational leaders in recognizing and…

  20. Second Language Teaching and Learning in the Net Generation

    ERIC Educational Resources Information Center

    Oxford, Raquel, Ed.; Oxford, Jeffrey, Ed.

    2009-01-01

    Today's young people--the Net Generation--have grown up with technology all around them. However, teachers cannot assume that students' familiarity with technology in general transfers successfully to pedagogical settings. This volume examines various technologies and offers concrete advice on how each can be successfully implemented in the second…

  1. Pathways for balancing CO2 emissions and sinks.

    PubMed

    Walsh, Brian; Ciais, Philippe; Janssens, Ivan A; Peñuelas, Josep; Riahi, Keywan; Rydzak, Felicjan; van Vuuren, Detlef P; Obersteiner, Michael

    2017-04-13

    In December 2015 in Paris, leaders committed to achieve global, net decarbonization of human activities before 2100. This achievement would halt and even reverse anthropogenic climate change through the net removal of carbon from the atmosphere. However, the Paris documents contain few specific prescriptions for emissions mitigation, leaving various countries to pursue their own agendas. In this analysis, we project energy and land-use emissions mitigation pathways through 2100, subject to best-available parameterization of carbon-climate feedbacks and interdependencies. We find that, barring unforeseen and transformative technological advancement, anthropogenic emissions need to peak within the next 10 years, to maintain realistic pathways to meeting the COP21 emissions and warming targets. Fossil fuel consumption will probably need to be reduced below a quarter of primary energy supply by 2100 and the allowable consumption rate drops even further if negative emissions technologies remain technologically or economically unfeasible at the global scale.

  2. Pathways for balancing CO2 emissions and sinks

    PubMed Central

    Walsh, Brian; Ciais, Philippe; Janssens, Ivan A.; Peñuelas, Josep; Riahi, Keywan; Rydzak, Felicjan; van Vuuren, Detlef P.; Obersteiner, Michael

    2017-01-01

    In December 2015 in Paris, leaders committed to achieve global, net decarbonization of human activities before 2100. This achievement would halt and even reverse anthropogenic climate change through the net removal of carbon from the atmosphere. However, the Paris documents contain few specific prescriptions for emissions mitigation, leaving various countries to pursue their own agendas. In this analysis, we project energy and land-use emissions mitigation pathways through 2100, subject to best-available parameterization of carbon-climate feedbacks and interdependencies. We find that, barring unforeseen and transformative technological advancement, anthropogenic emissions need to peak within the next 10 years, to maintain realistic pathways to meeting the COP21 emissions and warming targets. Fossil fuel consumption will probably need to be reduced below a quarter of primary energy supply by 2100 and the allowable consumption rate drops even further if negative emissions technologies remain technologically or economically unfeasible at the global scale. PMID:28406154

  3. Development of miniaturized instrumentation for Planetary Exploration and its application to the Mars MetNet Precursor Mission

    NASA Astrophysics Data System (ADS)

    Guerrero, Hector

    2010-05-01

    In this communication is presented the current development of some miniaturized instruments developed for Lander and Rovers for Planetary exploration. In particular, we present a magnetometer with resolution below 10 nT and mass in the range of 45 g; a sun irradiance spectral sensor with 10 bands (UV-VIS-near IR) and a mass in the range of 75 g. These are being developed for the Finnish, Russian and Spanish MetNet Mars Precursor Mission, to be launched in 2011 within the Phobos Grunt (Sample Return). The magnetometer (at present at EQM level) has two triaxial magnetometers (based on commercial AMR technologies) that operate in gradiometer configuration. Moreover has inside the box there a triaxial accelerometer to get the gravitational orientation of the magnetometer after its deployment. This unit is being designed to operate under the Mars severe conditions (at night) without any thermal conditioning. The sun irradiance spectral irradiance sensor is composed by individual silicon photodiodes with interference filters on each, and collimators to prevent wavelength shifts due to oblique incidence. In order allow discrimination between direct and diffuse ambient light, the photodiodes are deployed on the top and lateral sides of this unit. The instrument is being optimized for deep UV detection, dust optical depth and Phobos transits. The accuracy for detecting some atmospheric gases traces is under study. Besides, INTA is developing optical wireless link technologies modules for operating on Mars at distances over 1 m, to minimize harness, reduce weight and improve Assembly Integration and Test (AIT) tasks. Actual emitter/receiver modules are below 10 g allowing data transmission rates over 1 Mbps.

  4. MetNet Precursor - Network Mission to Mars

    NASA Astrophysics Data System (ADS)

    Harri, Arri-Matti

    2010-05-01

    We are developing a new kind of planetary exploration mission for Mars - MetNet in situ observation network based on a new semi-hard landing vehicle called the Met-Net Lander (MNL). The first MetNet vehicle, MetNet Precursor, slated for launch in 2011. The MetNet development work started already in 2001. The actual practical Precursor Mission development work started in January 2009 with participation from various space research institutes and agencies. The scientific rationale and goals as well as key mission solutions will be discussed. The eventual scope of the MetNet Mission is to deploy some 20 MNLs on the Martian surface using inflatable descent system structures, which will be supported by observations from the orbit around Mars. Currently we are working on the MetNet Mars Precursor Mission (MMPM) to deploy one MetNet Lander to Mars in the 2011 launch window as a technology and science demonstration mission. The MNL will have a versatile science payload focused on the atmospheric science of Mars. Time-resolved in situ Martian meteorological measurements acquired by the Viking, Mars Pathfinder and Phoenix landers and remote sensing observations by the Mariner 9, Viking, Mars Global Surveyor, Mars Odyssey and the Mars Express orbiters have provided the basis for our current understanding of the behavior of weather and climate on Mars. However, the available amount of data is still scarce and a wealth of additional in situ observations are needed on varying types of Martian orography, terrain and altitude spanning all latitudes and longitudes to address microscale and mesoscale atmospheric phenomena. Detailed characterization of the Martian atmospheric circulation patterns and climatological cycles requires simultaneous in situ atmospheric observations. The scientific payload of the MetNet Mission encompasses separate instrument packages for the atmospheric entry and descent phase and for the surface operation phase. The MetNet mission concept and key probe technologies have been developed and the critical subsystems have been qualified to meet the Martian environmental and functional conditions. The flight unit of the landing vehicle has been manufactured and tested. This development effort has been fulfilled in collaboration between the Finnish Meteorological Institute (FMI), the Russian Lavoschkin Association (LA) and the Russian Space Research Institute (IKI) since August 2001. INTA (Instituto Nacional de Técnica Aeroespacial) from Spain joined the MetNet Mission team in 2008, and is participating significantly in the MetNet payload development.

  5. Evaluating a federated medical search engine: tailoring the methodology and reporting the evaluation outcomes.

    PubMed

    Saparova, D; Belden, J; Williams, J; Richardson, B; Schuster, K

    2014-01-01

    Federated medical search engines are health information systems that provide a single access point to different types of information. Their efficiency as clinical decision support tools has been demonstrated through numerous evaluations. Despite their rigor, very few of these studies report holistic evaluations of medical search engines and even fewer base their evaluations on existing evaluation frameworks. To evaluate a federated medical search engine, MedSocket, for its potential net benefits in an established clinical setting. This study applied the Human, Organization, and Technology (HOT-fit) evaluation framework in order to evaluate MedSocket. The hierarchical structure of the HOT-factors allowed for identification of a combination of efficiency metrics. Human fit was evaluated through user satisfaction and patterns of system use; technology fit was evaluated through the measurements of time-on-task and the accuracy of the found answers; and organization fit was evaluated from the perspective of system fit to the existing organizational structure. Evaluations produced mixed results and suggested several opportunities for system improvement. On average, participants were satisfied with MedSocket searches and confident in the accuracy of retrieved answers. However, MedSocket did not meet participants' expectations in terms of download speed, access to information, and relevance of the search results. These mixed results made it necessary to conclude that in the case of MedSocket, technology fit had a significant influence on the human and organization fit. Hence, improving technological capabilities of the system is critical before its net benefits can become noticeable. The HOT-fit evaluation framework was instrumental in tailoring the methodology for conducting a comprehensive evaluation of the search engine. Such multidimensional evaluation of the search engine resulted in recommendations for system improvement.

  6. Evaluating a Federated Medical Search Engine

    PubMed Central

    Belden, J.; Williams, J.; Richardson, B.; Schuster, K.

    2014-01-01

    Summary Background Federated medical search engines are health information systems that provide a single access point to different types of information. Their efficiency as clinical decision support tools has been demonstrated through numerous evaluations. Despite their rigor, very few of these studies report holistic evaluations of medical search engines and even fewer base their evaluations on existing evaluation frameworks. Objectives To evaluate a federated medical search engine, MedSocket, for its potential net benefits in an established clinical setting. Methods This study applied the Human, Organization, and Technology (HOT-fit) evaluation framework in order to evaluate MedSocket. The hierarchical structure of the HOT-factors allowed for identification of a combination of efficiency metrics. Human fit was evaluated through user satisfaction and patterns of system use; technology fit was evaluated through the measurements of time-on-task and the accuracy of the found answers; and organization fit was evaluated from the perspective of system fit to the existing organizational structure. Results Evaluations produced mixed results and suggested several opportunities for system improvement. On average, participants were satisfied with MedSocket searches and confident in the accuracy of retrieved answers. However, MedSocket did not meet participants’ expectations in terms of download speed, access to information, and relevance of the search results. These mixed results made it necessary to conclude that in the case of MedSocket, technology fit had a significant influence on the human and organization fit. Hence, improving technological capabilities of the system is critical before its net benefits can become noticeable. Conclusions The HOT-fit evaluation framework was instrumental in tailoring the methodology for conducting a comprehensive evaluation of the search engine. Such multidimensional evaluation of the search engine resulted in recommendations for system improvement. PMID:25298813

  7. Decision theory and the evaluation of risks and benefits of clinical trials.

    PubMed

    Bernabe, Rosemarie D C; van Thiel, Ghislaine J M W; Raaijmakers, Jan A M; van Delden, Johannes J M

    2012-12-01

    Research ethics committees (RECs) are tasked to assess the risks and the benefits of a clinical trial. In previous studies, it was shown that RECs find this task difficult, if not impossible, to do. The current approaches to benefit-risk assessment (i.e. Component Analysis and the Net Risk Test) confound the various risk-benefit tasks, and as such, make balancing impossible. In this article, we show that decision theory, specifically through the expected utility theory and multiattribute utility theory, enable for an explicit and ethically weighted risk-benefit evaluation. This makes a balanced ethical justification possible, and thus a more rationally defensible decision making. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Efficient droplet router for digital microfluidic biochip using particle swarm optimizer

    NASA Astrophysics Data System (ADS)

    Pan, Indrajit; Samanta, Tuhina

    2013-01-01

    Digital Microfluidic Biochip has emerged as a revolutionary finding in the field of micro-electromechanical research. Different complex bioassays and pathological analysis are being efficiently performed on this miniaturized chip with negligible amount of sample specimens. Initially biochip was invented on continuous-fluid-flow mechanism but later it has evolved with more efficient concept of digital-fluid-flow. These second generation biochips are capable of serving more complex bioassays. This operational change in biochip technology emerged with the requirement of high end computer aided design needs for physical design automation. The change also paved new avenues of research to assist the proficient design automation. Droplet routing is one of those major aspects where it necessarily requires minimization of both routing completion time and total electrode usage. This task involves optimization of multiple associated parameters. In this paper we have proposed a particle swarm optimization based approach for droplet outing. The process mainly operates in two phases where initially we perform clustering of state space and classification of nets into designated clusters. This helps us to reduce solution space by redefining local sub optimal target in the interleaved space between source and global target of a net. In the next phase we resolve the concurrent routing issues of every sub optimal situation to generate final routing schedule. The method was applied on some standard test benches and hard test sets. Comparative analysis of experimental results shows good improvement on the aspect of unit cell usage, routing completion time and execution time over some well existing methods.

  9. Negative emissions: Part 1—research landscape and synthesis

    NASA Astrophysics Data System (ADS)

    Minx, Jan C.; Lamb, William F.; Callaghan, Max W.; Fuss, Sabine; Hilaire, Jérôme; Creutzig, Felix; Amann, Thorben; Beringer, Tim; de Oliveira Garcia, Wagner; Hartmann, Jens; Khanna, Tarun; Lenzi, Dominic; Luderer, Gunnar; Nemet, Gregory F.; Rogelj, Joeri; Smith, Pete; Vicente, Jose Luis Vicente; Wilcox, Jennifer; del Mar Zamora Dominguez, Maria

    2018-06-01

    With the Paris Agreement’s ambition of limiting climate change to well below 2 °C, negative emission technologies (NETs) have moved into the limelight of discussions in climate science and policy. Despite several assessments, the current knowledge on NETs is still diffuse and incomplete, but also growing fast. Here, we synthesize a comprehensive body of NETs literature, using scientometric tools and performing an in-depth assessment of the quantitative and qualitative evidence therein. We clarify the role of NETs in climate change mitigation scenarios, their ethical implications, as well as the challenges involved in bringing the various NETs to the market and scaling them up in time. There are six major findings arising from our assessment: first, keeping warming below 1.5 °C requires the large-scale deployment of NETs, but this dependency can still be kept to a minimum for the 2 °C warming limit. Second, accounting for economic and biophysical limits, we identify relevant potentials for all NETs except ocean fertilization. Third, any single NET is unlikely to sustainably achieve the large NETs deployment observed in many 1.5 °C and 2 °C mitigation scenarios. Yet, portfolios of multiple NETs, each deployed at modest scales, could be invaluable for reaching the climate goals. Fourth, a substantial gap exists between the upscaling and rapid diffusion of NETs implied in scenarios and progress in actual innovation and deployment. If NETs are required at the scales currently discussed, the resulting urgency of implementation is currently neither reflected in science nor policy. Fifth, NETs face severe barriers to implementation and are only weakly incentivized so far. Finally, we identify distinct ethical discourses relevant for NETs, but highlight the need to root them firmly in the available evidence in order to render such discussions relevant in practice.

  10. Wearable computer technology for dismounted applications

    NASA Astrophysics Data System (ADS)

    Daniels, Reginald

    2010-04-01

    Small computing devices which rival the compact size of traditional personal digital assistants (PDA) have recently established a market niche. These computing devices are small enough to be considered unobtrusive for humans to wear. The computing devices are also powerful enough to run full multi-tasking general purpose operating systems. This paper will explore the wearable computer information system for dismounted applications recently fielded for ground-based US Air Force use. The environments that the information systems are used in will be reviewed, as well as a description of the net-centric, ground-based warrior. The paper will conclude with a discussion regarding the importance of intuitive, usable, and unobtrusive operator interfaces for dismounted operators.

  11. Modified Petri net model sensitivity to workload manipulations

    NASA Technical Reports Server (NTRS)

    White, S. A.; Mackinnon, D. P.; Lyman, J.

    1986-01-01

    Modified Petri Nets (MPNs) are investigated as a workload modeling tool. The results of an exploratory study of the sensitivity of MPNs to work load manipulations in a dual task are described. Petri nets have been used to represent systems with asynchronous, concurrent and parallel activities (Peterson, 1981). These characteristics led some researchers to suggest the use of Petri nets in workload modeling where concurrent and parallel activities are common. Petri nets are represented by places and transitions. In the workload application, places represent operator activities and transitions represent events. MPNs have been used to formally represent task events and activities of a human operator in a man-machine system. Some descriptive applications demonstrate the usefulness of MPNs in the formal representation of systems. It is the general hypothesis herein that in addition to descriptive applications, MPNs may be useful for workload estimation and prediction. The results are reported of the first of a series of experiments designed to develop and test a MPN system of workload estimation and prediction. This first experiment is a screening test of MPN model general sensitivity to changes in workload. Positive results from this experiment will justify the more complicated analyses and techniques necessary for developing a workload prediction system.

  12. Changes in Teachers' Attitudes toward Instructional Technology Attributed to Completing the ISTE NETS*T Certificate of Proficiency Capstone Program

    ERIC Educational Resources Information Center

    Overbaugh, Richard C.; Lu, Ruiling; Diacopoulos, Mark

    2015-01-01

    An evaluation was conducted of teachers' attitudinal perceptions of their confidence for implementation, stages of innovation adoption, and satisfaction, as a result of participating in the International Society for Technology in Education's National Educational Technology Standards-Teachers (ISTE NETS*T) Certificate of Proficiency Capstone…

  13. Dietary caffeine, performance and mood: enhancing and restorative effects after controlling for withdrawal reversal.

    PubMed

    James, Jack E; Gregg, M Elizabeth; Kane, Marian; Harte, Frances

    2005-01-01

    This study aimed to determine whether sustained (i.e. dietary) use of caffeine has net effects on performance and mood compared with sustained abstinence, and whether dietary caffeine restores performance and mood adversely affected by sleep restriction. Participants (n = 96) alternated weekly between ingesting placebo and caffeine (1.75 mg/kg) three times daily for 4 consecutive weeks, while either rested or sleep restricted. Performance involved either a single task requiring sustained vigilance or a varied battery of brief psychomotor and cognitive tasks, and mood was assessed using the Profile of Mood States. Caffeine had no significant net enhancing effects for either performance or mood when participants were rested, and produced no net restorative effects when performance and mood were degraded by sleep restriction. Copyright 2005 S. Karger AG, Basel

  14. A New Informatics Geography.

    PubMed

    Coiera, E

    2016-11-10

    Anyone with knowledge of information systems has experienced frustration when it comes to system implementation or use. Unanticipated challenges arise frequently and unanticipated consequences may follow. Working from first principles, to understand why information technology (IT) is often challenging, identify which IT endeavors are more likely to succeed, and predict the best role that technology can play in different tasks and settings. The fundamental purpose of IT is to enhance our ability to undertake tasks, supplying new information that changes what we decide and ultimately what occurs in the world. The value of this information (VOI) can be calculated at different stages of the decision-making process and will vary depending on how technology is used. We can imagine a task space that describes the relative benefits of task completion by humans or computers and that contains specific areas where humans or computers are superior. There is a third area where neither is strong and a final joint workspace where humans and computers working in partnership produce the best results. By understanding that information has value and that VOI can be quantified, we can make decisions about how best to support the work we do. Evaluation of the expected utility of task completion by humans or computers should allow us to decide whether solutions should depend on technology, humans, or a partnership between the two.

  15. Pre-shaping of the Fingertip of Robot Hand Covered with Net Structure Proximity Sensor

    NASA Astrophysics Data System (ADS)

    Suzuki, Kenji; Suzuki, Yosuke; Hasegawa, Hiroaki; Ming, Aiguo; Ishikawa, Masatoshi; Shimojo, Makoto

    To achieve skillful tasks with multi-fingered robot hands, many researchers have been working on sensor-based control of them. Vision sensors and tactile sensors are indispensable for the tasks, however, the correctness of the information from the vision sensors decreases as a robot hand approaches to a grasping object because of occlusion. This research aims to achieve seamless detection for reliable grasp by use of proximity sensors: correcting the positional error of the hand in vision-based approach, and contacting the fingertip in the posture for effective tactile sensing. In this paper, we propose a method for adjusting the posture of the fingertip to the surface of the object. The method applies “Net-Structure Proximity Sensor” on the fingertip, which can detect the postural error in the roll and pitch axes between the fingertip and the object surface. The experimental result shows that the postural error is corrected in the both axes even if the object dynamically rotates.

  16. Coordinating complex decision support activities across distributed applications

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  17. Bulk energy storage increases United States electricity system emissions.

    PubMed

    Hittinger, Eric S; Azevedo, Inês M L

    2015-03-03

    Bulk energy storage is generally considered an important contributor for the transition toward a more flexible and sustainable electricity system. Although economically valuable, storage is not fundamentally a "green" technology, leading to reductions in emissions. We model the economic and emissions effects of bulk energy storage providing an energy arbitrage service. We calculate the profits under two scenarios (perfect and imperfect information about future electricity prices), and estimate the effect of bulk storage on net emissions of CO2, SO2, and NOx for 20 eGRID subregions in the United States. We find that net system CO2 emissions resulting from storage operation are nontrivial when compared to the emissions from electricity generation, ranging from 104 to 407 kg/MWh of delivered energy depending on location, storage operation mode, and assumptions regarding carbon intensity. Net NOx emissions range from -0.16 (i.e., producing net savings) to 0.49 kg/MWh, and are generally small when compared to average generation-related emissions. Net SO2 emissions from storage operation range from -0.01 to 1.7 kg/MWh, depending on location and storage operation mode.

  18. SeqDepot: streamlined database of biological sequences and precomputed features.

    PubMed

    Ulrich, Luke E; Zhulin, Igor B

    2014-01-15

    Assembling and/or producing integrated knowledge of sequence features continues to be an onerous and redundant task despite a large number of existing resources. We have developed SeqDepot-a novel database that focuses solely on two primary goals: (i) assimilating known primary sequences with predicted feature data and (ii) providing the most simple and straightforward means to procure and readily use this information. Access to >28.5 million sequences and 300 million features is provided through a well-documented and flexible RESTful interface that supports fetching specific data subsets, bulk queries, visualization and searching by MD5 digests or external database identifiers. We have also developed an HTML5/JavaScript web application exemplifying how to interact with SeqDepot and Perl/Python scripts for use with local processing pipelines. Freely available on the web at http://seqdepot.net/. RESTaccess via http://seqdepot.net/api/v1. Database files and scripts maybe downloaded from http://seqdepot.net/download.

  19. The importance of health co-benefits in macroeconomic assessments of UK Greenhouse Gas emission reduction strategies.

    PubMed

    Jensen, Henning Tarp; Keogh-Brown, Marcus R; Smith, Richard D; Chalabi, Zaid; Dangour, Alan D; Davies, Mike; Edwards, Phil; Garnett, Tara; Givoni, Moshe; Griffiths, Ulla; Hamilton, Ian; Jarrett, James; Roberts, Ian; Wilkinson, Paul; Woodcock, James; Haines, Andy

    We employ a single-country dynamically-recursive Computable General Equilibrium model to make health-focussed macroeconomic assessments of three contingent UK Greenhouse Gas (GHG) mitigation strategies, designed to achieve 2030 emission targets as suggested by the UK Committee on Climate Change. In contrast to previous assessment studies, our main focus is on health co-benefits additional to those from reduced local air pollution. We employ a conservative cost-effectiveness methodology with a zero net cost threshold. Our urban transport strategy (with cleaner vehicles and increased active travel) brings important health co-benefits and is likely to be strongly cost-effective; our food and agriculture strategy (based on abatement technologies and reduction in livestock production) brings worthwhile health co-benefits, but is unlikely to eliminate net costs unless new technological measures are included; our household energy efficiency strategy is likely to breakeven only over the long term after the investment programme has ceased (beyond our 20 year time horizon). We conclude that UK policy makers will, most likely, have to adopt elements which involve initial net societal costs in order to achieve future emission targets and longer-term benefits from GHG reduction. Cost-effectiveness of GHG strategies is likely to require technological mitigation interventions and/or demand-constraining interventions with important health co-benefits and other efficiency-enhancing policies that promote internalization of externalities. Health co-benefits can play a crucial role in bringing down net costs, but our results also suggest the need for adopting holistic assessment methodologies which give proper consideration to welfare-improving health co-benefits with potentially negative economic repercussions (such as increased longevity).

  20. Design and Implementation of Surrounding Transaction Plotting and Management System Based on Google Map API

    NASA Astrophysics Data System (ADS)

    Cao, Y. B.; Hua, Y. X.; Zhao, J. X.; Guo, S. M.

    2013-11-01

    With China's rapid economic development and comprehensive national strength growing, Border work has become a long-term and important task in China's diplomatic work. How to implement rapid plotting, real-time sharing and mapping surrounding affairs has taken great significance for government policy makers and diplomatic staff. However, at present the already exists Boundary information system are mainly have problems of Geospatial data update is heavily workload, plotting tools are in a state of serious lack of, Geographic events are difficult to share, this phenomenon has seriously hampered the smooth development of the border task. The development and progress of Geographic information system technology especially the development of Web GIS offers the possibility to solve the above problems, this paper adopts four layers of B/S architecture, with the support of Google maps service, uses the free API which is offered by Google maps and its features of openness, ease of use, sharing characteristics, highresolution images to design and implement the surrounding transaction plotting and management system based on the web development technology of ASP.NET, C#, Ajax. The system can provide decision support for government policy makers as well as diplomatic staff's real-time plotting and sharing of surrounding information. The practice has proved that the system has good usability and strong real-time.

  1. Technological Constraints and Implementation Barriers of Using Videoconferencing for Virtual Teaching in New Zealand Secondary Schools

    ERIC Educational Resources Information Center

    Lai, Kwok-Wing; Pratt, Keryn

    2009-01-01

    Nine New Zealand secondary schools participated in the OtagoNet project, using videoconferencing technologies to deliver courses to multiple sites. This paper reports findings from a study conducted between 2001 and 2004 to evaluate the effectiveness of OtagoNet. It was found that videoconferencing technology had a significant impact on pedagogy…

  2. Toward a Psychological Science of Advanced Technology Design for Older Adults

    PubMed Central

    Fisk, Arthur D.

    2010-01-01

    Objectives. Technology represents advances in knowledge that change the way humans perform tasks. Ideally, technology will make the task easier, more efficient, safer, or perhaps more pleasurable. Unfortunately, new technologies can sometimes make a task more difficult, slower, dangerous, or perhaps more frustrating. Older adults interact with a variety of technologies in the course of their daily activities and thus products should be designed to be used by people of varying ages. Methods. In this article, we provide an overview of what psychology has to offer to the design of technology—from understanding what people need, to identifying their preferences for design characteristics, and to defining their capabilities and limitations that will influence technology interactions. Results. We identify how research in the field of psychology and aging has advanced understanding of technology interactions and how research on technology interactions can inform theories of aging. Discussion. Design for aging involves understanding the unique capabilities and limitations of older adults; identifying their needs, preferences, and desires for technology in their lives; and involving them in the design process. PMID:20833690

  3. Rich Representations with Exposed Semantics for Deep Visual Reasoning

    DTIC Science & Technology

    2016-06-01

    Gupta, Abh inav; Hebert, Martial ; Aminoff, Elissa; Park, HyunSoo; Forsyth, David; Shi, Jianbo; Tarr, Michael Se. TASK NUMBER Sf. WORK UNIT NUMBER 7...approach performs equally well or better when compared to the state-of-the- art . We also expanded our research on perceiving the interactions between...cross-instance correspondences. We demonstrated that our end-to-end trained ConvNet supervised by cycle-consistency outperforms state-of-the- art

  4. Functional network in posttranslational modifications: Glyco-Net in Glycoconjugate Data Bank.

    PubMed

    Miura, Nobuaki; Okada, Takuya; Murayama, Daisuke; Hirose, Kazuko; Sato, Taku; Hashimoto, Ryo; Fukushima, Nobuhiro

    2015-01-01

    Elucidating pathways related to posttranslational modifications (PTMs) such as glycosylation is of growing importance in post-genome science and technology. Graphical networks describing the relationships among glycan-related molecules, including genes, proteins, lipids, and various biological events, are considered extremely valuable and convenient tools for the systematic investigation of PTMs. Glyco-Net (http://bibi.sci.hokudai.ac.jp/functions/) can dynamically make network figures among various biological molecules and biological events. A certain molecule or event is expressed with a node, and the relationship between the molecule and the event is indicated by arrows in the network figures. In this chapter, we mention the features and current status of the Glyco-Net and a simple example of the search with the Glyco-Net.

  5. The value of personal health record (PHR) systems.

    PubMed

    Kaelber, David; Pan, Eric C

    2008-11-06

    Personal health records (PHRs) are a rapidly growing area of health information technology despite a lack of significant value-based assessment.Here we present an assessment of the potential value of PHR systems, looking at both costs and benefits.We examine provider-tethered, payer-tethered, and third-party PHRs, as well as idealized interoperable PHRs. An analytical model was developed that considered eight PHR application and infrastructure functions. Our analysis projects the initial and annual costs and annual benefits of PHRs to the entire US over the next 10 years.This PHR analysis shows that all forms of PHRs have initial net negative value. However, at the end of 10 years, steady state annual net value ranging from$13 billion to -$29 billion. Interoperable PHRs provide the most value, followed by third-party PHRs and payer-tethered PHRs also showing positive net value. Provider-tethered PHRs constantly demonstrating negative net value.

  6. SurvNet: a web server for identifying network-based biomarkers that most correlate with patient survival data.

    PubMed

    Li, Jun; Roebuck, Paul; Grünewald, Stefan; Liang, Han

    2012-07-01

    An important task in biomedical research is identifying biomarkers that correlate with patient clinical data, and these biomarkers then provide a critical foundation for the diagnosis and treatment of disease. Conventionally, such an analysis is based on individual genes, but the results are often noisy and difficult to interpret. Using a biological network as the searching platform, network-based biomarkers are expected to be more robust and provide deep insights into the molecular mechanisms of disease. We have developed a novel bioinformatics web server for identifying network-based biomarkers that most correlate with patient survival data, SurvNet. The web server takes three input files: one biological network file, representing a gene regulatory or protein interaction network; one molecular profiling file, containing any type of gene- or protein-centred high-throughput biological data (e.g. microarray expression data or DNA methylation data); and one patient survival data file (e.g. patients' progression-free survival data). Given user-defined parameters, SurvNet will automatically search for subnetworks that most correlate with the observed patient survival data. As the output, SurvNet will generate a list of network biomarkers and display them through a user-friendly interface. SurvNet can be accessed at http://bioinformatics.mdanderson.org/main/SurvNet.

  7. A hybrid architecture for the implementation of the Athena neural net model

    NASA Technical Reports Server (NTRS)

    Koutsougeras, C.; Papachristou, C.

    1989-01-01

    The implementation of an earlier introduced neural net model for pattern classification is considered. Data flow principles are employed in the development of a machine that efficiently implements the model and can be useful for real time classification tasks. Further enhancement with optical computing structures is also considered.

  8. Research on air and missile defense task allocation based on extended contract net protocol

    NASA Astrophysics Data System (ADS)

    Zhang, Yunzhi; Wang, Gang

    2017-10-01

    Based on the background of air and missile defense distributed element corporative engagement, the interception task allocation problem of multiple weapon units with multiple targets under network condition is analyzed. Firstly, a mathematical model of task allocation is established by combat task decomposition. Secondly, the initialization assignment based on auction contract and the adjustment allocation scheme based on swap contract were introduced to the task allocation. Finally, through the simulation calculation of typical situation, the model can be used to solve the task allocation problem in complex combat environment.

  9. Deep learning with convolutional neural networks for EEG decoding and visualization

    PubMed Central

    Springenberg, Jost Tobias; Fiederer, Lukas Dominique Josef; Glasstetter, Martin; Eggensperger, Katharina; Tangermann, Michael; Hutter, Frank; Burgard, Wolfram; Ball, Tonio

    2017-01-01

    Abstract Deep learning with convolutional neural networks (deep ConvNets) has revolutionized computer vision through end‐to‐end learning, that is, learning from the raw data. There is increasing interest in using deep ConvNets for end‐to‐end EEG analysis, but a better understanding of how to design and train ConvNets for end‐to‐end EEG decoding and how to visualize the informative EEG features the ConvNets learn is still needed. Here, we studied deep ConvNets with a range of different architectures, designed for decoding imagined or executed tasks from raw EEG. Our results show that recent advances from the machine learning field, including batch normalization and exponential linear units, together with a cropped training strategy, boosted the deep ConvNets decoding performance, reaching at least as good performance as the widely used filter bank common spatial patterns (FBCSP) algorithm (mean decoding accuracies 82.1% FBCSP, 84.0% deep ConvNets). While FBCSP is designed to use spectral power modulations, the features used by ConvNets are not fixed a priori. Our novel methods for visualizing the learned features demonstrated that ConvNets indeed learned to use spectral power modulations in the alpha, beta, and high gamma frequencies, and proved useful for spatially mapping the learned features by revealing the topography of the causal contributions of features in different frequency bands to the decoding decision. Our study thus shows how to design and train ConvNets to decode task‐related information from the raw EEG without handcrafted features and highlights the potential of deep ConvNets combined with advanced visualization techniques for EEG‐based brain mapping. Hum Brain Mapp 38:5391–5420, 2017. © 2017 Wiley Periodicals, Inc. PMID:28782865

  10. Measurement and Evidence of Computer-Based Task Switching and Multitasking by "Net Generation" Students

    ERIC Educational Resources Information Center

    Judd, Terry; Kennedy, Gregor

    2011-01-01

    Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…

  11. Predictive Cache Modeling and Analysis

    DTIC Science & Technology

    2011-11-01

    metaheuristic /bin-packing algorithm to optimize task placement based on task communication characterization. Our previous work on task allocation showed...Cache Miss Minimization Technology To efficiently explore combinations and discover nearly-optimal task-assignment algorithms , we extended to our...it was possible to use our algorithmic techniques to decrease network bandwidth consumption by ~25%. In this effort, we adapted these existing

  12. Development of Web GIS for complex processing and visualization of climate geospatial datasets as an integral part of dedicated Virtual Research Environment

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander

    2017-04-01

    For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. The specialized Web GIS client contains three basic tires: • Tier of NetCDF metadata in JSON format • Middleware tier of JavaScript objects implementing methods to work with: o NetCDF metadata o XML file of selected calculations configuration (XML task) o WMS/WFS/WPS cartographical services • Graphical user interface tier representing JavaScript objects realizing general application business logic Web-GIS developed provides computational processing services launching to support solving tasks in the area of environmental monitoring, as well as presenting calculation results in the form of WMS/WFS cartographical layers in raster (PNG, JPG, GeoTIFF), vector (KML, GML, Shape), and binary (NetCDF) formats. It has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical formats. The work is supported by the Russian Science Foundation grant No 16-19-10257.

  13. The Key to Competitiveness: Understanding the Next Generation Learner--A Guide for College and University Leaders

    ERIC Educational Resources Information Center

    American Association of State Colleges and Universities, 2004

    2004-01-01

    It is unclear whether the Net generation is pushing innovations in communications technologies (because of their fascination with it) or if advances in communications technologies are influencing the Net generation. That discussion is not nearly as relevant as their perceptions of the role technology plays in their lives. Whereas the older peers…

  14. Center of Pressure Displacement of Standing Posture during Rapid Movements Is Reorganised Due to Experimental Lower Extremity Muscle Pain.

    PubMed

    Shiozawa, Shinichiro; Hirata, Rogerio Pessoto; Graven-Nielsen, Thomas

    2015-01-01

    Postural control during rapid movements may be impaired due to musculoskeletal pain. The purpose of this study was to investigate the effect of experimental knee-related muscle pain on the center of pressure (CoP) displacement in a reaction time task condition. Nine healthy males performed two reaction time tasks (dominant side shoulder flexion and bilateral heel lift) before, during, and after experimental pain induced in the dominant side vastus medialis or the tibialis anterior muscles by hypertonic saline injections. The CoP displacement was extracted from the ipsilateral and contralateral side by two force plates and the net CoP displacement was calculated. Compared with non-painful sessions, tibialis anterior muscle pain during the peak and peak-to-peak displacement for the CoP during anticipatory postural adjustments (APAs) of the shoulder task reduced the peak-to-peak displacement of the net CoP in the medial-lateral direction (P<0.05). Tibialis anterior and vastus medialis muscle pain during shoulder flexion task reduced the anterior-posterior peak-to-peak displacement in the ipsilateral side (P<0.05). The central nervous system in healthy individuals was sufficiently robust in maintaining the APA characteristics during pain, although the displacement of net and ipsilateral CoP in the medial-lateral and anterior-posterior directions during unilateral fast shoulder movement was altered.

  15. Waste-to-Energy Thermal Destruction Identification for Forward Operating Bases

    DTIC Science & Technology

    2016-07-01

    waste disposal strategy is to simplify the technology development goals. Specifically, we recommend a goal of reducing total net energy consumption ...to net zero. The minimum objective should be the lowest possible fuel consumption per unit of waste disposed. By shifting the focus from W2E to waste...over long distances increases the risks to military personnel and contractors. Because fuel is a limited resource at FOBs, diesel fuel consumption

  16. Usability inspection to improve an electronic provincial medication repository.

    PubMed

    Kitson, Nicole A; Price, Morgan; Bowen, Michael; Lau, Francis

    2013-01-01

    Medication errors are a significant source of actual and potential harm for patients. Community medication records have the potential to reduce medication errors, but they can also introduce unintended consequences when there is low fit to task (low cognitive fit). PharmaNet is a provincially managed electronic repository that contains the records for community-based pharmacy-dispensed medications in British Columbia. This research explores the usability of PharmaNet, as a representative community-based medication repository. We completed usability inspections of PharmaNet through vendor applications. Vendor participants were asked to complete activity-driven scenarios, which highlighted aspects of medication management workflow. Screen recording was later reviewed. Heuristics were applied to explore usability issues and improvement opportunities. Usability inspection was conducted with four PharmaNet applications. Ninety-six usability issues were identified; half of these had potential implications for patient safety. These were primarily related to login and logout procedures; display of patient name; display of medications; update and display of alert information; and the changing or discontinuation of medications. PharmaNet was designed primarily to support medication dispensing and billing activities by community pharmacies, but is also used to support care providers with monitoring and prescribing activities. As such, some of the features do not have a strong fit for other clinical activities. To improve fit, we recommend: having a Current Medications List and Displaying Medication Utilization Charts.

  17. IT-adoption and the interaction of task, technology and individuals: a fit framework and a case study

    PubMed Central

    Ammenwerth, Elske; Iller, Carola; Mahler, Cornelia

    2006-01-01

    Background Factors of IT adoption have largely been discussed in the literature. However, existing frameworks (such as TAM or TTF) are failing to include one important aspect, the interaction between user and task. Method Based on a literature study and a case study, we developed the FITT framework to help analyse the socio-organisational-technical factors that influence IT adoption in a health care setting. Results Our FITT framework ("Fit between Individuals, Task and Technology") is based on the idea that IT adoption in a clinical environment depends on the fit between the attributes of the individual users (e.g. computer anxiety, motivation), attributes of the technology (e.g. usability, functionality, performance), and attributes of the clinical tasks and processes (e.g. organisation, task complexity). We used this framework in the retrospective analysis of a three-year case study, describing the adoption of a nursing documentation system in various departments in a German University Hospital. We will show how the FITT framework helped analyzing the process of IT adoption during an IT implementation: we were able to describe every found IT adoption problem with regard to the three fit dimensions, and any intervention on the fit can be described with regard to the three objects of the FITT framework (individual, task, technology). We also derive facilitators and barriers to IT adoption of clinical information systems. Conclusion This work should support a better understanding of the reasons for IT adoption failures and therefore enable better prepared and more successful IT introduction projects. We will discuss, however, that from a more epistemological point of view, it may be difficult or even impossible to analyse the complex and interacting factors that predict success or failure of IT projects in a socio-technical environment. PMID:16401336

  18. Devices and circuits for nanoelectronic implementation of artificial neural networks

    NASA Astrophysics Data System (ADS)

    Turel, Ozgur

    Biological neural networks perform complicated information processing tasks at speeds better than conventional computers based on conventional algorithms. This has inspired researchers to look into the way these networks function, and propose artificial networks that mimic their behavior. Unfortunately, most artificial neural networks, either software or hardware, do not provide either the speed or the complexity of a human brain. Nanoelectronics, with high density and low power dissipation that it provides, may be used in developing more efficient artificial neural networks. This work consists of two major contributions in this direction. First is the proposal of the CMOL concept, hybrid CMOS-molecular hardware [1-8]. CMOL may circumvent most of the problems in posed by molecular devices, such as low yield, vet provide high active device density, ˜1012/cm 2. The second contribution is CrossNets, artificial neural networks that are based on CMOL. We showed that CrossNets, with their fault tolerance, exceptional speed (˜ 4 to 6 orders of magnitude faster than biological neural networks) can perform any task any artificial neural network can perform. Moreover, there is a hope that if their integration scale is increased to that of human cerebral cortex (˜ 1010 neurons and ˜ 1014 synapses), they may be capable of performing more advanced tasks.

  19. Tritium technology development in EEC laboratories contributions to design goals for NET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinner, P.; Chazalon, M.; Leger, D.

    1988-09-01

    An overview is given of the tritium technology activities carried out in the European national laboratories associated with the European Fusion Programme and in the European Joint Research Center. The relationship of these activities to the Next European Torus (NET) design priorities is discussed, and the current status of the research is summarised. Future developments, required for NET, which will be addressed in the definition of the next 5-year programme are also presented.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, A.; Repac, B.; Gonder, J.

    This poster presents initial estimates of the net energy impacts of automated vehicles (AVs). Automated vehicle technologies are increasingly recognized as having potential to decrease carbon dioxide emissions and petroleum consumption through mechanisms such as improved efficiency, better routing, lower traffic congestion, and by enabling advanced technologies. However, some effects of AVs could conceivably increase fuel consumption through possible effects such as longer distances traveled, increased use of transportation by underserved groups, and increased travel speeds. The net effect on petroleum use and climate change is still uncertain. To make an aggregate system estimate, we first collect best estimates formore » the energy impacts of approximately ten effects of AVs. We then use a modified Kaya Identity approach to estimate the range of aggregate effects and avoid double counting. We find that depending on numerous factors, there is a wide range of potential energy impacts. Adoption of automated personal or shared vehicles can lead to significant fuel savings but has potential for backfire.« less

  1. Body size and predatory performance in wolves: is bigger better?

    PubMed

    MacNulty, Daniel R; Smith, Douglas W; Mech, L David; Eberly, Lynn E

    2009-05-01

    1. Large body size hinders locomotor performance in ways that may lead to trade-offs in predator foraging ability that limit the net predatory benefit of larger size. For example, size-related improvements in handling prey may come at the expense of pursuing prey and thus negate any enhancement in overall predatory performance due to increasing size. 2. This hypothesis was tested with longitudinal data from repeated observations of 94 individually known wolves (Canis lupus) hunting elk (Cervus elaphus) in Yellowstone National Park, USA. Wolf size was estimated from an individually based sex-specific growth model derived from body mass measurements of 304 wolves. 3. Larger size granted individual wolves a net predatory advantage despite substantial variation in its effect on the performance of different predatory tasks; larger size improved performance of a strength-related task (grappling and subduing elk) but failed to improve performance of a locomotor-related task (selecting an elk from a group) for wolves > 39 kg. 4. Sexual dimorphism in wolf size also explained why males outperformed females in each of the three tasks considered (attacking, selecting, and killing). 5. These findings support the generalization that bigger predators are overall better hunters, but they also indicate that increasing size ultimately limits elements of predatory behaviour that require superior locomotor performance. We argue that this could potentially narrow the dietary niche of larger carnivores as well as limit the evolution of larger size if prey are substantially more difficult to pursue than to handle.

  2. Body size and predatory performance in wolves: Is bigger better?

    USGS Publications Warehouse

    MacNulty, D.R.; Smith, D.W.; Mech, L.D.; Eberly, L.E.

    2009-01-01

    Large body size hinders locomotor performance in ways that may lead to trade-offs in predator foraging ability that limit the net predatory benefit of larger size. For example, size-related improvements in handling prey may come at the expense of pursuing prey and thus negate any enhancement in overall predatory performance due to increasing size. 2. This hypothesis was tested with longitudinal data from repeated observations of 94 individually known wolves (Canis lupus) hunting elk (Cervus elaphus) in Yellowstone National Park, USA. Wolf size was estimated from an individually based sex-specific growth model derived from body mass measurements of 304 wolves. 3. Larger size granted individual wolves a net predatory advantage despite substantial variation in its effect on the performance of different predatory tasks; larger size improved performance of a strength-related task (grappling and subduing elk) but failed to improve performance of a locomotor-related task (selecting an elk from a group) for wolves > 39 kg. 4. Sexual dimorphism in wolf size also explained why males outperformed females in each of the three tasks considered (attacking, selecting, and killing). 5. These findings support the generalization that bigger predators are overall better hunters, but they also indicate that increasing size ultimately limits elements of predatory behaviour that require superior locomotor performance. We argue that this could potentially narrow the dietary niche of larger carnivores as well as limit the evolution of larger size if prey are substantially more difficult to pursue than to handle. ?? 2009 British Ecological Society.

  3. Designing an IMAC system using TeraNet

    NASA Astrophysics Data System (ADS)

    Mun, In K.; Hilal, S. K.; Andrews, M. C.; Gidron, Rafael

    1992-07-01

    Even though considerable progresses have been made with communication technology, one of the more difficult problems facing in installing a comprehensive clinically effective Image Management and Communication (IMAC) system for a hospital is the communication problem. Most existing systems are based on Ethernet or Token-ring net. Some of the newer systems are being installed using FDDL. All these systems have inherent problems like communication speed, control of bandwidth usage, or/and poor performance under heavy traffic. In order to overcome these difficulties, we are designing a complete IMAC system based on a novel network known as TeraNet, being developed at Center for Telecommunication Research, Columbia University.

  4. Mars base technology program overview

    NASA Technical Reports Server (NTRS)

    Chu, Chneg-Chih; Hayati, Samad A.; Udomkesmalee, Suraphol

    2005-01-01

    In this paper, we present an overview of the current technology portfolio for Mars Base Technology Program. Brief descriptions of the awarded technologies and the high-priority areas in both NRAs are provided to show the current focus of MTP. We also present the approach that MTP uses to evaluate technology maturity for each of the technology tasks.

  5. Investigating the genetic basis of attention to facial expressions: the role of the norepinephrine transporter gene.

    PubMed

    Yang, Xing; Ru, Wenzhao; Wang, Bei; Gao, Xiaocai; Yang, Lu; Li, She; Xi, Shoumin; Gong, Pingyuan

    2016-12-01

    Levels of norepinephrine (NE) in the brain are related to attention ability in animals and risk of attention-deficit hyperactivity disorder in humans. Given the modulation of the norepinephrine transporter (NET) on NE levels in the brain and the link between NE and attention impairment of attention-deficit hyperactivity disorder, it was possible that the NET gene underpinned individual differences in attention processes in healthy populations. To investigate to what extent NET could modulate one's attention orientation to facial expressions, we categorized individuals according to the genotypes of the -182 T/C (rs2242446) polymorphism and measured individuals' attention orientation with the spatial cueing task. Our results indicated that the -182 T/C polymorphism significantly modulated attention orientation to facial expressions, of which the CC genotype facilitated attention reorientation to the locations where cued faces were previously presented. However, this polymorphism showed no significant effects on the regulations of emotional cues on attention orientation. Our findings suggest that the NET gene modulates the individual difference in attention to facial expressions, which provides new insights into the roles of NE in social interactions.

  6. Evaluation of Advanced Stirling Convertor Net Heat Input Correlation Methods Using a Thermal Standard

    NASA Technical Reports Server (NTRS)

    Briggs, Maxwell; Schifer, Nicholas

    2011-01-01

    Test hardware used to validate net heat prediction models. Problem: Net Heat Input cannot be measured directly during operation. Net heat input is a key parameter needed in prediction of efficiency for convertor performance. Efficiency = Electrical Power Output (Measured) divided by Net Heat Input (Calculated). Efficiency is used to compare convertor designs and trade technology advantages for mission planning.

  7. AggNet: Deep Learning From Crowds for Mitosis Detection in Breast Cancer Histology Images.

    PubMed

    Albarqouni, Shadi; Baur, Christoph; Achilles, Felix; Belagiannis, Vasileios; Demirci, Stefanie; Navab, Nassir

    2016-05-01

    The lack of publicly available ground-truth data has been identified as the major challenge for transferring recent developments in deep learning to the biomedical imaging domain. Though crowdsourcing has enabled annotation of large scale databases for real world images, its application for biomedical purposes requires a deeper understanding and hence, more precise definition of the actual annotation task. The fact that expert tasks are being outsourced to non-expert users may lead to noisy annotations introducing disagreement between users. Despite being a valuable resource for learning annotation models from crowdsourcing, conventional machine-learning methods may have difficulties dealing with noisy annotations during training. In this manuscript, we present a new concept for learning from crowds that handle data aggregation directly as part of the learning process of the convolutional neural network (CNN) via additional crowdsourcing layer (AggNet). Besides, we present an experimental study on learning from crowds designed to answer the following questions. 1) Can deep CNN be trained with data collected from crowdsourcing? 2) How to adapt the CNN to train on multiple types of annotation datasets (ground truth and crowd-based)? 3) How does the choice of annotation and aggregation affect the accuracy? Our experimental setup involved Annot8, a self-implemented web-platform based on Crowdflower API realizing image annotation tasks for a publicly available biomedical image database. Our results give valuable insights into the functionality of deep CNN learning from crowd annotations and prove the necessity of data aggregation integration.

  8. Assessing NETS.T Performance in Teacher Candidates: Exploring the Wayfind Teacher Assessment

    ERIC Educational Resources Information Center

    Banister, Savilla; Vannatta Reinhart, Rachel

    2013-01-01

    To effectively integrate digital technologies in K-12 schools, teachers must be provided with undergraduate experiences that strongly support these integration resources and strategies. The National Educational Technology Standards for Teachers (NETS.T) provide a framework for teacher candidates and inservice teachers to identify their…

  9. Space Technology 5 (ST-5) Observations of the Imbalance of Region 1 and 2 Field-Aligned Currents

    NASA Technical Reports Server (NTRS)

    Le, Guan

    2010-01-01

    Space Technology 5 (ST-5) is a three micro-satellite constellation deployed into a 300 x 4500 km, dawn-dusk, sun-synchronous polar orbit from March 22 to June 21, 2006, for technology validations. In this study, we use the in-situ magnetic field observations from Space Technology 5 mission to quantify the imbalance of Region 1 (R1) and Region 2 (R2) currents. During the three-month duration of the ST5 mission, geomagnetic conditions range from quiet to moderately active. We find that the R1 current intensity is consistently stronger than the R2 current intensity both for the dawnside and the duskside large-scale field-aligned current system. The net currents flowing into (out of) the ionosphere in the dawnside (duskside) are in the order of 5% of the total RI currents. We also find that the net currents flowing into or out of the ionosphere are controlled by the solar wind-magnetosphere interaction in the same way as the field-aligned currents themselves are. Since the net currents due to the imbalance of the R1 and R2 currents require that their closure currents flow across the polar cap from dawn to dusk as Pedersen currents, our results indicate that the total amount of the cross-polar cap Pedersen currents is in the order of approx. 0.1 MA. This study, although with a very limited dataset, is one of the first attempts to quantify the cross-polar cap Pedersen currents. Given the importance of the Joule heating due to Pedersen currents to the high-latitude ionospheric electrodynamics, quantifying the cross-polar cap Pedersen currents and associated Joule heating is needed for developing models of the magnetosphere-ionosphere coupling.

  10. Route Advising in a Dynamic Environment - A High-Tech Approach

    NASA Astrophysics Data System (ADS)

    Firdhous, M. F. M.; Basnayake, D. L.; Kodithuwakku, K. H. L.; Hatthalla, N. K.; Charlin, N. W.; Bandara, P. M. R. I. K.

    Finding the optimal path between two locations in the Colombo city is not a straight forward task, because of the complex road system and the huge traffic jams etc. This paper presents a system to find the optimal driving direction between two locations within the Colombo city, considering road rules (one way, two ways or fully closed in both directions). The system contains three main modules - core module, web module and mobile module, additionally there are two user interfaces one for normal users and the other for administrative users. Both these interfaces can be accessed using a web browser or a GPRS enabled mobile phone. The system is developed based on the Geographic Information System (GIS) technology. GIS is considered as the best option to integrate hardware, software, and data for capturing, managing, analyzing, and displaying all forms of geographically referenced information. The core of the system is MapServer (MS4W) used along with the other supporting technologies such as PostGIS, PostgreSQL, pgRouting, ASP.NET and C#.

  11. Development and realization of the open fault diagnosis system based on XPE

    NASA Astrophysics Data System (ADS)

    Deng, Hui; Wang, TaiYong; He, HuiLong; Xu, YongGang; Zeng, JuXiang

    2005-12-01

    To make the complex mechanical equipment work in good service, the technology for realizing an embedded open system is introduced systematically, including open hardware configuration, customized embedded operation system and open software structure. The ETX technology is adopted in this system, integrating the CPU main-board functions, and achieving the quick, real-time signal acquisition and intelligent data analysis with applying DSP and CPLD data acquisition card. Under the open configuration, the signal bus mode such as PCI, ISA and PC/104 can be selected and the styles of the signals can be chosen too. In addition, through customizing XPE system, adopting the EWF (Enhanced Write Filter), and realizing the open system authentically, the stability of the system is enhanced. Multi-thread and multi-task programming techniques are adopted in the software programming process. Interconnecting with the remote fault diagnosis center via the net interface, cooperative diagnosis is conducted and the intelligent degree of the fault diagnosis is improved.

  12. Advanced Mirror Technology Development (AMTD) Project Status

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2014-01-01

    To date, AMTD Phase 1 has accomplished all of its technical tasks on-schedule and on-budget. AMTD was awarded a Phase 2 contract. We are now performing Phase 2 tasks along with those tasks continued from Phase 1.

  13. Holmes: a graphical tool for development, simulation and analysis of Petri net based models of complex biological systems.

    PubMed

    Radom, Marcin; Rybarczyk, Agnieszka; Szawulak, Bartlomiej; Andrzejewski, Hubert; Chabelski, Piotr; Kozak, Adam; Formanowicz, Piotr

    2017-12-01

    Model development and its analysis is a fundamental step in systems biology. The theory of Petri nets offers a tool for such a task. Since the rapid development of computer science, a variety of tools for Petri nets emerged, offering various analytical algorithms. From this follows a problem of using different programs to analyse a single model. Many file formats and different representations of results make the analysis much harder. Especially for larger nets the ability to visualize the results in a proper form provides a huge help in the understanding of their significance. We present a new tool for Petri nets development and analysis called Holmes. Our program contains algorithms for model analysis based on different types of Petri nets, e.g. invariant generator, Maximum Common Transitions (MCT) sets and cluster modules, simulation algorithms or knockout analysis tools. A very important feature is the ability to visualize the results of almost all analytical modules. The integration of such modules into one graphical environment allows a researcher to fully devote his or her time to the model building and analysis. Available at http://www.cs.put.poznan.pl/mradom/Holmes/holmes.html. piotr@cs.put.poznan.pl. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. Deep Convolutional Neural Networks for Computer-Aided Detection: CNN Architectures, Dataset Characteristics and Transfer Learning

    PubMed Central

    Hoo-Chang, Shin; Roth, Holger R.; Gao, Mingchen; Lu, Le; Xu, Ziyue; Nogues, Isabella; Yao, Jianhua; Mollura, Daniel

    2016-01-01

    Remarkable progress has been made in image recognition, primarily due to the availability of large-scale annotated datasets (i.e. ImageNet) and the revival of deep convolutional neural networks (CNN). CNNs enable learning data-driven, highly representative, layered hierarchical image features from sufficient training data. However, obtaining datasets as comprehensively annotated as ImageNet in the medical imaging domain remains a challenge. There are currently three major techniques that successfully employ CNNs to medical image classification: training the CNN from scratch, using off-the-shelf pre-trained CNN features, and conducting unsupervised CNN pre-training with supervised fine-tuning. Another effective method is transfer learning, i.e., fine-tuning CNN models (supervised) pre-trained from natural image dataset to medical image tasks (although domain transfer between two medical image datasets is also possible). In this paper, we exploit three important, but previously understudied factors of employing deep convolutional neural networks to computer-aided detection problems. We first explore and evaluate different CNN architectures. The studied models contain 5 thousand to 160 million parameters, and vary in numbers of layers. We then evaluate the influence of dataset scale and spatial image context on performance. Finally, we examine when and why transfer learning from pre-trained ImageNet (via fine-tuning) can be useful. We study two specific computeraided detection (CADe) problems, namely thoraco-abdominal lymph node (LN) detection and interstitial lung disease (ILD) classification. We achieve the state-of-the-art performance on the mediastinal LN detection, with 85% sensitivity at 3 false positive per patient, and report the first five-fold cross-validation classification results on predicting axial CT slices with ILD categories. Our extensive empirical evaluation, CNN model analysis and valuable insights can be extended to the design of high performance CAD systems for other medical imaging tasks. PMID:26886976

  15. AdaNET executive summary

    NASA Technical Reports Server (NTRS)

    Digman, R. Michael

    1988-01-01

    The goal of AdaNET is to transfer existing and emerging software engineering technology from the Federal government to the private sector. The views and perspectives of the current project participants on long and short term goals for AdaNET; organizational structure; resources and returns; summary of identified AdaNET services; and the summary of the organizational model currently under discussion are presented.

  16. Ertl and Non-Ertl amputees exhibit functional biomechanical differences during the sit-to-stand task.

    PubMed

    Ferris, Abbie E; Christiansen, Cory L; Heise, Gary D; Hahn, David; Smith, Jeremy D

    2017-05-01

    People with transtibial amputation stand ~50times/day. There are two general approaches to transtibial amputation: 1) distal tibia and fibula union using a "bone-bridge" (Ertl), 2) non-union of the tibia and fibula (Non-Ertl). The Ertl technique may improve functional outcomes by increasing the end-bearing ability of the residual limb. We hypothesized individuals with an Ertl would perform a five-time sit-to-stand task faster through greater involvement/end-bearing of the affected limb. Ertl (n=11) and Non-Ertl (n=7) participants sat on a chair with each foot on separate force plates and performed the five-time sit-to-stand task. A symmetry index (intact vs affected limbs) was calculated using peak ground reaction forces. The Ertl group performed the task significantly faster (9.33s (2.66) vs 13.27 (2.83)s). Symmetry index (23.33 (23.83)% Ertl, 36.53 (13.51)% Non-Ertl) indicated the intact limb for both groups produced more force than the affected limb. Ertl affected limb peak ground reaction forces were significantly larger than the Non-Ertl affected limb. Peak knee power and net work of the affected limb were smaller than their respective intact limb for both groups. The Ertl intact limb produced significantly greater peak knee power and net work than the Non-Ertl intact knee. Although loading asymmetries existed between the intact and affected limb of both groups, the Ertl group performed the task ~30% faster. This was driven by greater power and work production of the Ertl intact limb knee. Our results suggest that functional differences exist between the procedures. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Evolution of Self-Organized Task Specialization in Robot Swarms

    PubMed Central

    Ferrante, Eliseo; Turgut, Ali Emre; Duéñez-Guzmán, Edgar; Dorigo, Marco; Wenseleers, Tom

    2015-01-01

    Division of labor is ubiquitous in biological systems, as evidenced by various forms of complex task specialization observed in both animal societies and multicellular organisms. Although clearly adaptive, the way in which division of labor first evolved remains enigmatic, as it requires the simultaneous co-occurrence of several complex traits to achieve the required degree of coordination. Recently, evolutionary swarm robotics has emerged as an excellent test bed to study the evolution of coordinated group-level behavior. Here we use this framework for the first time to study the evolutionary origin of behavioral task specialization among groups of identical robots. The scenario we study involves an advanced form of division of labor, common in insect societies and known as “task partitioning”, whereby two sets of tasks have to be carried out in sequence by different individuals. Our results show that task partitioning is favored whenever the environment has features that, when exploited, reduce switching costs and increase the net efficiency of the group, and that an optimal mix of task specialists is achieved most readily when the behavioral repertoires aimed at carrying out the different subtasks are available as pre-adapted building blocks. Nevertheless, we also show for the first time that self-organized task specialization could be evolved entirely from scratch, starting only from basic, low-level behavioral primitives, using a nature-inspired evolutionary method known as Grammatical Evolution. Remarkably, division of labor was achieved merely by selecting on overall group performance, and without providing any prior information on how the global object retrieval task was best divided into smaller subtasks. We discuss the potential of our method for engineering adaptively behaving robot swarms and interpret our results in relation to the likely path that nature took to evolve complex sociality and task specialization. PMID:26247819

  18. Evolution of Self-Organized Task Specialization in Robot Swarms.

    PubMed

    Ferrante, Eliseo; Turgut, Ali Emre; Duéñez-Guzmán, Edgar; Dorigo, Marco; Wenseleers, Tom

    2015-08-01

    Division of labor is ubiquitous in biological systems, as evidenced by various forms of complex task specialization observed in both animal societies and multicellular organisms. Although clearly adaptive, the way in which division of labor first evolved remains enigmatic, as it requires the simultaneous co-occurrence of several complex traits to achieve the required degree of coordination. Recently, evolutionary swarm robotics has emerged as an excellent test bed to study the evolution of coordinated group-level behavior. Here we use this framework for the first time to study the evolutionary origin of behavioral task specialization among groups of identical robots. The scenario we study involves an advanced form of division of labor, common in insect societies and known as "task partitioning", whereby two sets of tasks have to be carried out in sequence by different individuals. Our results show that task partitioning is favored whenever the environment has features that, when exploited, reduce switching costs and increase the net efficiency of the group, and that an optimal mix of task specialists is achieved most readily when the behavioral repertoires aimed at carrying out the different subtasks are available as pre-adapted building blocks. Nevertheless, we also show for the first time that self-organized task specialization could be evolved entirely from scratch, starting only from basic, low-level behavioral primitives, using a nature-inspired evolutionary method known as Grammatical Evolution. Remarkably, division of labor was achieved merely by selecting on overall group performance, and without providing any prior information on how the global object retrieval task was best divided into smaller subtasks. We discuss the potential of our method for engineering adaptively behaving robot swarms and interpret our results in relation to the likely path that nature took to evolve complex sociality and task specialization.

  19. What College Instructors Can Do about Student Cyber-Slacking

    ERIC Educational Resources Information Center

    Flanigan, Abraham E.; Kiewra, Kenneth A.

    2018-01-01

    Today's traditional-aged college students are avid users of mobile technology. Commonly referred to as the Net Generation, today's college students spend several hours each day using their smart phones, iPads, and laptops. Although some scholars initially opined that the Net Generation would grow into technologically savvy digital natives who…

  20. 2D net shape weaving for cost effective manufacture of textile reinforced composites

    NASA Astrophysics Data System (ADS)

    Vo, D. M. P.; Kern, M.; Hoffmann, G.; Cherif, C.

    2017-10-01

    Despite significant weight and performance advantages over metal parts, the today’s demand for fibre-reinforced polymer composites (FRPC) has been limited mainly by their large manufacturing cost. The combination of dry textile preforms and low-cost consolidation processes such as resin transfer molding (RTM) has been appointed as a promising approach to low-cost FRPC manufacture. At the current state of the art, tooling and impregnation technology is well understood whereas preform fabrication technology has not been developed effectively. This paper presents an advanced 2D net shape weaving technology developed with the aim to establish a more cost effective system for the manufacture of dry textile preforms for FRPC. 2D net shape weaving is developed based on open reed weave (ORW) technology and enables the manufacture of 2D contoured woven fabrics with firm edge, so that oversize cutting and hand trimming after molding are no longer required. The introduction of 2D net shape woven fabrics helps to reduce material waste, cycle time and preform manufacturing cost significantly. Furthermore, higher grade of automation in preform fabrication can be achieved.

  1. Microfluidics cell sample preparation for analysis: Advances in efficient cell enrichment and precise single cell capture

    PubMed Central

    Bian, Shengtai; Cheng, Yinuo; Shi, Guanya; Liu, Peng; Ye, Xiongying

    2017-01-01

    Single cell analysis has received increasing attention recently in both academia and clinics, and there is an urgent need for effective upstream cell sample preparation. Two extremely challenging tasks in cell sample preparation—high-efficiency cell enrichment and precise single cell capture—have now entered into an era full of exciting technological advances, which are mostly enabled by microfluidics. In this review, we summarize the category of technologies that provide new solutions and creative insights into the two tasks of cell manipulation, with a focus on the latest development in the recent five years by highlighting the representative works. By doing so, we aim both to outline the framework and to showcase example applications of each task. In most cases for cell enrichment, we take circulating tumor cells (CTCs) as the target cells because of their research and clinical importance in cancer. For single cell capture, we review related technologies for many kinds of target cells because the technologies are supposed to be more universal to all cells rather than CTCs. Most of the mentioned technologies can be used for both cell enrichment and precise single cell capture. Each technology has its own advantages and specific challenges, which provide opportunities for researchers in their own area. Overall, these technologies have shown great promise and now evolve into real clinical applications. PMID:28217240

  2. Telementoring in education of laparoscopic surgeons: An emerging technology

    PubMed Central

    Bogen, Etai M; Augestad, Knut M; Patel, Hiten RH; Lindsetmo, Rolv-Ole

    2014-01-01

    Laparoscopy, minimally invasive and minimal access surgery with more surgeons performing these advanced procedures. We highlight in the review several key emerging technologies such as the telementoring and virtual reality simulators, that provide a solid ground for delivering surgical education to rural area and allow young surgeons a safety net and confidence while operating on a newly learned technique. PMID:24944728

  3. Processing of chromatic information in a deep convolutional neural network.

    PubMed

    Flachot, Alban; Gegenfurtner, Karl R

    2018-04-01

    Deep convolutional neural networks are a class of machine-learning algorithms capable of solving non-trivial tasks, such as object recognition, with human-like performance. Little is known about the exact computations that deep neural networks learn, and to what extent these computations are similar to the ones performed by the primate brain. Here, we investigate how color information is processed in the different layers of the AlexNet deep neural network, originally trained on object classification of over 1.2M images of objects in their natural contexts. We found that the color-responsive units in the first layer of AlexNet learned linear features and were broadly tuned to two directions in color space, analogously to what is known of color responsive cells in the primate thalamus. Moreover, these directions are decorrelated and lead to statistically efficient representations, similar to the cardinal directions of the second-stage color mechanisms in primates. We also found, in analogy to the early stages of the primate visual system, that chromatic and achromatic information were segregated in the early layers of the network. Units in the higher layers of AlexNet exhibit on average a lower responsivity for color than units at earlier stages.

  4. Two-stream Convolutional Neural Network for Methane Emissions Quantification

    NASA Astrophysics Data System (ADS)

    Wang, J.; Ravikumar, A. P.; McGuire, M.; Bell, C.; Tchapmi, L. P.; Brandt, A. R.

    2017-12-01

    Methane, a key component of natural gas, has a 25x higher global warming potential than carbon dioxide on a 100-year basis. Accurately monitoring and mitigating methane emissions require cost-effective detection and quantification technologies. Optical gas imaging, one of the most commonly used leak detection technology, adopted by Environmental Protection Agency, cannot estimate leak-sizes. In this work, we harness advances in computer science to allow for rapid and automatic leak quantification. Particularly, we utilize two-stream deep Convolutional Networks (ConvNets) to estimate leak-size by capturing complementary spatial information from still plume frames, and temporal information from plume motion between frames. We build large leak datasets for training and evaluating purposes by collecting about 20 videos (i.e. 397,400 frames) of leaks. The videos were recorded at six distances from the source, covering 10 -60 ft. Leak sources included natural gas well-heads, separators, and tanks. All frames were labeled with a true leak size, which has eight levels ranging from 0 to 140 MCFH. Preliminary analysis shows that two-stream ConvNets provides significant accuracy advantage over single steam ConvNets. Spatial stream ConvNet can achieve an accuracy of 65.2%, by extracting important features, including texture, plume area, and pattern. Temporal stream, fed by the results of optical flow analysis, results in an accuracy of 58.3%. The integration of the two-stream ConvNets gives a combined accuracy of 77.6%. For future work, we will split the training and testing datasets in distinct ways in order to test the generalization of the algorithm for different leak sources. Several analytic metrics, including confusion matrix and visualization of key features, will be used to understand accuracy rates and occurrences of false positives. The quantification algorithm can help to find and fix super-emitters, and improve the cost-effectiveness of leak detection and repair programs.

  5. Preserving with Prisms: Producing Nets

    ERIC Educational Resources Information Center

    Prummer, Kathy E.; Amador, Julie M.; Wallin, Abraham J.

    2016-01-01

    Two mathematics teachers in a small rural school decided to create a task that would engage seventh graders. The goal of the real-world activity was to help students develop geometric and spatial reasoning and to support their understanding of volume of rectangular prisms. The impetus for the task came from the teachers' desire to engage students…

  6. Risk Reduction and Resource Pooling on a Cooperation Task

    ERIC Educational Resources Information Center

    Pietras, Cynthia J.; Cherek, Don R.; Lane, Scott D.; Tcheremissine, Oleg

    2006-01-01

    Two experiments investigated choice in adult humans on a simulated cooperation task to evaluate a risk-reduction account of sharing based on the energy-budget rule. The energy-budget rule is an optimal foraging model that predicts risk-averse choices when net energy gains exceed energy requirements (positive energy budget) and risk-prone choices…

  7. XML schemas for common bioinformatic data types and their application in workflow systems.

    PubMed

    Seibel, Philipp N; Krüger, Jan; Hartmeier, Sven; Schwarzer, Knut; Löwenthal, Kai; Mersch, Henning; Dandekar, Thomas; Giegerich, Robert

    2006-11-06

    Today, there is a growing need in bioinformatics to combine available software tools into chains, thus building complex applications from existing single-task tools. To create such workflows, the tools involved have to be able to work with each other's data--therefore, a common set of well-defined data formats is needed. Unfortunately, current bioinformatic tools use a great variety of heterogeneous formats. Acknowledging the need for common formats, the Helmholtz Open BioInformatics Technology network (HOBIT) identified several basic data types used in bioinformatics and developed appropriate format descriptions, formally defined by XML schemas, and incorporated them in a Java library (BioDOM). These schemas currently cover sequence, sequence alignment, RNA secondary structure and RNA secondary structure alignment formats in a form that is independent of any specific program, thus enabling seamless interoperation of different tools. All XML formats are available at http://bioschemas.sourceforge.net, the BioDOM library can be obtained at http://biodom.sourceforge.net. The HOBIT XML schemas and the BioDOM library simplify adding XML support to newly created and existing bioinformatic tools, enabling these tools to interoperate seamlessly in workflow scenarios.

  8. Scientific Data Storage for Cloud Computing

    NASA Astrophysics Data System (ADS)

    Readey, J.

    2014-12-01

    Traditionally data storage used for geophysical software systems has centered on file-based systems and libraries such as NetCDF and HDF5. In contrast cloud based infrastructure providers such as Amazon AWS, Microsoft Azure, and the Google Cloud Platform generally provide storage technologies based on an object based storage service (for large binary objects) complemented by a database service (for small objects that can be represented as key-value pairs). These systems have been shown to be highly scalable, reliable, and cost effective. We will discuss a proposed system that leverages these cloud-based storage technologies to provide an API-compatible library for traditional NetCDF and HDF5 applications. This system will enable cloud storage suitable for geophysical applications that can scale up to petabytes of data and thousands of users. We'll also cover other advantages of this system such as enhanced metadata search.

  9. Risk assessment for invasive species produces net bioeconomic benefits

    PubMed Central

    Keller, Reuben P.; Lodge, David M.; Finnoff, David C.

    2007-01-01

    International commerce in live organisms presents a policy challenge for trade globalization; sales of live organisms create wealth, but some nonindigenous species cause harm. To reduce damage, some countries have implemented species screening to limit the introduction of damaging species. Adoption of new risk assessment (RA) technologies has been slowed, however, by concerns that RA accuracy remains insufficient to produce positive net economic benefits. This concern arises because only a small proportion of all introduced species escape, spread, and cause harm (i.e., become invasive), so a RA will exclude many noninvasive species (which provide a net economic benefit) for every invasive species correctly identified. Here, we develop a simple cost:benefit bioeconomic framework to quantify the net benefits from applying species prescreening. Because invasive species are rarely eradicated, and their damages must therefore be borne for long periods, we have projected the value of RA over a suitable range of policy time horizons (10–500 years). We apply the model to the Australian plant quarantine program and show that this RA program produces positive net economic benefits over the range of reasonable assumptions. Because we use low estimates of the financial damage caused by invasive species and high estimates of the value of species in the ornamental trade, our results underestimate the net benefit of the Australian plant quarantine program. In addition, because plants have relatively low rates of invasion, applying screening protocols to animals would likely demonstrate even greater benefits. PMID:17190819

  10. Risk assessment for invasive species produces net bioeconomic benefits.

    PubMed

    Keller, Reuben P; Lodge, David M; Finnoff, David C

    2007-01-02

    International commerce in live organisms presents a policy challenge for trade globalization; sales of live organisms create wealth, but some nonindigenous species cause harm. To reduce damage, some countries have implemented species screening to limit the introduction of damaging species. Adoption of new risk assessment (RA) technologies has been slowed, however, by concerns that RA accuracy remains insufficient to produce positive net economic benefits. This concern arises because only a small proportion of all introduced species escape, spread, and cause harm (i.e., become invasive), so a RA will exclude many noninvasive species (which provide a net economic benefit) for every invasive species correctly identified. Here, we develop a simple cost:benefit bioeconomic framework to quantify the net benefits from applying species prescreening. Because invasive species are rarely eradicated, and their damages must therefore be borne for long periods, we have projected the value of RA over a suitable range of policy time horizons (10-500 years). We apply the model to the Australian plant quarantine program and show that this RA program produces positive net economic benefits over the range of reasonable assumptions. Because we use low estimates of the financial damage caused by invasive species and high estimates of the value of species in the ornamental trade, our results underestimate the net benefit of the Australian plant quarantine program. In addition, because plants have relatively low rates of invasion, applying screening protocols to animals would likely demonstrate even greater benefits.

  11. Ohio SchoolNet. Schools on the Move.

    ERIC Educational Resources Information Center

    Ohio State Dept. of Education, Columbus.

    SchoolNet is a state-funded partnership that will facilitate the installation of computer and communications networking technology in public schools and classrooms across Ohio and coordinate its use. SchoolNet seeks to provide Ohio students with expanded course offerings; more individualized educational opportunities; interactive learning…

  12. Manufacturing Cost Levelization Model – A User’s Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrow, William R.; Shehabi, Arman; Smith, Sarah Josephine

    The Manufacturing Cost Levelization Model is a cost-performance techno-economic model that estimates total large-scale manufacturing costs for necessary to produce a given product. It is designed to provide production cost estimates for technology researchers to help guide technology research and development towards an eventual cost-effective product. The model presented in this user’s guide is generic and can be tailored to the manufacturing of any product, including the generation of electricity (as a product). This flexibility, however, requires the user to develop the processes and process efficiencies that represents a full-scale manufacturing facility. The generic model is comprised of several modulesmore » that estimate variable costs (material, labor, and operating), fixed costs (capital & maintenance), financing structures (debt and equity financing), and tax implications (taxable income after equipment and building depreciation, debt interest payments, and expenses) of a notional manufacturing plant. A cash-flow method is used to estimate a selling price necessary for the manufacturing plant to recover its total cost of production. A levelized unit sales price ($ per unit of product) is determined by dividing the net-present value of the manufacturing plant’s expenses ($) by the net present value of its product output. A user defined production schedule drives the cash-flow method that determines the levelized unit price. In addition, an analyst can increase the levelized unit price to include a gross profit margin to estimate a product sales price. This model allows an analyst to understand the effect that any input variables could have on the cost of manufacturing a product. In addition, the tool is able to perform sensitivity analysis, which can be used to identify the key variables and assumptions that have the greatest influence on the levelized costs. This component is intended to help technology researchers focus their research attention on tasks that offer the greatest opportunities for cost reduction early in the research and development stages of technology invention.« less

  13. Selective Convolutional Descriptor Aggregation for Fine-Grained Image Retrieval.

    PubMed

    Wei, Xiu-Shen; Luo, Jian-Hao; Wu, Jianxin; Zhou, Zhi-Hua

    2017-06-01

    Deep convolutional neural network models pre-trained for the ImageNet classification task have been successfully adopted to tasks in other domains, such as texture description and object proposal generation, but these tasks require annotations for images in the new domain. In this paper, we focus on a novel and challenging task in the pure unsupervised setting: fine-grained image retrieval. Even with image labels, fine-grained images are difficult to classify, letting alone the unsupervised retrieval task. We propose the selective convolutional descriptor aggregation (SCDA) method. The SCDA first localizes the main object in fine-grained images, a step that discards the noisy background and keeps useful deep descriptors. The selected descriptors are then aggregated and the dimensionality is reduced into a short feature vector using the best practices we found. The SCDA is unsupervised, using no image label or bounding box annotation. Experiments on six fine-grained data sets confirm the effectiveness of the SCDA for fine-grained image retrieval. Besides, visualization of the SCDA features shows that they correspond to visual attributes (even subtle ones), which might explain SCDA's high-mean average precision in fine-grained retrieval. Moreover, on general image retrieval data sets, the SCDA achieves comparable retrieval results with the state-of-the-art general image retrieval approaches.

  14. BP fusion model for the detection of oil spills on the sea by remote sensing

    NASA Astrophysics Data System (ADS)

    Chen, Weiwei; An, Jubai; Zhang, Hande; Lin, Bin

    2003-06-01

    Oil spills are very serious marine pollution in many countries. In order to detect and identify the oil-spilled on the sea by remote sensor, scientists have to conduct a research work on the remote sensing image. As to the detection of oil spills on the sea, edge detection is an important technology in image processing. There are many algorithms of edge detection developed for image processing. These edge detection algorithms always have their own advantages and disadvantages in the image processing. Based on the primary requirements of edge detection of the oil spills" image on the sea, computation time and detection accuracy, we developed a fusion model. The model employed a BP neural net to fuse the detection results of simple operators. The reason we selected BP neural net as the fusion technology is that the relation between simple operators" result of edge gray level and the image"s true edge gray level is nonlinear, while BP neural net is good at solving the nonlinear identification problem. Therefore in this paper we trained a BP neural net by some oil spill images, then applied the BP fusion model on the edge detection of other oil spill images and obtained a good result. In this paper the detection result of some gradient operators and Laplacian operator are also compared with the result of BP fusion model to analysis the fusion effect. At last the paper pointed out that the fusion model has higher accuracy and higher speed in the processing oil spill image"s edge detection.

  15. Integrated technology wing design study

    NASA Technical Reports Server (NTRS)

    Hays, A. P.; Beck, W. E.; Morita, W. H.; Penrose, B. J.; Skarshaug, R. E.; Wainfan, B. S.

    1984-01-01

    The technology development costs and associated benefits in applying advanced technology associated with the design of a new wing for a new or derivative trijet with a capacity for 350 passengers and maximum range of 8519 km, entering service in 1990 were studied. The areas of technology are: (1) airfoil technology; (2) planform parameters; (3) high lift; (4) pitch active control system; (5) all electric systems; (6) E to 3rd power propulsion; (7) airframe/propulsion integration; (8) graphite/epoxy composites; (9) advanced aluminum alloys; (10) titanium alloys; and (11) silicon carbide/aluminum composites. These technologies were applied to the reference aircraft configuration. Payoffs were determined for block fuel reductions and net value of technology. These technologies are ranked for the ratio of net value of technology (NVT) to technology development costs.

  16. The Evolution of Technology: A Decade of Surfing the Net

    ERIC Educational Resources Information Center

    Berger, Sandra

    2005-01-01

    The world was a different place when "Understanding Our Gifted" introduced "Surfing the Net" in 1994 as a regular feature. Since then, technology and the Internet have become part of people's culture, permeating almost every aspect of their lives. The Internet has greatly changed the way they conduct business and communicate with friends, it helps…

  17. Standards for the 21st-Century Learner: Comparisons with NETS and State Standards

    ERIC Educational Resources Information Center

    Pappas, Marjorie

    2008-01-01

    The American Association of School Librarians (AASL) and the International Society for Technology in Education (ISTE) have both recently launched new standards. These are known as the "AASL Standards for the 21st-Century Learner" and "The National Educational Technology Standards for Students: The Next Generation" (NETS). The standards from each…

  18. Qualitative analysis of programmatic initiatives to text patients with mobile devices in resource-limited health systems.

    PubMed

    Garg, Sachin K; Lyles, Courtney R; Ackerman, Sara; Handley, Margaret A; Schillinger, Dean; Gourley, Gato; Aulakh, Veenu; Sarkar, Urmimala

    2016-02-06

    Text messaging is an affordable, ubiquitous, and expanding mobile communication technology. However, safety net health systems in the United States that provide more care to uninsured and low-income patients may face additional financial and infrastructural challenges in utilizing this technology. Formative evaluations of texting implementation experiences are limited. We interviewed safety net health systems piloting texting initiatives to study facilitators and barriers to real-world implementation. We conducted telephone interviews with various stakeholders who volunteered from each of the eight California-based safety net systems that received external funding to pilot a texting-based program of their choosing to serve a primary care need. We developed a semi-structured interview guide based partly on the Consolidated Framework for Implementation Research (CFIR), which encompasses several domains: the intervention, individuals involved, contextual factors, and implementation process. We inductively and deductively (using CFIR) coded transcripts, and categorized themes into facilitators and barriers. We performed eight interviews (one interview per pilot site). Five sites had no prior texting experience. Sites applied texting for programs related to medication adherence and monitoring, appointment reminders, care coordination, and health education and promotion. No site texted patient-identifying health information, and most sites manually obtained informed consent from each participating patient. Facilitators of implementation included perceived enthusiasm from patients, staff and management belief that texting is patient-centered, and the early identification of potential barriers through peer collaboration among grantees. Navigating government regulations that protect patient privacy and guide the handling of protected health information emerged as a crucial barrier. A related technical challenge in five sites was the labor-intensive tracking and documenting of texting communications due to an inability to integrate texting platforms with electronic health records. Despite enthusiasm for the texting programs from the involved individuals and organizations, inadequate data management capabilities and unclear privacy and security regulations for mobile health technology slowed the initial implementation and limited the clinical use of texting in the safety net and scope of pilots. Future implementation work and research should investigate how different texting platform and intervention designs affect efficacy, as well as explore issues that may affect sustainability and the scalability.

  19. Sparse Forward-Backward for Fast Training of Conditional Random Fields

    DTIC Science & Technology

    2006-01-01

    knowledge- based systems. Proceedings of the 6th Conference on Uncertainty in Artifcial Intelligence , 1990. Appears to be unavailable. [4] Michael I...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and...task, the NetTalk text-to-speech data set [5], we can now train a conditional random field (CRF) in about 6 hours for which training previously

  20. Simple and efficient machine learning frameworks for identifying protein-protein interaction relevant articles and experimental methods used to study the interactions.

    PubMed

    Agarwal, Shashank; Liu, Feifan; Yu, Hong

    2011-10-03

    Protein-protein interaction (PPI) is an important biomedical phenomenon. Automatically detecting PPI-relevant articles and identifying methods that are used to study PPI are important text mining tasks. In this study, we have explored domain independent features to develop two open source machine learning frameworks. One performs binary classification to determine whether the given article is PPI relevant or not, named "Simple Classifier", and the other one maps the PPI relevant articles with corresponding interaction method nodes in a standardized PSI-MI (Proteomics Standards Initiative-Molecular Interactions) ontology, named "OntoNorm". We evaluated our system in the context of BioCreative challenge competition using the standardized data set. Our systems are amongst the top systems reported by the organizers, attaining 60.8% F1-score for identifying relevant documents, and 52.3% F1-score for mapping articles to interaction method ontology. Our results show that domain-independent machine learning frameworks can perform competitively well at the tasks of detecting PPI relevant articles and identifying the methods that were used to study the interaction in such articles. Simple Classifier is available at http://sourceforge.net/p/simpleclassify/home/ and OntoNorm at http://sourceforge.net/p/ontonorm/home/.

  1. 75 FR 39621 - Proposed Information Collection (Income-Net Worth and Employment Statement) Activity: Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-09

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0002] Proposed Information Collection (Income-Net Worth and Employment Statement) Activity: Comment Request AGENCY: Veterans Benefits Administration... techniques or the use of other forms of information technology. Title: Income-Net Worth and Employment...

  2. Turbine Inlet Air Cooling for Industrial and Aero-derivative Gas Turbine in Malaysia Climate

    NASA Astrophysics Data System (ADS)

    Nordin, A.; Salim, D. A.; Othoman, M. A.; Kamal, S. N. Omar; Tam, Danny; Yusof, M. KY

    2017-12-01

    The performance of a gas turbine is dependent on the ambient temperature. A higher temperature results in a reduction of the gas turbine’s power output and an increase in heat rate. The warm and humid climate in Malaysia with its high ambient air temperature has an adverse effect on the performance of gas turbine generators. In this paper, the expected effect of turbine inlet air cooling technology on the annual performance of an aero-derivative gas turbine (GE LM6000PD) is compared against that of an industrial gas turbine (GEFr6B.03) using GT Pro software. This study investigated the annual net energy output and the annual net electrical efficiency of a plant with and without turbine inlet air cooling technology. The results show that the aero-derivative gas turbine responds more favorably to turbine inlet air cooling technology, thereby yielding higher annual net energy output and higher net electrical efficiency when compared to the industrial gas turbine.

  3. H3ABioNet, a sustainable pan-African bioinformatics network for human heredity and health in Africa

    PubMed Central

    Mulder, Nicola J.; Adebiyi, Ezekiel; Alami, Raouf; Benkahla, Alia; Brandful, James; Doumbia, Seydou; Everett, Dean; Fadlelmola, Faisal M.; Gaboun, Fatima; Gaseitsiwe, Simani; Ghazal, Hassan; Hazelhurst, Scott; Hide, Winston; Ibrahimi, Azeddine; Jaufeerally Fakim, Yasmina; Jongeneel, C. Victor; Joubert, Fourie; Kassim, Samar; Kayondo, Jonathan; Kumuthini, Judit; Lyantagaye, Sylvester; Makani, Julie; Mansour Alzohairy, Ahmed; Masiga, Daniel; Moussa, Ahmed; Nash, Oyekanmi; Ouwe Missi Oukem-Boyer, Odile; Owusu-Dabo, Ellis; Panji, Sumir; Patterton, Hugh; Radouani, Fouzia; Sadki, Khalid; Seghrouchni, Fouad; Tastan Bishop, Özlem; Tiffin, Nicki; Ulenga, Nzovu

    2016-01-01

    The application of genomics technologies to medicine and biomedical research is increasing in popularity, made possible by new high-throughput genotyping and sequencing technologies and improved data analysis capabilities. Some of the greatest genetic diversity among humans, animals, plants, and microbiota occurs in Africa, yet genomic research outputs from the continent are limited. The Human Heredity and Health in Africa (H3Africa) initiative was established to drive the development of genomic research for human health in Africa, and through recognition of the critical role of bioinformatics in this process, spurred the establishment of H3ABioNet, a pan-African bioinformatics network for H3Africa. The limitations in bioinformatics capacity on the continent have been a major contributory factor to the lack of notable outputs in high-throughput biology research. Although pockets of high-quality bioinformatics teams have existed previously, the majority of research institutions lack experienced faculty who can train and supervise bioinformatics students. H3ABioNet aims to address this dire need, specifically in the area of human genetics and genomics, but knock-on effects are ensuring this extends to other areas of bioinformatics. Here, we describe the emergence of genomics research and the development of bioinformatics in Africa through H3ABioNet. PMID:26627985

  4. Neural networks for structural design - An integrated system implementation

    NASA Technical Reports Server (NTRS)

    Berke, Laszlo; Hafez, Wassim; Pao, Yoh-Han

    1992-01-01

    The development of powerful automated procedures to aid the creative designer is becoming increasingly critical for complex design tasks. In the work described here Artificial Neural Nets are applied to acquire structural analysis and optimization domain expertise. Based on initial instructions from the user an automated procedure generates random instances of structural analysis and/or optimization 'experiences' that cover a desired domain. It extracts training patterns from the created instances, constructs and trains an appropriate network architecture and checks the accuracy of net predictions. The final product is a trained neural net that can estimate analysis and/or optimization results instantaneously.

  5. The Scenario Analysis Tool Suite: A User’s Guide

    DTIC Science & Technology

    2009-01-01

    be exported at any stage and continued manually. The free, open-source integrated development environment (IDE) NetBeans [14] was used in the creation...and Technology Organisation, Australia. 14. Sun Microsystems & CollabNet (2008) NetBeans IDE 6.0, http://wwwnetbeans.org. 15. Tri, N., Boswell, S

  6. A standard set of upper extremity tasks for evaluating rehabilitation interventions for individuals with complete arm paralysis

    PubMed Central

    Cornwell, Andrew S.; Liao, James Y.; Bryden, Anne M.; Kirsch, Robert F.

    2013-01-01

    We have developed a set of upper extremity functional tasks to guide the design and test the performance of rehabilitation technologies that restore arm motion in people with high tetraplegia. Our goal was to develop a short set of tasks that would be representative of a much larger set of activities of daily living while also being feasible for a unilateral user of an implanted Functional Electrical Stimulation (FES) system. To compile this list of tasks, we reviewed existing clinical outcome measures related to arm and hand function, and were further informed by surveys of patient desires. We ultimately selected a set of five tasks that captured the most common components of movement seen in these tasks, making them highly relevant for assessing FES-restored unilateral arm function in individuals with high cervical spinal cord injury (SCI). The tasks are intended to be used when setting design specifications and for evaluation and standardization of rehabilitation technologies under development. While not unique, this set of tasks will provide a common basis for comparing different interventions (e.g., FES, powered orthoses, robotic assistants) and testing different user command interfaces (e.g., sip-and-puff, head joysticks, brain-computer interfaces). PMID:22773199

  7. How five leading safety-net hospitals are preparing for the challenges and opportunities of health care reform.

    PubMed

    Coughlin, Teresa A; Long, Sharon K; Sheen, Edward; Tolbert, Jennifer

    2012-08-01

    Safety-net hospitals will continue to play a critical role in the US health care system, as they will need to care for the more than twenty-three million people who are estimated to remain uninsured after the Affordable Care Act is implemented. Yet such hospitals will probably have less federal and state support for uncompensated care. At the same time, safety-net hospitals will need to reposition themselves in the marketplace to compete effectively for newly insured people who will have a choice of providers. We examine how five leading safety-net hospitals have begun preparing for reform. Building upon strong organizational attributes such as health information technology and system integration, the study hospitals' preparations include improving the efficiency and quality of care delivery, retaining current and attracting new patients, and expanding the medical home model.

  8. A Prototype Web-based system for GOES-R Space Weather Data

    NASA Astrophysics Data System (ADS)

    Sundaravel, A.; Wilkinson, D. C.

    2010-12-01

    The Geostationary Operational Environmental Satellite-R Series (GOES-R) makes use of advanced instruments and technologies to monitor the Earth's surface and provide with accurate space weather data. The first GOES-R series satellite is scheduled to be launched in 2015. The data from the satellite will be widely used by scientists for space weather modeling and predictions. This project looks into the ways of how these datasets can be made available to the scientists on the Web and to assist them on their research. We are working on to develop a prototype web-based system that allows users to browse, search and download these data. The GOES-R datasets will be archived in NetCDF (Network Common Data Form) and CSV (Comma Separated Values) format. The NetCDF is a self-describing data format that contains both the metadata information and the data. The data is stored in an array-oriented fashion. The web-based system will offer services in two ways: via a web application (portal) and via web services. Using the web application, the users can download data in NetCDF or CSV format and can also plot a graph of the data. The web page displays the various categories of data and the time intervals for which the data is available. The web application (client) sends the user query to the server, which then connects to the data sources to retrieve the data and delivers it to the users. Data access will also be provided via SOAP (Simple Object Access Protocol) and REST (Representational State Transfer) web services. These provide functions which can be used by other applications to fetch data and use the data for further processing. To build the prototype system, we are making use of proxy data from existing GOES and POES space weather datasets. Java is the programming language used in developing tools that formats data to NetCDF and CSV. For the web technology we have chosen Grails to develop both the web application and the services. Grails is an open source web application framework based on the Groovy language. We are also making use of the THREDDS (Thematic Realtime Environmental Distributed Data Services) server to publish and access the NetCDF files. We have completed developing software tools to generate NetCDF and CSV data files and also tools to translate NetCDF to CSV. The current phase of the project involves in designing and developing the web interface.

  9. Low-Cost, Net-Shape Ceramic Radial Turbine Program

    DTIC Science & Technology

    1985-05-01

    PROGRAM ELEMENT. PROJECT. TASK Garrett Turbine Engine Company AE OKUI UBR 111 South 34th Street, P.O. Box 2517 Phoenix, Arizona 85010 %I. CONTROLLING...processing iterations. Program management and materials characterization were conducted at Garrett Turbine Engine Company (GTEC), test bar and rotor...automotive gas turbine engine rotor development efforts at ACC. xvii PREFACE This is the final technical report of the Low-Cost, Net- Shape Ceramic

  10. Scientific and Technical Support for the Galileo Net Flux Radiometer Experiment

    NASA Technical Reports Server (NTRS)

    Sromovsky, Lawrence A.

    1997-01-01

    This report describes work in support of the Galileo Net Flux Radiometer (NFR), an instrument mounted on the Galileo probe, a spacecraft designed for entry into and direct measurements of Jupiter's atmosphere. Tasks originally proposed for the post launch period are briefly as follows: attend and support PSG (Project Science Group) and other project science meetings; support in-flight checkouts; maintain and keep safe the spare instrument and GSE (Ground Support Equipment); organize and maintain documentation; finish NFR calibration measurements, documentation, and analysis; characterize and diagnose instrument anomalies; develop descent data analysis tools; and science data analysis and publication. Because we had the capability to satisfy a project support need we also subsequently proposed and were funded to make ground- based observations of Jupiter during the period surrounding the Galileo arrival at Jupiter, using the Swedish Solar Telescope at La Palma, Canary Islands. The following section (11) provides background information on the NFR instrument.

  11. Partially pre-calculated weights for the backpropagation learning regime and high accuracy function mapping using continuous input RAM-based sigma-pi nets.

    PubMed

    Neville, R S; Stonham, T J; Glover, R J

    2000-01-01

    In this article we present a methodology that partially pre-calculates the weight updates of the backpropagation learning regime and obtains high accuracy function mapping. The paper shows how to implement neural units in a digital formulation which enables the weights to be quantised to 8-bits and the activations to 9-bits. A novel methodology is introduced to enable the accuracy of sigma-pi units to be increased by expanding their internal state space. We, also, introduce a novel means of implementing bit-streams in ring memories instead of utilising shift registers. The investigation utilises digital "Higher Order" sigma-pi nodes and studies continuous input RAM-based sigma-pi units. The units are trained with the backpropagation learning regime to learn functions to a high accuracy. The neural model is the sigma-pi units which can be implemented in digital microelectronic technology. The ability to perform tasks that require the input of real-valued information, is one of the central requirements of any cognitive system that utilises artificial neural network methodologies. In this article we present recent research which investigates a technique that can be used for mapping accurate real-valued functions to RAM-nets. One of our goals was to achieve accuracies of better than 1% for target output functions in the range Y epsilon [0,1], this is equivalent to an average Mean Square Error (MSE) over all training vectors of 0.0001 or an error modulus of 0.01. We present a development of the sigma-pi node which enables the provision of high accuracy outputs. The sigma-pi neural model was initially developed by Gurney (Learning in nets of structured hypercubes. PhD Thesis, Department of Electrical Engineering, Brunel University, Middlessex, UK, 1989; available as Technical Memo CN/R/144). Gurney's neuron models, the Time Integration Node (TIN), utilises an activation that was derived from a bit-stream. In this article we present a new methodology for storing sigma-pi node's activations as single values which are averages. In the course of the article we state what we define as a real number; how we represent real numbers and input of continuous values in our neural system. We show how to utilise the bounded quantised site-values (weights) of sigma-pi nodes to make training of these neurocomputing systems simple, using pre-calculated look-up tables to train the nets. In order to meet our accuracy goal, we introduce a means of increasing the bandwidth capability of sigma-pi units by expanding their internal state-space. In our implementation we utilise bit-streams when we calculate the real-valued outputs of the net. To simplify the hardware implementation of bit-streams we present a method of mapping them to RAM-based hardware using 'ring memories'. Finally, we study the sigma-pi units' ability to generalise once they are trained to map real-valued, high accuracy, continuous functions. We use sigma-pi units as they have been shown to have shorter training times than their analogue counterparts and can also overcome some of the drawbacks of semi-linear units (Gurney, 1992. Neural Networks, 5, 289-303).

  12. Educational Technology Network: a computer conferencing system dedicated to applications of computers in radiology practice, research, and education.

    PubMed

    D'Alessandro, M P; Ackerman, M J; Sparks, S M

    1993-11-01

    Educational Technology Network (ET Net) is a free, easy to use, on-line computer conferencing system organized and funded by the National Library of Medicine that is accessible via the SprintNet (SprintNet, Reston, VA) and Internet (Merit, Ann Arbor, MI) computer networks. It is dedicated to helping bring together, in a single continuously running electronic forum, developers and users of computer applications in the health sciences, including radiology. ET Net uses the Caucus computer conferencing software (Camber-Roth, Troy, NY) running on a microcomputer. This microcomputer is located in the National Library of Medicine's Lister Hill National Center for Biomedical Communications and is directly connected to the SprintNet and the Internet networks. The advanced computer conferencing software of ET Net allows individuals who are separated in space and time to unite electronically to participate, at any time, in interactive discussions on applications of computers in radiology. A computer conferencing system such as ET Net allows radiologists to maintain contact with colleagues on a regular basis when they are not physically together. Topics of discussion on ET Net encompass all applications of computers in radiological practice, research, and education. ET Net has been in successful operation for 3 years and has a promising future aiding radiologists in the exchange of information pertaining to applications of computers in radiology.

  13. Fast growing research on negative emissions

    NASA Astrophysics Data System (ADS)

    Minx, Jan C.; Lamb, William F.; Callaghan, Max W.; Bornmann, Lutz; Fuss, Sabine

    2017-03-01

    Generating negative emissions by removing carbon dioxide from the atmosphere is a key requirement for limiting global warming to well below 2 °C, or even 1.5 °C, and therefore for achieving the long-term climate goals of the recent Paris Agreement. Despite being a relatively young topic, negative emission technologies (NETs) have attracted growing attention in climate change research over the last decade. A sizeable body of evidence on NETs has accumulated across different fields that is by today too large and too diverse to be comprehensively tracked by individuals. Yet, understanding the size, composition and thematic structure of this literature corpus is a crucial pre-condition for effective scientific assessments of NETs as, for example, required for the new special report on the 1.5 °C by the Intergovernmental Panel on Climate Change (IPCC). In this paper we use scientometric methods and topic modelling to identify and characterize the available evidence on NETs as recorded in the Web of Science. We find that the development of the literature on NETs has started later than for climate change as a whole, but proceeds more quickly by now. A total number of about 2900 studies have accumulated between 1991 and 2016 with almost 500 new publications in 2016. The discourse on NETs takes place in distinct communities around energy systems, forests as well as biochar and other soil carbon options. Integrated analysis of NET portfolios—though crucial for understanding how much NETs are possible at what costs and risks—are still in their infancy and do not feature as a theme across the literature corpus. Overall, our analysis suggests that NETs research is relatively marginal in the wider climate change discourse despite its importance for global climate policy.

  14. Current Status and Future Prospect of K-NET and KiK-net

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Kunugi, T.; Suzuki, W.; Nakamura, H.; Fujiwara, H.

    2014-12-01

    During 18 years since the deployment of K-NET following the Kobe earthquake, our attention has mainly focused on rapidity of the data collection and an unfailing and reliable observation. In this presentation, we review three generations of the instruments employed by K-NET and KiK-net from these two points of view.At beginning of the 2000's, we newly developed the second generation instruments (K-NET02, K-NET02A, KiK-net06) to replace the first generation instruments (K-NET95, SMAC-MDK) employed when the networks were constructed in the 1990's. These instruments have an automatic dial-out function. It takes typically 2-5 s to establish communication and a few seconds to send the pre-trigger data. After that, data is available typically within a 1.5 s delay. Not only waveform data but also strong motion indexes such as real-time intensity, PGA, PGV, PGD, and response spectra are continuously sent once a second.After the 2011 Tohoku earthquake, we have developed the third generation instruments (K-NET11, KiK-net11) and have replaced almost half of the all stations country wide. Main improvement of this instrument is more unfailing and reliable observation. Because we have often experienced very large ground motions (e.g. 45 records exceeding gravity), the maximum measureable range was expanded from 2000 gal to 4000 gal for the second generation instrument, and to 8000 gal for the third. For the third generation instrument, in case of power failure, observation (including transmission of data) works for seven days thanks to the backup battery, while for the second generation instruments it works only for one day. By adding an oblique component to the three-component accelerometers, we could automatically distinguish shaking data from noise such as electric pulses which may cause a false alarm in EEW. Implementation to guarantee the continuity of observation under severe conditions such as during the Tohoku earthquake is very important, as well as a highly efficient observation. Owning to the drastic progress of information technologies, continuous observation has become technically and economically feasible and some of stations are experimentally equipped with a continuous communication line. Continuous observation offers very important information to help mitigating ongoing earthquake disasters.

  15. Reinforcement learning in computer vision

    NASA Astrophysics Data System (ADS)

    Bernstein, A. V.; Burnaev, E. V.

    2018-04-01

    Nowadays, machine learning has become one of the basic technologies used in solving various computer vision tasks such as feature detection, image segmentation, object recognition and tracking. In many applications, various complex systems such as robots are equipped with visual sensors from which they learn state of surrounding environment by solving corresponding computer vision tasks. Solutions of these tasks are used for making decisions about possible future actions. It is not surprising that when solving computer vision tasks we should take into account special aspects of their subsequent application in model-based predictive control. Reinforcement learning is one of modern machine learning technologies in which learning is carried out through interaction with the environment. In recent years, Reinforcement learning has been used both for solving such applied tasks as processing and analysis of visual information, and for solving specific computer vision problems such as filtering, extracting image features, localizing objects in scenes, and many others. The paper describes shortly the Reinforcement learning technology and its use for solving computer vision problems.

  16. Decision-making deficits in patients with chronic schizophrenia: Iowa Gambling Task and Prospect Valence Learning model.

    PubMed

    Kim, Myung-Sun; Kang, Bit-Na; Lim, Jae Young

    2016-01-01

    Decision-making is the process of forming preferences for possible options, selecting and executing actions, and evaluating the outcome. This study used the Iowa Gambling Task (IGT) and the Prospect Valence Learning (PVL) model to investigate deficits in risk-reward related decision-making in patients with chronic schizophrenia, and to identify decision-making processes that contribute to poor IGT performance in these patients. Thirty-nine patients with schizophrenia and 31 healthy controls participated. Decision-making was measured by total net score, block net scores, and the total number of cards selected from each deck of the IGT. PVL parameters were estimated with the Markov chain Monte Carlo sampling scheme in OpenBugs and BRugs, its interface to R, and the estimated parameters were analyzed with the Mann-Whitney U-test. The schizophrenia group received significantly lower total net scores compared to the control group. In terms of block net scores, an interaction effect of group × block was observed. The block net scores of the schizophrenia group did not differ across the five blocks, whereas those of the control group increased as the blocks progressed. The schizophrenia group obtained significantly lower block net scores in the fourth and fifth blocks of the IGT and selected cards from deck D (advantageous) less frequently than the control group. Additionally, the schizophrenia group had significantly lower values on the utility-shape, loss-aversion, recency, and consistency parameters of the PVL model. These results indicate that patients with schizophrenia experience deficits in decision-making, possibly due to failure in learning the expected value of each deck, and incorporating outcome experiences of previous trials into expectancies about options in the present trial.

  17. Metrology for Information Technology

    DTIC Science & Technology

    1997-05-01

    Technology (IT) MEL/ITL Task Group on Metrology for Information Technology (IT) U.S. DEPARTMENT OF COMMERCE Technology Administration National Institute of...NIST management requested a white paper on metrology for information technology (IT). A task group was formed to develop this white paper with...representatives from the Manufacturing Engineering Laboratory (MEL), the Information Technology Laboratory (ITL), and Technology Services (TS). The task

  18. LabNet: Toward A Community of Practice. Technology in Education Series.

    ERIC Educational Resources Information Center

    Ruopp, Richard, Ed.; And Others

    Many educators advocate the use of projects in the science classroom. This document describes an effort (LabNet) that has successfully implemented a program that allows students to learn science using projects. Chapter 1, "An Introduction to LabNet" (Richard Ruopp, Megham Pfister), provides an initial framework for understanding the…

  19. Confronting the Technological Pedagogical Knowledge of Finnish Net Generation Student Teachers

    ERIC Educational Resources Information Center

    Valtonen, Teemu; Pontinen, Susanna; Kukkonen, Jari; Dillon, Patrick; Vaisanen, Pertti; Hacklin, Stina

    2011-01-01

    The research reported here is concerned with a critical examination of some of the assumptions concerning the "Net Generation" capabilities of 74 first-year student teachers in a Finnish university. There are assumptions that: (i) Net Generation students are adept at learning through discovery and thinking in a hypertext-like manner…

  20. The Design and Realization of Net Testing System on Campus Network

    ERIC Educational Resources Information Center

    Ren, Zhanying; Liu, Shijie

    2005-01-01

    According to the requirement of modern teaching theory and technology, based on software engineering, database theory, the technique of net information security and system integration, a net testing system on local network was designed and realized. The system benefits for dividing of testing & teaching and settles the problems of random…

  1. Economic Development Network (ED>Net): 1995-96 Report to the Governor and the Legislature.

    ERIC Educational Resources Information Center

    California Community Colleges, Sacramento. Office of the Chancellor.

    The Economic Development Network (ED>Net) of the California Community Colleges was designed to advance the state's economic growth and competitiveness by coordinating and facilitating workforce improvement, technology deployment, and business development initiatives. This report reviews outcomes for ED>Net for 1995-96 based on reports…

  2. Air quality and climate impacts of alternative bus technologies in Greater London.

    PubMed

    Chong, Uven; Yim, Steve H L; Barrett, Steven R H; Boies, Adam M

    2014-04-15

    The environmental impact of diesel-fueled buses can potentially be reduced by the adoption of alternative propulsion technologies such as lean-burn compressed natural gas (LB-CNG) or hybrid electric buses (HEB), and emissions control strategies such as a continuously regenerating trap (CRT), exhaust gas recirculation (EGR), or selective catalytic reduction with trap (SCRT). This study assessed the environmental costs and benefits of these bus technologies in Greater London relative to the existing fleet and characterized emissions changes due to alternative technologies. We found a >30% increase in CO2 equivalent (CO2e) emissions for CNG buses, a <5% change for exhaust treatment scenarios, and a 13% (90% confidence interval 3.8-20.9%) reduction for HEB relative to baseline CO2e emissions. A multiscale regional chemistry-transport model quantified the impact of alternative bus technologies on air quality, which was then related to premature mortality risk. We found the largest decrease in population exposure (about 83%) to particulate matter (PM2.5) occurred with LB-CNG buses. Monetized environmental and investment costs relative to the baseline gave estimated net present cost of LB-CNG or HEB conversion to be $187 million ($73 million to $301 million) or $36 million ($-25 million to $102 million), respectively, while EGR or SCRT estimated net present costs were $19 million ($7 million to $32 million) or $15 million ($8 million to $23 million), respectively.

  3. Scheduling multirobot operations in manufacturing by truncated Petri nets

    NASA Astrophysics Data System (ADS)

    Chen, Qin; Luh, J. Y.

    1995-08-01

    Scheduling of operational sequences in manufacturing processes is one of the important problems in automation. Methods of applying Petri nets to model and analyze the problem with constraints on precedence relations, multiple resources allocation, etc. have been available in literature. Searching for an optimum schedule can be implemented by combining the branch-and-bound technique with the execution of the timed Petri net. The process usually produces a large Petri net which is practically not manageable. This disadvantage, however, can be handled by a truncation technique which divides the original large Petri net into several smaller size subnets. The complexity involved in the analysis of each subnet individually is greatly reduced. However, when the locally optimum schedules of the resulting subnets are combined together, it may not yield an overall optimum schedule for the original Petri net. To circumvent this problem, algorithms are developed based on the concepts of Petri net execution and modified branch-and-bound process. The developed technique is applied to a multi-robot task scheduling problem of the manufacturing work cell.

  4. Modelling the protocol stack in NCS with deterministic and stochastic petri net

    NASA Astrophysics Data System (ADS)

    Hui, Chen; Chunjie, Zhou; Weifeng, Zhu

    2011-06-01

    Protocol stack is the basis of the networked control systems (NCS). Full or partial reconfiguration of protocol stack offers both optimised communication service and system performance. Nowadays, field testing is unrealistic to determine the performance of reconfigurable protocol stack; and the Petri net formal description technique offers the best combination of intuitive representation, tool support and analytical capabilities. Traditionally, separation between the different layers of the OSI model has been a common practice. Nevertheless, such a layered modelling analysis framework of protocol stack leads to the lack of global optimisation for protocol reconfiguration. In this article, we proposed a general modelling analysis framework for NCS based on the cross-layer concept, which is to establish an efficiency system scheduling model through abstracting the time constraint, the task interrelation, the processor and the bus sub-models from upper and lower layers (application, data link and physical layer). Cross-layer design can help to overcome the inadequacy of global optimisation based on information sharing between protocol layers. To illustrate the framework, we take controller area network (CAN) as a case study. The simulation results of deterministic and stochastic Petri-net (DSPN) model can help us adjust the message scheduling scheme and obtain better system performance.

  5. Cloud Computing and Virtual Desktop Infrastructures in Afloat Environments

    DTIC Science & Technology

    2012-06-01

    Institute of Standards and Technology NPS Naval Postgraduate School OCONUS Outside of the Continental United States ONE- NET OCONUS Navy Enterprise... framework of technology that allows all interested systems, inside and outside of an organization, to expose and access well-defined services, and...was established to manage the Navy’s three largest enterprise networks; the OCONUS Navy Enterprise 22 Network (ONE- NET ), the Navy-Marine Corps

  6. Development of space simulation / net-laboratory system

    NASA Astrophysics Data System (ADS)

    Usui, H.; Matsumoto, H.; Ogino, T.; Fujimoto, M.; Omura, Y.; Okada, M.; Ueda, H. O.; Murata, T.; Kamide, Y.; Shinagawa, H.; Watanabe, S.; Machida, S.; Hada, T.

    A research project for the development of space simulation / net-laboratory system was approved by Japan Science and Technology Corporation (JST) in the category of Research and Development for Applying Advanced Computational Science and Technology(ACT-JST) in 2000. This research project, which continues for three years, is a collaboration with an astrophysical simulation group as well as other space simulation groups which use MHD and hybrid models. In this project, we develop a proto type of unique simulation system which enables us to perform simulation runs by providing or selecting plasma parameters through Web-based interface on the internet. We are also developing an on-line database system for space simulation from which we will be able to search and extract various information such as simulation method and program, manuals, and typical simulation results in graphic or ascii format. This unique system will help the simulation beginners to start simulation study without much difficulty or effort, and contribute to the promotion of simulation studies in the STP field. In this presentation, we will report the overview and the current status of the project.

  7. Reading Guided by Automated Graphical Representations: How Model-Based Text Visualizations Facilitate Learning in Reading Comprehension Tasks

    ERIC Educational Resources Information Center

    Pirnay-Dummer, Pablo; Ifenthaler, Dirk

    2011-01-01

    Our study integrates automated natural language-oriented assessment and analysis methodologies into feasible reading comprehension tasks. With the newly developed T-MITOCAR toolset, prose text can be automatically converted into an association net which has similarities to a concept map. The "text to graph" feature of the software is based on…

  8. 2006 Net Centric Operations Conference - Facilitating Net Centric Operations and Warfare

    DTIC Science & Technology

    2006-03-16

    22, 2005 • White Paper, “Facilitating Shared Services in the DoD,” Feb 12, 2006 • White Paper, “ Shared Services : Performance Accountability and Risk...who demand a culture of information sharing and improved organizational effectiveness.” 12 Facilitating Shared Services : Task “What should be the...distinct programs.” 13 Facilitating Shared Services : Focus Areas • Governance and Control Policy • Common Information Standards and Technical

  9. Modeling shared resources with generalized synchronization within a Petri net bottom-up approach.

    PubMed

    Ferrarini, L; Trioni, M

    1996-01-01

    This paper proposes a simple and effective way to represent shared resources in manufacturing systems within a Petri net model previously developed. Such a model relies on the bottom-up and modular approach to synthesis and analysis. The designer may define elementary tasks and then connect them with one another with three kinds of connections: self-loops, inhibitor arcs and simple synchronizations. A theoretical framework has been established for the analysis of liveness and reversibility of such models. The generalized synchronization, here formalized, represents an extension of the simple synchronization, allowing the merging of suitable subnets among elementary tasks. It is proved that under suitable, but not restrictive, hypotheses the generalized synchronization may be substituted for a simple one, thus being compatible with all the developed theoretical body.

  10. Finding Meaning in Medication Reconciliation Using Electronic Health Records: Qualitative Analysis in Safety Net Primary and Specialty Care

    PubMed Central

    Matta, George Yaccoub; Khoong, Elaine C; Lyles, Courtney R; Schillinger, Dean

    2018-01-01

    Background Safety net health systems face barriers to effective ambulatory medication reconciliation for vulnerable populations. Although some electronic health record (EHR) systems offer safety advantages, EHR use may affect the quality of patient-provider communication. Objective This mixed-methods observational study aimed to develop a conceptual framework of how clinicians balance the demands and risks of EHR and communication tasks during medication reconciliation discussions in a safety net system. Methods This study occurred 3 to 16 (median 9) months after new EHR implementation in five academic public hospital clinics. We video recorded visits between English-/Spanish-speaking patients and their primary/specialty care clinicians. We analyzed the proportion of medications addressed and coded time spent on nonverbal tasks during medication reconciliation as “multitasking EHR use,” “silent EHR use,” “non-EHR multitasking,” and “focused patient-clinician talk.” Finally, we analyzed communication patterns to develop a conceptual framework. Results We examined 35 visits (17%, 6/35 Spanish) between 25 patients (mean age 57, SD 11 years; 44%, 11/25 women; 48%, 12/25 Hispanic; and 20%, 5/25 with limited health literacy) and 25 clinicians (48%, 12/25 primary care). Patients had listed a median of 7 (IQR 5-12) relevant medications, and clinicians addressed a median of 3 (interquartile range [IQR] 1-5) medications. The median duration of medication reconciliation was 2.1 (IQR 1.0-4.2) minutes, comprising a median of 10% (IQR 3%-17%) of visit time. Multitasking EHR use occurred in 47% (IQR 26%-70%) of the medication reconciliation time. Silent EHR use and non-EHR multitasking occurred a smaller proportion of medication reconciliation time, with a median of 0% for both. Focused clinician-patient talk occurred a median of 24% (IQR 0-39%) of medication reconciliation time. Five communication patterns with EHR medication reconciliation were observed: (1) typical EHR multitasking for medication reconciliation, (2) dynamic EHR use to negotiate medication discrepancies, (3) focused patient-clinician talk for medication counseling and addressing patient concerns, (4) responding to patient concerns while maintaining EHR use, and (5) using EHRs to engage patients during medication reconciliation. We developed a conceptual diagram representing the dilemma of the multitasking clinician during medication reconciliation. Conclusions Safety net visits involve multitasking EHR use during almost half of medication reconciliation time. The multitasking clinician balances the cognitive and emotional demands posed by incoming information from multiple sources, attempts to synthesize and act on this information through EHR and communication tasks, and adopts strategies of silent EHR use and focused patient-clinician talk that may help mitigate the risks of multitasking. Future studies should explore diverse patient perspectives about clinician EHR multitasking, clinical outcomes related to EHR multitasking, and human factors and systems engineering interventions to improve the safety of EHR use during the complex process of medication reconciliation. PMID:29735477

  11. Semantic Segmentation of Convolutional Neural Network for Supervised Classification of Multispectral Remote Sensing

    NASA Astrophysics Data System (ADS)

    Xue, L.; Liu, C.; Wu, Y.; Li, H.

    2018-04-01

    Semantic segmentation is a fundamental research in remote sensing image processing. Because of the complex maritime environment, the classification of roads, vegetation, buildings and water from remote Sensing Imagery is a challenging task. Although the neural network has achieved excellent performance in semantic segmentation in the last years, there are a few of works using CNN for ground object segmentation and the results could be further improved. This paper used convolution neural network named U-Net, its structure has a contracting path and an expansive path to get high resolution output. In the network , We added BN layers, which is more conducive to the reverse pass. Moreover, after upsampling convolution , we add dropout layers to prevent overfitting. They are promoted to get more precise segmentation results. To verify this network architecture, we used a Kaggle dataset. Experimental results show that U-Net achieved good performance compared with other architectures, especially in high-resolution remote sensing imagery.

  12. NetCDF-CF-OPeNDAP: Standards for ocean data interoperability and object lessons for community data standards processes

    USGS Publications Warehouse

    Hankin, Steven C.; Blower, Jon D.; Carval, Thierry; Casey, Kenneth S.; Donlon, Craig; Lauret, Olivier; Loubrieu, Thomas; Srinivasan, Ashwanth; Trinanes, Joaquin; Godøy, Øystein; Mendelssohn, Roy; Signell, Richard P.; de La Beaujardiere, Jeff; Cornillon, Peter; Blanc, Frederique; Rew, Russ; Harlan, Jack; Hall, Julie; Harrison, D.E.; Stammer, Detlef

    2010-01-01

    It is generally recognized that meeting society's emerging environmental science and management needs will require the marine data community to provide simpler, more effective and more interoperable access to its data. There is broad agreement, as well, that data standards are the bedrock upon which interoperability will be built. The path that would bring the marine data community to agree upon and utilize such standards, however, is often elusive. In this paper we examine the trio of standards 1) netCDF files; 2) the Climate and Forecast (CF) metadata convention; and 3) the OPeNDAP data access protocol. These standards taken together have brought our community a high level of interoperability for "gridded" data such as model outputs, satellite products and climatological analyses, and they are gaining rapid acceptance for ocean observations. We will provide an overview of the scope of the contribution that has been made. We then step back from the information technology considerations to examine the community or "social" process by which the successes were achieved. We contrast the path by which the World Meteorological Organization (WMO) has advanced the Global Telecommunications System (GTS) - netCDF/CF/OPeNDAP exemplifying a "bottom up" standards process whereas GTS is "top down". Both of these standards are tales of success at achieving specific purposes, yet each is hampered by technical limitations. These limitations sometimes lead to controversy over whether alternative technological directions should be pursued. Finally we draw general conclusions regarding the factors that affect the success of a standards development effort - the likelihood that an IT standard will meet its design goals and will achieve community-wide acceptance. We believe that a higher level of thoughtful awareness by the scientists, program managers and technology experts of the vital role of standards and the merits of alternative standards processes can help us as a community to reach our interoperability goals faster.

  13. A Tool for the Automated Collection of Space Utilization Data: Three Dimensional Space Utilization Monitor

    NASA Technical Reports Server (NTRS)

    Vos, Gordon A.; Fink, Patrick; Ngo, Phong H.; Morency, Richard; Simon, Cory; Williams, Robert E.; Perez, Lance C.

    2015-01-01

    Space Human Factors and Habitability (SHFH) Element within the Human Research Program (HRP), in collaboration with the Behavioral Health and Performance (BHP) Element, is conducting research regarding Net Habitable Volume (NHV), the internal volume within a spacecraft or habitat that is available to crew for required activities, as well as layout and accommodations within that volume. NASA is looking for innovative methods to unobtrusively collect NHV data without impacting crew time. Data required includes metrics such as location and orientation of crew, volume used to complete tasks, internal translation paths, flow of work, and task completion times. In less constrained environments methods for collecting such data exist yet many are obtrusive and require significant post-processing. Example technologies used in terrestrial settings include infrared (IR) retro-reflective marker based motion capture, GPS sensor tracking, inertial tracking, and multiple camera filmography. However due to constraints of space operations many such methods are infeasible, such as inertial tracking systems which typically rely upon a gravity vector to normalize sensor readings, and traditional IR systems which are large and require extensive calibration. However multiple technologies have not yet been applied to space operations for these explicit purposes. Two of these include 3-Dimensional Radio Frequency Identification Real-Time Localization Systems (3D RFID-RTLS) and depth imaging systems which allow for 3D motion capture and volumetric scanning (such as those using IR-depth cameras like the Microsoft Kinect or Light Detection and Ranging / Light-Radar systems, referred to as LIDAR).

  14. Father Google and Mother IM: Confessions of a Net Gen Learner

    ERIC Educational Resources Information Center

    Windham, Carie

    2005-01-01

    To bridge the technology cultural gap between many faculty and administrators and the youngest generation of college students, this author, a recent graduate, reveals what being a "Net Gener" really means and how that can translate to the classroom. She discusses what she considers the basic principles that guide the Net Generation: (1)…

  15. Evaluating the Usability of a Professional Modeling Tool Repurposed for Middle School Learning

    NASA Astrophysics Data System (ADS)

    Peters, Vanessa L.; Songer, Nancy Butler

    2013-10-01

    This paper reports the results of a three-stage usability test of a modeling tool designed to support learners' deep understanding of the impacts of climate change on ecosystems. The design process involved repurposing an existing modeling technology used by professional scientists into a learning tool specifically designed for middle school students. To evaluate usability, we analyzed students' task performance and task completion time as they worked on an activity with the repurposed modeling technology. In stage 1, we conducted remote testing of an early modeling prototype with urban middle school students (n = 84). In stages 2 and 3, we used screencasting software to record students' mouse and keyboard movements during collaborative think-alouds (n = 22) and conducted a qualitative analysis of their peer discussions. Taken together, the study findings revealed two kinds of usability issues that interfered with students' productive use of the tool: issues related to the use of data and information, and issues related to the use of the modeling technology. The study findings resulted in design improvements that led to stronger usability outcomes and higher task performance among students. In this paper, we describe our methods for usability testing, our research findings, and our design solutions for supporting students' use of the modeling technology and use of data. The paper concludes with implications for the design and study of modeling technologies for science learning.

  16. Combining deep residual neural network features with supervised machine learning algorithms to classify diverse food image datasets.

    PubMed

    McAllister, Patrick; Zheng, Huiru; Bond, Raymond; Moorhead, Anne

    2018-04-01

    Obesity is increasing worldwide and can cause many chronic conditions such as type-2 diabetes, heart disease, sleep apnea, and some cancers. Monitoring dietary intake through food logging is a key method to maintain a healthy lifestyle to prevent and manage obesity. Computer vision methods have been applied to food logging to automate image classification for monitoring dietary intake. In this work we applied pretrained ResNet-152 and GoogleNet convolutional neural networks (CNNs), initially trained using ImageNet Large Scale Visual Recognition Challenge (ILSVRC) dataset with MatConvNet package, to extract features from food image datasets; Food 5K, Food-11, RawFooT-DB, and Food-101. Deep features were extracted from CNNs and used to train machine learning classifiers including artificial neural network (ANN), support vector machine (SVM), Random Forest, and Naive Bayes. Results show that using ResNet-152 deep features with SVM with RBF kernel can accurately detect food items with 99.4% accuracy using Food-5K validation food image dataset and 98.8% with Food-5K evaluation dataset using ANN, SVM-RBF, and Random Forest classifiers. Trained with ResNet-152 features, ANN can achieve 91.34%, 99.28% when applied to Food-11 and RawFooT-DB food image datasets respectively and SVM with RBF kernel can achieve 64.98% with Food-101 image dataset. From this research it is clear that using deep CNN features can be used efficiently for diverse food item image classification. The work presented in this research shows that pretrained ResNet-152 features provide sufficient generalisation power when applied to a range of food image classification tasks. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. On determining firing delay time of transitions for Petri net based signaling pathways by introducing stochastic decision rules.

    PubMed

    Miwa, Yoshimasa; Li, Chen; Ge, Qi-Wei; Matsuno, Hiroshi; Miyano, Satoru

    2010-01-01

    Parameter determination is important in modeling and simulating biological pathways including signaling pathways. Parameters are determined according to biological facts obtained from biological experiments and scientific publications. However, such reliable data describing detailed reactions are not reported in most cases. This prompted us to develop a general methodology of determining the parameters of a model in the case of that no information of the underlying biological facts is provided. In this study, we use the Petri net approach for modeling signaling pathways, and propose a method to determine firing delay times of transitions for Petri net models of signaling pathways by introducing stochastic decision rules. Petri net technology provides a powerful approach to modeling and simulating various concurrent systems, and recently have been widely accepted as a description method for biological pathways. Our method enables to determine the range of firing delay time which realizes smooth token flows in the Petri net model of a signaling pathway. The availability of this method has been confirmed by the results of an application to the interleukin-1 induced signaling pathway.

  18. On determining firing delay time of transitions for petri net based signaling pathways by introducing stochastic decision rules.

    PubMed

    Miwa, Yoshimasa; Li, Chen; Ge, Qi-Wei; Matsuno, Hiroshi; Miyano, Satoru

    2011-01-01

    Parameter determination is important in modeling and simulating biological pathways including signaling pathways. Parameters are determined according to biological facts obtained from biological experiments and scientific publications. However, such reliable data describing detailed reactions are not reported in most cases. This prompted us to develop a general methodology of determining the parameters of a model in the case of that no information of the underlying biological facts is provided. In this study, we use the Petri net approach for modeling signaling pathways, and propose a method to determine firing delay times of transitions for Petri net models of signaling pathways by introducing stochastic decision rules. Petri net technology provides a powerful approach to modeling and simulating various concurrent systems, and recently have been widely accepted as a description method for biological pathways. Our method enables to determine the range of firing delay time which realizes smooth token flows in the Petri net model of a signaling pathway. The availability of this method has been confirmed by the results of an application to the interleukin-1 induced signaling pathway.

  19. Enabling technologies for millimeter-wave radio-over-fiber systems in next generation heterogeneous mobile access networks

    NASA Astrophysics Data System (ADS)

    Zhang, Junwen; Yu, Jianjun; Wang, Jing; Xu, Mu; Cheng, Lin; Lu, Feng; Shen, Shuyi; Yan, Yan; Cho, Hyunwoo; Guidotti, Daniel; Chang, Gee-kung

    2017-01-01

    Fifth-generation (5G) wireless access network promises to support higher access data rate with more than 1,000 times capacity with respect to current long-term evolution (LTE) systems. New radio-access-technologies (RATs) based on higher carrier frequencies to millimeter-wave (MMW) radio-over-fiber, and carrier-aggregation (CA) using multi-band resources are intensively studied to support the high data rate access and effectively use of frequency resources in heterogeneous mobile network (Het-Net). In this paper, we investigate several enabling technologies for MMW RoF systems in 5G Het-Net. Efficient mobile fronthaul (MFH) solutions for 5G centralized radio access network (C-RAN) and beyond are proposed, analyzed and experimentally demonstrated based on the analog scheme. Digital predistortion based on memory polynomial for analog MFH linearization are presented with improved EVM performances and receiver sensitivity. We also propose and experimentally demonstrate a novel inter-/intra- RAT CA scheme for 5G Het- Net. The real-time standard 4G-LTE signal is carrier-aggregated with three broadband 60GHz MMW signals based on proposed optical-domain band-mapping method. RATs based on new waveforms have also been studied here to achieve higher spectral-efficiency (SE) in asynchronous environments. Full-duplex asynchronous quasi-gapless carrier aggregation scheme for MMW ROF inter-/intra-RAT based on the FBMC is also presented with 4G-LTE signals. Compared with OFDM-based signals with large guard-bands, FBMC achieves higher spectral-efficiency with better EVM performance at less received power and smaller guard-bands.

  20. Seismic Monitoring with NetQuakes: The First 75 in the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Bodin, P.; Vidale, J. E.; Luetgert, J. H.; Malone, S. D.; Delorey, A. A.; Steele, W. P.; Gibbons, D. A.; Walsh, L. K.

    2011-12-01

    NetQuakes accelerographs are relatively inexpensive Internet-aware appliances that we are using as part of our regional seismic monitoring program in the Pacific Northwest Seismic Network (PNSN). To date we have deployed approximately 65 units. By the end of 2011, we will have at least 75 systems sited and operating. The instruments are made by Swiss manufacturer GeoSig, Ltd., and have been obtained by PNSN through several cooperative programs with the US Geological Survey (USGS). The NetQuakes systems have increased the number of strong-motion stations in the Pacific Northwest by ~50%. NetQuakes instruments connect to the Internet via wired or wireless telemetry, obtain accurate timing vie Network Time Protocol, and are designed to be located in the ground floor of houses or small buildings. At PNSN we have concentrated on finding NetQuakes hosts by having technologically savvy homeowners self-identify as a response to news reports about the NetQuakes project. Potential hosts are prioritized by their proximity to target sites provided by a regional panel of experts who studied the region's strong-ground-motion monitoring needs. Recorded waveforms, triggered by strong motion or retrieved from a buffer of continuous data, are transmitted to Menlo Park, and then on to PNSN in Seattle. Data are available with latency of a few minutes to a little over an hour, and are automatically incorporated with the rest of PNSN network data for analysis and the generation of earthquake products. Triggered data may also be viewed by the public via the USGS website, [http://earthquake.usgs.gov/monitoring/netquakes/map/pacnw]. We present examples of ground motion recordings returned to date. Local earthquakes up to M4 (at a distance of ~60 km) reveal interesting patterns of local site effects. The 11 March M9 Tohoku, Japan earthquake produced ground motions recorded on the PNSN accelerographs, including many NetQuakes systems, that reveal the extent and severity of basin-related shaking amplification.

  1. Technology and testing.

    PubMed

    Quellmalz, Edys S; Pellegrino, James W

    2009-01-02

    Large-scale testing of educational outcomes benefits already from technological applications that address logistics such as development, administration, and scoring of tests, as well as reporting of results. Innovative applications of technology also provide rich, authentic tasks that challenge the sorts of integrated knowledge, critical thinking, and problem solving seldom well addressed in paper-based tests. Such tasks can be used on both large-scale and classroom-based assessments. Balanced assessment systems can be developed that integrate curriculum-embedded, benchmark, and summative assessments across classroom, district, state, national, and international levels. We discuss here the potential of technology to launch a new era of integrated, learning-centered assessment systems.

  2. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory.

    PubMed

    Martiník, Ivo

    2015-01-01

    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality.

  3. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory

    PubMed Central

    Martiník, Ivo

    2015-01-01

    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality. PMID:26258164

  4. Cross-domain and multi-task transfer learning of deep convolutional neural network for breast cancer diagnosis in digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Samala, Ravi K.; Chan, Heang-Ping; Hadjiiski, Lubomir; Helvie, Mark A.; Richter, Caleb; Cha, Kenny

    2018-02-01

    We propose a cross-domain, multi-task transfer learning framework to transfer knowledge learned from non-medical images by a deep convolutional neural network (DCNN) to medical image recognition task while improving the generalization by multi-task learning of auxiliary tasks. A first stage cross-domain transfer learning was initiated from ImageNet trained DCNN to mammography trained DCNN. 19,632 regions-of-interest (ROI) from 2,454 mass lesions were collected from two imaging modalities: digitized-screen film mammography (SFM) and full-field digital mammography (DM), and split into training and test sets. In the multi-task transfer learning, the DCNN learned the mass classification task simultaneously from the training set of SFM and DM. The best transfer network for mammography was selected from three transfer networks with different number of convolutional layers frozen. The performance of single-task and multitask transfer learning on an independent SFM test set in terms of the area under the receiver operating characteristic curve (AUC) was 0.78+/-0.02 and 0.82+/-0.02, respectively. In the second stage cross-domain transfer learning, a set of 12,680 ROIs from 317 mass lesions on DBT were split into validation and independent test sets. We first studied the data requirements for the first stage mammography trained DCNN by varying the mammography training data from 1% to 100% and evaluated its learning on the DBT validation set in inference mode. We found that the entire available mammography set provided the best generalization. The DBT validation set was then used to train only the last four fully connected layers, resulting in an AUC of 0.90+/-0.04 on the independent DBT test set.

  5. Modeling and control of operator functional state in a unified framework of fuzzy inference petri nets.

    PubMed

    Zhang, Jian-Hua; Xia, Jia-Jun; Garibaldi, Jonathan M; Groumpos, Petros P; Wang, Ru-Bin

    2017-06-01

    In human-machine (HM) hybrid control systems, human operator and machine cooperate to achieve the control objectives. To enhance the overall HM system performance, the discrete manual control task-load by the operator must be dynamically allocated in accordance with continuous-time fluctuation of psychophysiological functional status of the operator, so-called operator functional state (OFS). The behavior of the HM system is hybrid in nature due to the co-existence of discrete task-load (control) variable and continuous operator performance (system output) variable. Petri net is an effective tool for modeling discrete event systems, but for hybrid system involving discrete dynamics, generally Petri net model has to be extended. Instead of using different tools to represent continuous and discrete components of a hybrid system, this paper proposed a method of fuzzy inference Petri nets (FIPN) to represent the HM hybrid system comprising a Mamdani-type fuzzy model of OFS and a logical switching controller in a unified framework, in which the task-load level is dynamically reallocated between the operator and machine based on the model-predicted OFS. Furthermore, this paper used a multi-model approach to predict the operator performance based on three electroencephalographic (EEG) input variables (features) via the Wang-Mendel (WM) fuzzy modeling method. The membership function parameters of fuzzy OFS model for each experimental participant were optimized using artificial bee colony (ABC) evolutionary algorithm. Three performance indices, RMSE, MRE, and EPR, were computed to evaluate the overall modeling accuracy. Experiment data from six participants are analyzed. The results show that the proposed method (FIPN with adaptive task allocation) yields lower breakdown rate (from 14.8% to 3.27%) and higher human performance (from 90.30% to 91.99%). The simulation results of the FIPN-based adaptive HM (AHM) system on six experimental participants demonstrate that the FIPN framework provides an effective way to model and regulate/optimize the OFS in HM hybrid systems composed of continuous-time OFS model and discrete-event switching controller. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Ling; Post, Brian; Kunc, Vlastimil

    Additive manufacturing (AM) or 3D printing is well known for producing arbitrary shaped parts without any tooling required, offering a promising alternative to the conventional injection molding method to fabricate near-net-shaped magnets. In order to determine their applicability in the fabrication of Nd-Fe-B bondedmagnets, we compare two 3D printing technologies, namely binder jetting and material extrusion. Some prospects and challenges of these state-of-the-art technologies for large-scale industrial applications will be discussed.

  7. Long-term field performance of a polyester-based long-lasting insecticidal mosquito net in rural Uganda

    PubMed Central

    Kilian, Albert; Byamukama, Wilson; Pigeon, Olivier; Atieli, Francis; Duchon, Stephan; Phan, Chi

    2008-01-01

    Background In order to evaluate whether criteria for LLIN field performance (phase III) set by the WHO Pesticide Evaluation Scheme are met, first and second generations of one of these products, PermaNet®, a polyester net using the coating technology were tested. Methods A randomized, double blinded study design was used comparing LLIN to conventionally treated nets and following LLIN for three years under regular household use in rural conditions. Primary outcome measures were deltamethrin residue and bioassay performance (60 minute knock-down and 24 hour mortality after a three minute exposure) using a strain of Anopheles gambiae s.s. sensitive to pyrethroid insecticides. Results Baseline concentration of deltamethrin was within targets for all net types but was rapidly lost in conventionally treated nets and first generation PermaNet® with median of 0.7 and 2.5 mg/m2 after six months respectively. In contrast, second generation PermaNet® retained insecticide well and had 41.5% of baseline dose after 36 months (28.7 mg/m2). Similarly, vector mortality and knockdown dropped to 18% and 70% respectively for first generation LLIN after six months but remained high (88.5% and 97.8% respectively) for second generation PermaNet® after 36 months of follow up at which time 90.0% of nets had either a knockdown rate ≥ 95% or mortality rate ≥ 80%. Conclusion Second generation PermaNet® showed excellent results after three years of field use and fulfilled the WHOPES criteria for LLIN. Loss of insecticide on LLIN using coating technology under field conditions was far more influenced by factors associated with handling rather than washing. PMID:18355408

  8. Computer-Mediated Communication in English for Specific Purposes: A Case Study with Computer Science Students at Universiti Teknologi Malaysia

    ERIC Educational Resources Information Center

    Shamsudin, Sarimah; Nesi, Hilary

    2006-01-01

    This paper will describe an ESP approach to the design and implementation of computer-mediated communication (CMC) tasks for computer science students at Universiti Teknologi Malaysia, and discuss the effectiveness of the chat feature of Windows NetMeeting as a tool for developing specified language skills. CMC tasks were set within a programme of…

  9. Nonword Repetition Priming in Lexical Decision Reverses as a Function of Study Task and Speed Stress

    ERIC Educational Resources Information Center

    Zeelenberg, Rene; Wagenmakers, Eric-Jan; Shiffrin, Richard M.

    2004-01-01

    The authors argue that nonword repetition priming in lexical decision is the net result of 2 opposing processes. First, repeating nonwords in the lexical decision task results in the storage of a memory trace containing the interpretation that the letter string is a nonword; retrieval of this trace leads to an increase in performance for repeated…

  10. ILEWG technology roadmap for Moon exploration

    NASA Astrophysics Data System (ADS)

    Foing, Bernard H.

    2008-04-01

    We discuss the charter and activities of the International Lunar Exploration Working Group (ILEWG), and give an update from the related ILEWG task groups. We discuss the different rationale and technology roadmap for Moon exploration, as debated in previous ILEWG conferences. The Technology rationale includes: 1) The advancement of instrumentation: 2) Technologies in robotic and human exploration 3) Moon-Mars Exploration can inspire solutions to global Earth sustained development. We finally discuss a possible roadmap for development of technologies necessary for Moon and Mars exploration.

  11. Health and agricultural productivity: Evidence from Zambia.

    PubMed

    Fink, Günther; Masiye, Felix

    2015-07-01

    We evaluate the productivity effects of investment in preventive health technology through a randomized controlled trial in rural Zambia. In the experiment, access to subsidized bed nets was randomly assigned at the community level; 516 farmers were followed over a one-year farming period. We find large positive effects of preventative health investment on productivity: among farmers provided with access to free nets, harvest value increased by US$ 76, corresponding to about 14.7% of the average output value. While only limited information was collected on farming inputs, shifts in the extensive and the intensive margins of labor supply appear to be the most likely mechanism underlying the productivity improvements observed. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Barriers and Facilitators to Online Portal Use Among Patients and Caregivers in a Safety Net Health Care System: A Qualitative Study.

    PubMed

    Tieu, Lina; Sarkar, Urmimala; Schillinger, Dean; Ralston, James D; Ratanawongsa, Neda; Pasick, Rena; Lyles, Courtney R

    2015-12-03

    Patient portals have the potential to support self-management for chronic diseases and improve health outcomes. With the rapid rise in adoption of patient portals spurred by meaningful use incentives among safety net health systems (a health system or hospital providing a significant level of care to low-income, uninsured, and vulnerable populations), it is important to understand the readiness and willingness of patients and caregivers in safety net settings to access their personal health records online. To explore patient and caregiver perspectives on online patient portal use before its implementation at San Francisco General Hospital, a safety net hospital. We conducted 16 in-depth interviews with chronic disease patients and caregivers who expressed interest in using the Internet to manage their health. Discussions focused on health care experiences, technology use, and interest in using an online portal to manage health tasks. We used open coding to categorize all the barriers and facilitators to portal use, followed by a second round of coding that compared the categories to previously published findings. In secondary analyses, we also examined specific barriers among 2 subgroups: those with limited health literacy and caregivers. We interviewed 11 patients and 5 caregivers. Patients were predominantly male (82%, 9/11) and African American (45%, 5/11). All patients had been diagnosed with diabetes and the majority had limited health literacy (73%, 8/11). The majority of caregivers were female (80%, 4/5), African American (60%, 3/5), caregivers of individuals with diabetes (60%, 3/5), and had adequate health literacy (60%, 3/5). A total of 88% (14/16) of participants reported interest in using the portal after viewing a prototype. Major perceived barriers included security concerns, lack of technical skills/interest, and preference for in-person communication. Facilitators to portal use included convenience, health monitoring, and improvements in patient-provider communication. Participants with limited health literacy discussed more fundamental barriers to portal use, including challenges with reading and typing, personal experience with online security breaches/viruses, and distrust of potential security measures. Caregivers expressed high interest in portal use to support their roles in interpreting health information, advocating for quality care, and managing health behaviors and medical care. Despite concerns about security, difficulty understanding medical information, and satisfaction with current communication processes, respondents generally expressed enthusiasm about portal use. Our findings suggest a strong need for training and support to assist vulnerable patients with portal registration and use, particularly those with limited health literacy. Efforts to encourage portal use among vulnerable patients should directly address health literacy and security/privacy issues and support access for caregivers.

  13. Barriers and Facilitators to Online Portal Use Among Patients and Caregivers in a Safety Net Health Care System: A Qualitative Study

    PubMed Central

    Sarkar, Urmimala; Schillinger, Dean; Ralston, James D; Ratanawongsa, Neda; Pasick, Rena; Lyles, Courtney R

    2015-01-01

    Background Patient portals have the potential to support self-management for chronic diseases and improve health outcomes. With the rapid rise in adoption of patient portals spurred by meaningful use incentives among safety net health systems (a health system or hospital providing a significant level of care to low-income, uninsured, and vulnerable populations), it is important to understand the readiness and willingness of patients and caregivers in safety net settings to access their personal health records online. Objective To explore patient and caregiver perspectives on online patient portal use before its implementation at San Francisco General Hospital, a safety net hospital. Methods We conducted 16 in-depth interviews with chronic disease patients and caregivers who expressed interest in using the Internet to manage their health. Discussions focused on health care experiences, technology use, and interest in using an online portal to manage health tasks. We used open coding to categorize all the barriers and facilitators to portal use, followed by a second round of coding that compared the categories to previously published findings. In secondary analyses, we also examined specific barriers among 2 subgroups: those with limited health literacy and caregivers. Results We interviewed 11 patients and 5 caregivers. Patients were predominantly male (82%, 9/11) and African American (45%, 5/11). All patients had been diagnosed with diabetes and the majority had limited health literacy (73%, 8/11). The majority of caregivers were female (80%, 4/5), African American (60%, 3/5), caregivers of individuals with diabetes (60%, 3/5), and had adequate health literacy (60%, 3/5). A total of 88% (14/16) of participants reported interest in using the portal after viewing a prototype. Major perceived barriers included security concerns, lack of technical skills/interest, and preference for in-person communication. Facilitators to portal use included convenience, health monitoring, and improvements in patient-provider communication. Participants with limited health literacy discussed more fundamental barriers to portal use, including challenges with reading and typing, personal experience with online security breaches/viruses, and distrust of potential security measures. Caregivers expressed high interest in portal use to support their roles in interpreting health information, advocating for quality care, and managing health behaviors and medical care. Conclusions Despite concerns about security, difficulty understanding medical information, and satisfaction with current communication processes, respondents generally expressed enthusiasm about portal use. Our findings suggest a strong need for training and support to assist vulnerable patients with portal registration and use, particularly those with limited health literacy. Efforts to encourage portal use among vulnerable patients should directly address health literacy and security/privacy issues and support access for caregivers. PMID:26681155

  14. Absent without leave; a neuroenergetic theory of mind wandering

    PubMed Central

    Killeen, Peter R.

    2013-01-01

    Absent minded people are not under the control of task-relevant stimuli. According to the Neuroenergetics Theory of attention (NeT), this lack of control is often due to fatigue of the relevant processing units in the brain caused by insufficient resupply of the neuron's preferred fuel, lactate, from nearby astrocytes. A simple drift model of information processing accounts for response-time statistics in a paradigm often used to study inattention, the Sustained Attention to Response Task (SART). It is suggested that errors and slowing in this fast-paced, response-engaging task may have little to due with inattention. Slower-paced and less response-demanding tasks give greater license for inattention—aka absent-mindedness, mind-wandering. The basic NeT is therefore extended with an ancillary model of attentional drift and recapture. This Markov model, called NEMA, assumes probability λ of lapses of attention from 1 s to the next, and probability α of drifting back to the attentional state. These parameters measure the strength of attraction back to the task (α), or away to competing mental states or action patterns (λ); their proportion determines the probability of the individual being inattentive at any point in time over the long run. Their values are affected by the fatigue of the brain units they traffic between. The deployment of the model is demonstrated with a data set involving paced responding. PMID:23847559

  15. NovaNET 2008-09 Evaluation. Impact Evaluation. E&R Report No. 09.36

    ERIC Educational Resources Information Center

    Bulgakov-Cook, Dina

    2010-01-01

    NovaNET is a technology-based teacher-facilitated educational approach used at schools to support students at risk of not meeting graduation requirements to accrue credits in a variety of subjects. NovaNET contributes to the WCPSS goal of closing achievement gaps and creating opportunities for all students to graduate on time. In 2008-09, 38…

  16. DeepID-Net: Deformable Deep Convolutional Neural Networks for Object Detection.

    PubMed

    Ouyang, Wanli; Zeng, Xingyu; Wang, Xiaogang; Qiu, Shi; Luo, Ping; Tian, Yonglong; Li, Hongsheng; Yang, Shuo; Wang, Zhe; Li, Hongyang; Loy, Chen Change; Wang, Kun; Yan, Junjie; Tang, Xiaoou

    2016-07-07

    In this paper, we propose deformable deep convolutional neural networks for generic object detection. This new deep learning object detection framework has innovations in multiple aspects. In the proposed new deep architecture, a new deformation constrained pooling (def-pooling) layer models the deformation of object parts with geometric constraint and penalty. A new pre-training strategy is proposed to learn feature representations more suitable for the object detection task and with good generalization capability. By changing the net structures, training strategies, adding and removing some key components in the detection pipeline, a set of models with large diversity are obtained, which significantly improves the effectiveness of model averaging. The proposed approach improves the mean averaged precision obtained by RCNN [16], which was the state-of-the-art, from 31% to 50.3% on the ILSVRC2014 detection test set. It also outperforms the winner of ILSVRC2014, GoogLeNet, by 6.1%. Detailed component-wise analysis is also provided through extensive experimental evaluation, which provides a global view for people to understand the deep learning object detection pipeline.

  17. Can the real opportunity cost stand up: displaced services, the straw man outside the room.

    PubMed

    Eckermann, Simon; Pekarsky, Brita

    2014-04-01

    In current literature, displaced services have been suggested to provide a basis for determining a threshold value for the effects of a new technology as part of a reimbursement process when budgets are fixed. We critically examine the conditions under which displaced services would represent an economically meaningful threshold value. We first show that if we assume that the least cost-effective services are displaced to finance a new technology, then the incremental cost-effectiveness ratio (ICER) of the displaced services (d) only coincides with that related to the opportunity cost of adopting that new technology, the ICER of the most cost-effective service in expansion (n), under highly restrictive conditions-namely, complete allocative efficiency in existing provision of health care interventions. More generally, reimbursement of new technology with a fixed budget comprises two actions; adoption and financing through displacement and the effect of reimbursement is the net effect of these two actions. In order for the reimbursement process to be a pathway to allocative efficiency within a fixed budget, the net effect of the strategy of reimbursement is compared with the most cost-effective alternative strategy for reimbursement: optimal reallocation, the health gain maximizing expansion of existing services financed by the health loss minimizing contraction. The shadow price of the health effects of a new technology, βc = (1/n + 1/d - 1/m)(-1), accounts for both imperfect displacement (the ICER of the displaced service, d < m, the ICER of the least cost-effective of the existing services in contraction) and the allocative inefficiency (n < m) characteristic of health systems.

  18. AdaNET research project

    NASA Technical Reports Server (NTRS)

    Digman, R. Michael

    1988-01-01

    The components necessary for the success of the commercialization of an Ada Technology Transition Network are reported in detail. The organizational plan presents the planned structure for services development and technical transition of AdaNET services to potential user communities. The Business Plan is the operational plan for the AdaNET service as a commercial venture. The Technical Plan is the plan from which the AdaNET can be designed including detailed requirements analysis. Also contained is an analysis of user fees and charges, and a proposed user fee schedule.

  19. Task analysis of information technology-mediated medication management in outpatient care.

    PubMed

    van Stiphout, F; Zwart-van Rijkom, J E F; Maggio, L A; Aarts, J E C M; Bates, D W; van Gelder, T; Jansen, P A F; Schraagen, J M C; Egberts, A C G; ter Braak, E W M T

    2015-09-01

    Educating physicians in the procedural as well as cognitive skills of information technology (IT)-mediated medication management could be one of the missing links for the improvement of patient safety. We aimed to compose a framework of tasks that need to be addressed to optimize medication management in outpatient care. Formal task analysis: decomposition of a complex task into a set of subtasks. First, we obtained a general description of the medication management process from exploratory interviews. Secondly, we interviewed experts in-depth to further define tasks and subtasks. Outpatient care in different fields of medicine in six teaching and academic medical centres in the Netherlands and the United States. 20 experts. Tasks were divided up into procedural, cognitive and macrocognitive tasks and categorized into the three components of dynamic decision making. The medication management process consists of three components: (i) reviewing the medication situation; (ii) composing a treatment plan; and (iii) accomplishing and communicating a treatment and surveillance plan. Subtasks include multiple cognitive tasks such as composing a list of current medications and evaluating the reliability of sources, and procedural tasks such as documenting current medication. The identified macrocognitive tasks were: planning, integration of IT in workflow, managing uncertainties and responsibilities, and problem detection. All identified procedural, cognitive and macrocognitive skills should be included when designing education for IT-mediated medication management. The resulting framework supports the design of educational interventions to improve IT-mediated medication management in outpatient care. © 2015 The Authors. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of The British Pharmacological Society.

  20. A Feasibility Study for Perioperative Ventricular Tachycardia Prognosis and Detection and Noise Detection Using a Neural Network and Predictive Linear Operators

    NASA Technical Reports Server (NTRS)

    Moebes, T. A.

    1994-01-01

    To locate the accessory pathway(s) in preexicitation syndromes, epicardial and endocardial ventricular mapping is performed during anterograde ventricular activation via accessory pathway(s) from data originally received in signal form. As the number of channels increases, it is pertinent that more automated detection of coherent/incoherent signals is achieved as well as the prediction and prognosis of ventricular tachywardia (VT). Today's computers and computer program algorithms are not good in simple perceptual tasks such as recognizing a pattern or identifying a sound. This discrepancy, among other things, has been a major motivating factor in developing brain-based, massively parallel computing architectures. Neural net paradigms have proven to be effective at pattern recognition tasks. In signal processing, the picking of coherent/incoherent signals represents a pattern recognition task for computer systems. The picking of signals representing the onset ot VT also represents such a computer task. We attacked this problem by defining four signal attributes for each potential first maximal arrival peak and one signal attribute over the entire signal as input to a back propagation neural network. One attribute was the predicted amplitude value after the maximum amplitude over a data window. Then, by using a set of known (user selected) coherent/incoherent signals, and signals representing the onset of VT, we trained the back propagation network to recognize coherent/incoherent signals, and signals indicating the onset of VT. Since our output scheme involves a true or false decision, and since the output unit computes values between 0 and 1, we used a Fuzzy Arithmetic approach to classify data as coherent/incoherent signals. Furthermore, a Mean-Square Error Analysis was used to determine system stability. The neural net based picking coherent/incoherent signal system achieved high accuracy on picking coherent/incoherent signals on different patients. The system also achieved a high accuracy of picking signals which represent the onset of VT, that is, VT immediately followed these signals. A special binary representation of the input and output data allowed the neural network to train very rapidly as compared to another standard decimal or normalized representations of the data.

  1. AdaNET research plan

    NASA Technical Reports Server (NTRS)

    Mcbride, John G.

    1990-01-01

    The mission of the AdaNET research effort is to determine how to increase the availability of reusable Ada components and associated software engineering technology to both private and Federal sectors. The effort is structured to define the requirements for transfer of Federally developed software technology, study feasible approaches to meeting the requirements, and to gain experience in applying various technologies and practices. The overall approach to the development of the AdaNET System Specification is presented. A work breakdown structure is presented with each research activity described in detail. The deliverables for each work area are summarized. The overall organization and responsibilities for each research area are described. The schedule and necessary resources are presented for each research activity. The estimated cost is summarized for each activity. The project plan is fully described in the Super Project Expert data file contained on the floppy disk attached to the back cover of this plan.

  2. Understanding the Knowledge Gap Experienced by U.S. Safety Net Patients in Teleretinal Screening.

    PubMed

    George, Sheba M; Hayes, Erin Moran; Fish, Allison; Daskivich, Lauren Patty; Ogunyemi, Omolola I

    2016-01-01

    Safety-net patients' socioeconomic barriers interact with limited digital and health literacies to produce a "knowledge gap" that impacts the delivery of healthcare via telehealth technologies. Six focus groups (2 African- American and 4 Latino) were conducted with patients who received teleretinal screening in a U.S. urban safety-net setting. Focus groups were analyzed using a modified grounded theory methodology. Findings indicate that patients' knowledge gap is primarily produced at three points during the delivery of care: (1) exacerbation of patients' pre-existing personal barriers in the clinical setting; (2) encounters with technology during screening; and (3) lack of follow up after the visit. This knowledge gap produces confusion, potentially limiting patients' perceptions of care and their ability to manage their own care. It may be ameliorated through delivery of patient education focused on both disease pathology and specific role of telehealth technologies in disease management.

  3. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  4. NaNet: a configurable NIC bridging the gap between HPC and real-time HEP GPU computing

    NASA Astrophysics Data System (ADS)

    Lonardo, A.; Ameli, F.; Ammendola, R.; Biagioni, A.; Cotta Ramusino, A.; Fiorini, M.; Frezza, O.; Lamanna, G.; Lo Cicero, F.; Martinelli, M.; Neri, I.; Paolucci, P. S.; Pastorelli, E.; Pontisso, L.; Rossetti, D.; Simeone, F.; Simula, F.; Sozzi, M.; Tosoratto, L.; Vicini, P.

    2015-04-01

    NaNet is a FPGA-based PCIe Network Interface Card (NIC) design with GPUDirect and Remote Direct Memory Access (RDMA) capabilities featuring a configurable and extensible set of network channels. The design currently supports both standard—Gbe (1000BASE-T) and 10GbE (10Base-R)—and custom—34 Gbps APElink and 2.5 Gbps deterministic latency KM3link—channels, but its modularity allows for straightforward inclusion of other link technologies. The GPUDirect feature combined with a transport layer offload module and a data stream processing stage makes NaNet a low-latency NIC suitable for real-time GPU processing. In this paper we describe the NaNet architecture and its performances, exhibiting two of its use cases: the GPU-based low-level trigger for the RICH detector in the NA62 experiment at CERN and the on-/off-shore data transport system for the KM3NeT-IT underwater neutrino telescope.

  5. ECASTAR: Energy Conservation; an Assessment of Systems, Technologies and Requirements

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A methodology for a systems approach display and assessment of the potential for energy conservation actions and the impacts of those actions was presented. The U.S. economy is divided into four sectors: energy industry, industry, residential/commercial and transportation. Each sector is assessed with respect to energy conservation actions and impacts. The four sectors are combined and three strategies for energy conservation actions for the combined sectors are assessed. The three strategies (national energy conservation, electrification and diversification) represent energy conservation actions for the near term (now to 1985), the mid term (1985 to 2000) and the far term (2000 and beyond). The assessment procedure includes input/output analysis to bridge the flows between the sectors, and net economics and net energetics as performance criteria for the conservation actions. Targets of opportunity for large net energy net energy savings and the application of technology to achieve these savings are discussed.

  6. Impact of Market Behavior, Fleet Composition, and Ancillary Services on Revenue Sufficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frew, Bethany; Gallo, Giulia; Brinkman, Gregory

    Revenue insufficiency, or the missing money problem, occurs when the revenues that generators earn from the market are not sufficient to cover both fixed and variable costs to remain in the market and/or justify investments in new capacity, which may be needed for reliability. The near-zero marginal cost of variable renewable generators further exacerbates these revenue challenges. Estimating the extent of the missing money problem in current electricity markets is an important, nontrivial task that requires representing both how the power system operates and how market participants behave. This paper explores the missing money problem using a production cost modelmore » that represented a simplified version of the Electric Reliability Council of Texas (ERCOT) energy-only market for the years 2012-2014. We evaluate how various market structures -- including market behavior, ancillary services, and changing fleet compositions -- affect net revenues in this ERCOT-like system. In most production cost modeling exercises, resources are assumed to offer their marginal capabilities at marginal costs. Although this assumption is reasonable for feasibility studies and long-term planning, it does not adequately consider the market behaviors that impact revenue sufficiency. In this work, we simulate a limited set of market participant strategic bidding behaviors by means of different sets of markups; these markups are applied to the true production costs of all gas generators, which are the most prominent generators in ERCOT. Results show that markups can help generators increase their net revenues overall, although net revenues may increase or decrease depending on the technology and the year under study. Results also confirm that conventional, variable-cost-based production cost simulations do not capture prices accurately, and this particular feature calls for proxies for strategic behaviors (e.g., markups) and more accurate representations of how electricity markets work. The analysis also shows that generators face revenue sufficiency challenges in this ERCOT-like energy-only market model; net revenues provided by the market in all base markup cases and sensitivity scenarios (except when a large fraction of the existing coal fleet is retired) are not sufficient to justify investments in new capacity for thermal and nuclear power units. Overall, the work described in this paper points to the need for improved behavioral models of electricity markets to more accurately study current and potential market design issues that could arise in systems with high penetrations of renewable generation.« less

  7. Net Zero Energy Military Installations: A Guide to Assessment and Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Booth, S.; Barnett, J.; Burman, K.

    2010-08-01

    The U.S. Department of Defense (DoD) recognizes the strategic importance of energy to its mission, and is working to reduce energy consumption and enhance energy self-sufficiency by drawing on local clean energy sources. A joint initiative formed between DoD and the U.S. Department of Energy (DOE) in 2008 to address military energy use led to a task force to examine the potential for net zero energy military installations, which would produce as much energy on site as they consume in buildings, facilities, and fleet vehicles. This report presents an assessment and planning process to examine military installations for net zeromore » energy potential. Net Zero Energy Installation Assessment (NZEIA) presents a systematic framework to analyze energy projects at installations while balancing other site priorities such as mission, cost, and security.« less

  8. The Future of Low-Carbon Electricity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenblatt, Jeffery B.; Brown, Nicholas R.; Slaybaugh, Rachel

    We review future global demand for electricity and major technologies positioned to supply it with minimal greenhouse gas (GHG) emissions: renewables (wind, solar, water, geothermal, and biomass), nuclear fission, and fossil power with CO2 capture and sequestration. We discuss two breakthrough technologies (space solar power and nuclear fusion) as exciting but uncertain additional options for low-net GHG emissions (i.e., low-carbon) electricity generation. In addition, we discuss grid integration technologies (monitoring and forecasting of transmission and distribution systems, demand-side load management, energy storage, and load balancing with low-carbon fuel substitutes). For each topic, recent historical trends and future prospects are reviewed,more » along with technical challenges, costs, and other issues as appropriate. Although no technology represents an ideal solution, their strengths can be enhanced by deployment in combination, along with grid integration that forms a critical set of enabling technologies to assure a reliable and robust future low-carbon electricity system.« less

  9. Exploring Cognitive Flexibility With a Noninvasive BCI Using Simultaneous Steady-State Visual Evoked Potentials and Sensorimotor Rhythms.

    PubMed

    Edelman, Bradley J; Meng, Jianjun; Gulachek, Nicholas; Cline, Christopher C; He, Bin

    2018-05-01

    EEG-based brain-computer interface (BCI) technology creates non-biological pathways for conveying a user's mental intent solely through noninvasively measured neural signals. While optimizing the performance of a single task has long been the focus of BCI research, in order to translate this technology into everyday life, realistic situations, in which multiple tasks are performed simultaneously, must be investigated. In this paper, we explore the concept of cognitive flexibility, or multitasking, within the BCI framework by utilizing a 2-D cursor control task, using sensorimotor rhythms (SMRs), and a four-target visual attention task, using steady-state visual evoked potentials (SSVEPs), both individually and simultaneously. We found no significant difference between the accuracy of the tasks when executing them alone (SMR-57.9% ± 15.4% and SSVEP-59.0% ± 14.2%) and simultaneously (SMR-54.9% ± 17.2% and SSVEP-57.5% ± 15.4%). These modest decreases in performance were supported by similar, non-significant changes in the electrophysiology of the SSVEP and SMR signals. In this sense, we report that multiple BCI tasks can be performed simultaneously without a significant deterioration in performance; this finding will help drive these systems toward realistic daily use in which a user's cognition will need to be involved in multiple tasks at once.

  10. Evaluating the Life Cycle Environmental Benefits and Trade-Offs of Water Reuse Systems for Net-Zero Buildings.

    PubMed

    Hasik, Vaclav; Anderson, Naomi E; Collinge, William O; Thiel, Cassandra L; Khanna, Vikas; Wirick, Jason; Piacentini, Richard; Landis, Amy E; Bilec, Melissa M

    2017-02-07

    Aging water infrastructure and increased water scarcity have resulted in higher interest in water reuse and decentralization. Rating systems for high-performance buildings implicitly promote the use of building-scale, decentralized water supply and treatment technologies. It is important to recognize the potential benefits and trade-offs of decentralized and centralized water systems in the context of high-performance buildings. For this reason and to fill a gap in the current literature, we completed a life cycle assessment (LCA) of the decentralized water system of a high-performance, net-zero energy, net-zero water building (NZB) that received multiple green building certifications and compared the results with two modeled buildings (conventional and water efficient) using centralized water systems. We investigated the NZB's impacts over varying lifetimes, conducted a break-even analysis, and included Monte Carlo uncertainty analysis. The results show that, although the NZB performs better in most categories than the conventional building, the water efficient building generally outperforms the NZB. The lifetime of the NZB, septic tank aeration, and use of solar energy have been found to be important factors in the NZB's impacts. While these findings are specific to the case study building, location, and treatment technologies, the framework for comparison of water and wastewater impacts of various buildings can be applied during building design to aid decision making. As we design and operate high-performance buildings, the potential trade-offs of advanced decentralized water treatment systems should be considered.

  11. A repellent net as a new technology to protect cabbage crops.

    PubMed

    Martin, T; Palix, R; Kamal, A; Delétré, E; Bonafos, R; Simon, S; Ngouajio, M

    2013-08-01

    Floating row covers or insect-proof nets with fine mesh are effective at protecting vegetable crops against aphids but negatively impact plant health, especially under warm conditions. Furthermore, in control of cabbage insect pests, aphid parasitoids cannot enter the fine-mesh nets, leading to frequent aphid outbreaks. To surmount these difficulties, a 40-mesh-size repellent net treated with alphacypermethrin was studied in laboratory and field tests. Results showed both irritant and repellent effects of the alphacypermethrin-treated net on Myzus persicae (Sulzer) (Hemiptera: Aphididae) and its parasitoid Aphidius colemani (Haliday) (Hymenoptera: Braconidae). Under field conditions, there were no pests on cabbage protected with the repellent net. The repellent net allowed combining a visual and repellent barrier against aphids. Because of this additive effect, repellent nets allowed covering cabbage permanently with adequate protection against all pests.

  12. Task conflict and team creativity: a question of how much and when.

    PubMed

    Farh, Jiing-Lih; Lee, Cynthia; Farh, Crystal I C

    2010-11-01

    Bridging the task conflict, team creativity, and project team development literatures, we present a contingency model in which the relationship between task conflict and team creativity depends on the level of conflict and when it occurs in the life cycle of a project team. In a study of 71 information technology project teams in the greater China region, we found that task conflict had a curvilinear effect on team creativity, such that creativity was highest at moderate levels of task conflict. Additionally, we found this relationship to be moderated by team phase, such that the curvilinear effect was strongest at an early phase. In contrast, at later phases of the team life cycle, task conflict was found to be unrelated to team creativity. (c) 2010 APA, all rights reserved.

  13. Assessing the impacts of changes in treatment technology on energy and greenhouse gas balances for organic waste and wastewater treatment using historical data.

    PubMed

    Poulsen, Tjalfe G; Hansen, Jens Aage

    2009-11-01

    Historical data on organic waste and wastewater treatment during the period of 1970-2020 were used to assess the impact of treatment on energy and greenhouse gas (GHG) balances. The assessment included the waste fractions: Sewage sludge, food waste, yard waste and other organic waste (paper, plastic, etc.). Data were collected from Aalborg, a municipality located in Northern Denmark. During the period from 1970-2005, Aalborg Municipality has changed its waste treatment strategy from landfilling of all wastes toward composting of yard waste and incineration with combined heat and power production from the remaining organic municipal waste. Wastewater treatment has changed from direct discharge of untreated wastewater to full organic matter and nutrient (N, P) removal combined with anaerobic digestion of the sludge for biogas production with power and heat generation. These changes in treatment technology have resulted in the waste and wastewater treatment systems in Aalborg progressing from being net consumers of energy and net emitters of GHG, to becoming net producers of energy and net savers of GHG emissions (due to substitution of fossil fuels elsewhere). If it is assumed that the organic waste quantity and composition is the same in 1970 and 2005, the technology change over this time period has resulted in a progression from a net annual GHG emission of 200 kg CO( 2)-eq. capita(-1) in 1970 to a net saving of 170 kg CO(2)-eq. capita(-1) in 2005 for management of urban organic wastes.

  14. Engineering Technology Of Fish Farming Floating Nets Cages On Polka Dot Grouper (Cromileptes Altivelis) Used Artificial Feed Enriched Phytase Enzyme

    NASA Astrophysics Data System (ADS)

    Samidjan, Istiyanto; Rachmawati, Diana

    2018-02-01

    One solution is to utilize engineering technology cultivation floating cage net polka dot grouper (ducker grouper), which is given artificial feed enriched with phytase enzymes. The objectives of this study was to examine the use of technology engineering floating net on ducker grouper on artificial feed that is enriched with different dose phytase enzymes to accelerate growth and survival. The research method used ducker grouper fish size 15,5 ± 0,5 cm in the net cages unit (1 m x 1 m x 1 m), 250 fish per cage, using 12 cages. Each net-cages was made of polyethylens netting, mesh size 12.5 mm. with complete randomized design (CRD) 4 treatment and 3 replication were feed Artificial enriched of phytase enzyme with the doses of A (0 FTU · kg-1 diet), B (200 FTU · kg-1 diet), C (500 FTU · kg-1 diet), and D (800 FTU · kg-1 diet) phytase enzyme. Feed was given 2 times a day in the morning and afternoon with 5% biomass per day. Data includes the growth of absolute weight polka dot grouper, FCR, and survival rate analyzed variety and Test Tukey.The result of the research showed that the difference of artificial feeding enriched phytase enzyme significantly (P <0,05) to growth, food conversion ratio (FCR), survival rete of polka dot grouper. The best treatment at C (500 mg / kg of feed) increase growth of absolute weight of 128.75 g, 1.75 (FCR), and a survival rate of 93.5%.

  15. Initial Usability and Feasibility Evaluation of a Personal Health Record-Based Self-Management System for Older Adults.

    PubMed

    Sheehan, Barbara; Lucero, Robert J

    2015-01-01

    Electronic personal health record-based (ePHR-based) self-management systems can improve patient engagement and have an impact on health outcomes. In order to realize the benefits of these systems, there is a need to develop and evaluate heath information technology from the same theoretical underpinnings. Using an innovative usability approach based in human-centered distributed information design (HCDID), we tested an ePHR-based falls-prevention self-management system-Self-Assessment via a Personal Health Record (i.e., SAPHeR)-designed using HCDID principles in a laboratory. And we later evaluated SAPHeR's use by community-dwelling older adults at home. The innovative approach used in this study supported the analysis of four components: tasks, users, representations, and functions. Tasks were easily learned and features such as text-associated images facilitated task completion. Task performance times were slow, however user satisfaction was high. Nearly seven out of every ten features desired by design participants were evaluated in our usability testing of the SAPHeR system. The in vivo evaluation suggests that older adults could improve their confidence in performing indoor and outdoor activities after using the SAPHeR system. We have applied an innovative consumer-usability evaluation. Our approach addresses the limitations of other usability testing methods that do not utilize consistent theoretically based methods for designing and testing technology. We have successfully demonstrated the utility of testing consumer technology use across multiple components (i.e., task, user, representational, functional) to evaluate the usefulness, usability, and satisfaction of an ePHR-based self-management system.

  16. An advanced environment for hybrid modeling of biological systems based on modelica.

    PubMed

    Pross, Sabrina; Bachmann, Bernhard

    2011-01-20

    Biological systems are often very complex so that an appropriate formalism is needed for modeling their behavior. Hybrid Petri Nets, consisting of time-discrete Petri Net elements as well as continuous ones, have proven to be ideal for this task. Therefore, a new Petri Net library was implemented based on the object-oriented modeling language Modelica which allows the modeling of discrete, stochastic and continuous Petri Net elements by differential, algebraic and discrete equations. An appropriate Modelica-tool performs the hybrid simulation with discrete events and the solution of continuous differential equations. A special sub-library contains so-called wrappers for specific reactions to simplify the modeling process. The Modelica-models can be connected to Simulink-models for parameter optimization, sensitivity analysis and stochastic simulation in Matlab. The present paper illustrates the implementation of the Petri Net component models, their usage within the modeling process and the coupling between the Modelica-tool Dymola and Matlab/Simulink. The application is demonstrated by modeling the metabolism of Chinese Hamster Ovary Cells.

  17. cudaMap: a GPU accelerated program for gene expression connectivity mapping.

    PubMed

    McArt, Darragh G; Bankhead, Peter; Dunne, Philip D; Salto-Tellez, Manuel; Hamilton, Peter; Zhang, Shu-Dong

    2013-10-11

    Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.

  18. Automatic white blood cell classification using pre-trained deep learning models: ResNet and Inception

    NASA Astrophysics Data System (ADS)

    Habibzadeh, Mehdi; Jannesari, Mahboobeh; Rezaei, Zahra; Baharvand, Hossein; Totonchi, Mehdi

    2018-04-01

    This works gives an account of evaluation of white blood cell differential counts via computer aided diagnosis (CAD) system and hematology rules. Leukocytes, also called white blood cells (WBCs) play main role of the immune system. Leukocyte is responsible for phagocytosis and immunity and therefore in defense against infection involving the fatal diseases incidence and mortality related issues. Admittedly, microscopic examination of blood samples is a time consuming, expensive and error-prone task. A manual diagnosis would search for specific Leukocytes and number abnormalities in the blood slides while complete blood count (CBC) examination is performed. Complications may arise from the large number of varying samples including different types of Leukocytes, related sub-types and concentration in blood, which makes the analysis prone to human error. This process can be automated by computerized techniques which are more reliable and economical. In essence, we seek to determine a fast, accurate mechanism for classification and gather information about distribution of white blood evidences which may help to diagnose the degree of any abnormalities during CBC test. In this work, we consider the problem of pre-processing and supervised classification of white blood cells into their four primary types including Neutrophils, Eosinophils, Lymphocytes, and Monocytes using a consecutive proposed deep learning framework. For first step, this research proposes three consecutive pre-processing calculations namely are color distortion; bounding box distortion (crop) and image flipping mirroring. In second phase, white blood cell recognition performed with hierarchy topological feature extraction using Inception and ResNet architectures. Finally, the results obtained from the preliminary analysis of cell classification with (11200) training samples and 1244 white blood cells evaluation data set are presented in confusion matrices and interpreted using accuracy rate, and false positive with the classification framework being validated with experiments conducted on poor quality blood images sized 320 × 240 pixels. The deferential outcomes in the challenging cell detection task, as shown in result section, indicate that there is a significant achievement in using Inception and ResNet architecture with proposed settings. Our framework detects on average 100% of the four main white blood cell types using ResNet V1 50 while also alternative promising result with 99.84% and 99.46% accuracy rate obtained with ResNet V1 152 and ResNet 101, respectively with 3000 epochs and fine-tuning all layers. Further statistical confusion matrix tests revealed that this work achieved 1, 0.9979, 0.9989 sensitivity values when area under the curve (AUC) scores above 1, 0.9992, 0.9833 on three proposed techniques. In addition, current work shows negligible and small false negative 0, 2, 1 and substantial false positive with 0, 0, 5 values in Leukocytes detection.

  19. Technology Developments Integrating a Space Network Communications Testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enable its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions. It can simulate entire networks and can interface with external (testbed) systems. The key technology developments enabling the integration of MACHETE into a distributed testbed are the Monitor and Control module and the QualNet IP Network Emulator module. Specifically, the Monitor and Control module establishes a standard interface mechanism to centralize the management of each testbed component. The QualNet IP Network Emulator module allows externally generated network traffic to be passed through MACHETE to experience simulated network behaviors such as propagation delay, data loss, orbital effects and other communications characteristics, including entire network behaviors. We report a successful integration of MACHETE with a space communication testbed modeling a lunar exploration scenario. This document is the viewgraph slides of the presentation.

  20. Who We Are: Today's Students Speak Out

    ERIC Educational Resources Information Center

    Blandford, Ayoka

    2012-01-01

    Today's students have been nicknamed the "Digital Generation," "Millennials," "Net Generation" and "Generation Next." They are frequently identified by their technological prowess and seem to work well with multiple stimuli (for example, designing a web site while listening to iTunes and responding to texts). While many research studies have been…

  1. Designing Online Learning Communities of Practice: A Democratic Perspective

    ERIC Educational Resources Information Center

    Sorensen, Elsebeth Korsgaard; Murchu, Daithi O.

    2004-01-01

    This study addresses the problem of designing an appropriate learning space or architecture for distributed online courses using net-based communication technologies. We apply Wenger's criteria to explore, identify and discuss the design architectures of two online courses from two comparable online Master's programmes, developed and delivered in…

  2. Debunking the "Digital Native": Beyond Digital Apartheid, towards Digital Democracy

    ERIC Educational Resources Information Center

    Brown, C.; Czerniewicz, L.

    2010-01-01

    This paper interrogates the currently pervasive discourse of the "net generation" finding the concept of the "digital native" especially problematic, both empirically and conceptually. We draw on a research project of South African higher education students' access to and use of Information and Communication Technologies (ICTs)…

  3. SteinerNet: a web server for integrating ‘omic’ data to discover hidden components of response pathways

    PubMed Central

    Tuncbag, Nurcan; McCallum, Scott; Huang, Shao-shan Carol; Fraenkel, Ernest

    2012-01-01

    High-throughput technologies including transcriptional profiling, proteomics and reverse genetics screens provide detailed molecular descriptions of cellular responses to perturbations. However, it is difficult to integrate these diverse data to reconstruct biologically meaningful signaling networks. Previously, we have established a framework for integrating transcriptional, proteomic and interactome data by searching for the solution to the prize-collecting Steiner tree problem. Here, we present a web server, SteinerNet, to make this method available in a user-friendly format for a broad range of users with data from any species. At a minimum, a user only needs to provide a set of experimentally detected proteins and/or genes and the server will search for connections among these data from the provided interactomes for yeast, human, mouse, Drosophila melanogaster and Caenorhabditis elegans. More advanced users can upload their own interactome data as well. The server provides interactive visualization of the resulting optimal network and downloadable files detailing the analysis and results. We believe that SteinerNet will be useful for researchers who would like to integrate their high-throughput data for a specific condition or cellular response and to find biologically meaningful pathways. SteinerNet is accessible at http://fraenkel.mit.edu/steinernet. PMID:22638579

  4. Quantitative phase microscopy using deep neural networks

    NASA Astrophysics Data System (ADS)

    Li, Shuai; Sinha, Ayan; Lee, Justin; Barbastathis, George

    2018-02-01

    Deep learning has been proven to achieve ground-breaking accuracy in various tasks. In this paper, we implemented a deep neural network (DNN) to achieve phase retrieval in a wide-field microscope. Our DNN utilized the residual neural network (ResNet) architecture and was trained using the data generated by a phase SLM. The results showed that our DNN was able to reconstruct the profile of the phase target qualitatively. In the meantime, large error still existed, which indicated that our approach still need to be improved.

  5. Adapting GNU random forest program for Unix and Windows

    NASA Astrophysics Data System (ADS)

    Jirina, Marcel; Krayem, M. Said; Jirina, Marcel, Jr.

    2013-10-01

    The Random Forest is a well-known method and also a program for data clustering and classification. Unfortunately, the original Random Forest program is rather difficult to use. Here we describe a new version of this program originally written in Fortran 77. The modified program in Fortran 95 needs to be compiled only once and information for different tasks is passed with help of arguments. The program was tested with 24 data sets from UCI MLR and results are available on the net.

  6. CHIME-Net, The Connecticut Health Information Network: A Pilot Study

    PubMed Central

    Reed-Fourquet, LL; Durand, D; Johnson, L; Beaudin, S; Trask, J; DiSilvestro, E; Smith, L; Courtway, P; Pappanikou, J; Bretaigne, R; Pendleton, R; Vogler, E; Lobb, J; Dalal, S; Lynch, JT

    1995-01-01

    CHIME-Net is a state-wide community health information network project which uses a frame-relay approach to interfacility and internet connectivity. This is a collaborative effort among competitive institutions, which embraces technologies new to the health care industry. The experiences of implementation of the CHIME-Net pilot project are presented as a first milestone for the state-wide effort. PMID:8563347

  7. Consolidation Process in Near Net Shape Manufacturing of Armstrong CP-Ti/Ti-6Al-4V Powders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamamoto, Yukinori; Kiggans, Jim; Clark, Michael B

    2010-01-01

    This paper summarizes our recent efforts to develop the manufacturing technologies of consolidated net-shape components by using new low-cost commercially pure titanium (CP-Ti) and Ti-6Al-4V alloy powders made by the Armstrong process. Fabrication processes of net shape/ near net shape components, such as uniaxial die-pressing, cold isostatic pressing (CIP), sintering, roll compaction and stamping, have been evaluated. The press-and-sinter processing of the powders were systematically investigated in terms of theoretical density and microstructure as a function of time, pressure, and temperature. Up to 96.4% theoretical density has been achieved with the press-and-sinter technology. Tensile properties of the consolidated samples exhibitmore » good ductility as well as equivalent yield/ultimate tensile strengths to those of fully consolidate materials, even with the presence of a certain amount of porosity. A consolidation model is also under development to interpret the powder deformation during processing. Net shape components made of the Armstrong powder can successfully be fabricated with clearer surface details by using press-and-sinter processing.« less

  8. HiSeasNet: Oceanographic Ships Join the Grid

    NASA Astrophysics Data System (ADS)

    Berger, Jonathan; Orcutt, John; Foley, Steven; Bohlen, Steven

    2006-05-01

    HiSeasNet, the communications network providing full-period Internet access for the U.S. academic ocean research fleet, is an enabling technology that is changing the way oceanography is done in the 21st century. With the installation in March 2006 of a system on the research vessel (R/V) Seward Johnson and the planned installation on the R/V Marcus Langseth later this year, all but two of the Universities National Oceanographic Laboratories System (UNOLS) fleet of large/global and intermediate/ocean vessels will be equipped with HiSeasNet capability. HiSeasNet is a full-service Internet Protocol (IP) satellite network utilizing Cisco technology. In addition to the familiar IP services-such as e-mail, telnet, ssh, rlogin, Web traffic, and ftp-HiSeasNet can move real-time audio and video traffic across the satellite links. Phone systems onboard research ships can be connected to their home institutions' phone exchanges. Video teleconferencing with the current 96 kilobits per second circuits supports compressed video frame rates at about 10 frames per second, allowing for effective conversations and demonstrations with ship-to-shore video.

  9. Assessing the engineering performance of affordable net-zero energy housing

    NASA Astrophysics Data System (ADS)

    Wallpe, Jordan P.

    The purpose of this research was to evaluate affordable technologies that are capable of providing attractive, cost-effective energy savings to the housing industry. The research did so by investigating the 2011 Solar Decathlon competition, with additional insight from the Purdue INhome. Insight from the Purdue INhome verified the importance of using a three step design process to design a net-zero energy building. In addition, energy consumption values of the INhome were used to compare and contrast different systems used in other houses. Evaluation of unbiased competition contests gave a better understanding of how a house can realistically reach net-zero. Upon comparison, off-the-shelf engineering systems such as super-efficient HVAC units, heat pump hot water heaters, and properly designed photovoltaic arrays can affordably enable a house to become net-zero. These important and applicable technologies realized from the Solar Decathlon will reduce the 22 percent of all energy consumed through the residential sector in the United States. In conclusion, affordable net-zero energy buildings can be built today with commitment from design professionals, manufacturers, and home owners.

  10. Considerations of net present value in policy making regarding diagnostic and therapeutic technologies.

    PubMed

    Califf, Robert M; Rasiel, Emma B; Schulman, Kevin A

    2008-11-01

    The pharmaceutical and medical device industries function in a business environment in which shareholders expect companies to optimize profit within legal and ethical standards. A fundamental tool used to optimize decision making is the net present value calculation, which estimates the current value of cash flows relating to an investment. We examined 3 prototypical research investment decisions that have been the source of public scrutiny to illustrate how policy decisions can be better understood when their impact on societally desirable investments by industry are viewed from the standpoint of their impact on net present value. In the case of direct, comparative clinical trials, a simple net present value calculation provides insight into why companies eschew such investments. In the case of pediatric clinical trials, the Pediatric Extension Rule changed the net present value calculation from unattractive to potentially very attractive by allowing patent extensions; thus, the dramatic increase in pediatric clinical trials can be explained by the financial return on investment. In the case of products for small markets, the fixed costs of development make this option financially unattractive. Policy decisions can be better understood when their impact on societally desirable investments by the pharmaceutical and medical device industries are viewed from the standpoint of their impact on net present value.

  11. Inferring Muscle-Tendon Unit Power from Ankle Joint Power during the Push-Off Phase of Human Walking: Insights from a Multiarticular EMG-Driven Model

    PubMed Central

    2016-01-01

    Introduction Inverse dynamics joint kinetics are often used to infer contributions from underlying groups of muscle-tendon units (MTUs). However, such interpretations are confounded by multiarticular (multi-joint) musculature, which can cause inverse dynamics to over- or under-estimate net MTU power. Misestimation of MTU power could lead to incorrect scientific conclusions, or to empirical estimates that misguide musculoskeletal simulations, assistive device designs, or clinical interventions. The objective of this study was to investigate the degree to which ankle joint power overestimates net plantarflexor MTU power during the Push-off phase of walking, due to the behavior of the flexor digitorum and hallucis longus (FDHL)–multiarticular MTUs crossing the ankle and metatarsophalangeal (toe) joints. Methods We performed a gait analysis study on six healthy participants, recording ground reaction forces, kinematics, and electromyography (EMG). Empirical data were input into an EMG-driven musculoskeletal model to estimate ankle power. This model enabled us to parse contributions from mono- and multi-articular MTUs, and required only one scaling and one time delay factor for each subject and speed, which were solved for based on empirical data. Net plantarflexing MTU power was computed by the model and quantitatively compared to inverse dynamics ankle power. Results The EMG-driven model was able to reproduce inverse dynamics ankle power across a range of gait speeds (R2 ≥ 0.97), while also providing MTU-specific power estimates. We found that FDHL dynamics caused ankle power to slightly overestimate net plantarflexor MTU power, but only by ~2–7%. Conclusions During Push-off, FDHL MTU dynamics do not substantially confound the inference of net plantarflexor MTU power from inverse dynamics ankle power. However, other methodological limitations may cause inverse dynamics to overestimate net MTU power; for instance, due to rigid-body foot assumptions. Moving forward, the EMG-driven modeling approach presented could be applied to understand other tasks or larger multiarticular MTUs. PMID:27764110

  12. Inferring Muscle-Tendon Unit Power from Ankle Joint Power during the Push-Off Phase of Human Walking: Insights from a Multiarticular EMG-Driven Model.

    PubMed

    Honert, Eric C; Zelik, Karl E

    2016-01-01

    Inverse dynamics joint kinetics are often used to infer contributions from underlying groups of muscle-tendon units (MTUs). However, such interpretations are confounded by multiarticular (multi-joint) musculature, which can cause inverse dynamics to over- or under-estimate net MTU power. Misestimation of MTU power could lead to incorrect scientific conclusions, or to empirical estimates that misguide musculoskeletal simulations, assistive device designs, or clinical interventions. The objective of this study was to investigate the degree to which ankle joint power overestimates net plantarflexor MTU power during the Push-off phase of walking, due to the behavior of the flexor digitorum and hallucis longus (FDHL)-multiarticular MTUs crossing the ankle and metatarsophalangeal (toe) joints. We performed a gait analysis study on six healthy participants, recording ground reaction forces, kinematics, and electromyography (EMG). Empirical data were input into an EMG-driven musculoskeletal model to estimate ankle power. This model enabled us to parse contributions from mono- and multi-articular MTUs, and required only one scaling and one time delay factor for each subject and speed, which were solved for based on empirical data. Net plantarflexing MTU power was computed by the model and quantitatively compared to inverse dynamics ankle power. The EMG-driven model was able to reproduce inverse dynamics ankle power across a range of gait speeds (R2 ≥ 0.97), while also providing MTU-specific power estimates. We found that FDHL dynamics caused ankle power to slightly overestimate net plantarflexor MTU power, but only by ~2-7%. During Push-off, FDHL MTU dynamics do not substantially confound the inference of net plantarflexor MTU power from inverse dynamics ankle power. However, other methodological limitations may cause inverse dynamics to overestimate net MTU power; for instance, due to rigid-body foot assumptions. Moving forward, the EMG-driven modeling approach presented could be applied to understand other tasks or larger multiarticular MTUs.

  13. Qualitatively modelling and analysing genetic regulatory networks: a Petri net approach.

    PubMed

    Steggles, L Jason; Banks, Richard; Shaw, Oliver; Wipat, Anil

    2007-02-01

    New developments in post-genomic technology now provide researchers with the data necessary to study regulatory processes in a holistic fashion at multiple levels of biological organization. One of the major challenges for the biologist is to integrate and interpret these vast data resources to gain a greater understanding of the structure and function of the molecular processes that mediate adaptive and cell cycle driven changes in gene expression. In order to achieve this biologists require new tools and techniques to allow pathway related data to be modelled and analysed as network structures, providing valuable insights which can then be validated and investigated in the laboratory. We propose a new technique for constructing and analysing qualitative models of genetic regulatory networks based on the Petri net formalism. We take as our starting point the Boolean network approach of treating genes as binary switches and develop a new Petri net model which uses logic minimization to automate the construction of compact qualitative models. Our approach addresses the shortcomings of Boolean networks by providing access to the wide range of existing Petri net analysis techniques and by using non-determinism to cope with incomplete and inconsistent data. The ideas we present are illustrated by a case study in which the genetic regulatory network controlling sporulation in the bacterium Bacillus subtilis is modelled and analysed. The Petri net model construction tool and the data files for the B. subtilis sporulation case study are available at http://bioinf.ncl.ac.uk/gnapn.

  14. Deep Convolutional Neural Networks for Computer-Aided Detection: CNN Architectures, Dataset Characteristics and Transfer Learning.

    PubMed

    Shin, Hoo-Chang; Roth, Holger R; Gao, Mingchen; Lu, Le; Xu, Ziyue; Nogues, Isabella; Yao, Jianhua; Mollura, Daniel; Summers, Ronald M

    2016-05-01

    Remarkable progress has been made in image recognition, primarily due to the availability of large-scale annotated datasets and deep convolutional neural networks (CNNs). CNNs enable learning data-driven, highly representative, hierarchical image features from sufficient training data. However, obtaining datasets as comprehensively annotated as ImageNet in the medical imaging domain remains a challenge. There are currently three major techniques that successfully employ CNNs to medical image classification: training the CNN from scratch, using off-the-shelf pre-trained CNN features, and conducting unsupervised CNN pre-training with supervised fine-tuning. Another effective method is transfer learning, i.e., fine-tuning CNN models pre-trained from natural image dataset to medical image tasks. In this paper, we exploit three important, but previously understudied factors of employing deep convolutional neural networks to computer-aided detection problems. We first explore and evaluate different CNN architectures. The studied models contain 5 thousand to 160 million parameters, and vary in numbers of layers. We then evaluate the influence of dataset scale and spatial image context on performance. Finally, we examine when and why transfer learning from pre-trained ImageNet (via fine-tuning) can be useful. We study two specific computer-aided detection (CADe) problems, namely thoraco-abdominal lymph node (LN) detection and interstitial lung disease (ILD) classification. We achieve the state-of-the-art performance on the mediastinal LN detection, and report the first five-fold cross-validation classification results on predicting axial CT slices with ILD categories. Our extensive empirical evaluation, CNN model analysis and valuable insights can be extended to the design of high performance CAD systems for other medical imaging tasks.

  15. Net Shape Technology in Aerospace Structures. Volume 1.

    DTIC Science & Technology

    1986-11-01

    ofI nIo n- destructive evaluation methods, such a s ult rasonic inspection, in detecting otherwise hidden defects in parts made of the material. Pratt...SCHEDULE 4. PERFORMING ORGANIZATION REPORT NUMBER( S ) 5. MONITORING ORGANIZATION REPORT NUMBER( S ) n/a n/a 6a. NAME OF PERFORMING ORGANIZATION 6b...a n/a n/a 11 TITLE (Include Security Classification) Net Shape Technology in Aerospace Structures, Vol. I (U) 12. PERSONAL AUTHOR( S ) 13a. TYPE OF

  16. Metal Matrix Composite LOX Turbopump Housing Via Novel Tool-Less Net-Shape Pressure Infiltration Casting Technology

    NASA Technical Reports Server (NTRS)

    Shah, Sandeep; Lee, Jonathan; Bhat, Biliyar; Wells, Doug; Gregg, Wayne; Marsh, Matthew; Genge, Gary; Forbes, John; Salvi, Alex; Cornie, James A.; hide

    2002-01-01

    This presentation provides an overview of the effort by Metal Matrix Cast Composites, Inc. to redesign turbopump housing joints using metal matrix composite material and a toolless net-shape pressure infiltration casting technology. Topics covered include: advantage of metal matrix composites for propulsion components, baseline pump design and analysis, advanced toolless pressure infiltration casting process, subscale pump housing, preform splicing and joining for large components, and fullscale pump housing redesign.

  17. Entropy production in mesoscopic stochastic thermodynamics: nonequilibrium kinetic cycles driven by chemical potentials, temperatures, and mechanical forces

    NASA Astrophysics Data System (ADS)

    Qian, Hong; Kjelstrup, Signe; Kolomeisky, Anatoly B.; Bedeaux, Dick

    2016-04-01

    Nonequilibrium thermodynamics (NET) investigates processes in systems out of global equilibrium. On a mesoscopic level, it provides a statistical dynamic description of various complex phenomena such as chemical reactions, ion transport, diffusion, thermochemical, thermomechanical and mechanochemical fluxes. In the present review, we introduce a mesoscopic stochastic formulation of NET by analyzing entropy production in several simple examples. The fundamental role of nonequilibrium steady-state cycle kinetics is emphasized. The statistical mechanics of Onsager’s reciprocal relations in this context is elucidated. Chemomechanical, thermomechanical, and enzyme-catalyzed thermochemical energy transduction processes are discussed. It is argued that mesoscopic stochastic NET in phase space provides a rigorous mathematical basis of fundamental concepts needed for understanding complex processes in chemistry, physics and biology. This theory is also relevant for nanoscale technological advances.

  18. Learning and Skills Development in a Virtual Class of Educommunication Based on Educational Proposals and Interactions

    ERIC Educational Resources Information Center

    Bohorquez Sotelo, Maria Cristina; Rodriguez Mendoza, Brigitte Julieth; Vega, Sandra Milena; Roja Higuera, Naydu Shirley; Barbosa Gomez, Luisa Fernanda

    2016-01-01

    In the present paper we describe the analysis of qualitative and quantitative data from asynchronous learning networks, the virtual forums that take place in VirtualNet 2.0, the platform of the University Manuela Beltran (UMB), inside the course of Educommunication, from the master of Digital technologies applied to education. Here, we performed a…

  19. Scientific and Technical Support for the Galileo Net Flux Radiometer Experiment

    NASA Technical Reports Server (NTRS)

    Sromovsky, Lawrence A.

    1997-01-01

    This report describes work in support of the Galileo Net Flux Radiometer (NFR), an instrument mounted on the Galileo probe, a spacecraft designed for entry into and direct measurements of Jupiter's atmosphere. Tasks originally proposed for the post launch period covered by NCC 2-854 are briefly as follows: attend and support PSG (Project Science Group) and other project science meetings; support in-flight checkouts; maintain and keep safe the spare instrument and GSE (Ground Support Equipment); organize and maintain documentation; finish NFR calibration measurements, documentation, and analysis; characterize and diagnose instrument anomalies; develop descent data analysis tools; and science data analysis and publication. Because we had the capability to satisfy a project support need we also subsequently proposed and were funded to make ground-based observations of Jupiter during the period surrounding the Galileo arrival at Jupiter, using the Swedish Solar Telescope at La Palma, Canary Islands. The following section provides background information on the NFR instrument. Section 3 contains the final report of work done.

  20. BioC implementations in Go, Perl, Python and Ruby

    PubMed Central

    Liu, Wanli; Islamaj Doğan, Rezarta; Kwon, Dongseop; Marques, Hernani; Rinaldi, Fabio; Wilbur, W. John; Comeau, Donald C.

    2014-01-01

    As part of a communitywide effort for evaluating text mining and information extraction systems applied to the biomedical domain, BioC is focused on the goal of interoperability, currently a major barrier to wide-scale adoption of text mining tools. BioC is a simple XML format, specified by DTD, for exchanging data for biomedical natural language processing. With initial implementations in C++ and Java, BioC provides libraries of code for reading and writing BioC text documents and annotations. We extend BioC to Perl, Python, Go and Ruby. We used SWIG to extend the C++ implementation for Perl and one Python implementation. A second Python implementation and the Ruby implementation use native data structures and libraries. BioC is also implemented in the Google language Go. BioC modules are functional in all of these languages, which can facilitate text mining tasks. BioC implementations are freely available through the BioC site: http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net/ PMID:24961236

  1. Failures of explaining away and screening off in described versus experienced causal learning scenarios.

    PubMed

    Rehder, Bob; Waldmann, Michael R

    2017-02-01

    Causal Bayes nets capture many aspects of causal thinking that set them apart from purely associative reasoning. However, some central properties of this normative theory routinely violated. In tasks requiring an understanding of explaining away and screening off, subjects often deviate from these principles and manifest the operation of an associative bias that we refer to as the rich-get-richer principle. This research focuses on these two failures comparing tasks in which causal scenarios are merely described (via verbal statements of the causal relations) versus experienced (via samples of data that manifest the intervariable correlations implied by the causal relations). Our key finding is that we obtained stronger deviations from normative predictions in the described conditions that highlight the instructed causal model compared to those that presented data. This counterintuitive finding indicate that a theory of causal reasoning and learning needs to integrate normative principles with biases people hold about causal relations.

  2. NutriNet: A Deep Learning Food and Drink Image Recognition System for Dietary Assessment.

    PubMed

    Mezgec, Simon; Koroušić Seljak, Barbara

    2017-06-27

    Automatic food image recognition systems are alleviating the process of food-intake estimation and dietary assessment. However, due to the nature of food images, their recognition is a particularly challenging task, which is why traditional approaches in the field have achieved a low classification accuracy. Deep neural networks have outperformed such solutions, and we present a novel approach to the problem of food and drink image detection and recognition that uses a newly-defined deep convolutional neural network architecture, called NutriNet. This architecture was tuned on a recognition dataset containing 225,953 512 × 512 pixel images of 520 different food and drink items from a broad spectrum of food groups, on which we achieved a classification accuracy of 86 . 72 % , along with an accuracy of 94 . 47 % on a detection dataset containing 130 , 517 images. We also performed a real-world test on a dataset of self-acquired images, combined with images from Parkinson's disease patients, all taken using a smartphone camera, achieving a top-five accuracy of 55 % , which is an encouraging result for real-world images. Additionally, we tested NutriNet on the University of Milano-Bicocca 2016 (UNIMIB2016) food image dataset, on which we improved upon the provided baseline recognition result. An online training component was implemented to continually fine-tune the food and drink recognition model on new images. The model is being used in practice as part of a mobile app for the dietary assessment of Parkinson's disease patients.

  3. Functional Contour-following via Haptic Perception and Reinforcement Learning.

    PubMed

    Hellman, Randall B; Tekin, Cem; van der Schaar, Mihaela; Santos, Veronica J

    2018-01-01

    Many tasks involve the fine manipulation of objects despite limited visual feedback. In such scenarios, tactile and proprioceptive feedback can be leveraged for task completion. We present an approach for real-time haptic perception and decision-making for a haptics-driven, functional contour-following task: the closure of a ziplock bag. This task is challenging for robots because the bag is deformable, transparent, and visually occluded by artificial fingertip sensors that are also compliant. A deep neural net classifier was trained to estimate the state of a zipper within a robot's pinch grasp. A Contextual Multi-Armed Bandit (C-MAB) reinforcement learning algorithm was implemented to maximize cumulative rewards by balancing exploration versus exploitation of the state-action space. The C-MAB learner outperformed a benchmark Q-learner by more efficiently exploring the state-action space while learning a hard-to-code task. The learned C-MAB policy was tested with novel ziplock bag scenarios and contours (wire, rope). Importantly, this work contributes to the development of reinforcement learning approaches that account for limited resources such as hardware life and researcher time. As robots are used to perform complex, physically interactive tasks in unstructured or unmodeled environments, it becomes important to develop methods that enable efficient and effective learning with physical testbeds.

  4. Parallel task processing of very large datasets

    NASA Astrophysics Data System (ADS)

    Romig, Phillip Richardson, III

    This research concerns the use of distributed computer technologies for the analysis and management of very large datasets. Improvements in sensor technology, an emphasis on global change research, and greater access to data warehouses all are increase the number of non-traditional users of remotely sensed data. We present a framework for distributed solutions to the challenges of datasets which exceed the online storage capacity of individual workstations. This framework, called parallel task processing (PTP), incorporates both the task- and data-level parallelism exemplified by many image processing operations. An implementation based on the principles of PTP, called Tricky, is also presented. Additionally, we describe the challenges and practical issues in modeling the performance of parallel task processing with large datasets. We present a mechanism for estimating the running time of each unit of work within a system and an algorithm that uses these estimates to simulate the execution environment and produce estimated runtimes. Finally, we describe and discuss experimental results which validate the design. Specifically, the system (a) is able to perform computation on datasets which exceed the capacity of any one disk, (b) provides reduction of overall computation time as a result of the task distribution even with the additional cost of data transfer and management, and (c) in the simulation mode accurately predicts the performance of the real execution environment.

  5. Development of on line automatic separation device for apple and sleeve

    NASA Astrophysics Data System (ADS)

    Xin, Dengke; Ning, Duo; Wang, Kangle; Han, Yuhang

    2018-04-01

    Based on STM32F407 single chip microcomputer as control core, automatic separation device of fruit sleeve is designed. This design consists of hardware and software. In hardware, it includes mechanical tooth separator and three degree of freedom manipulator, as well as industrial control computer, image data acquisition card, end effector and other structures. The software system is based on Visual C++ development environment, to achieve localization and recognition of fruit sleeve with the technology of image processing and machine vision, drive manipulator of foam net sets of capture, transfer, the designated position task. Test shows: The automatic separation device of the fruit sleeve has the advantages of quick response speed and high separation success rate, and can realize separation of the apple and plastic foam sleeve, and lays the foundation for further studying and realizing the application of the enterprise production line.

  6. Concentrating Solar Power Projects - Ivanpah Solar Electric Generating

    Science.gov Websites

    Energy; Google Technology: Power tower Turbine Capacity: Net: 377.0 MW Gross: 392.0 MW Status Turbine Capacity (Gross): 392.0 MW Turbine Capacity (Net): 377.0 MW Turbine Manufacturer: Siemens SST-900

  7. The Net-Enhanced University.

    ERIC Educational Resources Information Center

    Sederburg, William A.

    2002-01-01

    Using the example of Ferris State University, discusses how a "net-enhanced" university functions and offers guiding principles: serve the core activity, recognize the limits to technology, create a policy structure, provide technical infrastructure, provide personnel infrastructure, build communities, digitize, and don't duplicate. (EV)

  8. Computers for Political Change: PeaceNet and Public Data Access.

    ERIC Educational Resources Information Center

    Downing, John D. H.

    1989-01-01

    Describes two computer communication projects: PeaceNet, devoted to peace issues; and Public Data Access, devoted to making U.S. government information more broadly available. Discusses the potential of new technology (computer communication) for grass-roots political movements. (SR)

  9. Drift Nets on the High Seas.

    ERIC Educational Resources Information Center

    Clearing, 1990

    1990-01-01

    Information is provided on the use and misuse of drift nets used internationally in the Pacific Ocean. An activity in which students acquire some understanding of the history of fishing and the effects of modern technologies on fish populations is included. (KR)

  10. netPICOmag: from Design to Network Implementation

    NASA Astrophysics Data System (ADS)

    Schofield, I.; Connors, M.; Russell, C.

    2009-05-01

    netPICOmag is the successful conclusion of a design effort involving networking based on Rabbit microcontrollers, PIC microcontrollers, and pulsed magnetometer sensors. GPS timing allows both timestamping of data and the precision counting of the number of pulses produced by the sensor heads in one second. Power over Ethernet, use of DHCP, and broadcast of UDP packets mean a very simple local installation, with one wire leading to a relatively small integrated sensor package which is vertically placed in the ground. Although we continue to make improvements, including through investigating new sensor types, we regard the design as mature and well tested. Here we focus on the need for yet denser magnetometer networks, technological applications which become practical using sensitive yet inexpensive magnetometers, and deployment methods for large numbers of sensors. With careful calibration, netPICOmags overlap with research grade magnetometers. Without it, they still sensitively detect magnetic variations and can be used for an education or outreach program. Due to their low cost, such an application allows many students to be directly involved in gathering data that can be very relevant to them personally when they witness auroras.

  11. NetCoffee: a fast and accurate global alignment approach to identify functionally conserved proteins in multiple networks.

    PubMed

    Hu, Jialu; Kehr, Birte; Reinert, Knut

    2014-02-15

    Owing to recent advancements in high-throughput technologies, protein-protein interaction networks of more and more species become available in public databases. The question of how to identify functionally conserved proteins across species attracts a lot of attention in computational biology. Network alignments provide a systematic way to solve this problem. However, most existing alignment tools encounter limitations in tackling this problem. Therefore, the demand for faster and more efficient alignment tools is growing. We present a fast and accurate algorithm, NetCoffee, which allows to find a global alignment of multiple protein-protein interaction networks. NetCoffee searches for a global alignment by maximizing a target function using simulated annealing on a set of weighted bipartite graphs that are constructed using a triplet approach similar to T-Coffee. To assess its performance, NetCoffee was applied to four real datasets. Our results suggest that NetCoffee remedies several limitations of previous algorithms, outperforms all existing alignment tools in terms of speed and nevertheless identifies biologically meaningful alignments. The source code and data are freely available for download under the GNU GPL v3 license at https://code.google.com/p/netcoffee/.

  12. The Design and Implementation of Network Teaching Platform Basing on .NET

    NASA Astrophysics Data System (ADS)

    Yanna, Ren

    This paper addresses the problem that students under traditional teaching model have poor operation ability and studies in depth the network teaching platform in domestic colleges and universities, proposing the design concept of network teaching platform of NET + C # + SQL excellent course and designing the overall structure, function module and back-end database of the platform. This paper emphatically expounds the use of MD5 encryption techniques in order to solve data security problems and the assessment of student learning using ADO.NET database access technology as well as the mathematical formula. The example shows that the network teaching platform developed by using WEB application technology has higher safety and availability, and thus improves the students' operation ability.

  13. How humans drive speciation as well as extinction

    PubMed Central

    Maron, M.

    2016-01-01

    A central topic for conservation science is evaluating how human activities influence global species diversity. Humanity exacerbates extinction rates. But by what mechanisms does humanity drive the emergence of new species? We review human-mediated speciation, compare speciation and known extinctions, and discuss the challenges of using net species diversity as a conservation objective. Humans drive rapid evolution through relocation, domestication, hunting and novel ecosystem creation—and emerging technologies could eventually provide additional mechanisms. The number of species relocated, domesticated and hunted during the Holocene is of comparable magnitude to the number of observed extinctions. While instances of human-mediated speciation are known, the overall effect these mechanisms have upon speciation rates has not yet been quantified. We also explore the importance of anthropogenic influence upon divergence in microorganisms. Even if human activities resulted in no net loss of species diversity by balancing speciation and extinction rates, this would probably be deemed unacceptable. We discuss why, based upon ‘no net loss’ conservation literature—considering phylogenetic diversity and other metrics, risk aversion, taboo trade-offs and spatial heterogeneity. We conclude that evaluating speciation alongside extinction could result in more nuanced understanding of biosphere trends, clarifying what it is we actually value about biodiversity. PMID:27358365

  14. How humans drive speciation as well as extinction.

    PubMed

    Bull, J W; Maron, M

    2016-06-29

    A central topic for conservation science is evaluating how human activities influence global species diversity. Humanity exacerbates extinction rates. But by what mechanisms does humanity drive the emergence of new species? We review human-mediated speciation, compare speciation and known extinctions, and discuss the challenges of using net species diversity as a conservation objective. Humans drive rapid evolution through relocation, domestication, hunting and novel ecosystem creation-and emerging technologies could eventually provide additional mechanisms. The number of species relocated, domesticated and hunted during the Holocene is of comparable magnitude to the number of observed extinctions. While instances of human-mediated speciation are known, the overall effect these mechanisms have upon speciation rates has not yet been quantified. We also explore the importance of anthropogenic influence upon divergence in microorganisms. Even if human activities resulted in no net loss of species diversity by balancing speciation and extinction rates, this would probably be deemed unacceptable. We discuss why, based upon 'no net loss' conservation literature-considering phylogenetic diversity and other metrics, risk aversion, taboo trade-offs and spatial heterogeneity. We conclude that evaluating speciation alongside extinction could result in more nuanced understanding of biosphere trends, clarifying what it is we actually value about biodiversity. © 2016 The Author(s).

  15. Study of Turbofan Engines Designed for Low Enery Consumption

    NASA Technical Reports Server (NTRS)

    Neitzel, R. E.; Hirschkron, R.; Johnston, R. P.

    1976-01-01

    Subsonic transport turbofan engine design and technology features which have promise of improving aircraft energy consumption are described. Task I addressed the selection and evaluation of features for the CF6 family of engines in current aircraft, and growth models of these aircraft. Task II involved cycle studies and the evaluation of technology features for advanced technology turbofans, consistent with initial service in 1985. Task III pursued the refined analysis of a specific design of an advanced technology turbofan engine selected as the result of Task II studies. In all of the above, the impact upon aircraft economics, as well as energy consumption, was evaluated. Task IV summarized recommendations for technology developments which would be necessary to achieve the improvements in energy consumption identified.

  16. Desert Research and Technology Studies (RATS) 2007 Field Campaign Objectives and Results

    NASA Technical Reports Server (NTRS)

    Kosmo, Joseph; Romig, Barbara

    2008-01-01

    Desert "RATS" (Research and Technology Studies) is a combined, multi-discipline group of inter-NASA center scientists and engineers, net-working and collaborating with representatives of industry and academia, for the purpose of conducting planetary surface exploration-focused remote field exercises. These integrated testing exercises conducted under representative analog Lunar and Mars surface terrain conditions, provide NASA the capability to validate experimental prototype hardware and software systems as well as to evaluate and develop mission operational techniques in order to identify and establish technical requirements and identify potential technology "gaps" applicable for future planetary human exploration. The 2007 D-RATS field campaign test activities were initiated based on the major themes and objectives of a notional 5-year plan developed for conducting relative analog test activities in support of the engineering evaluation and assessment of various system architectural requirements, conceptual prototype support equipment and selected technologies necessary for the establishment of a lunar outpost. Specifically, the major objectives included measuring task efficiency during robot, human, and human-robot interactive tasks associated with lunar outpost site surveying and reconnaissance activities and deployment of a representative solar panel power and distribution system. In addition, technology demonstrations were conducted with a new Lithium-ion battery and autonomous software to coordinate multiple robot activities. Secondary objectives were evaluating airlock concept mockups and prototype removable space suit over-garment elements for dust mitigation, and upgrades to the prototype extravehicular activities (EVA) communication and information system. Dry run test activities, prior to testing at a designated remote field site location, were initially conducted at the Johnson Space Center (JSC) Remote Field Demonstration Test Site. This is a multi-acre external test site located at JSC and has detailed representative terrain features simulating both Lunar and Mars surface characteristics. Both the local JSC and remote field test sites have terrain conditions that are representative and characteristic of both the Moon and Mars, such as strewn rock and volcanic ash fields, craters, rolling plains, hills, gullies, slopes, and outcrops. The D-RATS 2007 field campaign, representing the completion of its tenth year of analog testing, was conducted at the large Cinder Lake volcanic ash bed area adjacent to Flagstaff, Arizona.

  17. The NetQuakes Project - Seeking a Balance Between Science and Citizens.

    NASA Astrophysics Data System (ADS)

    Luetgert, J. H.; Oppenheimer, D. H.

    2012-12-01

    The challenge for any system that uses volunteer help to do science is to dependably acquire quality data without unduly burdening the volunteer. The NetQuakes accelerograph and its data acquisition system were created to address the recognized need for more densely sampled strong ground motion recordings in urban areas to provide more accurate ShakeMaps for post-earthquake disaster assessment and to provide data for structural engineers to improve design standards. The recorder has 18 bit resolution with ±3g internal tri-axial MEMS accelerometers. Data are continuously recorded at 200 sps into a 1-2 week ringbuffer. When triggered, a miniSEED file is sent to USGS servers via the Internet. Data can also be recovered from the ringbuffer by a remote request through the NetQuakes servers. Following a power failure, the instrument can run for 36 hours using its internal battery. We rely upon cooperative citizens to host the dataloggers, provide power and Internet connectivity and perform minor servicing. Instrument and battery replacement are simple tasks that can be performed by hosts, thus reducing maintenance costs. Communication with the instrument to acquire data or deliver firmware is accomplished by file transfers using NetQuakes servers. The client instrument initiates all client-server interactions, so it safely resides behind a host's firewall. A connection to the host's LAN, and from there to the public Internet, can be made using WiFi to minimize cabling. Although timing using a cable to an external GPS antenna is possible, it is simpler to use the Network Time Protocol (NTP) to discipline the internal clock. This approach achieves timing accuracy substantially better than a sample interval. Since 2009, we have installed more than 140 NetQuakes instruments in the San Francisco Bay Area and have successfully integrated their data into the near real time data stream of the Northern California Seismic System. An additional 235 NetQuakes instruments have been installed by other regional seismic networks - all communicating via the common NetQuakes servers.

  18. Application of advanced speech technology in manned penetration bombers

    NASA Astrophysics Data System (ADS)

    North, R.; Lea, W.

    1982-03-01

    This report documents research on the potential use of speech technology in a manned penetration bomber aircraft (B-52/G and H). The objectives of the project were to analyze the pilot/copilot crewstation tasks over a three-hour-and forty-minute mission and determine the tasks that would benefit the most from conversion to speech recognition/generation, determine the technological feasibility of each of the identified tasks, and prioritize these tasks based on these criteria. Secondary objectives of the program were to enunciate research strategies in the application of speech technologies in airborne environments, and develop guidelines for briefing user commands on the potential of using speech technologies in the cockpit. The results of this study indicated that for the B-52 crewmember, speech recognition would be most beneficial for retrieving chart and procedural data that is contained in the flight manuals. Technological feasibility of these tasks indicated that the checklist and procedural retrieval tasks would be highly feasible for a speech recognition system.

  19. Structural Technology Evaluation Analysis Program (STEAP). Task Order 0029: Thermal Stability of Fatigue Life-Enhanced Structures

    DTIC Science & Technology

    2012-01-01

    and c, we were able to obtain Figure 21: Intensity and Pressure Temporal Profiles Calculated from Pressure Model 0 20 40 60 80 100 0 2 4 6 8...August 2008 – 31 January 2012 4 . TITLE AND SUBTITLE STRUCTURAL TECHNOLOGY EVALUATION ANALYSIS PROGRAM (STEAP) Task Order 0029: Thermal...Stability of Fatigue Life-Enhanced Structures 5a. CONTRACT NUMBER FA8650-04-D-3446-0029 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62201F 6

  20. A Critical Analysis of the Acquisition Review Journal: Are We in Step with the Field?

    DTIC Science & Technology

    2006-12-01

    ART # YEAR AUTHOR(S) TITLE THEMES 1 1994 Preston, Colleen Acquisition Reform: Making it a Reality Acquisition Reform 2 1994 LaBerge , Walter B...Going? Technology 33 1996 Hewitt, Clyde Getting to the On-Ramp of the Information Highway Technology 34 1996 LaBerge , Walter B. Cycle Time: A...Learned from Developing the ABCs 6.4 Solution System of Systems 204 2005 Zenishek, Steven G.; Usechak, David Net-Centric Warfare and its Impact on

  1. The research of .NET framework based on delegate of the LCE

    NASA Astrophysics Data System (ADS)

    Chen, Yi-peng

    2011-10-01

    Programmers realize LCE Enterprise services provided by NET framework when they develop applied VC# programming design language with component technology facing objects Lots of basic codes used to be compiled in the traditional programming design. However, nowadays this can be done just by adding corresponding character at class, interface, method, assembly with simple declarative program. This paper mainly expatiates the mechanism to realize LCE event services with delegate mode in C#. It also introduces the procedure of applying event class, event publisher, subscriber and client in LCE technology. It analyses the technology points of LCE based on delegate mode with popular language and practicing cases.

  2. An Assessment of the Status of Captive Broodstock Technology of Pacific Salmon, 1995 Final Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flagg, Thomas A.; Mahnaken, Conrad V.W.; Hard, Jeffrey J.

    1995-06-01

    This report provides guidance for the refinement and use of captive broodstock technology for Pacific salmon (Oncorhynchus spp.) by bringing together information on the husbandry techniques, genetic risks, physiology, nutrition, and pathology affecting captive broodstocks. Captive broodstock rearing of Pacific salmon is an evolving technology, as yet without well defined standards. At present, we regard captive rearing of Pacific salmon as problematic: high mortality rates and low egg viability were common in the programs we reviewed for this report. One of the most important elements in fish husbandry is the culture environment itself. Many captive broodstock programs for Pacific salmonmore » have reared fish from smolt-to-adult in seawater net-pens, and most have shown success in providing gametes for recovery efforts. However, some programs have lost entire brood years to diseases that transmitted rapidly in this medium. Current programs for endangered species of Pacific salmon rear most fish full-term to maturity in fresh well-water, since ground water is low in pathogens and thus helps ensure survival to adulthood. Our review suggested that captive rearing of fish in either freshwater, well-water, or filtered and sterilized seawater supplied to land-based tanks should produce higher survival than culture in seawater net-pens.« less

  3. Exploring inattention and distraction in the SafetyNet Accident Causation Database.

    PubMed

    Talbot, Rachel; Fagerlind, Helen; Morris, Andrew

    2013-11-01

    Distraction and inattention are considered to be very important and prevalent factors in the causation of road accidents. There have been many recent research studies which have attempted to understand the circumstances under which a driver becomes distracted or inattentive and how distraction/inattention can be prevented. Both factors are thought to have become more important in recent times partly due to the evolution of in-vehicle information and communication technology. This study describes a methodology that was developed to understand when factors such as distraction and inattention may have been contributors to crashes and also describes some of the consequences of distraction and inattention in terms of subsequent driver actions. The study uses data relating to distraction and inattention from the SafetyNet Accident Causation Database. This database was formulated as part of the SafetyNet project to address the lack of representative in-depth accident causation data within the European Union. Data were collected in 6 European countries using 'on-scene' and 'nearly on-scene' crash investigation methodologies. 32% of crashes recorded in the database, involved at least one driver, rider or pedestrian, who was determined to be 'Inattentive' or 'Distracted'. 212 of the drivers were assigned 'Distraction' and 140 drivers were given the code 'Inattention'. It was found that both distraction and inattention often lead to missed observations within the driving task and consequently 'Timing' or 'Direction' become critical events in the aetiology of crashes. In addition, the crash types and outcomes may differ according to the type and nature of the distraction and inattention as determined by the in-depth investigations. The development of accident coding methodology is described in this study as is its evolution into the Driver Reliability and Error Analysis Model (DREAM) version 3.0. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. SOLID STATE ENERGY CONVERSION ALLIANCE DELPHI SOLID OXIDE FUEL CELL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steven Shaffer; Sean Kelly; Subhasish Mukerjee

    2004-05-07

    The objective of this project is to develop a 5 kW Solid Oxide Fuel Cell power system for a range of fuels and applications. During Phase I, the following will be accomplished: Develop and demonstrate technology transfer efforts on a 5 kW stationary distributed power generation system that incorporates steam reforming of natural gas with the option of piped-in water (Demonstration System A). Initiate development of a 5 kW system for later mass-market automotive auxiliary power unit application, which will incorporate Catalytic Partial Oxidation (CPO) reforming of gasoline, with anode exhaust gas injected into an ultra-lean burn internal combustion engine.more » This technical progress report covers work performed by Delphi from July 1, 2003 to December 31, 2003, under Department of Energy Cooperative Agreement DE-FC-02NT41246. This report highlights technical results of the work performed under the following tasks: Task 1 System Design and Integration; Task 2 Solid Oxide Fuel Cell Stack Developments; Task 3 Reformer Developments; Task 4 Development of Balance of Plant (BOP) Components; Task 5 Manufacturing Development (Privately Funded); Task 6 System Fabrication; Task 7 System Testing; Task 8 Program Management; Task 9 Stack Testing with Coal-Based Reformate; and Task 10 Technology Transfer from SECA CORE Technology Program. In this reporting period, unless otherwise noted Task 6--System Fabrication and Task 7--System Testing will be reported within Task 1 System Design and Integration. Task 8--Program Management, Task 9--Stack Testing with Coal Based Reformate, and Task 10--Technology Transfer from SECA CORE Technology Program will be reported on in the Executive Summary section of this report.« less

  5. Cost benefit analysis of space communications technology: Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Holland, L. D.; Sassone, P. G.; Gallagher, J. J.; Robinette, S. L.; Vogler, F. H.; Zimmer, R. P.

    1976-01-01

    The questions of (1) whether or not NASA should support the further development of space communications technology, and, if so, (2) which technology's support should be given the highest priority are addressed. Insofar as the issues deal principally with resource allocation, an economics perspective is adopted. The resultant cost benefit methodology utilizes the net present value concept in three distinct analysis stages to evaluate and rank those technologies which pass a qualification test based upon probable (private sector) market failure. User-preference and technology state-of-the-art surveys were conducted (in 1975) to form a data base for the technology evaluation. The program encompassed near-future technologies in space communications earth stations and satellites, including the noncommunication subsystems of the satellite (station keeping, electrical power system, etc.). Results of the research program include confirmation of the applicability of the methodology as well as a list of space communications technologies ranked according to the estimated net present value of their support (development) by NASA.

  6. Defending the Doomed: Implicit Strategies Concerning Protection of First-Person Shooter Games

    PubMed Central

    Munko, Daniel; Glock, Sabine; Bente, Gary

    2012-01-01

    Abstract Censorship of violent digital games, especially first-person shooter (FPS) games, is broadly discussed between generations. While older people are concerned about possible negative influences of these games, not only players but also nonplayers of the younger net-generation seem to deny any association with real aggressive behavior. Our study aimed at investigating defense mechanisms players and nonplayers use to defend FPS and peers with playing habits. By using a lexical decision task, we found that aggressive concepts are activated by priming the content of FPS but suppressed afterward. Only if participants were instructed to actively suppress aggressive concepts after priming, thought suppression was no longer necessary. Young people still do have negative associations with violent video games. These associations are neglected by implicitly applying defense strategies—independent of own playing habits—to protect this specific hobby, which is common for the net-generation. PMID:22515170

  7. Software support for SBGN maps: SBGN-ML and LibSBGN.

    PubMed

    van Iersel, Martijn P; Villéger, Alice C; Czauderna, Tobias; Boyd, Sarah E; Bergmann, Frank T; Luna, Augustin; Demir, Emek; Sorokin, Anatoly; Dogrusoz, Ugur; Matsuoka, Yukiko; Funahashi, Akira; Aladjem, Mirit I; Mi, Huaiyu; Moodie, Stuart L; Kitano, Hiroaki; Le Novère, Nicolas; Schreiber, Falk

    2012-08-01

    LibSBGN is a software library for reading, writing and manipulating Systems Biology Graphical Notation (SBGN) maps stored using the recently developed SBGN-ML file format. The library (available in C++ and Java) makes it easy for developers to add SBGN support to their tools, whereas the file format facilitates the exchange of maps between compatible software applications. The library also supports validation of maps, which simplifies the task of ensuring compliance with the detailed SBGN specifications. With this effort we hope to increase the adoption of SBGN in bioinformatics tools, ultimately enabling more researchers to visualize biological knowledge in a precise and unambiguous manner. Milestone 2 was released in December 2011. Source code, example files and binaries are freely available under the terms of either the LGPL v2.1+ or Apache v2.0 open source licenses from http://libsbgn.sourceforge.net. sbgn-libsbgn@lists.sourceforge.net.

  8. Emergency response nurse scheduling with medical support robot by multi-agent and fuzzy technique.

    PubMed

    Kono, Shinya; Kitamura, Akira

    2015-08-01

    In this paper, a new co-operative re-scheduling method corresponding the medical support tasks that the time of occurrence can not be predicted is described, assuming robot can co-operate medical activities with the nurse. Here, Multi-Agent-System (MAS) is used for the co-operative re-scheduling, in which Fuzzy-Contract-Net (FCN) is applied to the robots task assignment for the emergency tasks. As the simulation results, it is confirmed that the re-scheduling results by the proposed method can keep the patients satisfaction and decrease the work load of the nurse.

  9. Individual differences in the balance of GABA to glutamate in pFC predict the ability to select among competing options.

    PubMed

    de la Vega, Alejandro; Brown, Mark S; Snyder, Hannah R; Singel, Debra; Munakata, Yuko; Banich, Marie T

    2014-11-01

    Individuals vary greatly in their ability to select one item or response when presented with a multitude of options. Here we investigate the neural underpinnings of these individual differences. Using magnetic resonance spectroscopy, we found that the balance of inhibitory versus excitatory neurotransmitters in pFC predicts the ability to select among task-relevant options in two language production tasks. The greater an individual's concentration of GABA relative to glutamate in the lateral pFC, the more quickly he or she could select a relevant word from among competing options. This outcome is consistent with our computational modeling of this task [Snyder, H. R., Hutchison, N., Nyhus, E., Curran, T., Banich, M. T., O'Reilly, R. C., et al. Neural inhibition enables selection during language processing. Proceedings of the National Academy of Sciences, U.S.A., 107, 16483-16488, 2010], which predicts that greater net inhibition in pFC increases the efficiency of resolving competition among task-relevant options. Moreover, the association with the GABA/glutamate ratio was specific to selection and was not observed for executive function ability in general. These findings are the first to link the balance of excitatory and inhibitory neural transmission in pFC to specific aspects of executive function.

  10. SonoNet: Real-Time Detection and Localisation of Fetal Standard Scan Planes in Freehand Ultrasound.

    PubMed

    Baumgartner, Christian F; Kamnitsas, Konstantinos; Matthew, Jacqueline; Fletcher, Tara P; Smith, Sandra; Koch, Lisa M; Kainz, Bernhard; Rueckert, Daniel

    2017-11-01

    Identifying and interpreting fetal standard scan planes during 2-D ultrasound mid-pregnancy examinations are highly complex tasks, which require years of training. Apart from guiding the probe to the correct location, it can be equally difficult for a non-expert to identify relevant structures within the image. Automatic image processing can provide tools to help experienced as well as inexperienced operators with these tasks. In this paper, we propose a novel method based on convolutional neural networks, which can automatically detect 13 fetal standard views in freehand 2-D ultrasound data as well as provide a localization of the fetal structures via a bounding box. An important contribution is that the network learns to localize the target anatomy using weak supervision based on image-level labels only. The network architecture is designed to operate in real-time while providing optimal output for the localization task. We present results for real-time annotation, retrospective frame retrieval from saved videos, and localization on a very large and challenging dataset consisting of images and video recordings of full clinical anomaly screenings. We found that the proposed method achieved an average F1-score of 0.798 in a realistic classification experiment modeling real-time detection, and obtained a 90.09% accuracy for retrospective frame retrieval. Moreover, an accuracy of 77.8% was achieved on the localization task.

  11. Life-Cycle Inventory Analysis of Bioproducts from a Modular Advanced Biomass Pyrolysis System

    Treesearch

    Richard Bergman; Hongmei Gu

    2014-01-01

    Expanding bioenergy production has the potential to reduce net greenhouse gas (GHG) emissions and improve energy security. Science-based assessments of new bioenergy technologies are essential tools for policy makers dealing with expanding renewable energy production. Using life cycle inventory (LCI) analysis, this study evaluated a 200-kWe...

  12. Objectives, Budgets, Thresholds, and Opportunity Costs-A Health Economics Approach: An ISPOR Special Task Force Report [4].

    PubMed

    Danzon, Patricia M; Drummond, Michael F; Towse, Adrian; Pauly, Mark V

    2018-02-01

    The fourth section of our Special Task Force report focuses on a health plan or payer's technology adoption or reimbursement decision, given the array of technologies, on the basis of their different values and costs. We discuss the role of budgets, thresholds, opportunity costs, and affordability in making decisions. First, we discuss the use of budgets and thresholds in private and public health plans, their interdependence, and connection to opportunity cost. Essentially, each payer should adopt a decision rule about what is good value for money given their budget; consistent use of a cost-per-quality-adjusted life-year threshold will ensure the maximum health gain for the budget. In the United States, different public and private insurance programs could use different thresholds, reflecting the differing generosity of their budgets and implying different levels of access to technologies. In addition, different insurance plans could consider different additional elements to the quality-adjusted life-year metric discussed elsewhere in our Special Task Force report. We then define affordability and discuss approaches to deal with it, including consideration of disinvestment and related adjustment costs, the impact of delaying new technologies, and comparative cost effectiveness of technologies. Over time, the availability of new technologies may increase the amount that populations want to spend on health care. We then discuss potential modifiers to thresholds, including uncertainty about the evidence used in the decision-making process. This article concludes by discussing the application of these concepts in the context of the pluralistic US health care system, as well as the "excess burden" of tax-financed public programs versus private programs. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  13. SpectralNET – an application for spectral graph analysis and visualization

    PubMed Central

    Forman, Joshua J; Clemons, Paul A; Schreiber, Stuart L; Haggarty, Stephen J

    2005-01-01

    Background Graph theory provides a computational framework for modeling a variety of datasets including those emerging from genomics, proteomics, and chemical genetics. Networks of genes, proteins, small molecules, or other objects of study can be represented as graphs of nodes (vertices) and interactions (edges) that can carry different weights. SpectralNET is a flexible application for analyzing and visualizing these biological and chemical networks. Results Available both as a standalone .NET executable and as an ASP.NET web application, SpectralNET was designed specifically with the analysis of graph-theoretic metrics in mind, a computational task not easily accessible using currently available applications. Users can choose either to upload a network for analysis using a variety of input formats, or to have SpectralNET generate an idealized random network for comparison to a real-world dataset. Whichever graph-generation method is used, SpectralNET displays detailed information about each connected component of the graph, including graphs of degree distribution, clustering coefficient by degree, and average distance by degree. In addition, extensive information about the selected vertex is shown, including degree, clustering coefficient, various distance metrics, and the corresponding components of the adjacency, Laplacian, and normalized Laplacian eigenvectors. SpectralNET also displays several graph visualizations, including a linear dimensionality reduction for uploaded datasets (Principal Components Analysis) and a non-linear dimensionality reduction that provides an elegant view of global graph structure (Laplacian eigenvectors). Conclusion SpectralNET provides an easily accessible means of analyzing graph-theoretic metrics for data modeling and dimensionality reduction. SpectralNET is publicly available as both a .NET application and an ASP.NET web application from . Source code is available upon request. PMID:16236170

  14. SpectralNET--an application for spectral graph analysis and visualization.

    PubMed

    Forman, Joshua J; Clemons, Paul A; Schreiber, Stuart L; Haggarty, Stephen J

    2005-10-19

    Graph theory provides a computational framework for modeling a variety of datasets including those emerging from genomics, proteomics, and chemical genetics. Networks of genes, proteins, small molecules, or other objects of study can be represented as graphs of nodes (vertices) and interactions (edges) that can carry different weights. SpectralNET is a flexible application for analyzing and visualizing these biological and chemical networks. Available both as a standalone .NET executable and as an ASP.NET web application, SpectralNET was designed specifically with the analysis of graph-theoretic metrics in mind, a computational task not easily accessible using currently available applications. Users can choose either to upload a network for analysis using a variety of input formats, or to have SpectralNET generate an idealized random network for comparison to a real-world dataset. Whichever graph-generation method is used, SpectralNET displays detailed information about each connected component of the graph, including graphs of degree distribution, clustering coefficient by degree, and average distance by degree. In addition, extensive information about the selected vertex is shown, including degree, clustering coefficient, various distance metrics, and the corresponding components of the adjacency, Laplacian, and normalized Laplacian eigenvectors. SpectralNET also displays several graph visualizations, including a linear dimensionality reduction for uploaded datasets (Principal Components Analysis) and a non-linear dimensionality reduction that provides an elegant view of global graph structure (Laplacian eigenvectors). SpectralNET provides an easily accessible means of analyzing graph-theoretic metrics for data modeling and dimensionality reduction. SpectralNET is publicly available as both a .NET application and an ASP.NET web application from http://chembank.broad.harvard.edu/resources/. Source code is available upon request.

  15. Task-technology fit of video telehealth for nurses in an outpatient clinic setting.

    PubMed

    Cady, Rhonda G; Finkelstein, Stanley M

    2014-07-01

    Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task-technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task-technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time-motion study. Qualitative and quantitative results were merged and analyzed within the task-technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task-technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Telehealth must provide the right information to the right clinician at the right time. Evaluating task-technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology.

  16. Sensori-Motor Learning with Movement Sonification: Perspectives from Recent Interdisciplinary Studies.

    PubMed

    Bevilacqua, Frédéric; Boyer, Eric O; Françoise, Jules; Houix, Olivier; Susini, Patrick; Roby-Brami, Agnès; Hanneton, Sylvain

    2016-01-01

    This article reports on an interdisciplinary research project on movement sonification for sensori-motor learning. First, we describe different research fields which have contributed to movement sonification, from music technology including gesture-controlled sound synthesis, sonic interaction design, to research on sensori-motor learning with auditory-feedback. In particular, we propose to distinguish between sound-oriented tasks and movement-oriented tasks in experiments involving interactive sound feedback. We describe several research questions and recently published results on movement control, learning and perception. In particular, we studied the effect of the auditory feedback on movements considering several cases: from experiments on pointing and visuo-motor tracking to more complex tasks where interactive sound feedback can guide movements, or cases of sensory substitution where the auditory feedback can inform on object shapes. We also developed specific methodologies and technologies for designing the sonic feedback and movement sonification. We conclude with a discussion on key future research challenges in sensori-motor learning with movement sonification. We also point out toward promising applications such as rehabilitation, sport training or product design.

  17. MirandaNet: A Learning Community--A Community of Learners.

    ERIC Educational Resources Information Center

    Cuthell, John

    2002-01-01

    Explains MirandaNet, a learning community of teachers and academics as agents of change who use information and communications technology to change their teaching and learning practice and to develop innovative models for continuing professional development. Discusses distributed cognition in an online community. (LRW)

  18. Quantifying Phishing Susceptibility for Detection and Behavior Decisions.

    PubMed

    Canfield, Casey Inez; Fischhoff, Baruch; Davis, Alex

    2016-12-01

    We use signal detection theory to measure vulnerability to phishing attacks, including variation in performance across task conditions. Phishing attacks are difficult to prevent with technology alone, as long as technology is operated by people. Those responsible for managing security risks must understand user decision making in order to create and evaluate potential solutions. Using a scenario-based online task, we performed two experiments comparing performance on two tasks: detection, deciding whether an e-mail is phishing, and behavior, deciding what to do with an e-mail. In Experiment 1, we manipulated the order of the tasks and notification of the phishing base rate. In Experiment 2, we varied which task participants performed. In both experiments, despite exhibiting cautious behavior, participants' limited detection ability left them vulnerable to phishing attacks. Greater sensitivity was positively correlated with confidence. Greater willingness to treat e-mails as legitimate was negatively correlated with perceived consequences from their actions and positively correlated with confidence. These patterns were robust across experimental conditions. Phishing-related decisions are sensitive to individuals' detection ability, response bias, confidence, and perception of consequences. Performance differs when people evaluate messages or respond to them but not when their task varies in other ways. Based on these results, potential interventions include providing users with feedback on their abilities and information about the consequences of phishing, perhaps targeting those with the worst performance. Signal detection methods offer system operators quantitative assessments of the impacts of interventions and their residual vulnerability. © 2016, Human Factors and Ergonomics Society.

  19. Strategic Adaptation to Task Characteristics, Incentives, and Individual Differences in Dual-Tasking

    PubMed Central

    Janssen, Christian P.; Brumby, Duncan P.

    2015-01-01

    We investigate how good people are at multitasking by comparing behavior to a prediction of the optimal strategy for dividing attention between two concurrent tasks. In our experiment, 24 participants had to interleave entering digits on a keyboard with controlling a randomly moving cursor with a joystick. The difficulty of the tracking task was systematically varied as a within-subjects factor. Participants were also exposed to different explicit reward functions that varied the relative importance of the tracking task relative to the typing task (between-subjects). Results demonstrate that these changes in task characteristics and monetary incentives, together with individual differences in typing ability, influenced how participants choose to interleave tasks. This change in strategy then affected their performance on each task. A computational cognitive model was used to predict performance for a wide set of alternative strategies for how participants might have possibly interleaved tasks. This allowed for predictions of optimal performance to be derived, given the constraints placed on performance by the task and cognition. A comparison of human behavior with the predicted optimal strategy shows that participants behaved near optimally. Our findings have implications for the design and evaluation of technology for multitasking situations, as consideration should be given to the characteristics of the task, but also to how different users might use technology depending on their individual characteristics and their priorities. PMID:26161851

  20. Catamaran Nets

    NASA Technical Reports Server (NTRS)

    1990-01-01

    West Coast Netting, Inc.'s net of Hyperester twine, is made of three strands of fiber twisted together by a company-invented sophisticated twisting machine and process that maintain precisely the same tension on each strand. The resulting twine offers higher strength and improved abrasion resistance. The technology that created the Hyperester supertwine has found spinoff applications, first as an extra-efficient seine for tuna fishing, then as a capture net for law enforcement agencies. The newest one is as a deck for racing catamarans. Hyperester twine net has been used on most of the high performance racing catamarans of recent years, including the America's Cup Challenge boats. They are tough and hold up well in the continual exposure to sunlight and saltwater.

  1. Mars MetNet Mission Status

    NASA Astrophysics Data System (ADS)

    Harri, A.-M.; Aleksashkin, S.; Arruego, I.; Schmidt, W.; Genzer, M.; Vazquez, L.; Haukka, H.; Palin, M.; Nikkanen, T.

    2015-10-01

    New kind of planetary exploration mission for Mars is under development in collaboration between the Finnish Meteorological Institute (FMI), Lavochkin Association (LA), Space Research Institute (IKI) and Institutio Nacional de Tecnica Aerospacial (INTA). The Mars MetNet mission is based on a new semihard landing vehicle called MetNet Lander (MNL). The scientific payload of the Mars MetNet Precursor [1] mission is divided into three categories: Atmospheric instruments, Optical devices and Composition and structure devices. Each of the payload instruments will provide significant insights in to the Martian atmospheric behavior. The key technologies of the MetNet Lander have been qualified and the electrical qualification model (EQM) of the payload bay has been built and successfully tested.

  2. Advanced Platform Systems Technology study. Volume 2: Trade study and technology selection

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Three primary tasks were identified which include task 1-trade studies, task 2-trade study comparison and technology selection, and task 3-technology definition. Task 1 general objectives were to identify candidate technology trade areas, determine which areas have the highest potential payoff, define specific trades within the high payoff areas, and perform the trade studies. In order to satisfy these objectives, a structured, organized approach was employed. Candidate technology areas and specific trades were screened using consistent selection criteria and considering possible interrelationships. A data base comprising both manned and unmanned space platform documentation was used as a source of system and subsystem requirements. When requirements were not stated in the data base documentation, assumptions were made and recorded where necessary to characterize a particular spacecraft system. The requirements and assumptions were used together with the selection criteria to establish technology advancement goals and select trade studies. While both manned and unmanned platform data were used, the study was focused on the concept of an early manned space station.

  3. Technology Reinvestment Project Manufacturing Education and Training. Volume 1

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Bond, Arthur J.

    1997-01-01

    The manufacturing education program is a joint program between the University of Alabama in Huntsville's (UAH) College of Engineering and Alabama A&M University's (AAMLJ) School of Engineering and Technology. The objective of the program is to provide more hands-on experiences to undergraduate engineering and engineering technology students. The scope of work consisted of. Year 1, Task 1: Review courses at Alabama Industrial Development Training (AIDT); Task 2: Review courses at UAH and AAMU; Task 3: Develop new lab manuals; Task 4: Field test manuals; Task 5: Prepare annual report. Year 2, Task 1: Incorporate feedback into lab manuals; Task 2 : Introduce lab manuals into classes; Task 3: Field test manuals; Task 4: Prepare annual report. Year 3, Task 1: Incorporate feedback into lab manuals; Task 2: Introduce lab manuals into remaining classes; Task 3: Conduct evaluation with assistance of industry; Task 4: Prepare final report. This report only summarizes the activities of the University of Alabama in Huntsville. The activities of Alabama A&M University are contained in a separate report.

  4. KM3NeT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jong, M. de; Leiden Institute of Physics, Leiden University, Leiden; Collaboration: KM3NeT Collaboration

    2015-07-15

    KM3NeT is a large research infrastructure, that will consist of a network of deep-sea neutrino telescopes in the Mediterranean Sea. The main objective of KM3NeT is the discovery and subsequent observation of high-energy neutrino sources in the Universe. A further physics perspective is the measurement of the mass hierarchy of neutrinos. A corresponding study, ORCA, is ongoing within KM3NeT. A cost effective technology for (very) large water Cherenkov detectors has been developed based on a new generation of low price 3-inch photo-multiplier tubes. Following the successful deployment and operation of two prototypes, the construction of the KM3NeT research infrastructure hasmore » started. The prospects of the different phases of the implementation of KM3NeT are summarised.« less

  5. KM3NeT

    NASA Astrophysics Data System (ADS)

    de Jong, M.

    2015-07-01

    KM3NeT is a large research infrastructure, that will consist of a network of deep-sea neutrino telescopes in the Mediterranean Sea. The main objective of KM3NeT is the discovery and subsequent observation of high-energy neutrino sources in the Universe. A further physics perspective is the measurement of the mass hierarchy of neutrinos. A corresponding study, ORCA, is ongoing within KM3NeT. A cost effective technology for (very) large water Cherenkov detectors has been developed based on a new generation of low price 3-inch photo-multiplier tubes. Following the successful deployment and operation of two prototypes, the construction of the KM3NeT research infrastructure has started. The prospects of the different phases of the implementation of KM3NeT are summarised.

  6. Cognitive Factors Affecting Free Recall, Cued Recall, and Recognition Tasks in Alzheimer's Disease

    PubMed Central

    Yamagishi, Takashi; Sato, Takuya; Sato, Atsushi; Imamura, Toru

    2012-01-01

    Background/Aims Our aim was to identify cognitive factors affecting free recall, cued recall, and recognition tasks in patients with Alzheimer's disease (AD). Subjects: We recruited 349 consecutive AD patients who attended a memory clinic. Methods Each patient was assessed using the Alzheimer's Disease Assessment Scale (ADAS) and the extended 3-word recall test. In this task, each patient was asked to freely recall 3 previously presented words. If patients could not recall 1 or more of the target words, the examiner cued their recall by providing the category of the target word and then provided a forced-choice recognition of the target word with 2 distracters. The patients were divided into groups according to the results of the free recall, cued recall, and recognition tasks. Multivariate logistic regression analysis for repeated measures was carried out to evaluate the net effects of cognitive factors on the free recall, cued recall, and recognition tasks after controlling for the effects of age and recent memory deficit. Results Performance on the ADAS Orientation task was found to be related to performance on the free and cued recall tasks, performance on the ADAS Following Commands task was found to be related to performance on the cued recall task, and performance on the ADAS Ideational Praxis task was found to be related to performance on the free recall, cued recall, and recognition tasks. Conclusion The extended 3-word recall test reflects deficits in a wider range of memory and other cognitive processes, including memory retention after interference, divided attention, and executive functions, compared with word-list recall tasks. The characteristics of the extended 3-word recall test may be advantageous for evaluating patients’ memory impairments in daily living. PMID:22962551

  7. Cognitive factors affecting free recall, cued recall, and recognition tasks in Alzheimer's disease.

    PubMed

    Yamagishi, Takashi; Sato, Takuya; Sato, Atsushi; Imamura, Toru

    2012-01-01

    Our aim was to identify cognitive factors affecting free recall, cued recall, and recognition tasks in patients with Alzheimer's disease (AD). We recruited 349 consecutive AD patients who attended a memory clinic. Each patient was assessed using the Alzheimer's Disease Assessment Scale (ADAS) and the extended 3-word recall test. In this task, each patient was asked to freely recall 3 previously presented words. If patients could not recall 1 or more of the target words, the examiner cued their recall by providing the category of the target word and then provided a forced-choice recognition of the target word with 2 distracters. The patients were divided into groups according to the results of the free recall, cued recall, and recognition tasks. Multivariate logistic regression analysis for repeated measures was carried out to evaluate the net effects of cognitive factors on the free recall, cued recall, and recognition tasks after controlling for the effects of age and recent memory deficit. Performance on the ADAS Orientation task was found to be related to performance on the free and cued recall tasks, performance on the ADAS Following Commands task was found to be related to performance on the cued recall task, and performance on the ADAS Ideational Praxis task was found to be related to performance on the free recall, cued recall, and recognition tasks. The extended 3-word recall test reflects deficits in a wider range of memory and other cognitive processes, including memory retention after interference, divided attention, and executive functions, compared with word-list recall tasks. The characteristics of the extended 3-word recall test may be advantageous for evaluating patients' memory impairments in daily living.

  8. cMapper: gene-centric connectivity mapper for EBI-RDF platform.

    PubMed

    Shoaib, Muhammad; Ansari, Adnan Ahmad; Ahn, Sung-Min

    2017-01-15

    In this era of biological big data, data integration has become a common task and a challenge for biologists. The Resource Description Framework (RDF) was developed to enable interoperability of heterogeneous datasets. The EBI-RDF platform enables an efficient data integration of six independent biological databases using RDF technologies and shared ontologies. However, to take advantage of this platform, biologists need to be familiar with RDF technologies and SPARQL query language. To overcome this practical limitation of the EBI-RDF platform, we developed cMapper, a web-based tool that enables biologists to search the EBI-RDF databases in a gene-centric manner without a thorough knowledge of RDF and SPARQL. cMapper allows biologists to search data entities in the EBI-RDF platform that are connected to genes or small molecules of interest in multiple biological contexts. The input to cMapper consists of a set of genes or small molecules, and the output are data entities in six independent EBI-RDF databases connected with the given genes or small molecules in the user's query. cMapper provides output to users in the form of a graph in which nodes represent data entities and the edges represent connections between data entities and inputted set of genes or small molecules. Furthermore, users can apply filters based on database, taxonomy, organ and pathways in order to focus on a core connectivity graph of their interest. Data entities from multiple databases are differentiated based on background colors. cMapper also enables users to investigate shared connections between genes or small molecules of interest. Users can view the output graph on a web browser or download it in either GraphML or JSON formats. cMapper is available as a web application with an integrated MySQL database. The web application was developed using Java and deployed on Tomcat server. We developed the user interface using HTML5, JQuery and the Cytoscape Graph API. cMapper can be accessed at http://cmapper.ewostech.net Readers can download the development manual from the website http://cmapper.ewostech.net/docs/cMapperDocumentation.pdf. Source Code is available at https://github.com/muhammadshoaib/cmapperContact:smahn@gachon.ac.krSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. A study of mass data storage technology for rocket engine data

    NASA Technical Reports Server (NTRS)

    Ready, John F.; Benser, Earl T.; Fritz, Bernard S.; Nelson, Scott A.; Stauffer, Donald R.; Volna, William M.

    1990-01-01

    The results of a nine month study program on mass data storage technology for rocket engine (especially the Space Shuttle Main Engine) health monitoring and control are summarized. The program had the objective of recommending a candidate mass data storage technology development for rocket engine health monitoring and control and of formulating a project plan and specification for that technology development. The work was divided into three major technical tasks: (1) development of requirements; (2) survey of mass data storage technologies; and (3) definition of a project plan and specification for technology development. The first of these tasks reviewed current data storage technology and developed a prioritized set of requirements for the health monitoring and control applications. The second task included a survey of state-of-the-art and newly developing technologies and a matrix-based ranking of the technologies. It culminated in a recommendation of optical disk technology as the best candidate for technology development. The final task defined a proof-of-concept demonstration, including tasks required to develop, test, analyze, and demonstrate the technology advancement, plus an estimate of the level of effort required. The recommended demonstration emphasizes development of an optical disk system which incorporates an order-of-magnitude increase in writing speed above the current state of the art.

  10. Ecological assessment of divided attention: What about the current tools and the relevancy of virtual reality.

    PubMed

    Lopez Maïté, C; Gaétane, D; Axel, C

    2016-01-01

    The ability to perform two tasks simultaneously has become increasingly important as attention-demanding technologies have become more common in daily life. This type of attentional resources allocation is commonly called "divided attention". Because of the importance of divided attention in natural world settings, substantial efforts have been made recently so as to promote an integrated, realistic assessment of functional abilities in dual-task paradigms. In this context, virtual reality methods appear to be a good solution. However to date, there has been little discussion on validity of such methods. Here, we offer a comparative review of conventional tools used to assess divided attention and of the first virtual reality studies (mostly from the field of road and pedestrian safety). The ecological character of virtual environments leads to a better understanding of the influence of dual-task settings and also makes it possible to clarify issues such as the utility of hands-free phones. After discussing the theoretical and clinical contributions of these studies, we discuss the limits of virtual reality assessment, focusing in particular: (i) on the challenges associated with lack of familiarity with new technological devices; (ii) on the validity of the ecological character of virtual environments; and (iii) on the question of whether the results obtained in a specific context can be generalized to all dual-task situations typical of daily life. To overcome the limitations associated with virtual reality, we propose: (i) to include a standardized familiarization phase in assessment protocols so as to limit the interference caused by the use of new technologies; (ii) to systematically compare virtual reality performance with conventional tests or real-life tests; and (iii) to design dual-task scenarios that are independent from the patient's expertise on one of the two tasks. We conclude that virtual reality appears to constitute a useful tool when used in combination with more conventional tests. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  11. Information flow analysis and Petri-net-based modeling for welding flexible manufacturing cell

    NASA Astrophysics Data System (ADS)

    Qiu, T.; Chen, Shanben; Wang, Y. T.; Wu, Lin

    2000-10-01

    Due to the development of advanced manufacturing technology and the introduction of Smart-Manufacturing notion in the field of modern industrial production, welding flexible manufacturing system (WFMS) using robot technology has become the inevitable developing direction on welding automation. In WFMS process, the flexibility for different welding products and the realizing on corresponding welding parameters control are the guarantees for welding quality. Based on a new intelligent arc-welding flexible manufacturing cell (WFMC), the system structure and control policies are studied in this paper. Aiming at the different information flows among every subsystem and central monitoring computer in this WFMC, Petri net theory is introduced into the process of welding manufacturing. With its help, a discrete control model of WFMC has been constructed, in which the system status is regarded as place and the control process is regarded as transition. Moreover, grounded on automation Petri net principle, the judging and utilizing of information obtained from welding sensors are imported into net structure, which extends the traditional Petri net concepts. The control model and policies researched in this paper have established foundation for further intelligent real-time control on WFMC and WFMS.

  12. Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma.

    PubMed

    Kasneci, Enkelejda; Black, Alex A; Wood, Joanne M

    2017-01-01

    To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior.

  13. Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma

    PubMed Central

    Black, Alex A.

    2017-01-01

    To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior. PMID:28293433

  14. Advanced Near Net Shape Technology

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The objective of the Advanced Near Net Shape Technology (ANNST) project is to radically improve near net shape manufacturing methods from the current Technology/ Manufacturing Readiness Levels (TRL/MRL 3-4) to the point where they are viable candidates (TRL/ MRL-6) for shortening the time and cost for insertion of new aluminum alloys and revolutionary manufacturing methods into the development/improvement of space structures. Conventional cyrotank manufacturing processes require fabrication of multiple pieces welded together to form a complete tank. A variety of near net shape manufacturing processes has demonstrated excellent potential for enabling single-piece construction of components such as domes, barrels, and ring frames. Utilization of such processes can dramatically reduce the extent of welding and joining needed to construct cryogenic tanks and other aerospace structures. The specific focus of this project is to successfully mature the integrally stiffened cylinder (ISC) process in which a single-piece cylinder with integral stiffeners is formed in one spin/flow forming process. Structural launch vehicle components, like cryogenic fuel tanks (e.g., space shuttle external tank), are currently fabricated via multipiece assembly of parts produced through subtractive manufacturing techniques. Stiffened structural panels are heavily machined from thick plate, which results in excessive scrap rates. Multipiece construction requires welds to assemble the structure, which increases the risk for defects and catastrophic failures.

  15. SimulaTE: simulating complex landscapes of transposable elements of populations.

    PubMed

    Kofler, Robert

    2018-04-15

    Estimating the abundance of transposable elements (TEs) in populations (or tissues) promises to answer many open research questions. However, progress is hampered by the lack of concordance between different approaches for TE identification and thus potentially unreliable results. To address this problem, we developed SimulaTE a tool that generates TE landscapes for populations using a newly developed domain specific language (DSL). The simple syntax of our DSL allows for easily building even complex TE landscapes that have, for example, nested, truncated and highly diverged TE insertions. Reads may be simulated for the populations using different sequencing technologies (PacBio, Illumina paired-ends) and strategies (sequencing individuals and pooled populations). The comparison between the expected (i.e. simulated) and the observed results will guide researchers in finding the most suitable approach for a particular research question. SimulaTE is implemented in Python and available at https://sourceforge.net/projects/simulates/. Manual https://sourceforge.net/p/simulates/wiki/Home/#manual; Test data and tutorials https://sourceforge.net/p/simulates/wiki/Home/#walkthrough; Validation https://sourceforge.net/p/simulates/wiki/Home/#validation. robert.kofler@vetmeduni.ac.at.

  16. Leadership and transformational change in healthcare organisations: a qualitative analysis of the North East Transformation System.

    PubMed

    Erskine, Jonathan; Hunter, David J; Small, Adrian; Hicks, Chris; McGovern, Tom; Lugsden, Ed; Whitty, Paula; Steen, Nick; Eccles, Martin Paul

    2013-02-01

    The research project 'An Evaluation of Transformational Change in NHS North East' examines the progress and success of National Health Service (NHS) organisations in north east England in implementing and embedding the North East Transformation System (NETS), a region-wide programme to improve healthcare quality and safety, and to reduce waste, using a combination of Vision, Compact, and Lean-based Method. This paper concentrates on findings concerning the role of leadership in enabling tranformational change, based on semi-structured interviews with a mix of senior NHS managers and quality improvement staff in 14 study sites. Most interviewees felt that implementing the NETS requires committed, stable leadership, attention to team-building across disciplines and leadership development at many levels. We conclude that without senior leader commitment to continuous improvement over a long time scale and serious efforts to distribute leadership tasks to all levels, healthcare organisations are less likely to achieve positive changes in managerial-clinical relations, sustainable improvements to organisational culture and, ultimately, the region-wide step change in quality, safety and efficiency that the NETS was designed to deliver. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  17. Microgravity Manufacturing Via Fused Deposition

    NASA Technical Reports Server (NTRS)

    Cooper, K. G.; Griffin, M. R.

    2003-01-01

    Manufacturing polymer hardware during space flight is currently outside the state of the art. A process called fused deposition modeling (FDM) can make this approach a reality by producing net-shaped components of polymer materials directly from a CAE model. FDM is a rapid prototyping process developed by Stratasys, Inc.. which deposits a fine line of semi-molten polymer onto a substrate while moving via computer control to form the cross-sectional shape of the part it is building. The build platen is then lowered and the process is repeated, building a component directly layer by layer. This method enables direct net-shaped production of polymer components directly from a computer file. The layered manufacturing process allows for the manufacture of complex shapes and internal cavities otherwise impossible to machine. This task demonstrated the benefits of the FDM technique to quickly and inexpensively produce replacement components or repair broken hardware in a Space Shuttle or Space Station environment. The intent of the task was to develop and fabricate an FDM system that was lightweight, compact, and required minimum power consumption to fabricate ABS plastic hardware in microgravity. The final product of the shortened task turned out to be a ground-based breadboard device, demonstrating miniaturization capability of the system.

  18. China's Chemical Information Online Service: ChI2Net.

    ERIC Educational Resources Information Center

    Naiyan, Yu; And Others

    1997-01-01

    Describes the Chemical Integrated Information Service Network (ChI2Net), a comprehensive online information service system which includes chemical, technical, economic, market, news, and management information based on computer and modern communication technology that was built by the China National Chemical Information Centre. (Author/LRW)

  19. Instructor Perceptions of Web Technology Feature and Instructional Task Fit

    ERIC Educational Resources Information Center

    Strader, Troy J.; Reed, Diana; Suh, Inchul; Njoroge, Joyce W.

    2015-01-01

    In this exploratory study, university faculty (instructor) perceptions of the extent to which eight unique features of Web technology are useful for various instructional tasks are identified. Task-technology fit propositions are developed and tested using data collected from a survey of instructors in business, pharmacy, and arts/humanities. It…

  20. Stochastic model predicts evolving preferences in the Iowa gambling task

    PubMed Central

    Fuentes, Miguel A.; Lavín, Claudio; Contreras-Huerta, L. Sebastián; Miguel, Hernan; Rosales Jubal, Eduardo

    2014-01-01

    Learning under uncertainty is a common task that people face in their daily life. This process relies on the cognitive ability to adjust behavior to environmental demands. Although the biological underpinnings of those cognitive processes have been extensively studied, there has been little work in formal models seeking to capture the fundamental dynamic of learning under uncertainty. In the present work, we aimed to understand the basic cognitive mechanisms of outcome processing involved in decisions under uncertainty and to evaluate the relevance of previous experiences in enhancing learning processes within such uncertain context. We propose a formal model that emulates the behavior of people playing a well established paradigm (Iowa Gambling Task - IGT) and compare its outcome with a behavioral experiment. We further explored whether it was possible to emulate maladaptive behavior observed in clinical samples by modifying the model parameter which controls the update of expected outcomes distributions. Results showed that the performance of the model resembles the observed participant performance as well as IGT performance by healthy subjects described in the literature. Interestingly, the model converges faster than some subjects on the decks with higher net expected outcome. Furthermore, the modified version of the model replicated the trend observed in clinical samples performing the task. We argue that the basic cognitive component underlying learning under uncertainty can be represented as a differential equation that considers the outcomes of previous decisions for guiding the agent to an adaptive strategy. PMID:25566043

  1. Stochastic model predicts evolving preferences in the Iowa gambling task.

    PubMed

    Fuentes, Miguel A; Lavín, Claudio; Contreras-Huerta, L Sebastián; Miguel, Hernan; Rosales Jubal, Eduardo

    2014-01-01

    Learning under uncertainty is a common task that people face in their daily life. This process relies on the cognitive ability to adjust behavior to environmental demands. Although the biological underpinnings of those cognitive processes have been extensively studied, there has been little work in formal models seeking to capture the fundamental dynamic of learning under uncertainty. In the present work, we aimed to understand the basic cognitive mechanisms of outcome processing involved in decisions under uncertainty and to evaluate the relevance of previous experiences in enhancing learning processes within such uncertain context. We propose a formal model that emulates the behavior of people playing a well established paradigm (Iowa Gambling Task - IGT) and compare its outcome with a behavioral experiment. We further explored whether it was possible to emulate maladaptive behavior observed in clinical samples by modifying the model parameter which controls the update of expected outcomes distributions. Results showed that the performance of the model resembles the observed participant performance as well as IGT performance by healthy subjects described in the literature. Interestingly, the model converges faster than some subjects on the decks with higher net expected outcome. Furthermore, the modified version of the model replicated the trend observed in clinical samples performing the task. We argue that the basic cognitive component underlying learning under uncertainty can be represented as a differential equation that considers the outcomes of previous decisions for guiding the agent to an adaptive strategy.

  2. Foster Wheeler's Solutions for Large Scale CFB Boiler Technology: Features and Operational Performance of Łagisza 460 MWe CFB Boiler

    NASA Astrophysics Data System (ADS)

    Hotta, Arto

    During recent years, once-through supercritical (OTSC) CFB technology has been developed, enabling the CFB technology to proceed to medium-scale (500 MWe) utility projects such as Łagisza Power Plant in Poland owned by Poludniowy Koncern Energetyczny SA. (PKE), with net efficiency nearly 44%. Łagisza power plant is currently under commissioning and has reached full load operation in March 2009. The initial operation shows very good performance and confirms, that the CFB process has no problems with the scaling up to this size. Also the once-through steam cycle utilizing Siemens' vertical tube Benson technology has performed as predicted in the CFB process. Foster Wheeler has developed the CFB design further up to 800 MWe with net efficiency of ≥45%.

  3. Comparative Effectiveness of a Technology-Facilitated Depression Care Management Model in Safety-Net Primary Care Patients With Type 2 Diabetes: 6-Month Outcomes of a Large Clinical Trial

    PubMed Central

    Ell, Kathleen; Jin, Haomiao; Vidyanti, Irene; Chou, Chih-Ping; Lee, Pey-Jiuan; Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Belson, David; Nezu, Arthur M; Hay, Joel; Wang, Chien-Ju; Scheib, Geoffrey; Di Capua, Paul; Hawkins, Caitlin; Liu, Pai; Ramirez, Magaly; Wu, Brian W; Richman, Mark; Myers, Caitlin; Agustines, Davin; Dasher, Robert; Kopelowicz, Alex; Allevato, Joseph; Roybal, Mike; Ipp, Eli; Haider, Uzma; Graham, Sharon; Mahabadi, Vahid; Guterman, Jeffrey

    2018-01-01

    Background Comorbid depression is a significant challenge for safety-net primary care systems. Team-based collaborative depression care is effective, but complex system factors in safety-net organizations impede adoption and result in persistent disparities in outcomes. Diabetes-Depression Care-management Adoption Trial (DCAT) evaluated whether depression care could be significantly improved by harnessing information and communication technologies to automate routine screening and monitoring of patient symptoms and treatment adherence and allow timely communication with providers. Objective The aim of this study was to compare 6-month outcomes of a technology-facilitated care model with a usual care model and a supported care model that involved team-based collaborative depression care for safety-net primary care adult patients with type 2 diabetes. Methods DCAT is a translational study in collaboration with Los Angeles County Department of Health Services, the second largest safety-net care system in the United States. A comparative effectiveness study with quasi-experimental design was conducted in three groups of adult patients with type 2 diabetes to compare three delivery models: usual care, supported care, and technology-facilitated care. Six-month outcomes included depression and diabetes care measures and patient-reported outcomes. Comparative treatment effects were estimated by linear or logistic regression models that used generalized propensity scores to adjust for sampling bias inherent in the nonrandomized design. Results DCAT enrolled 1406 patients (484 in usual care, 480 in supported care, and 442 in technology-facilitated care), most of whom were Hispanic or Latino and female. Compared with usual care, both the supported care and technology-facilitated care groups were associated with significant reduction in depressive symptoms measured by scores on the 9-item Patient Health Questionnaire (least squares estimate, LSE: usual care=6.35, supported care=5.05, technology-facilitated care=5.16; P value: supported care vs usual care=.02, technology-facilitated care vs usual care=.02); decreased prevalence of major depression (odds ratio, OR: supported care vs usual care=0.45, technology-facilitated care vs usual care=0.33; P value: supported care vs usual care=.02, technology-facilitated care vs usual care=.007); and reduced functional disability as measured by Sheehan Disability Scale scores (LSE: usual care=3.21, supported care=2.61, technology-facilitated care=2.59; P value: supported care vs usual care=.04, technology-facilitated care vs usual care=.03). Technology-facilitated care was significantly associated with depression remission (technology-facilitated care vs usual care: OR=2.98, P=.04); increased satisfaction with care for emotional problems among depressed patients (LSE: usual care=3.20, technology-facilitated care=3.70; P=.05); reduced total cholesterol level (LSE: usual care=176.40, technology-facilitated care=160.46; P=.01); improved satisfaction with diabetes care (LSE: usual care=4.01, technology-facilitated care=4.20; P=.05); and increased odds of taking an glycated hemoglobin test (technology-facilitated care vs usual care: OR=3.40, P<.001). Conclusions Both the technology-facilitated care and supported care delivery models showed potential to improve 6-month depression and functional disability outcomes. The technology-facilitated care model has a greater likelihood to improve depression remission, patient satisfaction, and diabetes care quality. PMID:29685872

  4. Guidance, Navigation and Control Digital Emulation Technology Laboratory. Volume 1. Part 2. Task 1: Digital Emulation Technology Laboratory

    DTIC Science & Technology

    1991-09-27

    AD-A241 692 II I] II I11 ANNUAL REPORT VOLUME 1 PART 2 TASK 1: DIGITAL EMULATION TECHNOLOGY LABORATOIRY REPORT NO. AR-0142-91-001 September 27, 1991... DIGITAL EMULATION TECHNOLOGY LABORATORY Contract No. DASG60-89-C-0142 Sponsored By The United States Army ? trategic Defense Command COMPUTER...ANNUAL REPORT VOLUME 1 PART 2 TASK 1: DIGITAL EMULATION TECHNOLOGY LABORATORY September 27, 1991 Authors Thomas R. Collins and Stephen R. Wachtel

  5. Virtual Reality-Based Center of Mass-Assisted Personalized Balance Training System.

    PubMed

    Kumar, Deepesh; González, Alejandro; Das, Abhijit; Dutta, Anirban; Fraisse, Philippe; Hayashibe, Mitsuhiro; Lahiri, Uttama

    2017-01-01

    Poststroke hemiplegic patients often show altered weight distribution with balance disorders, increasing their risk of fall. Conventional balance training, though powerful, suffers from scarcity of trained therapists, frequent visits to clinics to get therapy, one-on-one therapy sessions, and monotony of repetitive exercise tasks. Thus, technology-assisted balance rehabilitation can be an alternative solution. Here, we chose virtual reality as a technology-based platform to develop motivating balance tasks. This platform was augmented with off-the-shelf available sensors such as Nintendo Wii balance board and Kinect to estimate one's center of mass (CoM). The virtual reality-based CoM-assisted balance tasks (Virtual CoMBaT) was designed to be adaptive to one's individualized weight-shifting capability quantified through CoM displacement. Participants were asked to interact with Virtual CoMBaT that offered tasks of varying challenge levels while adhering to ankle strategy for weight shifting. To facilitate the patients to use ankle strategy during weight-shifting, we designed a heel lift detection module. A usability study was carried out with 12 hemiplegic patients. Results indicate the potential of our system to contribute to improving one's overall performance in balance-related tasks belonging to different difficulty levels.

  6. Automatic bladder segmentation from CT images using deep CNN and 3D fully connected CRF-RNN.

    PubMed

    Xu, Xuanang; Zhou, Fugen; Liu, Bo

    2018-03-19

    Automatic approach for bladder segmentation from computed tomography (CT) images is highly desirable in clinical practice. It is a challenging task since the bladder usually suffers large variations of appearance and low soft-tissue contrast in CT images. In this study, we present a deep learning-based approach which involves a convolutional neural network (CNN) and a 3D fully connected conditional random fields recurrent neural network (CRF-RNN) to perform accurate bladder segmentation. We also propose a novel preprocessing method, called dual-channel preprocessing, to further advance the segmentation performance of our approach. The presented approach works as following: first, we apply our proposed preprocessing method on the input CT image and obtain a dual-channel image which consists of the CT image and an enhanced bladder density map. Second, we exploit a CNN to predict a coarse voxel-wise bladder score map on this dual-channel image. Finally, a 3D fully connected CRF-RNN refines the coarse bladder score map and produce final fine-localized segmentation result. We compare our approach to the state-of-the-art V-net on a clinical dataset. Results show that our approach achieves superior segmentation accuracy, outperforming the V-net by a significant margin. The Dice Similarity Coefficient of our approach (92.24%) is 8.12% higher than that of the V-net. Moreover, the bladder probability maps performed by our approach present sharper boundaries and more accurate localizations compared with that of the V-net. Our approach achieves higher segmentation accuracy than the state-of-the-art method on clinical data. Both the dual-channel processing and the 3D fully connected CRF-RNN contribute to this improvement. The united deep network composed of the CNN and 3D CRF-RNN also outperforms a system where the CRF model acts as a post-processing method disconnected from the CNN.

  7. Team performance in networked supervisory control of unmanned air vehicles: effects of automation, working memory, and communication content.

    PubMed

    McKendrick, Ryan; Shaw, Tyler; de Visser, Ewart; Saqer, Haneen; Kidwell, Brian; Parasuraman, Raja

    2014-05-01

    Assess team performance within a net-worked supervisory control setting while manipulating automated decision aids and monitoring team communication and working memory ability. Networked systems such as multi-unmanned air vehicle (UAV) supervision have complex properties that make prediction of human-system performance difficult. Automated decision aid can provide valuable information to operators, individual abilities can limit or facilitate team performance, and team communication patterns can alter how effectively individuals work together. We hypothesized that reliable automation, higher working memory capacity, and increased communication rates of task-relevant information would offset performance decrements attributed to high task load. Two-person teams performed a simulated air defense task with two levels of task load and three levels of automated aid reliability. Teams communicated and received decision aid messages via chat window text messages. Task Load x Automation effects were significant across all performance measures. Reliable automation limited the decline in team performance with increasing task load. Average team spatial working memory was a stronger predictor than other measures of team working memory. Frequency of team rapport and enemy location communications positively related to team performance, and word count was negatively related to team performance. Reliable decision aiding mitigated team performance decline during increased task load during multi-UAV supervisory control. Team spatial working memory, communication of spatial information, and team rapport predicted team success. An automated decision aid can improve team performance under high task load. Assessment of spatial working memory and the communication of task-relevant information can help in operator and team selection in supervisory control systems.

  8. Presentation of the results of a Bayesian automatic event detection and localization program to human analysts

    NASA Astrophysics Data System (ADS)

    Kushida, N.; Kebede, F.; Feitio, P.; Le Bras, R.

    2016-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing and testing NET-VISA (Arora et al., 2013), a Bayesian automatic event detection and localization program, and evaluating its performance in a realistic operational mode. In our preliminary testing at the CTBTO, NET-VISA shows better performance than its currently operating automatic localization program. However, given CTBTO's role and its international context, a new technology should be introduced cautiously when it replaces a key piece of the automatic processing. We integrated the results of NET-VISA into the Analyst Review Station, extensively used by the analysts so that they can check the accuracy and robustness of the Bayesian approach. We expect the workload of the analysts to be reduced because of the better performance of NET-VISA in finding missed events and getting a more complete set of stations than the current system which has been operating for nearly twenty years. The results of a series of tests indicate that the expectations born from the automatic tests, which show an overall overlap improvement of 11%, meaning that the missed events rate is cut by 42%, hold for the integrated interactive module as well. New events are found by analysts, which qualify for the CTBTO Reviewed Event Bulletin, beyond the ones analyzed through the standard procedures. Arora, N., Russell, S., and Sudderth, E., NET-VISA: Network Processing Vertically Integrated Seismic Analysis, 2013, Bull. Seismol. Soc. Am., 103, 709-729.

  9. Existing generating assets squeezed as new project starts slow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, R.B.; Tiffany, E.D.

    Most forecasting reports concentrate on political or regulatory events to predict future industry trends. Frequently overlooked are the more empirical performance trends of the principal power generation technologies. Solomon and Associates queried its many power plant performance databases and crunched some numbers to identify those trends. Areas of investigation included reliability, utilization (net output factor and net capacity factor) and cost (operating costs). An in-depth analysis for North America and Europe is presented in this article, by region and by regeneration technology. 4 figs., 2 tabs.

  10. NutriNet: A Deep Learning Food and Drink Image Recognition System for Dietary Assessment

    PubMed Central

    Koroušić Seljak, Barbara

    2017-01-01

    Automatic food image recognition systems are alleviating the process of food-intake estimation and dietary assessment. However, due to the nature of food images, their recognition is a particularly challenging task, which is why traditional approaches in the field have achieved a low classification accuracy. Deep neural networks have outperformed such solutions, and we present a novel approach to the problem of food and drink image detection and recognition that uses a newly-defined deep convolutional neural network architecture, called NutriNet. This architecture was tuned on a recognition dataset containing 225,953 512 × 512 pixel images of 520 different food and drink items from a broad spectrum of food groups, on which we achieved a classification accuracy of 86.72%, along with an accuracy of 94.47% on a detection dataset containing 130,517 images. We also performed a real-world test on a dataset of self-acquired images, combined with images from Parkinson’s disease patients, all taken using a smartphone camera, achieving a top-five accuracy of 55%, which is an encouraging result for real-world images. Additionally, we tested NutriNet on the University of Milano-Bicocca 2016 (UNIMIB2016) food image dataset, on which we improved upon the provided baseline recognition result. An online training component was implemented to continually fine-tune the food and drink recognition model on new images. The model is being used in practice as part of a mobile app for the dietary assessment of Parkinson’s disease patients. PMID:28653995

  11. Army AL&T, July-September 2008

    DTIC Science & Technology

    2008-09-01

    Technology , and Logistics (AT&L) Workforce and will summarize best practices , specific initiatives, and relevant accomplishments of DOD and the...PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Army Acquisition, Logistics & Technology (AT&L...logistics, and technology (AL&T) community. We have a vast number of programs that range from developing transformational technologies for our

  12. Using Browser Notebooks to Analyse Big Atmospheric Data-sets in the Cloud

    NASA Astrophysics Data System (ADS)

    Robinson, N.; Tomlinson, J.; Arribas, A.; Prudden, R.

    2016-12-01

    We are presenting an account of our experience building an ecosystem for the analysis of big atmospheric data-sets. By using modern technologies we have developed a prototype platform which is scaleable and capable of analysing very large atmospheric datasets. We tested different big-data ecosystems such as Hadoop MapReduce, Spark and Dask, in order to find the one which was best suited for analysis of multidimensional binary data such as NetCDF. We make extensive use of infrastructure-as-code and containerisation to provide a platform which is reusable, and which can scale to accommodate changes in demand. We make this platform readily accessible using browser based notebooks. As a result, analysts with minimal technology experience can, in tens of lines of Python, make interactive data-visualisation web pages, which can analyse very large amounts of data using cutting edge big-data technology

  13. Systems Librarian and Automation Review.

    ERIC Educational Resources Information Center

    Schuyler, Michael

    1992-01-01

    Discusses software sharing on computer networks and the need for proper bandwidth; and describes the technology behind FidoNet, a computer network made up of electronic bulletin boards. Network features highlighted include front-end mailers, Zone Mail Hour, Nodelist, NetMail, EchoMail, computer conferences, tosser and scanner programs, and host…

  14. Status Report on Image Information Systems and Image Data Base Technology

    DTIC Science & Technology

    1989-12-01

    PowerHouse, StarGate , StarNet. Significant Recent Developments: Acceptance major teaching Universities (Australia), U.S.A.F. Major Corporations. Future...scenario, all computers must be VAX). STARBASE StarBase StarNet, (Network server), StarBase StarGate , (SQL gateway). SYBASE Sybase is an inherently

  15. Evaluating the Technical and Economic Performance of PV Plus Storage Power Plants: Report Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denholm, Paul L.; Margolis, Robert M.; Eichman, Joshua D.

    The decreasing costs of both PV and energy storage technologies have raised interest in the creation of combined PV plus storage systems to provide dispatchable energy and reliable capacity. In this study, we examine the tradeoffs among various PV plus storage configurations and quantify the impact of configuration on system net value.

  16. Evaluating the Technical and Economic Performance of PV Plus Storage Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denholm, Paul L.; Margolis, Robert M.; Eichman, Joshua D.

    The decreasing costs of both PV and energy storage technologies have raised interest in the creation of combined PV plus storage systems to provide dispatchable energy and reliable capacity. In this study, we examine the tradeoffs among various PV plus storage configurations and quantify the impact of configuration on system net value.

  17. Choose to Use: Scaffolding for Technology Learning Needs in a Project-Based Learning Environment

    ERIC Educational Resources Information Center

    Weimer, Peggy D.

    2017-01-01

    Project-based learning is one approach used by teachers to meet the challenge of developing more technologically proficient students. This approach, however, requires students to manage a large number of tasks including the mastery of technology. If a student's perception that their capability to perform a task falls below the task's difficulty,…

  18. Understanding Skill in EVA Mass Handling. Volume 4; An Integrated Methodology for Evaluating Space Suit Mobility and Stability

    NASA Technical Reports Server (NTRS)

    McDonald, P. Vernon; Newman, Dava

    1999-01-01

    The empirical investigation of extravehicular activity (EVA) mass handling conducted on NASA's Precision Air-Bearing Floor led to a Phase I SBIR from JSC. The purpose of the SBIR was to design an innovative system for evaluating space suit mobility and stability in conditions that simulate EVA on the surface of the Moon or Mars. The approach we used to satisfy the Phase I objectives was based on a structured methodology for the development of human-systems technology. Accordingly the project was broken down into a number of tasks and subtasks. In sequence, the major tasks were: 1) Identify missions and tasks that will involve EVA and resulting mobility requirements in the near and long term; 2) Assess possible methods for evaluating mobility of space suits during field-based EVA tests; 3) Identify requirements for behavioral evaluation by interacting with NASA stakeholders;.4) Identify necessary and sufficient technology for implementation of a mobility evaluation system; and 5) Prioritize and select technology solutions. The work conducted in these tasks is described in this final volume of the series on EVA mass handling. While prior volumes in the series focus on novel data-analytic techniques, this volume addresses technology that is necessary for minimally intrusive data collection and near-real-time data analysis and display.

  19. Convolutional Neural Network for Histopathological Analysis of Osteosarcoma.

    PubMed

    Mishra, Rashika; Daescu, Ovidiu; Leavey, Patrick; Rakheja, Dinesh; Sengupta, Anita

    2018-03-01

    Pathologists often deal with high complexity and sometimes disagreement over osteosarcoma tumor classification due to cellular heterogeneity in the dataset. Segmentation and classification of histology tissue in H&E stained tumor image datasets is a challenging task because of intra-class variations, inter-class similarity, crowded context, and noisy data. In recent years, deep learning approaches have led to encouraging results in breast cancer and prostate cancer analysis. In this article, we propose convolutional neural network (CNN) as a tool to improve efficiency and accuracy of osteosarcoma tumor classification into tumor classes (viable tumor, necrosis) versus nontumor. The proposed CNN architecture contains eight learned layers: three sets of stacked two convolutional layers interspersed with max pooling layers for feature extraction and two fully connected layers with data augmentation strategies to boost performance. The use of a neural network results in higher accuracy of average 92% for the classification. We compare the proposed architecture with three existing and proven CNN architectures for image classification: AlexNet, LeNet, and VGGNet. We also provide a pipeline to calculate percentage necrosis in a given whole slide image. We conclude that the use of neural networks can assure both high accuracy and efficiency in osteosarcoma classification.

  20. Decision making in healthy participants on the Iowa Gambling Task: new insights from an operant approach.

    PubMed

    Bull, Peter N; Tippett, Lynette J; Addis, Donna Rose

    2015-01-01

    The Iowa Gambling Task (IGT) has contributed greatly to the study of affective decision making. However, researchers have observed high inter-study and inter-individual variability in IGT performance in healthy participants, and many are classified as impaired using standard criteria. Additionally, while decision-making deficits are often attributed to atypical sensitivity to reward and/or punishment, the IGT lacks an integrated sensitivity measure. Adopting an operant perspective, two experiments were conducted to explore these issues. In Experiment 1, 50 healthy participants completed a 200-trial version of the IGT which otherwise closely emulated Bechara et al.'s (1999) original computer task. Group data for Trials 1-100 closely replicated Bechara et al.'s original findings of high net scores and preferences for advantageous decks, suggesting that implementations that depart significantly from Bechara's standard IGT contribute to inter-study variability. During Trials 101-200, mean net scores improved significantly and the percentage of participants meeting the "impaired" criterion was halved. An operant-style stability criterion applied to individual data revealed this was likely related to individual differences in learning rate. Experiment 2 used a novel operant card task-the Auckland Card Task (ACT)-to derive quantitative estimates of sensitivity using the generalized matching law. Relative to individuals who mastered the IGT, persistent poor performers on the IGT exhibited significantly lower sensitivity to magnitudes (but not frequencies) of rewards and punishers on the ACT. Overall, our findings demonstrate the utility of operant-style analysis of IGT data and the potential of applying operant concurrent-schedule procedures to the study of human decision making.

  1. Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems

    NASA Technical Reports Server (NTRS)

    Holda, Julie

    2004-01-01

    The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.

  2. How can the study of action kinematics inform our understanding of human social interaction?

    PubMed

    Krishnan-Barman, Sujatha; Forbes, Paul A G; Hamilton, Antonia F de C

    2017-10-01

    The kinematics of human actions are influenced by the social context in which they are performed. Motion-capture technology has allowed researchers to build up a detailed and complex picture of how action kinematics vary across different social contexts. Here we review three task domains-point-to-point imitation tasks, motor interference tasks and reach-to-grasp tasks-to critically evaluate how these tasks can inform our understanding of social interactions. First, we consider how actions within these task domains are performed in a non-social context, before highlighting how a plethora of social cues can perturb the baseline kinematics. We show that there is considerable overlap in the findings from these different tasks domains but also highlight the inconsistencies in the literature and the possible reasons for this. Specifically, we draw attention to the pitfalls of dealing with rich, kinematic data. As a way to avoid these pitfalls, we call for greater standardisation and clarity in the reporting of kinematic measures and suggest the field would benefit from a move towards more naturalistic tasks. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Mars MetNet Mission - Martian Atmospheric Observational Post Network

    NASA Astrophysics Data System (ADS)

    Harri, A.-M.; Haukka, H.; Aleksashkin, S.; Arruego, I.; Schmidt, W.; Genzer, M.; Vazquez, L.; Siikonen, T.; Palin, M.

    2017-09-01

    A new kind of planetary exploration mission for Mars is under development in collaboration between the Finnish Meteorological Institute (FMI), Lavochkin Association (LA), Space Research Institute (IKI) and Institutio Nacional de Tecnica Aerospacial (INTA). The Mars MetNet mission is based on a new semi-hard landing vehicle called MetNet Lander (MNL). The scientific payload of the Mars MetNet Precursor [1] mission is divided into three categories: Atmospheric instruments, Optical devices and Composition and structure devices. Each of the payload instruments will provide significant insights in to the Martian atmospheric behavior. The key technologies of the MetNet Lander have been qualified and the electrical qualification model (EQM) of the payload bay has been built and successfully tested.

  4. Communication Strategies for Shared-Bus Embedded Multiprocessors

    DTIC Science & Technology

    2005-09-01

    target architecture [10]. We utilize the task execution model in [11], where each task vi in the task graph G = (V,E) is associated with three possible...predictability is therefore an interesting and important direction for further study. REFERENCES [1] T. Kogel, M. Doerper, A. Wieferink, R. Leupers, G ...Proceedings of Real-Time Technology and Applications Symposium, 1995, pp. 164–173. [11] S. Hua, G . Qu, and S. Bhattacharyya, “Energy reduction technique

  5. Aerial Rotation Effects on Vertical Jump Performance Among Highly Skilled Collegiate Soccer Players.

    PubMed

    Barker, Leland A; Harry, John R; Dufek, Janet S; Mercer, John A

    2017-04-01

    Barker, LA, Harry, JR, Dufek, JS, and Mercer, JA. Aerial rotation effects on vertical jump performance among highly skilled collegiate soccer players. J Strength Cond Res 31(4): 932-938, 2017-In soccer matches, jumps involving rotations occur when attempting to head the ball for a shot or pass from set pieces, such as corner kicks, goal kicks, and lob passes. However, the 3-dimensional ground reaction forces used to perform rotational jumping tasks are currently unknown. Therefore, the purpose of this study was to compare bilateral, 3-dimensional, and ground reaction forces of a standard countermovement jump (CMJ0) with those of a countermovement jump with a 180° rotation (CMJ180) among Division-1 soccer players. Twenty-four participants from the soccer team of the University of Nevada performed 3 trials of CMJ0 and CMJ180. Dependent variables included jump height, downward and upward phase times, vertical (Fz) peak force and net impulse relative to mass, and medial-lateral and anterior-posterior force couple values. Statistical significance was set a priori at α = 0.05. CMJ180 reduced jump height, increased the anterior-posterior force couple in the downward and upward phases, and increased upward peak Fz (p ≤ 0.05). All other variables were not significantly different between groups (p > 0.05). However, we did recognize that downward peak Fz trended lower in the CMJ0 condition (p = 0.059), and upward net impulse trended higher in the CMJ0 condition (p = 0.071). It was concluded that jump height was reduced during the rotational jumping task, and rotation occurred primarily via AP ground reaction forces through the entire countermovement jump. Coaches and athletes may consider additional rotational jumping in their training programs to mediate performance decrements during rotational jump tasks.

  6. Simplified Distributed Computing

    NASA Astrophysics Data System (ADS)

    Li, G. G.

    2006-05-01

    The distributed computing runs from high performance parallel computing, GRID computing, to an environment where idle CPU cycles and storage space of numerous networked systems are harnessed to work together through the Internet. In this work we focus on building an easy and affordable solution for computationally intensive problems in scientific applications based on existing technology and hardware resources. This system consists of a series of controllers. When a job request is detected by a monitor or initialized by an end user, the job manager launches the specific job handler for this job. The job handler pre-processes the job, partitions the job into relative independent tasks, and distributes the tasks into the processing queue. The task handler picks up the related tasks, processes the tasks, and puts the results back into the processing queue. The job handler also monitors and examines the tasks and the results, and assembles the task results into the overall solution for the job request when all tasks are finished for each job. A resource manager configures and monitors all participating notes. A distributed agent is deployed on all participating notes to manage the software download and report the status. The processing queue is the key to the success of this distributed system. We use BEA's Weblogic JMS queue in our implementation. It guarantees the message delivery and has the message priority and re-try features so that the tasks never get lost. The entire system is built on the J2EE technology and it can be deployed on heterogeneous platforms. It can handle algorithms and applications developed in any languages on any platforms. J2EE adaptors are provided to manage and communicate the existing applications to the system so that the applications and algorithms running on Unix, Linux and Windows can all work together. This system is easy and fast to develop based on the industry's well-adopted technology. It is highly scalable and heterogeneous. It is an open system and any number and type of machines can join the system to provide the computational power. This asynchronous message-based system can achieve second of response time. For efficiency, communications between distributed tasks are often done at the start and end of the tasks but intermediate status of the tasks can also be provided.

  7. Sleep Consolidates Motor Learning of Complex Movement Sequences in Mice.

    PubMed

    Nagai, Hirotaka; de Vivo, Luisa; Bellesi, Michele; Ghilardi, Maria Felice; Tononi, Giulio; Cirelli, Chiara

    2017-02-01

    Sleep-dependent consolidation of motor learning has been extensively studied in humans, but it remains unclear why some, but not all, learned skills benefit from sleep. Here, we compared 2 different motor tasks, both requiring the mice to run on an accelerating device. In the rotarod task, mice learn to maintain balance while running on a small rod, while in the complex wheel task, mice run on an accelerating wheel with an irregular rung pattern. In the rotarod task, performance improved to the same extent after sleep or after sleep deprivation (SD). Overall, using 7 different experimental protocols (41 sleep deprived mice, 26 sleeping controls), we found large interindividual differences in the learning and consolidation of the rotarod task, but sleep before/after training did not account for this variability. By contrast, using the complex wheel, we found that sleep after training, relative to SD, led to better performance from the beginning of the retest session, and longer sleep was correlated with greater subsequent performance. As in humans, the effects of sleep showed large interindividual variability and varied between fast and slow learners, with sleep favoring the preservation of learned skills in fast learners and leading to a net offline gain in the performance in slow learners. Using Fos expression as a proxy for neuronal activation, we also found that complex wheel training engaged motor cortex and hippocampus more than the rotarod training. Sleep specifically consolidates a motor skill that requires complex movement sequences and strongly engages both motor cortex and hippocampus. © Sleep Research Society 2016. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  8. Sensor Acquisition for Water Utilities: A Survey and Technology List

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alai, M; Glascoe, L; Love, A

    2005-03-07

    The early detection of the deliberate biological and chemical contamination of water distribution systems is a necessary capability for securing the nation's water supply. Current and emerging early-detection technology capabilities and shortcomings need to be identified and assessed to provide government agencies and water utilities with an improved methodology for assessing the value of installing these technologies. The Department of Homeland Security (DHS) has tasked a multi-laboratory team to evaluate current and future needs to protect the nation's water distribution infrastructure by supporting an objective evaluation of current and new technologies. The primary deliverables from this Operational Technology Demonstration (OTD)more » are the following: (1) establishment of an advisory board for review and approval of testing protocols, technology acquisition processes and recommendations for technology test and evaluation in laboratory and field settings; (2) development of a technology acquisition process; (3) creation of laboratory and field testing and evaluation capability; and (4) testing of candidate technologies for insertion into a water early warning system. The initial phase of this study involves the development of two separate but complementary strategies to be reviewed by the advisory board: (1) a technology acquisition strategy, and (2) a technology evaluation strategy. Lawrence Livermore National Laboratory and Sandia National Laboratories are tasked with the first strategy, while Los Alamos, Pacific Northwest, and Oak Ridge National Laboratories are tasked with the second strategy. The first goal of the acquisition strategy is the development of a technology survey process that includes a review of previous sensor surveys and current test programs and then the development of a method to solicit and select existing and emerging sensor technologies for evaluation and testing. In this paper we discuss a survey of previous efforts by governmental agencies and private companies with the aim of facilitating a water sensor technology acquisition procedure. We provide a survey of previous sensor studies with regard to the use of Early Warning Systems (EWS) including earlier surveys, testing programs, and response studies. In the project we extend this earlier work by developing a list of important sensor specifications that are then used to help assemble a sensor selection criteria. A list of sensor technologies with their specifications is appended to this document. This list will assist the second goal of the project which is a recommendation of candidate technologies for laboratory and field testing.« less

  9. Training for Skill in Fault Diagnosis

    ERIC Educational Resources Information Center

    Turner, J. D.

    1974-01-01

    The Knitting, Lace and Net Industry Training Board has developed a training innovation called fault diagnosis training. The entire training process concentrates on teaching based on the experiences of troubleshooters or any other employees whose main tasks involve fault diagnosis and rectification. (Author/DS)

  10. Addressing the Nets for Students through Constructivist Technology Use in K-12 Classrooms

    ERIC Educational Resources Information Center

    Niederhauser, Dale S.; Lindstrom, Denise L.

    2006-01-01

    The National Educational Technology Standards for Students promote constructivist technology use for K-12 students in U.S. schools. In this study, researchers reported on 716 cases in which teachers described technology-based activities they conducted with their students. Narrative analysis was used to examine case transcripts relative to the…

  11. Assessing Pre-Service Teacher Attitudes and Skills with the Technology Integration Confidence Scale

    ERIC Educational Resources Information Center

    Browne, Jeremy

    2009-01-01

    As technology integration continues to gain importance, preservice teachers must develop higher levels of confidence and proficiency in using technology in their classrooms (Kay, 2006). The acceptance of the National Educational Technology Standards for Teachers (NETS-T) by National Council for Accreditation of Teacher Education (NCATE) has…

  12. Segmenting the Net-Generation: Embracing the Next Level of Technology

    ERIC Educational Resources Information Center

    Smith, Russell K.

    2014-01-01

    A segmentation study is used to partition college students into groups that are more or less likely to adopt tablet technology as a learning tool. Because the college population chosen for study presently relies upon laptop computers as their primary learning device, tablet technology represents a "next step" in technology. Student…

  13. Report on New Methods for Representing and Interacting with Qualitative Geographic Information, Stage 2: Task Group 3: Social-focused Use Case

    DTIC Science & Technology

    2014-06-30

    lesson learned through exploring current data with the ForceNet tool is that the tool (as implemented thus far) is able to give analysts a big ...including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and...Twitter data and on the development and implementation of tools to support this task; these include a Group Builder, a Force-directed Graph tool, and a

  14. UNH Data Cooperative: A Cyber Infrastructure for Earth System Studies

    NASA Astrophysics Data System (ADS)

    Braswell, B. H.; Fekete, B. M.; Prusevich, A.; Gliden, S.; Magill, A.; Vorosmarty, C. J.

    2007-12-01

    Earth system scientists and managers have a continuously growing demand for a wide array of earth observations derived from various data sources including (a) modern satellite retrievals, (b) "in-situ" records, (c) various simulation outputs, and (d) assimilated data products combining model results with observational records. The sheer quantity of data, and formatting inconsistencies make it difficult for users to take full advantage of this important information resource. Thus the system could benefit from a thorough retooling of our current data processing procedures and infrastructure. Emerging technologies, like OPeNDAP and OGC map services, open standard data formats (NetCDF, HDF) data cataloging systems (NASA-Echo, Global Change Master Directory, etc.) are providing the basis for a new approach in data management and processing, where web- services are increasingly designed to serve computer-to-computer communications without human interactions and complex analysis can be carried out over distributed computer resources interconnected via cyber infrastructure. The UNH Earth System Data Collaborative is designed to utilize the aforementioned emerging web technologies to offer new means of access to earth system data. While the UNH Data Collaborative serves a wide array of data ranging from weather station data (Climate Portal) to ocean buoy records and ship tracks (Portsmouth Harbor Initiative) to land cover characteristics, etc. the underlaying data architecture shares common components for data mining and data dissemination via web-services. Perhaps the most unique element of the UNH Data Cooperative's IT infrastructure is its prototype modeling environment for regional ecosystem surveillance over the Northeast corridor, which allows the integration of complex earth system model components with the Cooperative's data services. While the complexity of the IT infrastructure to perform complex computations is continuously increasing, scientists are often forced to spend considerable amount of time to solve basic data management and preprocessing tasks and deal with low level computational design problems like parallelization of model codes. Our modeling infrastructure is designed to take care the bulk of the common tasks found in complex earth system models like I/O handling, computational domain and time management, parallel execution of the modeling tasks, etc. The modeling infrastructure allows scientists to focus on the numerical implementation of the physical processes on a single computational objects(typically grid cells) while the framework takes care of the preprocessing of input data, establishing of the data exchange between computation objects and the execution of the science code. In our presentation, we will discuss the key concepts of our modeling infrastructure. We will demonstrate integration of our modeling framework with data services offered by the UNH Earth System Data Collaborative via web interfaces. We will layout the road map to turn our prototype modeling environment into a truly community framework for wide range of earth system scientists and environmental managers.

  15. Using Trialogues to Measure English Language Skills

    ERIC Educational Resources Information Center

    So, Youngsoon; Zapata-Rivera, Diego; Cho, Yeonsuk; Luce, Christine; Battistini, Laura

    2015-01-01

    We explored the use of technology-assisted, trialogue-based tasks to measure the English language proficiency of students learning English as a second or foreign language. A presumed benefit of the system for language assessment is its suitability for use in scenario-based tasks that integrate multiple language skills. This integration allows test…

  16. Task-Based Language Teaching Online: A Guide for Teachers

    ERIC Educational Resources Information Center

    Baralt, Melissa; Gómez, José Morcillo

    2017-01-01

    Technology-mediated task-based language teaching is the merger between technology and task-based language teaching (TBLT; González-Lloret & Ortega, 2014) and is arguably now an imperative for language education. As language classrooms are being redefined, training for how to set learners up to successfully do tasks online must be part of…

  17. Mind the Gap: Task Design and Technology in Novice Language Teachers' Practice

    ERIC Educational Resources Information Center

    Smits, Tom F. H.; Oberhofer, Margret; Colpaert, Jozef

    2016-01-01

    This paper focuses on the possibilities/challenges for English as a Foreign Language (EFL) teachers designing tasks grounded in Task-Based Language Teaching (TBLT) and taking advantage of the affordances of technology--Interactive WhiteBoards (IWBs). Teachers have been shown to confuse tasks with exercises or activities. The interactive…

  18. The value from investments in health information technology at the U.S. Department of Veterans Affairs.

    PubMed

    Byrne, Colene M; Mercincavage, Lauren M; Pan, Eric C; Vincent, Adam G; Johnston, Douglas S; Middleton, Blackford

    2010-04-01

    We compare health information technology (IT) in the Department of Veterans Affairs (VA) to norms in the private sector, and we estimate the costs and benefits of selected VA health IT systems. The VA spent proportionately more on IT than the private health care sector spent, but it achieved higher levels of IT adoption and quality of care. The potential value of the VA's health IT investments is estimated at $3.09 billion in cumulative benefits net of investment costs. This study serves as a framework to inform efforts to measure and calculate the benefits of federal health IT stimulus programs.

  19. Innovation and the growth of human population.

    PubMed

    Weinberger, V P; Quiñinao, C; Marquet, P A

    2017-12-05

    Biodiversity is sustained by and is essential to the services that ecosystems provide. Different species would use these services in different ways, or adaptive strategies, which are sustained in time by continuous innovations. Using this framework, we postulate a model for a biological species ( Homo sapiens ) in a finite world where innovations, aimed at increasing the flux of ecosystem services (a measure of habitat quality), increase with population size, and have positive effects on the generation of new innovations (positive feedback) as well as costs in terms of negatively affecting the provision of ecosystem services. We applied this model to human populations, where technological innovations are driven by cumulative cultural evolution. Our model shows that depending on the net impact of a technology on the provision of ecosystem services ( θ ), and the strength of technological feedback ( ξ ), different regimes can result. Among them, the human population can fill the entire planet while maximizing their well-being, but not exhaust ecosystem services. However, this outcome requires positive or green technologies that increase the provision of ecosystem services with few negative externalities or environmental costs, and that have a strong positive feedback in generating new technologies of the same kind. If the feedback is small, then the technological stock can collapse together with the human population. Scenarios where technological innovations generate net negative impacts may be associated with a limited technological stock as well as a limited human population at equilibrium and the potential for collapse. The only way to fill the planet with humans under this scenario of negative technologies is by reducing the technological stock to a minimum. Otherwise, the only feasible equilibrium is associated with population collapse. Our model points out that technological innovations per se may not help humans to grow and dominate the planet. Instead, different possibilities unfold for our future depending on their impact on the environment and on further innovation.This article is part of the themed issue 'Process and pattern in innovations from cells to societies'. © 2017 The Author(s).

  20. Framewise phoneme classification with bidirectional LSTM and other neural network architectures.

    PubMed

    Graves, Alex; Schmidhuber, Jürgen

    2005-01-01

    In this paper, we present bidirectional Long Short Term Memory (LSTM) networks, and a modified, full gradient version of the LSTM learning algorithm. We evaluate Bidirectional LSTM (BLSTM) and several other network architectures on the benchmark task of framewise phoneme classification, using the TIMIT database. Our main findings are that bidirectional networks outperform unidirectional ones, and Long Short Term Memory (LSTM) is much faster and also more accurate than both standard Recurrent Neural Nets (RNNs) and time-windowed Multilayer Perceptrons (MLPs). Our results support the view that contextual information is crucial to speech processing, and suggest that BLSTM is an effective architecture with which to exploit it.

  1. Modeling and Simulation of Metallurgical Process Based on Hybrid Petri Net

    NASA Astrophysics Data System (ADS)

    Ren, Yujuan; Bao, Hong

    2016-11-01

    In order to achieve the goals of energy saving and emission reduction of iron and steel enterprises, an increasing number of modeling and simulation technologies are used to research and analyse metallurgical production process. In this paper, the basic principle of Hybrid Petri net is used to model and analyse the Metallurgical Process. Firstly, the definition of Hybrid Petri Net System of Metallurgical Process (MPHPNS) and its modeling theory are proposed. Secondly, the model of MPHPNS based on material flow is constructed. The dynamic flow of materials and the real-time change of each technological state in metallurgical process are simulated vividly by using this model. The simulation process can implement interaction between the continuous event dynamic system and the discrete event dynamic system at the same level, and play a positive role in the production decision.

  2. Technological Change and HRD. Symposium.

    ERIC Educational Resources Information Center

    2002

    This document contains three papers from a symposium on technological change and human resource development. "New Technologies, Cognitive Demands, and the Implications for Learning Theory" (Richard J. Torraco) identifies four specific characteristics of the tasks involved in using new technologies (contingent versus deterministic tasks,…

  3. Recent Advances in Near-Net-Shape Fabrication of Al-Li Alloy 2195 for Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Wagner, John; Domack, Marcia; Hoffman, Eric

    2007-01-01

    Recent applications in launch vehicles use 2195 processed to Super Lightweight Tank specifications. Potential benefits exist by tailoring heat treatment and other processing parameters to the application. Assess the potential benefits and advocate application of Al-Li near-net-shape technologies for other launch vehicle structural components. Work with manufacturing and material producers to optimize Al-Li ingot shape and size for enhanced near-net-shape processing. Examine time dependent properties of 2195 critical for reusable applications.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The technology necessary to build net zero energy buildings (NZEBs) is ready and available today, however, building to net zero energy performance levels can be challenging. Energy efficiency measures, onsite energy generation resources, load matching and grid interaction, climatic factors, and local policies vary from location to location and require unique methods of constructing NZEBs. It is recommended that Components start looking into how to construct and operate NZEBs now as there is a learning curve to net zero construction and FY 2020 is just around the corner.

  5. Exploring the Connection between Age and Strategies for Learning New Technology Related Tasks

    ERIC Educational Resources Information Center

    Meiselwitz, Gabriele; Chakraborty, Suranjan

    2011-01-01

    This paper discusses the connection between age and strategies for learning new technology related tasks. Many users have to learn about new devices and applications on a frequent basis and use a variety of strategies to accomplish this learning process. Approaches to learning new technology related tasks vary and can contribute to a user's…

  6. An Examination of the Role of Technological Tools in Relation to the Cognitive Demand of Mathematical Tasks in Secondary Classrooms

    ERIC Educational Resources Information Center

    Sherman, Milan

    2011-01-01

    This study investigates the role of digital cognitive technologies in supporting students' mathematical thinking while engaging with instructional tasks. Specifically, the study sought to better understand how the use of technology is related to the cognitive demand of tasks. Data were collected in four secondary mathematics classrooms via…

  7. Improving diabetic foot care in a nurse-managed safety-net clinic.

    PubMed

    Peterson, Joann M; Virden, Mary D

    2013-05-01

    This article is a description of the development and implementation of a Comprehensive Diabetic Foot Care Program and assessment tool in an academically affiliated nurse-managed, multidisciplinary, safety-net clinic. The assessment tool parallels parameters identified in the Task Force Foot Care Interest Group of the American Diabetes Association's report published in 2008, "Comprehensive Foot Examination and Risk Assessment." Review of literature, Silver City Health Center's (SCHC) 2009 Annual Report, retrospective chart review. Since the full implementation of SCHC's Comprehensive Diabetic Foot Care Program, there have been no hospitalizations of clinic patients for foot-related complications. The development of the Comprehensive Diabetic Foot Assessment tool and the implementation of the Comprehensive Diabetic Foot Care Program have resulted in positive outcomes for the patients in a nurse-managed safety-net clinic. This article demonstrates that quality healthcare services can successfully be developed and implemented in a safety-net clinic setting. ©2012 The Author(s) Journal compilation ©2012 American Association of Nurse Practitioners.

  8. Using fuzzy logic to integrate neural networks and knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Yen, John

    1991-01-01

    Outlined here is a novel hybrid architecture that uses fuzzy logic to integrate neural networks and knowledge-based systems. The author's approach offers important synergistic benefits to neural nets, approximate reasoning, and symbolic processing. Fuzzy inference rules extend symbolic systems with approximate reasoning capabilities, which are used for integrating and interpreting the outputs of neural networks. The symbolic system captures meta-level information about neural networks and defines its interaction with neural networks through a set of control tasks. Fuzzy action rules provide a robust mechanism for recognizing the situations in which neural networks require certain control actions. The neural nets, on the other hand, offer flexible classification and adaptive learning capabilities, which are crucial for dynamic and noisy environments. By combining neural nets and symbolic systems at their system levels through the use of fuzzy logic, the author's approach alleviates current difficulties in reconciling differences between low-level data processing mechanisms of neural nets and artificial intelligence systems.

  9. Public pensions, family allowances and endogenous demographic change.

    PubMed

    Peters, W

    1995-05-01

    "A tax-transfer system deals with redistribution a PAYGmong generations and corrective taxation a PAYGt the same time. Since such a policy is a government's task, we take a normative approach and pose the question: Which tax-transfer system should a government apply to maximize social welfare? The framework we consider allows for endogenous demographic aspects...: first, fertility has a great impact on a PAYG [pay-as-you-go] financed pension insurance; and second, through education human capital is accumulated.... We analyzed the optimal extent of a public pension scheme in the presence of external effects of fertility and education on the net domestic product." Pension schemes in Germany and the United States are compared. excerpt

  10. Negotiating technology-mediated interaction in health care

    PubMed Central

    Håland, Erna; Melby, Line

    2015-01-01

    The health-care sector is increasingly faced with different forms of technology that are introduced to mediate interaction, thus fully or partially replacing face-to-face meetings. In this article we address health personnel's experiences with three such technologies, namely: electronic messages, video conferences and net-based discussion forums. Drawing on Goffman's perspectives on interaction and frame, we argue that when technologies are introduced to mediate interaction, new frames for understanding and making sense of situations are created. These new frames imply new ways of organising and making sense of experience, and require work by the participants in the interaction. In this article, based on interviews from two Norwegian research projects, we investigate health personnel's work to make sense of technology-mediated interaction in health care. We discuss this work represented in four categories: how to perform in a competent manner, how to negotiate immediacy, how to enable social cues and how to establish and maintain commitment. Concluding, we argue that the introduction of mediating technologies redefines what is considered up-to-date, ‘good' health-care work and challenges health personnel to change (some of) their work practices and moves, as a result, far beyond simple interventions aimed at making work more efficient. PMID:25685073

  11. Mentoring, Women in Engineering and Related Sciences, and MentorNet

    NASA Astrophysics Data System (ADS)

    Dockter, J.; Muller, C.

    2003-12-01

    Mentoring is a frequently employed strategy for retention of women in engineering and science. The power of mentoring is sometimes poorly understood, and mentoring is not always effectively practiced, however. At its strongest, mentoring is understood as a powerful learning process, which assures the intergenerational transfer of knowledge and "know-how" on an ongoing basis throughout one's life. Mentoring helps make explicit the tacit knowledge of a discipline and its professional culture, which is especially important for underrepresented groups. MentorNet (www.MentorNet.net), the E-Mentoring Network for Women in Engineering and Science, is a nonprofit organization focused on furthering women's progress in scientific and technical fields through the use of a dynamic, technology-supported mentoring program. Since 1998, nearly 10,000 undergraduate and graduate women studying engineering and related sciences at more than 100 colleges and universities across the U.S., and in several other nations, have been matched in structured, one-on-one, email-based mentoring relationships with male and female scientific and technical professionals working in industry and government. This poster will describe the MentorNet program, and provide findings of annual program evaluations related to outcomes for participants with particular focus on women in the planetary and earth sciences. We also address the development of the partnership of approximately 100 organizations currently involved in MentorNet and the value each gains from its affiliation. MentorNet is an ongoing effort which supports the interests of all organizations and individuals working to advance women in engineering and related sciences.

  12. Altered steering strategies for goal-directed locomotion in stroke

    PubMed Central

    2013-01-01

    Background Individuals who have sustained a stroke can manifest altered locomotor steering behaviors when exposed to optic flows expanding from different locations. Whether these alterations persist in the presence of a visible goal and whether they can be explained by the presence of a perceptuo-motor disorder remain unknown. The purpose of this study was to compare stroke participants and healthy participants on their ability to control heading while exposed to changing optic flows and target locations. Methods Ten participants with stroke (55.6 ± 9.3 yrs) and ten healthy controls (57.0 ± 11.5 yrs) participated in a mouse-driven steering task (perceptuo-motor task) while seated and in a walking steering task. In the seated steering task, participants were instructed to head or ‘walk’ toward a target in the virtual environment by using a mouse while wearing a helmet-mounted display (HMD). In the walking task, participants performed a similar steering task in the same virtual environment while walking overground at their comfortable speed. For both experiments, the target and/or the focus of expansion (FOE) of the optic flow shifted to the side (±20°) or remained centered. The main outcome measure was net heading errors (NHE). Secondary outcomes included mediolateral displacement, horizontal head orientation, and onsets of heading and head reorientation. Results In the walking steering task, the presence of FOE shifts modulated the extent and timing of mediolateral displacement and head rotation changes, as well as NHE magnitudes. Participants overshot and undershot their net heading, respectively, in response to ipsilateral and contralateral FOE and target shifts. Stroke participants made larger NHEs, especially when the FOE was shifted towards the non-paretic side. In the seated steering task, similar NHEs were observed between stroke and healthy participants. Conclusions The findings highlight the fine coordination between rotational and translational steering mechanisms in presence of targets and FOE shifts. The altered performance of stroke participants in walking but not in the seated steering task suggests that an altered perceptuo-motor processing of optic flow is not a main contributing factor and that other stroke-related sensorimotor deficits are involved. PMID:23875969

  13. BioC implementations in Go, Perl, Python and Ruby.

    PubMed

    Liu, Wanli; Islamaj Doğan, Rezarta; Kwon, Dongseop; Marques, Hernani; Rinaldi, Fabio; Wilbur, W John; Comeau, Donald C

    2014-01-01

    As part of a communitywide effort for evaluating text mining and information extraction systems applied to the biomedical domain, BioC is focused on the goal of interoperability, currently a major barrier to wide-scale adoption of text mining tools. BioC is a simple XML format, specified by DTD, for exchanging data for biomedical natural language processing. With initial implementations in C++ and Java, BioC provides libraries of code for reading and writing BioC text documents and annotations. We extend BioC to Perl, Python, Go and Ruby. We used SWIG to extend the C++ implementation for Perl and one Python implementation. A second Python implementation and the Ruby implementation use native data structures and libraries. BioC is also implemented in the Google language Go. BioC modules are functional in all of these languages, which can facilitate text mining tasks. BioC implementations are freely available through the BioC site: http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net/ Published by Oxford University Press 2014. This work is written by US Government employees and is in the public domain in the US.

  14. Tooth labeling in cone-beam CT using deep convolutional neural network for forensic identification

    NASA Astrophysics Data System (ADS)

    Miki, Yuma; Muramatsu, Chisako; Hayashi, Tatsuro; Zhou, Xiangrong; Hara, Takeshi; Katsumata, Akitoshi; Fujita, Hiroshi

    2017-03-01

    In large disasters, dental record plays an important role in forensic identification. However, filing dental charts for corpses is not an easy task for general dentists. Moreover, it is laborious and time-consuming work in cases of large scale disasters. We have been investigating a tooth labeling method on dental cone-beam CT images for the purpose of automatic filing of dental charts. In our method, individual tooth in CT images are detected and classified into seven tooth types using deep convolutional neural network. We employed the fully convolutional network using AlexNet architecture for detecting each tooth and applied our previous method using regular AlexNet for classifying the detected teeth into 7 tooth types. From 52 CT volumes obtained by two imaging systems, five images each were randomly selected as test data, and the remaining 42 cases were used as training data. The result showed the tooth detection accuracy of 77.4% with the average false detection of 5.8 per image. The result indicates the potential utility of the proposed method for automatic recording of dental information.

  15. The Perceptions of Public School Administrators from Southeast Texas on the Effects of the National Educational Technology Standards for Teachers on Teacher Evaluations

    ERIC Educational Resources Information Center

    Wharton, Kevin F.

    2014-01-01

    With the infusion of technology into the learning environment, the teacher evaluation process has been affected (Whale, 2006). Consequently, the International Society of Technology in Education developed technology standards for students, teachers, and administrators known as the National Educational Technology Standards, or NETS (Morphew, 2012).…

  16. Technology consumption and cognitive control: Contrasting action video game experience with media multitasking.

    PubMed

    Cardoso-Leite, Pedro; Kludt, Rachel; Vignola, Gianluca; Ma, Wei Ji; Green, C Shawn; Bavelier, Daphne

    2016-01-01

    Technology has the potential to impact cognition in many ways. Here we contrast two forms of technology usage: (1) media multitasking (i.e., the simultaneous consumption of multiple streams of media, such a texting while watching TV) and (2) playing action video games (a particular subtype of video games). Previous work has outlined an association between high levels of media multitasking and specific deficits in handling distracting information, whereas playing action video games has been associated with enhanced attentional control. Because these two factors are linked with reasonably opposing effects, failing to take them jointly into account may result in inappropriate conclusions as to the impacts of technology use on attention. Across four tasks (AX-continuous performance, N-back, task-switching, and filter tasks), testing different aspects of attention and cognition, we showed that heavy media multitaskers perform worse than light media multitaskers. Contrary to previous reports, though, the performance deficit was not specifically tied to distractors, but was instead more global in nature. Interestingly, participants with intermediate levels of media multitasking sometimes performed better than both light and heavy media multitaskers, suggesting that the effects of increasing media multitasking are not monotonic. Action video game players, as expected, outperformed non-video-game players on all tasks. However, surprisingly, this was true only for participants with intermediate levels of media multitasking, suggesting that playing action video games does not protect against the deleterious effect of heavy media multitasking. Taken together, these findings show that media consumption can have complex and counterintuitive effects on attentional control.

  17. Ohio SchoolNet Initiatives: The Role of the Ohio Education Computer Network.

    ERIC Educational Resources Information Center

    Ohio State Legislative Office of Education Oversight, Columbus.

    Ohio's Legislative Office of Education Oversight (LOEO) evaluates education-related activities funded wholly or in part by that state. SchoolNet initiatives seek to increase Ohio K-12 schools' access to computers, networks, and other technology, with a particular emphasis on low-wealth districts. This report addresses the gap between the…

  18. Corridors to Economic Growth and Employment: 1994-95 Final Report to the Governor and the Legislature.

    ERIC Educational Resources Information Center

    Helm, Phoebe

    The Economic Development Network (ED>Net) of the California Community Colleges was designed to advance the state's economic growth and competitiveness by coordinating and facilitating workforce improvement, technology deployment, and business development initiatives. This report reviews outcomes for ED>Net for 1994-95 based on reports…

  19. Give Your Old-School Curriculum a NETS Makeover

    ERIC Educational Resources Information Center

    LaMaster, Jen

    2012-01-01

    Integrating digital age technology into an industrial age educational system is hard enough. Imagine introducing ed tech to a 450-year-old Jesuit educational paradigm. Find out how to seamlessly combine the NETS with a centuries-old framework to create an effective ed tech strategic plan. This article describes how the author successfully…

  20. MOST PROBABLE NUMBER (MPN) CALCULATOR Version 2.0 User and System Installation and Administration Manual

    EPA Science Inventory

    The new MPN Calculator is an easy-to-use stand alone Windows application built by Avineon, Inc. for the EPA. The calculator was built using Microsoft .NET (dot NET) version 3.5 SP1 (C#) and Windows Presentation Foundation technologies. The new calculator not only combines the mai...

  1. Comparing and Contrasting Neural Net Solutions to Classical Statistical Solutions.

    ERIC Educational Resources Information Center

    Van Nelson, C.; Neff, Kathryn J.

    Data from two studies in which subjects were classified as successful or unsuccessful were analyzed using neural net technology after being analyzed with a linear regression function. Data were obtained from admission records of 201 students admitted to undergraduate and 285 students admitted to graduate programs. Data included grade point…

  2. Net returns from segregating dark northern spring wheat by protein concentration during harvest

    USDA-ARS?s Scientific Manuscript database

    In-line, optical sensing has been developed for on-combine measurement and mapping of grain protein concentration (GPC). The objective of this study was to estimate changes in costs and net returns from using this technology for segregation of the dark northern spring (DNS) subclass of hard red whe...

  3. Reviewing the Need for Gaming in Education to Accommodate the Net Generation

    ERIC Educational Resources Information Center

    Bekebrede, G.; Warmelink, H. J. G.; Mayer, I. S.

    2011-01-01

    There is a growing interest in the use of simulations and games in Dutch higher education. This development is based on the perception that students belong to the "gamer generation" or "net generation": a generation that has grown up with computer games and other technology affecting their preferred learning styles, social…

  4. Take the Science Fair Online!

    ERIC Educational Resources Information Center

    Tubbs, James

    2007-01-01

    The kids in today's classrooms spend lots of time playing video games, surfing the net, listening to iPods, and text messaging on cell phones. Known as Digital Kids and the Net Generation, they have grown up surrounded by digital media of all types (Tapscott 1999). Because they are already knowledgeable, why not use digital technologies to capture…

  5. Concentrating Solar Power Projects - Astexol II | Concentrating Solar Power

    Science.gov Websites

    (Badajoz) Owner(s): Elecnor/Aries/ABM AMRO (100%) Technology: Parabolic trough Turbine Capacity: Net: 50.0 Difference: 100°C Power Block Turbine Capacity (Gross): 50.0 MW Turbine Capacity (Net): 50.0 MW Output Type indirect Storage Capacity: 8 Hours Thermal Storage Description:

  6. Overview of codes and tools for nuclear engineering education

    NASA Astrophysics Data System (ADS)

    Yakovlev, D.; Pryakhin, A.; Medvedeva, L.

    2017-01-01

    The recent world trends in nuclear education have been developed in the direction of social education, networking, virtual tools and codes. MEPhI as a global leader on the world education market implements new advanced technologies for the distance and online learning and for student research work. MEPhI produced special codes, tools and web resources based on the internet platform to support education in the field of nuclear technology. At the same time, MEPhI actively uses codes and tools from the third parties. Several types of the tools are considered: calculation codes, nuclear data visualization tools, virtual labs, PC-based educational simulators for nuclear power plants (NPP), CLP4NET, education web-platforms, distance courses (MOOCs and controlled and managed content systems). The university pays special attention to integrated products such as CLP4NET, which is not a learning course, but serves to automate the process of learning through distance technologies. CLP4NET organizes all tools in the same information space. Up to now, MEPhI has achieved significant results in the field of distance education and online system implementation.

  7. Net-Shape HIP Powder Metallurgy Components for Rocket Engines

    NASA Technical Reports Server (NTRS)

    Bampton, Cliff; Goodin, Wes; VanDaam, Tom; Creeger, Gordon; James, Steve

    2005-01-01

    True net shape consolidation of powder metal (PM) by hot isostatic pressing (HIP) provides opportunities for many cost, performance and life benefits over conventional fabrication processes for large rocket engine structures. Various forms of selectively net-shape PM have been around for thirty years or so. However, it is only recently that major applications have been pursued for rocket engine hardware fabricated in the United States. The method employs sacrificial metallic tooling (HIP capsule and shaped inserts), which is removed from the part after HIP consolidation of the powder, by selective acid dissolution. Full exploitation of net-shape PM requires innovative approaches in both component design and materials and processing details. The benefits include: uniform and homogeneous microstructure with no porosity, irrespective of component shape and size; elimination of welds and the associated quality and life limitations; removal of traditional producibility constraints on design freedom, such as forgeability and machinability, and scale-up to very large, monolithic parts, limited only by the size of existing HIP furnaces. Net-shape PM HIP also enables fabrication of complex configurations providing additional, unique functionalities. The progress made in these areas will be described. Then critical aspects of the technology that still require significant further development and maturation will be discussed from the perspective of an engine systems builder and end-user of the technology.

  8. Energetic costs of producing muscle work and force in a cyclical human bouncing task

    PubMed Central

    Kuo, Arthur D.

    2011-01-01

    Muscles expend energy to perform active work during locomotion, but they may also expend significant energy to produce force, for example when tendons perform much of the work passively. The relative contributions of work and force to overall energy expenditure are unknown. We therefore measured the mechanics and energetics of a cyclical bouncing task, designed to control for work and force. We hypothesized that near bouncing resonance, little work would be performed actively by muscle, but the cyclical production of force would cost substantial metabolic energy. Human subjects (n = 9) bounced vertically about the ankles at inversely proportional frequencies (1–4 Hz) and amplitudes (15–4 mm), such that the overall rate of work performed on the body remained approximately constant (0.30 ± 0.06 W/kg), but the forces varied considerably. We used parameter identification to estimate series elasticity of the triceps surae tendon, as well as the work performed actively by muscle and passively by tendon. Net metabolic energy expenditure for bouncing at 1 Hz was 1.15 ± 0.31 W/kg, attributable mainly to active muscle work with an efficiency of 24 ± 3%. But at 3 Hz (near resonance), most of the work was performed passively, so that active muscle work could account for only 40% of the net metabolic rate of 0.76 ± 0.28 W/kg. Near resonance, a cost for cyclical force that increased with both amplitude and frequency of force accounted for at least as much of the total energy expenditure as a cost for work. Series elasticity reduces the need for active work, but energy must still be expended for force production. PMID:21212245

  9. Comparison of different deep learning approaches for parotid gland segmentation from CT images

    NASA Astrophysics Data System (ADS)

    Hänsch, Annika; Schwier, Michael; Gass, Tobias; Morgas, Tomasz; Haas, Benjamin; Klein, Jan; Hahn, Horst K.

    2018-02-01

    The segmentation of target structures and organs at risk is a crucial and very time-consuming step in radiotherapy planning. Good automatic methods can significantly reduce the time clinicians have to spend on this task. Due to its variability in shape and often low contrast to surrounding structures, segmentation of the parotid gland is especially challenging. Motivated by the recent success of deep learning, we study different deep learning approaches for parotid gland segmentation. Particularly, we compare 2D, 2D ensemble and 3D U-Net approaches and find that the 2D U-Net ensemble yields the best results with a mean Dice score of 0.817 on our test data. The ensemble approach reduces false positives without the need for an automatic region of interest detection. We also apply our trained 2D U-Net ensemble to segment the test data of the 2015 MICCAI head and neck auto-segmentation challenge. With a mean Dice score of 0.861, our classifier exceeds the highest mean score in the challenge. This shows that the method generalizes well onto data from independent sites. Since appropriate reference annotations are essential for training but often difficult and expensive to obtain, it is important to know how many samples are needed to properly train a neural network. We evaluate the classifier performance after training with differently sized training sets (50-450) and find that 250 cases (without using extensive data augmentation) are sufficient to obtain good results with the 2D ensemble. Adding more samples does not significantly improve the Dice score of the segmentations.

  10. Focus on the post-DVD formats

    NASA Astrophysics Data System (ADS)

    He, Hong; Wei, Jingsong

    2005-09-01

    As the digital TV(DTV) technologies are developing rapidly on its standard system, hardware desktop, software model, and interfaces between DTV and the home net, High Definition TV (HDTV) program worldwide broadcasting is scheduled. Enjoying high quality TV program at home is not a far-off dream for people. As for the main recording media, what would the main stream be for the optical storage technology to meet the HDTV requirements is becoming a great concern. At present, there are a few kinds of Post-DVD formats which are competing on technology, standard and market. Here we give a review on the co-existing Post-DVD formats in the world. We will discuss on the basic parameters for optical disk, video /audio coding strategy and system performance for HDTV program.

  11. The Impact of Experience and Technology Change on Task-Technology Fit of a Collaborative Technology

    ERIC Educational Resources Information Center

    Iversen, Jakob H.; Eierman, Michael A.

    2018-01-01

    This study continues a long running effort to examine collaborative writing and editing tools and the factors that impact Task-Technology Fit and Technology Acceptance. Previous studies found that MS Word/email performed better than technologies such as Twiki, Google Docs, and Office Live. The current study seeks to examine specifically the impact…

  12. The Impact of Communication Mode and Task Complexity on Small Group Performance and Member Satisfaction.

    ERIC Educational Resources Information Center

    Carey, Jane M.; Kacmar, Charles, J.

    1997-01-01

    It is often presumed that software technology will increase group productivity, but this may not be the case. Examines the impact of technology on time-to-complete-task, member satisfaction, perceived information load, number of contributing transactions, and task complexity. Three appendices provide examples of complex and simple tasks and the…

  13. Pilot Richards on middeck wearing University of Missouri 'MIZZOU' t-shirt

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Pilot Richard N. Richards takes a moment from middeck tasks to display his University of Missouri 'MIZZOU' t-shirt. Behind Richards are the forward middeck lockers, a net stowage bag filled with clothing, and the sleep restraints fastened to the starboard wall.

  14. Neural networks as a control methodology

    NASA Technical Reports Server (NTRS)

    Mccullough, Claire L.

    1990-01-01

    While conventional computers must be programmed in a logical fashion by a person who thoroughly understands the task to be performed, the motivation behind neural networks is to develop machines which can train themselves to perform tasks, using available information about desired system behavior and learning from experience. There are three goals of this fellowship program: (1) to evaluate various neural net methods and generate computer software to implement those deemed most promising on a personal computer equipped with Matlab; (2) to evaluate methods currently in the professional literature for system control using neural nets to choose those most applicable to control of flexible structures; and (3) to apply the control strategies chosen in (2) to a computer simulation of a test article, the Control Structures Interaction Suitcase Demonstrator, which is a portable system consisting of a small flexible beam driven by a torque motor and mounted on springs tuned to the first flexible mode of the beam. Results of each are discussed.

  15. Planetary entry, descent, and landing technologies

    NASA Astrophysics Data System (ADS)

    Pichkhadze, K.; Vorontsov, V.; Polyakov, A.; Ivankov, A.; Taalas, P.; Pellinen, R.; Harri, A.-M.; Linkin, V.

    2003-04-01

    Martian meteorological lander (MML) is intended for landing on the Martian surface in order to monitor the atmosphere at landing point for one Martian year. MMLs shall become the basic elements of a global network of meteorological mini-landers, observing the dynamics of changes of the atmospheric parameters on the Red Planet. The MML main scientific tasks are as follows: (1) Study of vertical structure of the Martian atmosphere throughout the MML descent; (2) On-surface meteorological observations for one Martian year. One of the essential factors influencing the lander's design is its entry, descent, and landing (EDL) sequence. During Phase A of the MML development, five different options for the lander's design were carefully analyzed. All of these options ensure the accomplishment of the above-mentioned scientific tasks with high effectiveness. CONCEPT A (conventional approach): Two lander options (with a parachute system + airbag and an inflatable airbrake + airbag) were analyzed. They are similar in terms of fulfilling braking phases and completely analogous in landing by means of airbags. CONCEPT B (innovative approach): Three lander options were analyzed. The distinguishing feature is the presence of inflatable braking units (IBU) in their configurations. SELECTED OPTION (innovative approach): Incorporating a unique design approach and modern technologies, the selected option of the lander represents a combination of the options analyzed in the framework of Concept B study. Currently, the selected lander option undergoes systems testing (Phase D1). Several MMLs can be delivered to Mars in frameworks of various missions as primary or piggybacking payload: (1) USA-led "Mars Scout" (2007); (2) France-led "NetLander" (2007/2009); (3) Russia-led "Mars-Deimos-Phobos sample return" (2007); (4) Independent mission (currently under preliminary study); etc.

  16. Creating an Assured Joint DOD and Interagency Interoperable Net-Centric Enterprise. Report of the Defense Science Board Task Force on Achieving Interoperability in a Net-Centric Environment

    DTIC Science & Technology

    2009-03-01

    policy, elliptic curve public key cryptography using the 256 -bit prime modulus elliptic curve as specified in FIPS-186-2 and SHA - 256 are appropriate for...publications/fips/fips186-2/fips186-2-change1.pdf 76 I P ART I . CH A PT E R 5 Hashing via the Secure Hash Algorithm (using SHA - 256 and...lithography and processing techniques. Field programmable gate arrays ( FPGAs ) are a chip design of interest. These devices are extensively used in

  17. Text mining and expert curation to develop a database on psychiatric diseases and their genes

    PubMed Central

    Gutiérrez-Sacristán, Alba; Bravo, Àlex; Portero-Tresserra, Marta; Valverde, Olga; Armario, Antonio; Blanco-Gandía, M.C.; Farré, Adriana; Fernández-Ibarrondo, Lierni; Fonseca, Francina; Giraldo, Jesús; Leis, Angela; Mané, Anna; Mayer, M.A.; Montagud-Romero, Sandra; Nadal, Roser; Ortiz, Jordi; Pavon, Francisco Javier; Perez, Ezequiel Jesús; Rodríguez-Arias, Marta; Serrano, Antonia; Torrens, Marta; Warnault, Vincent; Sanz, Ferran

    2017-01-01

    Abstract Psychiatric disorders constitute one of the main causes of disability worldwide. During the past years, considerable research has been conducted on the genetic architecture of such diseases, although little understanding of their etiology has been achieved. The difficulty to access up-to-date, relevant genotype-phenotype information has hampered the application of this wealth of knowledge to translational research and clinical practice in order to improve diagnosis and treatment of psychiatric patients. PsyGeNET (http://www.psygenet.org/) has been developed with the aim of supporting research on the genetic architecture of psychiatric diseases, by providing integrated and structured accessibility to their genotype–phenotype association data, together with analysis and visualization tools. In this article, we describe the protocol developed for the sustainable update of this knowledge resource. It includes the recruitment of a team of domain experts in order to perform the curation of the data extracted by text mining. Annotation guidelines and a web-based annotation tool were developed to support the curators’ tasks. A curation workflow was designed including a pilot phase and two rounds of curation and analysis phases. Negative evidence from the literature on gene–disease associations (GDAs) was taken into account in the curation process. We report the results of the application of this workflow to the curation of GDAs for PsyGeNET, including the analysis of the inter-annotator agreement and suggest this model as a suitable approach for the sustainable development and update of knowledge resources. Database URL: http://www.psygenet.org PsyGeNET corpus: http://www.psygenet.org/ds/PsyGeNET/results/psygenetCorpus.tar PMID:29220439

  18. Global potential for and limits to widespread implementation of bioenergy with carbon capture and storage (BECCS)

    NASA Astrophysics Data System (ADS)

    Smith, P.

    2017-12-01

    A majority of Integrated Assessment Models (IAMs) use, often very significant amounts (20 Gt CO2e/yr), of negative emissions technologies (NETs) to reach a 2°C target by 2100, among which BECCS is often selected as the most cost-effective NET. Given that most models fail to reach a 2°C target without NETs, it seems impossible that the aspirational target of 1.5°C of the Paris Agreement could be met without NETs, with BECCS suggested as a major NET. It is therefore essential that the potential, feasibility and impacts of BECCS are better defined. Potential limits to widespread application of BECCS could include land competition, greenhouse gas emissions, physical climate feedbacks (e.g. albedo), water requirements, nutrient use, energy and cost, all of which are explored in this presentation, and compared to the impacts of other land-based NETs.

  19. Implementation of NFC technology for industrial applications: case flexible production

    NASA Astrophysics Data System (ADS)

    Sallinen, Mikko; Strömmer, Esko; Ylisaukko-oja, Arto

    2007-09-01

    Near Field communication (NFC) technology enables a flexible short range communication. It has large amount of envisaged applications in consumer, welfare and industrial sector. Compared with other short range communication technologies such as Bluetooth or Wibree it provides advantages that we will introduce in this paper. In this paper, we present an example of applying NFC technology to industrial application where simple tasks can be automatized and industrial assembly process can be improved radically by replacing manual paperwork and increasing trace of the products during the production.

  20. Situated, strategic, and AI-Enhanced technology introduction to healthcare.

    PubMed

    Bushko, Renata G

    2005-01-01

    We work hard on creating AI-wings for physicians to let them fly higher and faster in diagnosing patients--a task that physicians do not want to automate. What we do not work hard on is determining the ENVIRONMENT in which physicians' AI wings are supposed to function. It seems to be a job for social/business analysts that have their own separate kingdom. For the sake of all of us (potential patients!) social/business consultants and their methodologies should not be treated as a separate kingdom. The most urgent task is to achieve synergy between (1) AI/Fuzzy/Neural research, (2) Applied medical AI, (3) Social/Business research on medical institutions. We need this synergy in order to assure humanistic medical technology; technology flexible and sensitive enough to facilitate healthcare work while leaving space for human pride and creativity. In order to achieve humanistic technology, designers should consider the impact of technological breakthroughs on the organizations in which this technology will function and the nature of work of humans destined to use this technology. Situated (different for each organization), Strategic (based on an in-depth knowledge of Healthcare business), and AI-Enhanced (ended with a dynamic model) method for introducing technology to Healthcare allows identifying areas where technology can make medical work easier. Using this method before automating human work will get us closer to the ideal where there is no discontinuity between design and use of programs; where the technology matches users' needs perfectly--the world with humanistic technology and healthcare workers with AI-wings.

  1. ROOT.NET: Using ROOT from .NET languages like C# and F#

    NASA Astrophysics Data System (ADS)

    Watts, G.

    2012-12-01

    ROOT.NET provides an interface between Microsoft's Common Language Runtime (CLR) and .NET technology and the ubiquitous particle physics analysis tool, ROOT. ROOT.NET automatically generates a series of efficient wrappers around the ROOT API. Unlike pyROOT, these wrappers are statically typed and so are highly efficient as compared to the Python wrappers. The connection to .NET means that one gains access to the full series of languages developed for the CLR including functional languages like F# (based on OCaml). Many features that make ROOT objects work well in the .NET world are added (properties, IEnumerable interface, LINQ compatibility, etc.). Dynamic languages based on the CLR can be used as well, of course (Python, for example). Additionally it is now possible to access ROOT objects that are unknown to the translation tool. This poster will describe the techniques used to effect this translation, along with performance comparisons, and examples. All described source code is posted on the open source site CodePlex.

  2. Study on launch scheme of space-net capturing system.

    PubMed

    Gao, Qingyu; Zhang, Qingbin; Feng, Zhiwei; Tang, Qiangang

    2017-01-01

    With the continuous progress in active debris-removal technology, scientists are increasingly concerned about the concept of space-net capturing system. The space-net capturing system is a long-range-launch flexible capture system, which has great potential to capture non-cooperative targets such as inactive satellites and upper stages. In this work, the launch scheme is studied by experiment and simulation, including two-step ejection and multi-point-traction analyses. The numerical model of the tether/net is based on finite element method and is verified by full-scale ground experiment. The results of the ground experiment and numerical simulation show that the two-step ejection and six-point traction scheme of the space-net system is superior to the traditional one-step ejection and four-point traction launch scheme.

  3. Technology transfer in software engineering

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.

    1989-01-01

    The University of Houston-Clear Lake is the prime contractor for the AdaNET Research Project under the direction of NASA Johnson Space Center. AdaNET was established to promote the principles of software engineering to the software development industry. AdaNET will contain not only environments and tools, but also concepts, principles, models, standards, guidelines and practices. Initially, AdaNET will serve clients from the U.S. government and private industry who are working in software development. It will seek new clients from those who have not yet adopted the principles and practices of software engineering. Some of the goals of AdaNET are to become known as an objective, authoritative source of new software engineering information and parts, to provide easy access to information and parts, and to keep abreast of innovations in the field.

  4. Study on launch scheme of space-net capturing system

    PubMed Central

    Zhang, Qingbin; Feng, Zhiwei; Tang, Qiangang

    2017-01-01

    With the continuous progress in active debris-removal technology, scientists are increasingly concerned about the concept of space-net capturing system. The space-net capturing system is a long-range-launch flexible capture system, which has great potential to capture non-cooperative targets such as inactive satellites and upper stages. In this work, the launch scheme is studied by experiment and simulation, including two-step ejection and multi-point-traction analyses. The numerical model of the tether/net is based on finite element method and is verified by full-scale ground experiment. The results of the ground experiment and numerical simulation show that the two-step ejection and six-point traction scheme of the space-net system is superior to the traditional one-step ejection and four-point traction launch scheme. PMID:28877187

  5. NASA/ESTO investments in remote sensing technologies (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Babu, Sachidananda R.

    2017-02-01

    For more then 18 years NASA Earth Science Technology Office has been investing in remote sensing technologies. During this period ESTO has invested in more then 900 tasks. These tasks are managed under multiple programs like Instrument Incubator Program (IIP), Advanced Component Technology (ACT), Advanced Information Systems Technology (AIST), In-Space Validation of Earth Science Technologies (InVEST), Sustainable Land Imaging - Technology (SLI-T) and others. This covers the whole spectrum of technologies from component to full up satellite in space and software. Over the years many of these technologies have been infused into space missions like Aquarius, SMAP, CYGNSS, SWOT, TEMPO and others. Over the years ESTO is actively investing in Infrared sensor technologies for space applications. Recent investments have been for SLI-T and InVEST program. On these tasks technology development is from simple Bolometers to Advanced Photonic waveguide based spectrometers. Some of the details on these missions and technologies will be presented.

  6. ESTO Investments in Innovative Sensor Technologies for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Babu, Sachidananda R.

    2017-01-01

    For more then 18 years NASA Earth Science Technology Office has been investing in remote sensing technologies. During this period ESTO has invested in more then 900 tasks. These tasks are managed under multiple programs like Instrument Incubator Program (IIP), Advanced Component Technology (ACT), Advanced Information Systems Technology (AIST), In-Space Validation of Earth Science Technologies (InVEST), Sustainable Land Imaging - Technology (SLI-T) and others. This covers the whole spectrum of technologies from component to full up satellite in space and software. Over the years many of these technologies have been infused into space missions like Aquarius, SMAP, CYGNSS, SWOT, TEMPO and others. Over the years ESTO is actively investing in Infrared sensor technologies for space applications. Recent investments have been for SLI-T and InVEST program. On these tasks technology development is from simple Bolometers to Advanced Photonic waveguide based spectrometers. Some of the details on these missions and technologies will be presented.

  7. Nonword repetition priming in lexical decision reverses as a function of study task and speed stress.

    PubMed

    Zeelenberg, René; Wagenmakers, Eric-Jan; Shiffrin, Richard M

    2004-01-01

    The authors argue that nonword repetition priming in lexical decision is the net result of 2 opposing processes. First, repeating nonwords in the lexical decision task results in the storage of a memory trace containing the interpretation that the letter string is a nonword; retrieval of this trace leads to an increase in performance for repeated nonwords. Second, nonword repetition results in increased familiarity, making the nonword more "wordlike," leading to a decrease in performance. Consistent with this dual-process account, Experiment 1 showed a facilitatory effect for nonwords studied in a lexical decision task but an inhibitory effect for nonwords studied in a letter-height task. Experiment 2 showed inhibitory nonword repetition priming for participants tested under speed-stress instructions. ((c) 2004 APA, all rights reserved)

  8. Net Generation's Learning Styles in Nursing Education.

    PubMed

    Christodoulou, Eleni; Kalokairinou, Athina

    2015-01-01

    Numerous surveys have confirmed that emerging technologies and Web 2.0 tools have been a defining feature in the lives of current students, estimating that there is a fundamental shift in the way young people communicate, socialize and learn. Nursing students in higher education are characterized as digital literate with distinct traits which influence their learning styles. Millennials exhibit distinct learning preferences such as teamwork, experiential activities, structure, instant feedback and technology integration. Higher education institutions should be aware of the implications of the Net Generation coming to university and be prepared to meet their expectations and learning needs.

  9. Noninvasive Electroencephalogram Based Control of a Robotic Arm for Reach and Grasp Tasks

    NASA Astrophysics Data System (ADS)

    Meng, Jianjun; Zhang, Shuying; Bekyo, Angeliki; Olsoe, Jaron; Baxter, Bryan; He, Bin

    2016-12-01

    Brain-computer interface (BCI) technologies aim to provide a bridge between the human brain and external devices. Prior research using non-invasive BCI to control virtual objects, such as computer cursors and virtual helicopters, and real-world objects, such as wheelchairs and quadcopters, has demonstrated the promise of BCI technologies. However, controlling a robotic arm to complete reach-and-grasp tasks efficiently using non-invasive BCI has yet to be shown. In this study, we found that a group of 13 human subjects could willingly modulate brain activity to control a robotic arm with high accuracy for performing tasks requiring multiple degrees of freedom by combination of two sequential low dimensional controls. Subjects were able to effectively control reaching of the robotic arm through modulation of their brain rhythms within the span of only a few training sessions and maintained the ability to control the robotic arm over multiple months. Our results demonstrate the viability of human operation of prosthetic limbs using non-invasive BCI technology.

  10. Case study: Optimizing fault model input parameters using bio-inspired algorithms

    NASA Astrophysics Data System (ADS)

    Plucar, Jan; Grunt, Onřej; Zelinka, Ivan

    2017-07-01

    We present a case study that demonstrates a bio-inspired approach in the process of finding optimal parameters for GSM fault model. This model is constructed using Petri Nets approach it represents dynamic model of GSM network environment in the suburban areas of Ostrava city (Czech Republic). We have been faced with a task of finding optimal parameters for an application that requires high amount of data transfers between the application itself and secure servers located in datacenter. In order to find the optimal set of parameters we employ bio-inspired algorithms such as Differential Evolution (DE) or Self Organizing Migrating Algorithm (SOMA). In this paper we present use of these algorithms, compare results and judge their performance in fault probability mitigation.

  11. Modernising Education and Training: Mobilising Technology for Learning

    ERIC Educational Resources Information Center

    Attewell, Jill; Savill-Smith, Carol; Douch, Rebecca; Parker, Guy

    2010-01-01

    In recent years there have been amazing advances in consumer technology. The Mobile Learning Network (MoLeNET) initiative has enabled colleges and schools to harness some of this technology in order to modernise aspects of teaching, learning and training. The result has been improvements in learner engagement, retention, achievement and…

  12. Quantitative evaluation of three advanced laparoscopic viewing technologies: a stereo endoscope, an image projection display, and a TFT display.

    PubMed

    Wentink, M; Jakimowicz, J J; Vos, L M; Meijer, D W; Wieringa, P A

    2002-08-01

    Compared to open surgery, minimally invasive surgery (MIS) relies heavily on advanced technology, such as endoscopic viewing systems and innovative instruments. The aim of the study was to objectively compare three technologically advanced laparoscopic viewing systems with the standard viewing system currently used in most Dutch hospitals. We evaluated the following advanced laparoscopic viewing systems: a Thin Film Transistor (TFT) display, a stereo endoscope, and an image projection display. The standard viewing system was comprised of a monocular endoscope and a high-resolution monitor. Task completion time served as the measure of performance. Eight surgeons with laparoscopic experience participated in the experiment. The average task time was significantly greater (p <0.05) with the stereo viewing system than with the standard viewing system. The average task times with the TFT display and the image projection display did not differ significantly from the standard viewing system. Although the stereo viewing system promises improved depth perception and the TFT and image projection displays are supposed to improve hand-eye coordination, none of these systems provided better task performance than the standard viewing system in this pelvi-trainer experiment.

  13. 3min. poster presentations of B01

    NASA Astrophysics Data System (ADS)

    Foing, Bernard H.

    We give a report on recommendations from ILEWG International conferences held at Cape Canaveral in 2008 (ICEUM10), and in Beijing in May 2010 with IAF (GLUC -ICEUM11). We discuss the different rationale for Moon exploration. Priorities for scientific investigations include: clues on the formation and evolution of rocky planets, accretion and bombardment in the inner solar system, comparative planetology processes (tectonic, volcanic, impact cratering, volatile delivery), historical records, astrobiology, survival of organics; past, present and future life. The ILEWG technology task group set priorities for the advancement of instrumenta-tion: Remote sensing miniaturised instruments; Surface geophysical and geochemistry package; Instrument deployment and robotic arm, nano-rover, sampling, drilling; Sample finder and collector. Regional mobility rover; Autonomy and Navigation; Artificially intelligent robots, Complex systems. The ILEWG ExogeoLab pilot project was developed as support for instru-ments, landers, rovers,and preparation for cooperative robotic village. The ILEWG lunar base task group looked at minimal design concepts, technologies in robotic and human exploration with Tele control, telepresence, virtual reality; Man-Machine interface and performances. The ILEWG ExoHab pilot project has been started with support from agencies and partners. We discuss ILEWG terrestrial Moon-Mars campaigns for validation of technologies, research and human operations. We indicate how Moon-Mars Exploration can inspire solutions to global Earth sustained development: In-Situ Utilisation of resources; Establishment of permanent robotic infrastructures, Environmental protection aspects; Life sciences laboratories; Support to human exploration. Co-Authors: ILEWG Task Groups on: Science, Technology, Robotic village, Lunar Bases , Commercial and Societal aspects, Roadmap synergies with other programmes, Public en-gagemnet and Outreach, Young Lunar Explorers.

  14. Virtual Reality-Based Center of Mass-Assisted Personalized Balance Training System

    PubMed Central

    Kumar, Deepesh; González, Alejandro; Das, Abhijit; Dutta, Anirban; Fraisse, Philippe; Hayashibe, Mitsuhiro; Lahiri, Uttama

    2018-01-01

    Poststroke hemiplegic patients often show altered weight distribution with balance disorders, increasing their risk of fall. Conventional balance training, though powerful, suffers from scarcity of trained therapists, frequent visits to clinics to get therapy, one-on-one therapy sessions, and monotony of repetitive exercise tasks. Thus, technology-assisted balance rehabilitation can be an alternative solution. Here, we chose virtual reality as a technology-based platform to develop motivating balance tasks. This platform was augmented with off-the-shelf available sensors such as Nintendo Wii balance board and Kinect to estimate one’s center of mass (CoM). The virtual reality-based CoM-assisted balance tasks (Virtual CoMBaT) was designed to be adaptive to one’s individualized weight-shifting capability quantified through CoM displacement. Participants were asked to interact with Virtual CoMBaT that offered tasks of varying challenge levels while adhering to ankle strategy for weight shifting. To facilitate the patients to use ankle strategy during weight-shifting, we designed a heel lift detection module. A usability study was carried out with 12 hemiplegic patients. Results indicate the potential of our system to contribute to improving one’s overall performance in balance-related tasks belonging to different difficulty levels. PMID:29359128

  15. Angular Impulse and Balance Regulation During the Golf Swing.

    PubMed

    Peterson, Travis J; Wilcox, Rand R; McNitt-Gray, Jill L

    2016-08-01

    Our aim was to determine how skilled players regulate linear and angular impulse while maintaining balance during the golf swing. Eleven highly-skilled golf players performed swings with a 6-iron and driver. Components contributing to linear and angular impulse generated by the rear and target legs (resultant horizontal reaction force [RFh], RFh-angle, and moment arm) were quantified and compared across the group and within a player (α = .05). Net angular impulse generated by both the rear and target legs was greater for the driver than the 6-iron. Mechanisms used to regulate angular impulse generation between clubs varied across players and required coordination between the legs. Increases in net angular impulse with a driver involved increases in target leg RFh. Rear leg RFh-angle was maintained between clubs whereas target leg RFh became more aligned with the target line. Net linear impulse perpendicular to the target line remained near zero, preserving balance, while net linear impulse along the target line decreased in magnitude. These results indicate that the net angular impulse was regulated between clubs by coordinating force generation of the rear and target legs while sustaining balance throughout the task.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vimmerstedt, Laura; Brown, Austin; Newes, Emily

    The transportation sector is changing, influenced by concurrent, ongoing, dynamic trends that could dramatically affect the future energy landscape, including effects on the potential for greenhouse gas emissions reductions. Battery cost reductions and improved performance coupled with a growing number of electric vehicle model offerings are enabling greater battery electric vehicle market penetration, and advances in fuel cell technology and decreases in hydrogen production costs are leading to initial fuel cell vehicle offerings. Radically more efficient vehicles based on both conventional and new drivetrain technologies reduce greenhouse gas emissions per vehicle-mile. Net impacts also depend on the energy sources usedmore » for propulsion, and these are changing with increased use of renewable energy and unconventional fossil fuel resources. Connected and automated vehicles are emerging for personal and freight transportation systems and could increase use of low- or non-emitting technologies and systems; however, the net effects of automation on greenhouse gas emissions are uncertain. The longstanding trend of an annual increase in transportation demand has reversed for personal vehicle miles traveled in recent years, demonstrating the possibility of lower-travel future scenarios. Finally, advanced biofuel pathways have continued to develop, highlighting low-carbon and in some cases carbon-negative fuel pathways. We discuss the potential for transformative reductions in petroleum use and greenhouse gas emissions through these emerging transportation-sector technologies and trends and present a Clean Transportation Sector Initiative scenario for such reductions, which are summarized in Table ES-1.« less

  17. Renewal of K-NET (National Strong-motion Observation Network of Japan)

    NASA Astrophysics Data System (ADS)

    Kunugi, T.; Fujiwara, H.; Aoi, S.; Adachi, S.

    2004-12-01

    The National Research Institute for Earth Science and Disaster Prevention (NIED) operates K-NET (Kyoshin Network), the national strong-motion observation network, which evenly covers the whole of Japan at intervals of 25 km on average. K-NET was constructed after the Hyogoken-Nambu (Kobe) earthquake in January 1995, and began operation in June 1996. Thus, eight years have passed since K-NET started, and large amounts of strong-motion records have been obtained. As technology has progressed and new technologies have become available, NIED has developed a new K-NET with improved functionality. New seismographs have been installed at 443 observatories mainly in southwestern Japan where there is a risk of strong-motion due to the Nankai and Tonankai earthquakes. The new system went into operation in June 2004, although seismographs have still to be replaced in other areas. The new seismograph (K-NET02) consists of a sensor module, a measurement module and a communication module. A UPS, a GPS antenna and a dial-up router are also installed together with a K-NET02. A triaxial accelerometer, FBA-ES-DECK (Kinemetrics Inc.) is built into the sensor module. The measurement module functions as a conventional strong-motion seismograph for high-precision observation. The communication module can perform sophisticated processes, such as calculation of the Japan Meteorological Agency (JMA) seismic intensity, continuous recording of data and near real-time data transmission. It connects to the Data Management Center (DMC) using an ISDN line. In case of a power failure, the measurement module can control the power supply to the router and the communication module to conserve battery power. One of the main features of K-NET02 is a function for processing JMA seismic intensity. K-NET02 functions as a proper seismic intensity meter that complies with the official requirements of JMA, although the old strong-motion seismograph (K-NET95) does not calculate seismic intensity. Another feature is near real-time data transmission. When a K-NET02 detects a strong-motion, it can automatically connect to the DMC in 2 to 5 seconds and then transmits seismic data. Furthermore, the full-scale is improved from 2000 gals to 4000 gals and the dynamic range of AD conversion is more than 132 dB. Strong-motion records of the new K-NET are available at: http://www.kyoshin.bosai.go.jp/

  18. Examining the Impact of Off-Task Multi-Tasking with Technology on Real-Time Classroom Learning

    ERIC Educational Resources Information Center

    Wood, Eileen; Zivcakova, Lucia; Gentile, Petrice; Archer, Karin; De Pasquale, Domenica; Nosko, Amanda

    2012-01-01

    The purpose of the present study was to examine the impact of multi-tasking with digital technologies while attempting to learn from real-time classroom lectures in a university setting. Four digitally-based multi-tasking activities (texting using a cell-phone, emailing, MSN messaging and Facebook[TM]) were compared to 3 control groups…

  19. NET-VISA, a Bayesian method next-generation automatic association software. Latest developments and operational assessment.

    NASA Astrophysics Data System (ADS)

    Le Bras, Ronan; Kushida, Noriyuki; Mialle, Pierrick; Tomuta, Elena; Arora, Nimar

    2017-04-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing a Bayesian method and software to perform the key step of automatic association of seismological, hydroacoustic, and infrasound (SHI) parametric data. In our preliminary testing in the CTBTO, NET_VISA shows much better performance than its currently operating automatic association module, with a rate for automatic events matching the analyst-reviewed events increased by 10%, signifying that the percentage of missed events is lowered by 40%. Initial tests involving analysts also showed that the new software will complete the automatic bulletins of the CTBTO by adding previously missed events. Because products by the CTBTO are also widely distributed to its member States as well as throughout the seismological community, the introduction of a new technology must be carried out carefully, and the first step of operational integration is to first use NET-VISA results within the interactive analysts' software so that the analysts can check the robustness of the Bayesian approach. We report on the latest results both on the progress for automatic processing and for the initial introduction of NET-VISA results in the analyst review process

  20. Technology for Sustained Supersonic Combustion Task Order 0006: Scramjet Research with Flight-Like Inflow Conditions

    DTIC Science & Technology

    2013-01-01

    flight vehicle . Many facilities are not large enough to perform free-jet testing of scramjet engines which include an inlet. Rather, testing is often...AFRL-RQ-WP-TR-2013-0029 TECHNOLOGY FOR SUSTAINED SUPERSONIC COMBUSTION Task Order 0006: Scramjet Research with Flight-Like Inflow...TITLE AND SUBTITLE TECHNOLOGY FOR SUSTAINED SUPERSONIC COMBUSTION Task Order 0006: Scramjet Research with Flight-Like Inflow Conditions 5a

  1. ATDRS payload technology R & D

    NASA Technical Reports Server (NTRS)

    Anzic, G.; Connolly, D. J.; Fujikawa, G.; Andro, M.; Kunath, R. R.; Sharp, G. R.

    1990-01-01

    Four technology development tasks were chosen to reduce (or at least better understand) the technology risks associated with proposed approaches to Advanced Tracking and Data Relay Satellite (ATDRS). The four tasks relate to a Tri-Band Antenna feed system, a Digital Beamforming System for the S Band Multiple-Access System (SMA), an SMA Phased Array Antenna, and a Configuration Thermal/Mechanical Analysis task. The objective, approach, and status of each are discussed.

  2. ATDRS payload technology research and development

    NASA Technical Reports Server (NTRS)

    Anzic, G.; Connolly, D. J.; Fujikawa, G.; Andro, M.; Kunath, R. R.; Sharp, G. R.

    1990-01-01

    Four technology development tasks were chosen to reduce (or at least better understand) the technology risks associated with proposed approaches to Advanced Tracking and Data Relay Satellite (ATDRS). The four tasks relate to a Tri-Band Antenna feed system, a Digital Beamforming System for the S Band Multiple Access System (SMA), an SMA Phased Array Antenna, and a Configuration Thermal/Mechanical Analysis task. The objective, approach, and status of each are discussed.

  3. ATDRS payload technology R & D

    NASA Astrophysics Data System (ADS)

    Anzic, G.; Connolly, D. J.; Fujikawa, G.; Andro, M.; Kunath, R. R.; Sharp, G. R.

    Four technology development tasks were chosen to reduce (or at least better understand) the technology risks associated with proposed approaches to Advanced Tracking and Data Relay Satellite (ATDRS). The four tasks relate to a Tri-Band Antenna feed system, a Digital Beamforming System for the S Band Multiple-Access System (SMA), an SMA Phased Array Antenna, and a Configuration Thermal/Mechanical Analysis task. The objective, approach, and status of each are discussed.

  4. Intelligent Controls for Net-Zero Energy Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Haorong; Cho, Yong; Peng, Dongming

    2011-10-30

    The goal of this project is to develop and demonstrate enabling technologies that can empower homeowners to convert their homes into net-zero energy buildings in a cost-effective manner. The project objectives and expected outcomes are as follows: • To develop rapid and scalable building information collection and modeling technologies that can obtain and process “as-built” building information in an automated or semiautomated manner. • To identify low-cost measurements and develop low-cost virtual sensors that can monitor building operations in a plug-n-play and low-cost manner. • To integrate and demonstrate low-cost building information modeling (BIM) technologies. • To develop decision supportmore » tools which can empower building owners to perform energy auditing and retrofit analysis. • To develop and demonstrate low-cost automated diagnostics and optimal control technologies which can improve building energy efficiency in a continual manner.« less

  5. Web 2 Technologies for Net Native Language Learners: A "Social CALL"

    ERIC Educational Resources Information Center

    Karpati, Andrea

    2009-01-01

    In order to make optimal educational use of social spaces offered by thousands of international communities in the second generation web applications termed Web 2 or Social Web, ICT competences as well as social skills are needed for both teachers and learners. The paper outlines differences in competence structures of Net Natives (who came of age…

  6. Little Experience with ICT: Are They Really the Net Generation Student-Teachers?

    ERIC Educational Resources Information Center

    So, Hyo-Jeong; Choi, Hyungshin; Lim, Wei Ying; Xiong, Yao

    2012-01-01

    The aim of this study is to investigate the complexity of past experiences with ICT, pedagogical beliefs, and attitude toward ICT in education that the Net Generation student teachers have about their intention to teach and learn with technology. This study has a particular focus on their lived experiences as school students where ICT related…

  7. E-Learning Technologies: Employing Matlab Web Server to Facilitate the Education of Mathematical Programming

    ERIC Educational Resources Information Center

    Karagiannis, P.; Markelis, I.; Paparrizos, K.; Samaras, N.; Sifaleras, A.

    2006-01-01

    This paper presents new web-based educational software (webNetPro) for "Linear Network Programming." It includes many algorithms for "Network Optimization" problems, such as shortest path problems, minimum spanning tree problems, maximum flow problems and other search algorithms. Therefore, webNetPro can assist the teaching process of courses such…

  8. Back to the Future: The Practicality of Using Microsoft NetMeeting for Effective Distance Tutoring

    ERIC Educational Resources Information Center

    Legutko, Robert S.

    2007-01-01

    Background: The idea for attempting a distance tutoring project between university tutors and elementary school students using Microsoft NetMeeting was conceived: (a) to provide a new experience mentoring children for university students pursuing a teaching certificate, (b) for university students to utilize technology in pedagogy, (c) as an…

  9. Cross-Modal Retrieval With CNN Visual Features: A New Baseline.

    PubMed

    Wei, Yunchao; Zhao, Yao; Lu, Canyi; Wei, Shikui; Liu, Luoqi; Zhu, Zhenfeng; Yan, Shuicheng

    2017-02-01

    Recently, convolutional neural network (CNN) visual features have demonstrated their powerful ability as a universal representation for various recognition tasks. In this paper, cross-modal retrieval with CNN visual features is implemented with several classic methods. Specifically, off-the-shelf CNN visual features are extracted from the CNN model, which is pretrained on ImageNet with more than one million images from 1000 object categories, as a generic image representation to tackle cross-modal retrieval. To further enhance the representational ability of CNN visual features, based on the pretrained CNN model on ImageNet, a fine-tuning step is performed by using the open source Caffe CNN library for each target data set. Besides, we propose a deep semantic matching method to address the cross-modal retrieval problem with respect to samples which are annotated with one or multiple labels. Extensive experiments on five popular publicly available data sets well demonstrate the superiority of CNN visual features for cross-modal retrieval.

  10. Negative emissions technologies and carbon capture and storage to achieve the Paris Agreement commitments.

    PubMed

    Haszeldine, R Stuart; Flude, Stephanie; Johnson, Gareth; Scott, Vivian

    2018-05-13

    How will the global atmosphere and climate be protected? Achieving net-zero CO 2 emissions will require carbon capture and storage (CCS) to reduce current GHG emission rates, and negative emissions technology (NET) to recapture previously emitted greenhouse gases. Delivering NET requires radical cost and regulatory innovation to impact on climate mitigation. Present NET exemplars are few, are at small-scale and not deployable within a decade, with the exception of rock weathering, or direct injection of CO 2 into selected ocean water masses. To keep warming less than 2°C, bioenergy with CCS (BECCS) has been modelled but does not yet exist at industrial scale. CCS already exists in many forms and at low cost. However, CCS has no political drivers to enforce its deployment. We make a new analysis of all global CCS projects and model the build rate out to 2050, deducing this is 100 times too slow. Our projection to 2050 captures just 700 Mt CO 2  yr -1 , not the minimum 6000 Mt CO 2  yr -1 required to meet the 2°C target. Hence new policies are needed to incentivize commercial CCS. A first urgent action for all countries is to commercially assess their CO 2 storage. A second simple action is to assign a Certificate of CO 2 Storage onto producers of fossil carbon, mandating a progressively increasing proportion of CO 2 to be stored. No CCS means no 2°C.This article is part of the theme issue 'The Paris Agreement: understanding the physical and social challenges for a warming world of 1.5°C above pre-industrial levels'. © 2018 The Author(s).

  11. Negative emissions technologies and carbon capture and storage to achieve the Paris Agreement commitments

    NASA Astrophysics Data System (ADS)

    Haszeldine, R. Stuart; Flude, Stephanie; Johnson, Gareth; Scott, Vivian

    2018-05-01

    How will the global atmosphere and climate be protected? Achieving net-zero CO2 emissions will require carbon capture and storage (CCS) to reduce current GHG emission rates, and negative emissions technology (NET) to recapture previously emitted greenhouse gases. Delivering NET requires radical cost and regulatory innovation to impact on climate mitigation. Present NET exemplars are few, are at small-scale and not deployable within a decade, with the exception of rock weathering, or direct injection of CO2 into selected ocean water masses. To keep warming less than 2°C, bioenergy with CCS (BECCS) has been modelled but does not yet exist at industrial scale. CCS already exists in many forms and at low cost. However, CCS has no political drivers to enforce its deployment. We make a new analysis of all global CCS projects and model the build rate out to 2050, deducing this is 100 times too slow. Our projection to 2050 captures just 700 Mt CO2 yr-1, not the minimum 6000 Mt CO2 yr-1 required to meet the 2°C target. Hence new policies are needed to incentivize commercial CCS. A first urgent action for all countries is to commercially assess their CO2 storage. A second simple action is to assign a Certificate of CO2 Storage onto producers of fossil carbon, mandating a progressively increasing proportion of CO2 to be stored. No CCS means no 2°C. This article is part of the theme issue `The Paris Agreement: understanding the physical and social challenges for a warming world of 1.5°C above pre-industrial levels'.

  12. Soil carbon sequestration and biochar as negative emission technologies.

    PubMed

    Smith, Pete

    2016-03-01

    Despite 20 years of effort to curb emissions, greenhouse gas (GHG) emissions grew faster during the 2000s than in the 1990s, which presents a major challenge for meeting the international goal of limiting warming to <2 °C relative to the preindustrial era. Most recent scenarios from integrated assessment models require large-scale deployment of negative emissions technologies (NETs) to reach the 2 °C target. A recent analysis of NETs, including direct air capture, enhanced weathering, bioenergy with carbon capture and storage and afforestation/deforestation, showed that all NETs have significant limits to implementation, including economic cost, energy requirements, land use, and water use. In this paper, I assess the potential for negative emissions from soil carbon sequestration and biochar addition to land, and also the potential global impacts on land use, water, nutrients, albedo, energy and cost. Results indicate that soil carbon sequestration and biochar have useful negative emission potential (each 0.7 GtCeq. yr(-1) ) and that they potentially have lower impact on land, water use, nutrients, albedo, energy requirement and cost, so have fewer disadvantages than many NETs. Limitations of soil carbon sequestration as a NET centre around issues of sink saturation and reversibility. Biochar could be implemented in combination with bioenergy with carbon capture and storage. Current integrated assessment models do not represent soil carbon sequestration or biochar. Given the negative emission potential of SCS and biochar and their potential advantages compared to other NETs, efforts should be made to include these options within IAMs, so that their potential can be explored further in comparison with other NETs for climate stabilization. © 2016 John Wiley & Sons Ltd.

  13. Solutions for Digital Video Transmission Technology Final Report CRADA No. TC02068.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, A. T.; Rivers, W.

    This Project aimed at development of software for seismic data processing based on the Geotool code developed by the American company Multimax., Inc. The Geotool code was written in early 90-es for the UNIX platform. Under Project# 2821, functions of the old Geotool code were transferred into a commercial version for the Microsoft XP and Vista platform with addition of new capabilities on visualization and data processing. The developed new version of the Geotool+ was implemented using the up-to-date tool Microsoft Visual Studio 2005 and uses capabilities of the .NET platform. C++ was selected as the main programming language formore » the Geotool+. The two-year Project was extended by six months and funding levels increased from 600,000 to $670,000. All tasks were successfully completed and all deliverables were met for the project even though both the industrial partner and LLNL principal investigator left the project before its final report.« less

  14. A Roadmap to School Improvement: A Strategic Plan for Educational Technology in Missouri. The Report of the Missouri Technology Task Force.

    ERIC Educational Resources Information Center

    Missouri School Boards Association, Columbia.

    The strategic plan for educational technology was developed by the Missouri Technology Task Force to assist state and local authorities in the creative application and appropriate integration of all technologies to achieve the broad educational goals for elementary and secondary Missouri schools. The specific goals and objectives of the plan…

  15. Driver strategies for engaging in distracting tasks using in-vehicle technologies

    DOT National Transportation Integrated Search

    2008-03-01

    This project investigated the decision process involved in a drivers willingness to engage in various technology-related and non-technology tasks. Previous research focused on how well drivers are able to drive while engaged in potentially distrac...

  16. Towards a Usability and Error "Safety Net": A Multi-Phased Multi-Method Approach to Ensuring System Usability and Safety.

    PubMed

    Kushniruk, Andre; Senathirajah, Yalini; Borycki, Elizabeth

    2017-01-01

    The usability and safety of health information systems have become major issues in the design and implementation of useful healthcare IT. In this paper we describe a multi-phased multi-method approach to integrating usability engineering methods into system testing to ensure both usability and safety of healthcare IT upon widespread deployment. The approach involves usability testing followed by clinical simulation (conducted in-situ) and "near-live" recording of user interactions with systems. At key stages in this process, usability problems are identified and rectified forming a usability and technology-induced error "safety net" that catches different types of usability and safety problems prior to releasing systems widely in healthcare settings.

  17. Observations Concerning the National Task Force on Educational Technology Report: "Reducing the Risk to the Nation."

    ERIC Educational Resources Information Center

    Salser, Carl

    1987-01-01

    Presents one educator's reaction to the National Task Force on Educational Technology's report. Provides responses to certain passages from the report's eight sections. Emphasizes that educators should look for ways to implement existing information-age technology in addition to more traditional forms of education technologies. (TW)

  18. Optimization of armored fighting vehicle crew performance in a net-centric battlefield

    NASA Astrophysics Data System (ADS)

    McKeen, William P.; Espenant, Mark

    2002-08-01

    Traditional display, control and situational awareness technologies may not allow the fighting vehicle commander to take full advantage of the rich data environment made available in the net-centric battle field of the future. Indeed, the sheer complexity and volume of available data, if not properly managed, may actually reduce crew performance by overloading or confusing the commander with irrelevant information. New techniques must be explored to understand how to present battlefield information and provide the commander with continuous high quality situational awareness without significant cognitive overhead. Control of the vehicle's many complex systems must also be addressed the entire Soldier Machine Interface must be optimized if we are to realize the potential performance improvements. Defence Research and Development Canada (DRDC) and General Dynamics Canada Ltd. have embarked on a joint program called Future Armoured Fighting Vehicle Systems Technology Demonstrator, to explore these issues. The project is based on man-in-the-loop experimentation using virtual reality technology on a six degree-of-freedom motion platform that simulates the motion, sights and sounds inside a future armoured vehicle. The vehicle commander is provided with a virtual reality vision system to view a simulated 360 degree multi-spectrum representation of the battlespace, thus providing enhanced situational awareness. Graphic overlays with decision aid information will be added to reduce cognitive loading. Experiments will be conducted to evaluate the effectiveness of virtual control systems. The simulations are carried out in a virtual battlefield created by linking our simulation system with other simulation centers to provide a net-centric battlespace where enemy forces can be engaged in fire fights. Survivability and lethality will be measured in successive test sequences using real armoured fighting vehicle crews to optimize overall system effectiveness.

  19. NASA Net Zero Energy Buildings Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pless, S.; Scheib, J.; Torcellini, P.

    In preparation for the time-phased net zero energy requirement for new federal buildings starting in 2020, set forth in Executive Order 13514, NASA requested that the National Renewable Energy Laboratory (NREL) to develop a roadmap for NASA's compliance. NASA detailed a Statement of Work that requested information on strategic, organizational, and tactical aspects of net zero energy buildings. In response, this document presents a high-level approach to net zero energy planning, design, construction, and operations, based on NREL's first-hand experience procuring net zero energy construction, and based on NREL and other industry research on net zero energy feasibility. The strategicmore » approach to net zero energy starts with an interpretation of the executive order language relating to net zero energy. Specifically, this roadmap defines a net zero energy acquisition process as one that sets an aggressive energy use intensity goal for the building in project planning, meets the reduced demand goal through energy efficiency strategies and technologies, then adds renewable energy in a prioritized manner, using building-associated, emission- free sources first, to offset the annual energy use required at the building; the net zero energy process extends through the life of the building, requiring a balance of energy use and production in each calendar year.« less

  20. Examining the functionality of the DeLone and McLean information system success model as a framework for synthesis in nursing information and communication technology research.

    PubMed

    Booth, Richard G

    2012-06-01

    In this review, studies examining information and communication technology used by nurses in clinical practice were examined. Overall, a total of 39 studies were assessed spanning a time period from 1995 to 2008. The impacts of the various health information and communication technology evaluated by individual studies were synthesized using the DeLone and McLean's six-dimensional framework for evaluating information systems success (ie, System Quality, Information Quality, Service Quality, Use, User Satisfaction, and Net Benefits). Overall, the majority of researchers reported results related to the overall Net Benefits (positive, negative, and indifferent) of the health information and communication technology used by nurses. Attitudes and user satisfaction with technology were also commonly measured attributes. The current iteration of DeLone and McLean model is effective at synthesizing basic elements of health information and communication technology use by nurses. Regardless, the current model lacks the sociotechnical sensitivity to capture deeper nurse-technology relationalities. Limitations and recommendations are provided for researchers considering using the DeLone and McLean model for evaluating health information and communication technology used by nurses.

Top