Yee, Kwang Chien; Miils, Erin; Airey, Caroline
2008-01-01
The current healthcare delivery model will not meet future healthcare demands. The only sustainable healthcare future is one that best leverages advances in technology to improve productivity and efficiency. Information communication technology (ICT) has, therefore, been touted as the panacea of future healthcare challenges. Many ICT projects in healthcare, however, fail to deliver on their promises to transform the healthcare system. From a technologist's perspective, this is often due to the lack of socio-technical consideration. From a socio-cultural perspective, however, there is often strong inertia to change. While the utilisation of user-centred design principles will generate a new wave of enthusiasm among technologists, this has to be matched with socio-cultural changes within the healthcare system. Generation Y healthcare workers might be the socio-cultural factor required, in combination with new technology, to transform the healthcare system. Generation Y has generated significant technology-driven changes in many other industries. The socio-cultural understanding of generation Y healthcare workers is essential to guide the design and implementation of ICT solutions for a sustainable healthcare future. This paper presents the initial analysis of our qualitative study which aims to generate in-depth conceptual insights of generation Y healthcare workers and their view of ICT in healthcare. Our results show that generation Y healthcare workers might assist future ICT implementation in healthcare. This paper, however, argues that significant changes to the current healthcare organisation will be required in order to unleash the full potential of generation Y workers and ICT implementation. Finally, this paper presents some strategies to empower generation Y workers as change agents for a sustainable future healthcare system.
Electronic Resources in Science and Technology: Gopher and Its Future.
ERIC Educational Resources Information Center
Weiner, Suzanne T., Ed.
1996-01-01
An Associate Head of Information Services and the Internet Gopher project leader discuss the future of Gopher with the arrival of the World Wide Web. Strengths and weaknesses of both systems are addressed. One expert sees a future with new versions of both; the other predicts a next generation of information systems combining their features. (PEN)
Leading into the future: coaching and mentoring Generation X employees.
Weston, M J
2001-09-01
Managers who recognize that Generation X employees are looking for workplaces that allow them to develop their competencies as well as have a balance in their personal and professional lives, are more successful in attracting and retaining employees in this age group. Savvy managers understand that adapting to meet the needs of Generation X employees also assists the manager in transitioning into the Information Age and the workplace of the future.
0-6760 : improved trip generation data for Texas using workplace and special generator surveys.
DOT National Transportation Integrated Search
2014-08-01
Trip generation rates play an important role in : transportation planning, which can help in : making informed decisions about future : transportation investment and design. However, : sometimes the rates are derived from small : sample sizes or may ...
(Some) Computer Futures: Mainframes.
ERIC Educational Resources Information Center
Joseph, Earl C.
Possible futures for the world of mainframe computers can be forecast through studies identifying forces of change and their impact on current trends. Some new prospects for the future have been generated by advances in information technology; for example, recent United States successes in applied artificial intelligence (AI) have created new…
Cordonnier, Aline; Barnier, Amanda J; Sutton, John
2016-01-01
Research on future thinking has emphasized how episodic details from memories are combined to create future thoughts, but has not yet examined the role of semantic scripts. In this study, participants recalled how they planned a past camping trip in Australia (past planning task) and imagined how they would plan a future camping trip (future planning task), set either in a familiar (Australia) or an unfamiliar (Antarctica) context. Transcripts were segmented into information units that were coded according to semantic category (e.g., where, when, transport, material, actions). Results revealed a strong interaction between tasks and their presentation order. Starting with the past planning task constrained the future planning task when the context was familiar. Participants generated no new information when the future camping trip was set in Australia and completed second (after the past planning task). Conversely, starting with the future planning task facilitated the past planning task. Participants recalled more information units of their past plan when the past planning task was completed second (after the future planning task). These results shed new light on the role of scripts in past and future thinking and on how past and future thinking processes interact.
This Is Your Future: A Case Study Approach to Foster Health Literacy
ERIC Educational Resources Information Center
Brey, Rebecca A.; Clark, Susan E.; Wantz, Molly S.
2008-01-01
Today's young people seem to live in an even faster fast-paced society than previous generations. As in the past, they are involved in sports, music, school, church, work, and are exposed to many forms of mass media that add to their base of information. However, they also have instant access to computer-generated information such as the Internet,…
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upshall, I.R.; McCarthy, G.J.
A contextual framework comprises 'entities' that exhibit one or more definable relationships with a particular 'event'. People, organisations, concepts, ideas, places, natural phenomena, events themselves, cultural artefacts including records, books, works of art can all be conceptualised as entities. If these entities are registered in an information management system where the relationships between them can be defined and systematically managed then it is possible to create a contextual information framework that represents a particular view of what occurs in real life. The careful identifying and mapping of the relationships between these entities and the selected event can lead rapidly tomore » the creation of an information network that closely reflects the human approach to knowledge acquisition and application. The 'event' referred to in this paper is the safe management of radioactive waste. It is widely accepted that society will expect that knowledge about the waste will be maintained for many decades, if not centuries. Delivering on this expectation will demand the application of management approaches that are both innovative and sustainable. Effective inter-generational transfer of information using many 'conventional' techniques will be highly dependent on societal stability - something that cannot be guaranteed over such long periods of time. Consequently, alternative approaches should be explored and, where appropriate, implemented to give reasonable assurance that future generations of waste custodians will not be unduly burdened by the need to recreate information about the waste long after its disposal. In actual fact, the contextual information framework model is not 'new technology' but simply a means for rationalising and representing the way humans naturally tend to use information in the pursuit of knowledge enhancement. By making use of multiple information entities and their relationships, it is often possible to convert otherwise impossibly complex socio-technical environments into information architectures or networks with remarkable and useful properties. The International Atomic Energy Agency, in its ongoing work to encourage the application of systems to manage radioactive waste information over the long term, has embraced the contextual information framework as a potentially viable approach to this particular challenge. To this end, it invited Member States to contribute to the production of a Safety Report that used the contextual information framework model, building on the wealth of existing IAEA guidance. The report focuses, not on the important area of records management, but on the benefits that can arise from the development of an information management approach that increases the likelihood that future generations will recognise the significance and value of the information contained in these records. Our understanding of 'inter-generational transfer' should extend beyond the simple physical transfer of records into an archival repository towards the establishment of a working culture that places sufficient contemporary information into a form that ensures it remains accessible, and ultimately enhances, the knowledge of future generations. Making information accessible is therefore the key and whilst the use of stable records media, storage environments and quality assurance are important elements, they cannot be considered solutions in themselves. This paper articulates some of the lessons that have been learned about using the contextual information framework model when applied to the long term management of radioactive waste. The draft IAEA Safety Report entitled 'Preservation and Transfer to Future Generations of Information Important to the Safety of Waste Disposal Facilities', on which this paper is based, is expected to be published in 2007. (authors)« less
President's Information Technology Advisory Committee Interim Report to the President.
ERIC Educational Resources Information Center
National Coordination Office for Information Technology Research and Development, Arlington, VA.
This document is the Interim Report on future directions for Federal support of research and development in high performance computing, communications, information technology, and the Next Generation Internet. This report provides a more detailed explanation of the findings and recommendations summarized by the President's Information Technology…
Conservaton and retrieval of information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jensen, M.
This is a summary of the findings of a Nordic working group formed in 1990 and given the task of establishing a basis for a common Nordic view of the need for information conservation for nuclear waste repositories by investigating the following: (1) the type of information that should be conserved; (2) the form in which the information should be kept; (3) the quality of the information as regards both type and form; and (4) the problems of future retrieval of information, including retrieval after very long periods of time. High-level waste from nuclear power generation will remain radioactive formore » very long times even though the major part of the radioactivity will have decayed within 1000 yr. Certain information about the waste must be kept for long time periods because future generations may-intentionally or inadvertently-come into contact with the radioactive waste. Current day waste management would benefit from an early identification of documents to be part of an archive for radioactive waste repositories. The same reasoning is valid for repositories for other toxic wastes.« less
Critical review: Uncharted waters? The future of the electricity-water nexus.
Sanders, Kelly T
2015-01-06
Electricity generation often requires large amounts of water, most notably for cooling thermoelectric power generators and moving hydroelectric turbines. This so-called "electricity-water nexus" has received increasing attention in recent years by governments, nongovernmental organizations, industry, and academics, especially in light of increasing water stress in many regions around the world. Although many analyses have attempted to project the future water requirements of electricity generation, projections vary considerably due to differences in temporal and spatial boundaries, modeling frameworks, and scenario definitions. This manuscript is intended to provide a critical review of recent publications that address the future water requirements of electricity production and define the factors that will moderate the water requirements of the electric grid moving forward to inform future research. The five variables identified include changes in (1) fuel consumption patterns, (2) cooling technology preferences, (3) environmental regulations, (4) ambient climate conditions, and (5) electric grid characteristics. These five factors are analyzed to provide guidance for future research related to the electricity-water nexus.
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system
Jensen, Tue V.; Pinson, Pierre
2017-01-01
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation. PMID:29182600
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.
Jensen, Tue V; Pinson, Pierre
2017-11-28
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.
RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system
NASA Astrophysics Data System (ADS)
Jensen, Tue V.; Pinson, Pierre
2017-11-01
Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.
Information Literacy in the Workplace: A Qualitative Exploratory Study
ERIC Educational Resources Information Center
Crawford, John; Irving, Christine
2009-01-01
Although increasingly recognized as a future skills issue, the use of information in the workplace is a little studied area within library and information research. A substantial "pedagogic" literature of learning in the workplace exists, however, and this was critically reviewed to generate a repertoire of issues which could in turn be…
Brainstorming Design for Health: Helping Patients Utilize Patient-Generated Information on the Web
Huh, Jina; Hartzler, Andrea; Munson, Sean; Anderson, Nick; Edwards, Kelly; Gore, John L.; McDonald, David; O’Leary, Jim; Parker, Andrea; Streat, Derek; Yetisgen-Yildiz, Meliha; Pratt, Wanda; Ackerman, Mark S.
2013-01-01
Researchers and practitioners show increasing sinterest in utilizing patient-generated information on the Web. Although the HCI and CSCW communities have provided many exciting opportunities for exploring new ideas and building broad agenda in health, few venues offer a platform for interdisciplinary and collaborative brainstorming about design challenges and opportunities in this space. The goal of this workshop is to provide participants with opportunities to interact with stakeholders from diverse backgrounds and practices—researchers, practitioners, designers, programmers, and ethnographers—and together generate tangible design outcomes that utilize patient-generated information on the Web. Through small multidisciplinary group work, we will provide participants with new collaboration opportunities, understanding of the state of the art, inspiration for future work, and ideally avenues for continuing to develop research and design ideas generated at the workshop. PMID:24499843
High Renewable Generation | Energy Analysis | NREL
and water use. Featured Studies Eastern Renewable Generation Integration Study Renewable Electricity Futures Study North American Renewable Integration Study Data and Tools Find out more about NREL's Grid are documented in Transparent Cost Database/Open Energy Information. Publications SunShot Vision Study
The Dubious Promise of Educational Technologies: Historical Patterns and Future Challenges
ERIC Educational Resources Information Center
Cuban, Larry; Jandric, Petar
2015-01-01
In this article, Larry Cuban discusses his ideas about the topic of this Special Issue of E-learning and Digital Media "Networked Realms and Hoped-For Futures: A Trans-Generational Dialogue" with one of its co-editors, Petar Jandric. The conversation explores the historical relationships between education and information and…
NASA Astrophysics Data System (ADS)
Arbab-Zavar, B.; Chakravarthy, A.; Sabeur, Z. A.
2012-04-01
The rapid development of advanced smart communication tools with good quality and resolution video cameras, audio and GPS devices in the last few years shall lead to profound impacts on the way future environmental observations are conducted and accessed by communities. The resulting large scale interconnections of these "Future Internet Things" form a large environmental sensing network which will generate large volumes of quality environmental observations and at highly localised spatial scales. This enablement in environmental sensing at local scales will be of great importance to contribute in the study of fauna and flora in the near future, particularly on the effect of climate change on biodiversity in various regions of Europe and beyond. The Future Internet could also potentially become the de facto information space to provide participative real-time sensing by communities and improve our situation awarness of the effect of climate on local environments. In the ENVIROFI(2011-2013) Usage Area project in the FP7 FI-PPP programme, a set of requirements for specific (and generic) enablers is achieved with the potential establishement of participating community observatories of the future. In particular, the specific enablement of interest concerns the building of future interoperable services for the management of environmental data intelligently with tagged contextual geo-spatial information generated by multiple operators in communities (Using smart phones). The classification of observed species in the resulting images is achieved with structured data pre-processing, semantic enrichement using contextual geospatial information, and high level fusion with controlled uncertainty estimations. The returned identification of species is further improved using future ground truth corrections and learning by the specific enablers.
Preparing for Change in the Federal Information Technology Workforce
2008-01-01
Boom Generation, born 1946 to 1964; Generation X (or Gen X), born 1965 to 1977; and the Net Generation (also called Generation Y or the Millennials ...Management and others, the future workforce can be characterized as: • More diverse, as measured by ethnicity, age, race, religion , family background...they cannot get a seat at the table to get their views heard. Baby Millennials or Greatest Generation Boomers Gen-X Net-Gen Age 64–84 45–63 32–44 19
Intergenerational equity and long-term stewardship plans.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hocking, E. K.
2002-02-05
For an untold number of contaminated sites throughout the world, stewardship will be inevitable. For many such sites, stewardship will be a reasonable approach because of the uncertainties associated with present and future site conditions and site contaminants, the limited performance of available technologies, the nonavailability of technologies, and the risk and cost associated with complete cleanup. Regardless of whether stewardship is a realistic approach to site situations or simply a convenient default, it could be required at most contaminated sites for multiple generations. Because the stewardship plan is required to protect the release of hazardous contaminants to the environment,more » some use restrictions will be put in place to provide that protection. These use restrictions will limit access to resources for as long as the protection is required. The intergenerational quality of long-term stewardship plans and their inherent limitations on resource use require that they be designed to achieve equity among the affected generations. Intergenerational equity, defined here as the fairness of access to resources across generations, could be achieved through a well-developed stewardship plan that provides future generations with the information they need to make wise decisions about resource use. Developing and implementing such a plan would take into account the failure mechanisms of the plan's components, feature short stewardship time blocks that would allow for periodic reassessments of the site and of the stewardship program's performance, and provide present and future generations with necessary site information.« less
One-Step Device Converts Water, Sunlight Into Fuel of the Future
great promise that through further research the technology can bring down the cost of using water and One-Step Device Converts Water, Sunlight Into Fuel of the Future For more information contact the world's most abundant resources, water and sunlight, to directly generate hydrogen, a non
What's Next After You Say Hello: First Steps in Mentoring
ERIC Educational Resources Information Center
Hogue, William F.; Pringle, Ernest M.
2005-01-01
In most cultures wisdom, knowledge, and experience are prized assets. Those who possess them are held in high regard and are expected to share them with the next generation. So it is in the world of information technology (IT). Veteran IT professionals are often charged with identifying and developing future IT leaders, while future leaders often…
ERIC Educational Resources Information Center
Romeo, Geoff; Lloyd, Margaret; Downes, Toni
2012-01-01
The "Teaching Teachers for the Future" (TTF) project is a unique nationally significant project funded by the Australian Government through the Department of Employment, Education and Workplace Relations (DEEWR, Au$8.8 million) and the Information and Communication Technology Innovation Fund (ICTIF). This 2011-2012 project has…
Flexibility decline contributes to similarity of past and future thinking in Alzheimer's disease.
El Haj, Mohamad; Antoine, Pascal; Kapogiannis, Dimitrios
2015-11-01
A striking similarity has been suggested between past and future thinking in Alzheimer's Disease (AD), a similarity attributable to abnormalities in common modular cognitive functions and neuroanatomical substrates. This study extends this literature by identifying specific executive function deficits underlying past and future thinking in AD. Twenty-four participants with a clinical diagnosis of probable (mild) AD and 26 older controls generated past and future events and underwent tests of binding and the executive functions of flexibility, inhibition, and updating. AD patients showed similar autobiographical performances in past and future event generation, and so did control participants. In each group, the similarity of past and future thinking was predicted by flexibility. Furthermore, AD patients with low flexibility showed higher similarity of past and future thinking than those with high flexibility. These findings are interpreted in terms of involvement of the hippocampus and frontal lobes in future thinking. Deficits in these brain regions in AD are likely to compromise the ability to recombine episodic information into novel and flexible configurations as scenarios for the future. © 2015 Wiley Periodicals, Inc.
Flexibility Decline Contributes to Similarity of Past and Future Thinking in Alzheimer’s Disease
El Haj, Mohamad; Antoine, Pascal; Kapogiannis, Dimitrios
2017-01-01
A striking similarity has been suggested between past and future thinking in Alzheimer’s Disease (AD), a similarity attributable to abnormalities in common modular cognitive functions and neuroanatomical substrates. This study extends this literature by identifying specific executive function deficits underlying past and future thinking in AD. Twenty-four participants with a clinical diagnosis of probable (mild) AD and 26 older controls generated past and future events and underwent tests of binding and the executive functions of flexibility, inhibition, and updating. AD patients showed similar autobiographical performances in past and future event generation, and so did control participants. In each group, the similarity of past and future thinking was predicted by flexibility. Furthermore, AD patients with low flexibility showed higher similarity of past and future thinking than those with high flexibility. These findings are interpreted in terms of involvement of the hippocampus and frontal lobes in future thinking. Deficits in these brain regions in AD are likely to compromise the ability to recombine episodic information into novel and flexible configurations as scenarios for the future. PMID:25850800
Integrated information management and hospital libraries.
Buchanan, H S; Fazzone, N
1985-01-01
It is demonstrated that hospitals are information-dependent and that there is need for integration of information generated and gathered through their subsystems. This paper discusses recommendations of the Matheson Report for an integrated information management system which would link these subsystems. The library's statement of mission, means for self-assessment, and analysis of information needs and uses are explored. Future directions with examples of new roles for the library are outlined. PMID:3978295
Open-Source Programming for Automated Generation of Graphene Raman Spectral Maps
NASA Astrophysics Data System (ADS)
Vendola, P.; Blades, M.; Pierre, W.; Jedlicka, S.; Rotkin, S. V.
Raman microscopy is a useful tool for studying the structural characteristics of graphene deposited onto substrates. However, extracting useful information from the Raman spectra requires data processing and 2D map generation. An existing home-built confocal Raman microscope was optimized for graphene samples and programmed to automatically generate Raman spectral maps across a specified area. In particular, an open source data collection scheme was generated to allow the efficient collection and analysis of the Raman spectral data for future use. NSF ECCS-1509786.
GalaxyGAN: Generative Adversarial Networks for recovery of galaxy features
NASA Astrophysics Data System (ADS)
Schawinski, Kevin; Zhang, Ce; Zhang, Hantian; Fowler, Lucas; Krishnan Santhanam, Gokula
2017-02-01
GalaxyGAN uses Generative Adversarial Networks to reliably recover features in images of galaxies. The package uses machine learning to train on higher quality data and learns to recover detailed features such as galaxy morphology by effectively building priors. This method opens up the possibility of recovering more information from existing and future imaging data.
NASA's Radioisotope Power Systems Planning and Potential Future Systems Overview
NASA Technical Reports Server (NTRS)
Zakrajsek, June F.; Woerner, Dave F.; Cairns-Gallimore, Dirk; Johnson, Stephen G.; Qualls, Louis
2016-01-01
The goal of NASA's Radioisotope Power Systems (RPS) Program is to make RPS ready and available to support the exploration of the solar system in environments where the use of conventional solar or chemical power generation is impractical or impossible to meet the needs of the missions. To meet this goal, the RPS Program, working closely with the Department of Energy, performs mission and system studies (such as the recently released Nuclear Power Assessment Study), assesses the readiness of promising technologies to infuse in future generators, assesses the sustainment of key RPS capabilities and knowledge, forecasts and tracks the Program's budgetary needs, and disseminates current information about RPS to the community of potential users. This process has been refined and used to determine the current content of the RPS Program's portfolio. This portfolio currently includes an effort to mature advanced thermoelectric technology for possible integration into an enhanced Multi-Mission Radioisotope Generator (eMMRTG), sustainment and production of the currently deployed MMRTG, and technology investments that could lead to a future Stirling Radioisotope Generator (SRG). This paper describes the program planning processes that have been used, the currently available MMRTG, and one of the potential future systems, the eMMRTG.
NASA's Radioisotope Power Systems Planning and Potential Future Systems Overview
NASA Technical Reports Server (NTRS)
Zakrajsek, June F.; Woerner, Dave F.; Cairns-Gallimore, Dirk; Johnson, Stephen G.; Qualis, Louis
2016-01-01
The goal of NASA's Radioisotope Power Systems (RPS) Program is to make RPS ready and available to support the exploration of the solar system in environments where the use of conventional solar or chemical power generation is impractical or impossible to meet the needs of the missions. To meet this goal, the RPS Program, working closely with the Department of Energy, performs mission and system studies (such as the recently released Nuclear Power Assessment Study), assesses the readiness of promising technologies to infuse in future generators, assesses the sustainment of key RPS capabilities and knowledge, forecasts and tracks the Programs budgetary needs, and disseminates current information about RPS to the community of potential users. This process has been refined and used to determine the current content of the RPS Programs portfolio. This portfolio currently includes an effort to mature advanced thermoelectric technology for possible integration into an enhanced Multi-Mission Radioisotope Generator (eMMRTG), sustainment and production of the currently deployed MMRTG, and technology investments that could lead to a future Stirling Radioisotope Generator (SRG). This paper describes the program planning processes that have been used, the currently available MMRTG, and one of the potential future systems, the eMMRTG.
From Sensor to Observation Web with environmental enablers in the Future Internet.
Havlik, Denis; Schade, Sven; Sabeur, Zoheir A; Mazzetti, Paolo; Watson, Kym; Berre, Arne J; Mon, Jose Lorenzo
2011-01-01
This paper outlines the grand challenges in global sustainability research and the objectives of the FP7 Future Internet PPP program within the Digital Agenda for Europe. Large user communities are generating significant amounts of valuable environmental observations at local and regional scales using the devices and services of the Future Internet. These communities' environmental observations represent a wealth of information which is currently hardly used or used only in isolation and therefore in need of integration with other information sources. Indeed, this very integration will lead to a paradigm shift from a mere Sensor Web to an Observation Web with semantically enriched content emanating from sensors, environmental simulations and citizens. The paper also describes the research challenges to realize the Observation Web and the associated environmental enablers for the Future Internet. Such an environmental enabler could for instance be an electronic sensing device, a web-service application, or even a social networking group affording or facilitating the capability of the Future Internet applications to consume, produce, and use environmental observations in cross-domain applications. The term "envirofied" Future Internet is coined to describe this overall target that forms a cornerstone of work in the Environmental Usage Area within the Future Internet PPP program. Relevant trends described in the paper are the usage of ubiquitous sensors (anywhere), the provision and generation of information by citizens, and the convergence of real and virtual realities to convey understanding of environmental observations. The paper addresses the technical challenges in the Environmental Usage Area and the need for designing multi-style service oriented architecture. Key topics are the mapping of requirements to capabilities, providing scalability and robustness with implementing context aware information retrieval. Another essential research topic is handling data fusion and model based computation, and the related propagation of information uncertainty. Approaches to security, standardization and harmonization, all essential for sustainable solutions, are summarized from the perspective of the Environmental Usage Area. The paper concludes with an overview of emerging, high impact applications in the environmental areas concerning land ecosystems (biodiversity), air quality (atmospheric conditions) and water ecosystems (marine asset management).
From Sensor to Observation Web with Environmental Enablers in the Future Internet
Havlik, Denis; Schade, Sven; Sabeur, Zoheir A.; Mazzetti, Paolo; Watson, Kym; Berre, Arne J.; Mon, Jose Lorenzo
2011-01-01
This paper outlines the grand challenges in global sustainability research and the objectives of the FP7 Future Internet PPP program within the Digital Agenda for Europe. Large user communities are generating significant amounts of valuable environmental observations at local and regional scales using the devices and services of the Future Internet. These communities’ environmental observations represent a wealth of information which is currently hardly used or used only in isolation and therefore in need of integration with other information sources. Indeed, this very integration will lead to a paradigm shift from a mere Sensor Web to an Observation Web with semantically enriched content emanating from sensors, environmental simulations and citizens. The paper also describes the research challenges to realize the Observation Web and the associated environmental enablers for the Future Internet. Such an environmental enabler could for instance be an electronic sensing device, a web-service application, or even a social networking group affording or facilitating the capability of the Future Internet applications to consume, produce, and use environmental observations in cross-domain applications. The term “envirofied” Future Internet is coined to describe this overall target that forms a cornerstone of work in the Environmental Usage Area within the Future Internet PPP program. Relevant trends described in the paper are the usage of ubiquitous sensors (anywhere), the provision and generation of information by citizens, and the convergence of real and virtual realities to convey understanding of environmental observations. The paper addresses the technical challenges in the Environmental Usage Area and the need for designing multi-style service oriented architecture. Key topics are the mapping of requirements to capabilities, providing scalability and robustness with implementing context aware information retrieval. Another essential research topic is handling data fusion and model based computation, and the related propagation of information uncertainty. Approaches to security, standardization and harmonization, all essential for sustainable solutions, are summarized from the perspective of the Environmental Usage Area. The paper concludes with an overview of emerging, high impact applications in the environmental areas concerning land ecosystems (biodiversity), air quality (atmospheric conditions) and water ecosystems (marine asset management). PMID:22163827
Life's Late Digital Revolution and Why It Matters for the Study of the Origins of Life.
Baum, David A; Lehman, Niles
2017-08-25
The information contained in life exists in two forms, analog and digital. Analog information is manifest mainly in the differing concentrations of chemicals that get passed from generation to generation and can vary from cell to cell. Digital information is encoded in linear polymers such as DNA and RNA, whose side chains come in discrete chemical forms. Here, we argue that the analog form of information preceded the digital. Acceptance of this dichotomy, and this progression, can help direct future studies on how life originated and initially complexified on the primordial Earth, as well as expected trajectories for other, independent origins of complex life.
NASA Astrophysics Data System (ADS)
Kamiya, Takeshi; Miyazaki, Tetsuya; Kubota, Fumito
In this section, first, current situation of traffic growth and penetration of broadband services are described. Then social demand, technical issues, and research trend for future information network in the United States, Europe, and Japan are described. Finally, a detailed construction of this book is introduced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, R.R.
1986-01-01
This report presents information on the Integral Fast Reactor and its role in the future. Information is presented in the areas of: inherent safety; other virtues of sodium-cooled breeder; and solving LWR fuel cycle problems with IFR technologies. (JDB)
A conceptual prototype for the next-generation national elevation dataset
Stoker, Jason M.; Heidemann, Hans Karl; Evans, Gayla A.; Greenlee, Susan K.
2013-01-01
In 2012 the U.S. Geological Survey's (USGS) National Geospatial Program (NGP) funded a study to develop a conceptual prototype for a new National Elevation Dataset (NED) design with expanded capabilities to generate and deliver a suite of bare earth and above ground feature information over the United States. This report details the research on identifying operational requirements based on prior research, evaluation of what is needed for the USGS to meet these requirements, and development of a possible conceptual framework that could potentially deliver the kinds of information that are needed to support NGP's partners and constituents. This report provides an initial proof-of-concept demonstration using an existing dataset, and recommendations for the future, to inform NGP's ongoing and future elevation program planning and management decisions. The demonstration shows that this type of functional process can robustly create derivatives from lidar point cloud data; however, more research needs to be done to see how well it extends to multiple datasets.
A Combinatorial Geometry Computer Description of the MEP-021A Generator Set
1979-02-01
Generator Computer Description Gasoline Generator GIFT MEP-021A 20. ABSTRACT fCbntteu* an rararaa eta* ft namamwaay anal Identify by block number) This... GIFT code is also stored on magnetic tape for future vulnerability analysis. 00,] *7,1473 EDITION OF • NOV 65 IS OBSOLETE UNCLASSIFIED SECURITY...the Geometric Information for Targets ( GIFT ) computer code. The GIFT code traces shotlines through a COM-GEOM description from any specified attack
Generation of an arbitrary concatenated Greenberger-Horne-Zeilinger state with single photons
NASA Astrophysics Data System (ADS)
Chen, Shan-Shan; Zhou, Lan; Sheng, Yu-Bo
2017-02-01
The concatenated Greenberger-Horne-Zeilinger (C-GHZ) state is a new kind of logic-qubit entangled state, which may have extensive applications in future quantum communication. In this letter, we propose a protocol for constructing an arbitrary C-GHZ state with single photons. We exploit the cross-Kerr nonlinearity for this purpose. This protocol has some advantages over previous protocols. First, it only requires two kinds of cross-Kerr nonlinearities to generate single phase shifts ±θ. Second, it is not necessary to use sophisticated m-photon Toffoli gates. Third, this protocol is deterministic and can be used to generate an arbitrary C-GHZ state. This protocol may be useful in future quantum information processing based on the C-GHZ state.
Daddy, What's a Nuclear Reactor?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reisenweaver, Dennis W.
2008-01-15
No matter what we think of the nuclear industry, it is part of mankind's heritage. The decommissioning process is slowly making facilities associated with this industry disappear and not enough is being done to preserve the information for future generations. This paper provides some food for thought and provides a possible way forward. Industrial archaeology is an ever expanding branch of archaeology that is dedicated to preserving, interpreting and documenting our industrial past and heritage. Normally it begins with analyzing an old building or ruins and trying to determine what was done, how it was done and what changes mightmore » have occurred during its operation. We have a unique opportunity to document all of these issues and provide them before the nuclear facility disappears. Entombment is an acceptable decommissioning strategy; however we would have to change our concept of entombment. It is proposed that a number of nuclear facilities be entombed or preserved for future generations to appreciate. This would include a number of different types of facilities such as different types of nuclear power and research reactors, a reprocessing plant, part of an enrichment plant and a fuel manufacturing plant. One of the main issues that would require resolution would be that of maintaining information of the location of the buried facility and the information about its operation and structure, and passing this information on to future generations. This can be done, but a system would have to be established prior to burial of the facility so that no information would be lost. In general, our current set of requirements and laws may need to be re-examined and modified to take into account these new situations. As an alternative, and to compliment the above proposal, it is recommended that a study and documentation of the nuclear industry be considered as part of twentieth century industrial archaeology. This study should not only include the power and fuel cycle facilities, but also the nuclear weapons complex and the industrial and research sectors. This would be a large chore due to the considerable number of different types of facilities that have been used in these industries, but it would be a worthwhile endeavor. This study would gather information that would normally be lost due to the decommissioning process and allow future generations to appreciate these industries. Because of the volume and varying types of facilities, it might be more beneficial to produce a set of studies relating to different aspects of the industry. A logical division would be the separation of the commercial nuclear industry and the nuclear weapons complex. The separation of the fuel cycle facilities may also be considered. If done properly, this could result in a set of documents of interest to a wide audience. The current nuclear industry is slowly disappearing through the decommissioning process. This industry is unique and is part of mankind's heritage. It must not be forgotten and the information should be made available for future generations. The U.S. Department of Energy and the National Park Service are doing some limited preservation of information, but I do not believe its enough. It is not being done in a manner that will preserve the true activities that were performed. It is recommended that the American Nuclear Society, along with other organizations, evaluate this proposal and possibly provide funds for a set of studies to be prepared and ensure that this valuable part of our heritage is not lost.« less
Acute Mountain Sickness and Hemoconcentration in Next Generation Spacecraft
NASA Technical Reports Server (NTRS)
Conkin, Johnny
2009-01-01
This slide presentation reviews the threat astronauts face from acute mountain sickness (AMS). It includes information about the symptoms of AMS, the potential threat to astronauts, and future efforts to mitigate the AMS threat.
Multipass Target Search in Natural Environments
Otte, Michael W.; Sofge, Donald; Gupta, Satyandra K.
2017-01-01
Consider a disaster scenario where search and rescue workers must search difficult to access buildings during an earthquake or flood. Often, finding survivors a few hours sooner results in a dramatic increase in saved lives, suggesting the use of drones for expedient rescue operations. Entropy can be used to quantify the generation and resolution of uncertainty. When searching for targets, maximizing mutual information of future sensor observations will minimize expected target location uncertainty by minimizing the entropy of the future estimate. Motion planning for multi-target autonomous search requires planning over an area with an imperfect sensor and may require multiple passes, which is hindered by the submodularity property of mutual information. Further, mission duration constraints must be handled accordingly, requiring consideration of the vehicle’s dynamics to generate feasible trajectories and must plan trajectories spanning the entire mission duration, something which most information gathering algorithms are incapable of doing. If unanticipated changes occur in an uncertain environment, new plans must be generated quickly. In addition, planning multipass trajectories requires evaluating path dependent rewards, requiring planning in the space of all previously selected actions, compounding the problem. We present an anytime algorithm for autonomous multipass target search in natural environments. The algorithm is capable of generating long duration dynamically feasible multipass coverage plans that maximize mutual information using a variety of techniques such as ϵ-admissible heuristics to speed up the search. To the authors’ knowledge this is the first attempt at efficiently solving multipass target search problems of such long duration. The proposed algorithm is based on best first branch and bound and is benchmarked against state of the art algorithms adapted to the problem in natural Simplex environments, gathering the most information in the given search time. PMID:29099087
FISHER INFORMATION AS A SUSTAINABILITY METRIC
World commission on Environment and Development defines sustainability as 'development that meets the needs of the present without compromising the ability of future generations to meet their own needs'. The concept of Sustainability requires study of complex integrated systems ...
Macleod, Adrian K; Stanley, Michele S; Day, John G; Cook, Elizabeth J
2016-01-01
Knowledge of biofouling typical of marine structures is essential for engineers to define appropriate loading criteria in addition to informing other stakeholders about the ecological implications of creating novel artificial environments. There is a lack of information regarding biofouling community composition (including weight and density characteristics) on floating structures associated with future marine renewable energy generation technologies. A network of navigation buoys were identified across a range of geographical areas, environmental conditions (tidal flow speed, temperature and salinity), and deployment durations suitable for future developments. Despite the perceived importance of environmental and temporal factors, geographical location explained the greatest proportion of the observed variation in community composition, emphasising the importance of considering geography when assessing the impact of biofouling on device functioning and associated ecology. The principal taxa associated with variation in biofouling community composition were mussels (Mytilus edulis), which were also important when determining loading criteria.
SO-QT: Collaborative Tool to Project the Future Space Object Population
NASA Technical Reports Server (NTRS)
Stupl, Jan
2017-01-01
Earth orbit gets increasingly congested, a challenge to space operators, both in governments and industry. We present a web tool that provides: 1) data on todays and the historic space object environments, by aggregating object-specific tracking data; and 2) future trends through a collaboration platform to collect information on planed launches. The collaborative platform enables experts to pool and compare their data in order to generate future launch scenarios. The tool is intended to support decision makers and mission designers while they investigate future missions and scholars as they develop strategies for space traffic management.
Usage of Multi-Mission Radioisotope Thermoelectric Generators (MMRTGs) for Future Potential Missions
NASA Technical Reports Server (NTRS)
Zakrajsek, June F.; Cairns-Gallimore, Dirk; Otting, Bill; Johnson, Steve; Woerner, Dave
2016-01-01
The goal of NASAs Radioisotope Power Systems (RPS) Program is to make RPS ready and available to support the exploration of the solar system in environments where the use of conventional solar or chemical power generation is impractical or impossible to meet the needs of the missions. To meet this goal, the RPS Program, working closely with the Department of Energy, performs mission and system studies (such as the recently released Nuclear Power Assessment Study), evaluates the readiness of promising technologies to infuse in future generators, assesses the sustainment of key RPS capabilities and knowledge, forecasts and tracks the Programs budgetary needs, and disseminates current information about RPS to the community of potential users. This presentation focuses on the needs of the mission community and provides users a better understanding of how to integrate the MMRTG (Multi-Mission Radioisotope Thermoelectric Generator).
Flood Risk in the Danube basin under climate change
NASA Astrophysics Data System (ADS)
Schröter, Kai; Wortmann, Michel; del Rocio Rivas Lopez, Maria; Liersch, Stefan; Viet Nguyen, Dung; Hardwick, Stephen; Hattermann, Fred
2017-04-01
The projected increase in temperature is expected to intensify the hydrological cycle, and thus more intense precipitation is likely to increase hydro-meteorological extremes and flood hazard. However to assess the future dynamics of hazard and impact induced by these changes it is necessary to consider extreme events and to take a spatially differentiated perspective. The Future Danube Model is a multi-hazard and risk model suite for the Danube region which has been developed in the OASIS project. The model comprises modules for estimating potential perils from heavy precipitation, heat-waves, floods, droughts, and damage risk considering hydro-climatic extremes under current and climate change conditions. Web-based open Geographic Information Systems (GIS) technology allows customers to graphically analyze and overlay perils and other spatial information such as population density or assets exposed. The Future Danube Model combines modules for weather generation, hydrological and hydrodynamic processes, and supports risk assessment and adaptation planning support. This contribution analyses changes in flood hazard in the Danube basin and in flood risk for the German part of the Danube basin. As climate change input, different regionalized climate ensemble runs of the newest IPCC generation are used, the so-called Representative Concentration Pathways (RCPs). They are delivered by the CORDEX initiative (Coordinated Downscaling Experiments). The CORDEX data sample is extended using the statistical weather generator (IMAGE) in order to also consider extreme events. Two time slices are considered: near future 2020-2049 and far future 2050-2079. This data provides the input for the hydrological, hydraulic and flood loss model chain. Results for RCP4.5 and RCP8.5 indicate an increase in intensity and frequency of peak discharges and thus in flood hazard for many parts of the Danube basin.
Information Technology for the Twenty-First Century: A Bold Investment in America's Future
NASA Astrophysics Data System (ADS)
1999-06-01
With this Information Technology for the Twenty First Century (IT2) initiative, the Federal Government is making an important re-commitment to fundamental research in information technology. The IT2 initiative proposes 366 million in increased investments in computing, information, and communications research and development (R&D) to help expand the knowledge base in fundamental information science, advance the Nations capabilities in cutting edge research, and train the next generation of researchers who will sustain the Information Revolution well into the 21st Century.
U.S. Spacesuit Knowledge Capture Status and Initiatives in Fiscal Year 2014
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2015-01-01
Since its 2008 inception, the NASA U.S. Spacesuit Knowledge Capture (KC) program has shared historical spacesuit information with engineers and other technical team members to expand their understanding of the spacesuit's evolution, known capability and limitations, and future desires and needs for its use. As part of the U.S. Spacesuit KC program, subject-matter experts have delivered presentations, held workshops, and participated in interviews to share valuable spacesuit lessons learned to ensure this vital information will survive for existing and future generations to use. These events have included spacesuit knowledge from the inception of NASA's first spacesuit to current spacesuit design. To ensure that this information is shared with the entire NASA community and other interested or invested entities, these KC events were digitally recorded and transcribed to be uploaded onto several applicable NASA Web sites. This paper discusses the various Web sites that the KC events are uploaded to and possible future sites that will channel this information.
U.S. Spacesuit Knowledge Capture Accomplishments in Fiscal Year 2014
NASA Technical Reports Server (NTRS)
Chullen, Cinda; Oliva, Vladenka R.
2015-01-01
Since its 2008 inception, the NASA U.S. Spacesuit Knowledge Capture (KC) program has shared historical spacesuit information with engineers and other technical team members to expand their understanding of the spacesuit's evolution, known capability and limitations, and future desires and needs for its use. As part of the U.S. Spacesuit KC program, subject-matter experts have delivered presentations, held workshops, and participated in interviews to share valuable spacesuit lessons learned to ensure this vital information will survive for existing and future generations to use. These events have included spacesuit knowledge from the inception of NASA's first spacesuit to current spacesuit design. To ensure that this information is shared with the entire NASA community and other interested or invested entities, these KC events were digitally recorded and transcribed to be uploaded onto several applicable NASA Web sites. This paper discusses the various Web sites that the KC events are uploaded to and possible future sites that will channel this information.
Integrated information theory of consciousness: an updated account.
Tononi, G
2012-12-01
This article presents an updated account of integrated information theory of consciousness (liT) and some of its implications. /IT stems from thought experiments that lead to phenomenological axioms (existence, compositionality, information, integration, exclusion) and corresponding ontological postulates. The information axiom asserts that every experience is spec~fic - it is what it is by differing in its particular way from a large repertoire of alternatives. The integration axiom asserts that each experience is unified- it cannot be reduced to independent components. The exclusion axiom asserts that every experience is definite - it is limited to particular things and not others and flows at a particular speed and resolution. /IT formalizes these intuitions with postulates. The information postulate states that only "differences that make a difference" from the intrinsic perpective of a system matter: a mechanism generates cause-effect information if its present state has selective past causes and selective future effects within a system. The integration postulate states that only information that is irreducible matters: mechanisms generate integrated information only to the extent that the information they generate cannot be partitioned into that generated within independent components. The exclusion postulate states that only maxima of integrated information matter: a mechanism specifies only one maximally irreducible set of past causes and future effects - a concept. A complex is a set of elements specifying a maximally irreducible constellation of concepts, where the maximum is evaluated over elements and at the optimal spatiatemporal scale. Its concepts specify a maximally integrated conceptual information structure or quale, which is identical with an experience. Finally, changes in information integration upon exposure to the environment reflect a system's ability to match the causal structure of the world. After introducing an updated definition of information integration and related quantities, the article presents some theoretical considerations about the relationship between information and causation and about the relational structure of concepts within a qua/e. It also explores the relationship between the temporal grain size of information integration and the dynamic of metastable states in the corticothalamic complex. Finally, it summarizes how liT accounts for empirical findings about the neural substrate of consciousness, and how various aspects of phenomenology may in principle be addressed in terms of the geometry of information integration.
A preliminary estimate of future communications traffic for the electric power system
NASA Technical Reports Server (NTRS)
Barnett, R. M.
1981-01-01
Diverse new generator technologies using renewable energy, and to improve operational efficiency throughout the existing electric power systems are presented. A description of a model utility and the information transfer requirements imposed by incorporation of dispersed storage and generation technologies and implementation of more extensive energy management are estimated. An example of possible traffic for an assumed system, and an approach that can be applied to other systems, control configurations, or dispersed storage and generation penetrations is provided.
2016 Annual Technology Baseline (ATB)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Wesley; Kurup, Parthiv; Hand, Maureen
Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), National Renewable Energy Laboratory provides an organized and centralized dataset that was reviewed by internal and external experts. It uses the best information from the Department of Energy laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values usingmore » best available information. The ATB includes both a presentation with notes (PDF) and an associated Excel Workbook. The ATB includes the following electricity generation technologies: land-based wind; offshore wind; utility-scale solar PV; concentrating solar power; geothermal power; hydropower plants (upgrades to existing facilities, powering non-powered dams, and new stream-reach development); conventional coal; coal with carbon capture and sequestration; integrated gasification combined cycle coal; natural gas combustion turbines; natural gas combined cycle; conventional biopower. Nuclear laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information.« less
Integrated information theory of consciousness: an updated account.
Tononi, G
2012-01-01
This article presents an updated account of integrated information theory of consciousness (IIT) and some of its implications. IIT stems from thought experiments that lead to phenomenological axioms and ontological postulates. The information axiom asserts that every experience is one out of many, i.e. specific - it is what it is by differing in its particular way from a large repertoire of alternatives. The integration axiom asserts that each experience is one, i.e. unified - it cannot be reduced to independent components. The exclusion axiom asserts that every experience is definite - it is limited to particular things and not others and flows at a particular speed and resolution. IIT formalizes these intuitions with three postulates. The information postulate states that only "differences that make a difference" from the intrinsic perspective of a system matter: a mechanism generates cause-effect information if its present state has specific past causes and specific future effects within a system. The integration postulate states that only information that is irreducible matters: mechanisms generate integrated information only to the extent that the information they generate cannot be partitioned into that generated within independent components. The exclusion postulate states that only maxima of integrated information matter: a mechanism specifies only one maximally irreducible set of past causes and future effects - a concept. A complex is a set of elements specifying a maximally irreducible constellation of concepts, where the maximum is evaluated at the optimal spatio-temporal scale. Its concepts specify a maximally integrated conceptual information structure or quale, which is identical with an experience. Finally, changes in information integration upon exposure to the environment reflect a system's ability to match the causal structure of the world. After introducing an updated definition of information integration and related quantities, the article presents some theoretical considerations about the relationship between information and causation and about the relational structure of concepts within a quale. It also explores the relationship between the temporal grain size of information integration and the dynamic of metastable states in the corticothalamic complex. Finally, it summarizes how IIT accounts for empirical findings about the neural substrate of consciousness, and how various aspects of phenomenology may in principle be addressed in terms of the geometry of information integration.
Moral judgment as information processing: an integrative review.
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.
Moral judgment as information processing: an integrative review
Guglielmo, Steve
2015-01-01
How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022
ERIC Educational Resources Information Center
Gay, Lesbian, and Straight Education Network, New York, NY.
The Gay, Lesbian, and Straight Education Network (GLSEN) aims to unite with educators in cultivating an informed citizenry and future generations of children who respect and accept all people, regardless of their sexual orientation or gender identity. By supporting educators in their efforts to build schools where information and expression flow…
ERIC Educational Resources Information Center
Beare, Hedley
2001-01-01
Forecasts for the future are made against the backdrop of population growth, environmental change, information technology, and globalization. Schools and teachers as we know them will change radically, perhaps become obsolete, as computers and the Internet enable access to information from anywhere, any time. Learning will become a life-long,…
USDA-ARS?s Scientific Manuscript database
It is widely believed that in Germany and Europe the risk of soil erosion by water increases as a result of changes in climate. Especially, an increase of the frequency of extreme precipitation events during phenological crop phases with reduced soil cover is very likely for the near future. A monit...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-09-01
Appendix A, Utility Plant Characteristics, contains information describing the characteristics of seven utility plants that were considered during the final site selection process. The plants are: Valley Electric Generating Plant, downtown Milwaukee; Manitowoc Electric Generating Plant, downtown Manitowoc; Blount Street Electric Generating Plant, downtown Madison; Pulliam Electric Generating Plant, downtown Green Bay; Edgewater Electric Generating Plant, downtown Sheboygan; Rock River Electric Generating Plant, near Janesville and Beloit; and Black Hawk Electric Generating Plant, downtown Beloit. Additional appendices are: Future Loads; hvac Inventory; Load Calculations; Factors to Induce Potential Users; Turbine Retrofit/Distribution System Data; and Detailed Economic Analysis Results/Data.
NASA Astrophysics Data System (ADS)
Faqih, A.
2017-03-01
Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.
Krogh-Jespersen, Sheila; Woodward, Amanda L
2014-01-01
Previous research has shown that young infants perceive others' actions as structured by goals. One open question is whether the recruitment of this understanding when predicting others' actions imposes a cognitive challenge for young infants. The current study explored infants' ability to utilize their knowledge of others' goals to rapidly predict future behavior in complex social environments and distinguish goal-directed actions from other kinds of movements. Fifteen-month-olds (N = 40) viewed videos of an actor engaged in either a goal-directed (grasping) or an ambiguous (brushing the back of her hand) action on a Tobii eye-tracker. At test, critical elements of the scene were changed and infants' predictive fixations were examined to determine whether they relied on goal information to anticipate the actor's future behavior. Results revealed that infants reliably generated goal-based visual predictions for the grasping action, but not for the back-of-hand behavior. Moreover, response latencies were longer for goal-based predictions than for location-based predictions, suggesting that goal-based predictions are cognitively taxing. Analyses of areas of interest indicated that heightened attention to the overall scene, as opposed to specific patterns of attention, was the critical indicator of successful judgments regarding an actor's future goal-directed behavior. These findings shed light on the processes that support "smart" social behavior in infants, as it may be a challenge for young infants to use information about others' intentions to inform rapid predictions.
Advanced electronic displays and their potential in future transport aircraft
NASA Technical Reports Server (NTRS)
Hatfield, J. J.
1981-01-01
It is pointed out that electronic displays represent one of the keys to continued integration and improvement of the effectiveness of avionic systems in future transport aircraft. An employment of modern electronic display media and generation has become vital in connection with the increases in modes and functions of modern aircraft. Requirements for electronic systems of future transports are examined, and a description is provided of the tools which are available for cockpit integration, taking into account trends in information processing and presentation, trends in integrated display devices, and trends concerning input/output devices. Developments related to display media, display generation, and I/O devices are considered, giving attention to a comparison of CRT and flat-panel display technology, advanced HUD technology and multifunction controls. Integrated display formats are discussed along with integrated systems and cockpit configurations.
Romig, Barbara D; Tucker, Ann W; Hewitt, Anne M; O'Sullivan Maillet, Julie
2017-01-01
There is limited information and consensus on the future of clinical education. The Delphi technique was selected to identify agreement among Association of Schools of Allied Health Professions' (ASAHP) allied health deans on the future (2018-2023) of allied health (AH) clinical education. Sixty-one AH deans, 54.9% (61 of 111) of the ASAHP membership, expressed opinions about clinical education through a three-round Delphi study. In conjunction with a conceptual model, four futuristic scenarios were used to encourage deans' feedback on the key factors impacting the future of clinical education. The responses to the four scenarios showed ways the external environment influences which activities the deans recommend. The results presented, by individual scenario and in totality, provide relevant and timely information on the importance and transformation of AH clinical education and its future. Futuristic scenarios, in combination with the Delphi technique, generated information where little exists specific to AH deans' perspectives on AH clinical education. The results offer deans opportunities for future strategic improvements. The use of the futuristic scenarios was suitable for guiding deans' responses and reaching agreement on the future of AH clinical education. These contributions reflect the imminent conditions and healthcare environment identified in the various scenarios and provide additional insight on key factors impacting the future for AH clinical education.
Knowledge Management in Sensor Enabled Online Services
NASA Astrophysics Data System (ADS)
Smyth, Dominick; Cappellari, Paolo; Roantree, Mark
The Future Internet, has as its vision, the development of improved features and usability for services, applications and content. In many cases, services can be provided automatically through the use of monitors or sensors. This means web generated sensor data becoming available not only to the companies that own the sensors but also to the domain users who generate the data and to information and knowledge workers who harvest the output. The goal is improving the service through better usage of the information provided by the service. Applications and services vary from climate, traffic, health and sports event monitoring. In this paper, we present the WSW system that harvests web sensor data to provide additional and, in some cases, more accurate information using an analysis of both live and warehoused information.
Raising Awareness of Pre-Symptomatic Genetic Testing
ERIC Educational Resources Information Center
Boerwinkel, Dirk Jan; Knippels, Marie-Christine; Waarlo, Arend Jan
2011-01-01
Presymptomatic genetic testing generates socioscientific issues in which decision making is complicated by several complexity factors. These factors include weighing of advantages and disadvantages, different interests of stakeholders, uncertainty of genetic information and conflicting values. Education preparing students for future decision…
ERIC Educational Resources Information Center
Croddy, Marshall; Levine, Peter
2014-01-01
As the C3 Framework for the social studies rolls out, it is hoped that its influence will grow, offering a vision and guidance for the development of a new generation of state social studies standards that promote deeper student learning and the acquisition of essentials skills for college, career, and civic life. In the interim, it can be an…
Episodic and semantic content of memory and imagination: A multilevel analysis.
Devitt, Aleea L; Addis, Donna Rose; Schacter, Daniel L
2017-10-01
Autobiographical memories of past events and imaginations of future scenarios comprise both episodic and semantic content. Correlating the amount of "internal" (episodic) and "external" (semantic) details generated when describing autobiographical events can illuminate the relationship between the processes supporting these constructs. Yet previous studies performing such correlations were limited by aggregating data across all events generated by an individual, potentially obscuring the underlying relationship within the events themselves. In the current article, we reanalyzed datasets from eight studies using a multilevel approach, allowing us to explore the relationship between internal and external details within events. We also examined whether this relationship changes with healthy aging. Our reanalyses demonstrated a largely negative relationship between the internal and external details produced when describing autobiographical memories and future imaginations. This negative relationship was stronger and more consistent for older adults and was evident both in direct and indirect measures of semantic content. Moreover, this relationship appears to be specific to episodic tasks, as no relationship was observed for a nonepisodic picture description task. This negative association suggests that people do not generate semantic information indiscriminately, but do so in a compensatory manner, to embellish episodically impoverished events. Our reanalysis further lends support for dissociable processes underpinning episodic and semantic information generation when remembering and imagining autobiographical events.
Bermuda Triangle or three to tango: generation Y, e-health and knowledge management.
Yee, Kwang Chien
2007-01-01
Generation Y workers are slowly gathering critical mass in the healthcare sector. The sustainability of future healthcare is highly dependent on this group of workers. This generation of workers loves technology and thrives in stimulating environments. They have great thirst for life-experience and therefore they move from one working environment to the other. The healthcare system has a hierarchical operational, information and knowledge structure, which unfortunately might not be the ideal ground to integrate with generation Y. The challenges ahead present a fantastic opportunity for electronic health implementation and knowledge management to flourish. Generation Y workers, however, have very different expectation of technology utilisation, technology design and knowledge presentation. This paper will argue that a clear understanding of this group of workers is essential for researchers in health informatics and knowledge management in order to provide socio-technical integrated solution for this group of future workers. The sustainability of a quality healthcare system will depend upon the integration of generation Y, health informatics and knowledge management strategies in a re-invented healthcare system.
de Medeiros, Kate; Rubinstein, Robert; Ermoshkina, Polina
2015-08-01
This paper examines generativity, social suffering, and culture change in a sample of 16 women aged 65 years or older who emigrated from the former Soviet Union. Key concerns with generativity are identity, which can be strongly rooted in one's original cultural formation, and a stable life course, which is what ideally enables generative impulses to be cultivated in later life. To better understand how early social suffering may affect later life generativity, we conducted two 90-min interviews with each of our participants on their past experiences and current views of generativity. The trauma of World War II, poor quality of life in the Soviet Union, scarcity of shelter and supplies, and fear of arrest emerged as common components in social suffering, which affected their identity. Overall, the theme of broken links to the future--the sense that their current lives were irrelevant to future generations--was strong among informants in their interviews, pointing to the importance of life course stability in relation to certain forms of generativity. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Knowledge environments representing molecular entities for the virtual physiological human.
Hofmann-Apitius, Martin; Fluck, Juliane; Furlong, Laura; Fornes, Oriol; Kolárik, Corinna; Hanser, Susanne; Boeker, Martin; Schulz, Stefan; Sanz, Ferran; Klinger, Roman; Mevissen, Theo; Gattermayer, Tobias; Oliva, Baldo; Friedrich, Christoph M
2008-09-13
In essence, the virtual physiological human (VPH) is a multiscale representation of human physiology spanning from the molecular level via cellular processes and multicellular organization of tissues to complex organ function. The different scales of the VPH deal with different entities, relationships and processes, and in consequence the models used to describe and simulate biological functions vary significantly. Here, we describe methods and strategies to generate knowledge environments representing molecular entities that can be used for modelling the molecular scale of the VPH. Our strategy to generate knowledge environments representing molecular entities is based on the combination of information extraction from scientific text and the integration of information from biomolecular databases. We introduce @neuLink, a first prototype of an automatically generated, disease-specific knowledge environment combining biomolecular, chemical, genetic and medical information. Finally, we provide a perspective for the future implementation and use of knowledge environments representing molecular entities for the VPH.
NASA Astrophysics Data System (ADS)
Myers, B.; Beard, T. D.; Weiskopf, S. R.; Jackson, S. T.; Tittensor, D.; Harfoot, M.; Senay, G. B.; Casey, K.; Lenton, T. M.; Leidner, A. K.; Ruane, A. C.; Ferrier, S.; Serbin, S.; Matsuda, H.; Shiklomanov, A. N.; Rosa, I.
2017-12-01
Biodiversity and ecosystems services underpin political targets for the conservation of biodiversity; however, previous incarnations of these biodiversity-related targets have not relied on integrated model based projections of possible outcomes based on climate and land use change. Although a few global biodiversity models are available, most biodiversity models lie along a continuum of geography and components of biodiversity. Model-based projections of the future of global biodiversity are critical to support policymakers in the development of informed global conservation targets, but the scientific community lacks a clear strategy for integrating diverse data streams in developing, and evaluating the performance of, such biodiversity models. Therefore, in this paper, we propose a framework for ongoing testing and refinement of model-based projections of biodiversity trends and change, by linking a broad variety of biodiversity models with data streams generated by advances in remote sensing, coupled with new and emerging in-situ observation technologies to inform development of essential biodiversity variables, future global biodiversity targets, and indicators. Our two main objectives are to (1) develop a framework for model testing and refining projections of a broad range of biodiversity models, focusing on global models, through the integration of diverse data streams and (2) identify the realistic outputs that can be developed and determine coupled approaches using remote sensing and new and emerging in-situ observations (e.g., metagenomics) to better inform the next generation of global biodiversity targets.
Next generation information communication infrastructure and case studies for future power systems
NASA Astrophysics Data System (ADS)
Qiu, Bin
As power industry enters the new century, powerful driving forces, uncertainties and new functions are compelling electric utilities to make dramatic changes in their information communication infrastructure. Expanding network services such as real time measurement and monitoring are also driving the need for more bandwidth in the communication network. These needs will grow further as new remote real-time protection and control applications become more feasible and pervasive. This dissertation addresses two main issues for the future power system information infrastructure: communication network infrastructure and associated power system applications. Optical networks no doubt will become the predominant data transmission media for next generation power system communication. The rapid development of fiber optic network technology poses new challenges in the areas of topology design, network management and real time applications. Based on advanced fiber optic technologies, an all-fiber network is investigated and proposed. The study will cover the system architecture and data exchange protocol aspects. High bandwidth, robust optical networks could provide great opportunities to the power system for better service and efficient operation. In the dissertation, different applications are investigated. One of the typical applications is the SCADA information accessing system. An Internet-based application for the substation automation system will be presented. VLSI (Very Large Scale Integration) technology is also used for one-line diagrams auto-generation. High transition rate and low latency optical network is especially suitable for power system real time control. In the dissertation, a new local area network based Load Shedding Controller (LSC) for isolated power system will be presented. By using PMU (Phasor Measurement Unit) and fiber optic network, an AGE (Area Generation Error) based accurate wide area load shedding scheme will also be proposed. The objective is to shed the load in the limited area with minimum disturbance.
NASA RPS Program Overview: A Focus on RPS Users
NASA Technical Reports Server (NTRS)
Hamley, John A.; Sutliff, Thomas J.; Sandifer, Carl E., II; Zakrajsek, June F.
2016-01-01
The goal of NASA's Radioisotope Power Systems (RPS) Program is to make RPS ready and available to support the exploration of the solar system in environments where the use of conventional solar or chemical power generation is impractical or impossible to meet the needs of the missions. To meet this goal, the RPS Program, working closely with the Department of Energy, performs mission and system studies (such as the recently released Nuclear Power Assessment Study), assesses the readiness of promising technologies to infuse in future generators, assesses the sustainment of key RPS capabilities and knowledge, forecasts and tracks the Programs budgetary needs, and disseminates current information about RPS to the community of potential users. This process has been refined and used to determine the current content of the RPS Programs portfolio. This portfolio currently includes an effort to mature advanced thermoelectric technology for possible integration into an enhanced Multi-Mission Radioisotope Generator (eMMRTG), sustainment and production of the currently deployed MMRTG, and technology investments that could lead to a future Stirling Radioisotope Generator (SRG). This paper describes the program planning processes that have been used, the currently available MMRTG, and one of the potential future systems, the eMMRTG.
Patterning roadmap: 2017 prospects
NASA Astrophysics Data System (ADS)
Neisser, Mark
2017-06-01
Road mapping of semiconductor chips has been underway for over 20 years, first with the International Technology Roadmap for Semiconductors (ITRS) roadmap and now with the International Roadmap for Devices and Systems (IRDS) roadmap. The original roadmap was mostly driven bottom up and was developed to ensure that the large numbers of semiconductor producers and suppliers had good information to base their research and development on. The current roadmap is generated more top-down, where the customers of semiconductor chips anticipate what will be needed in the future and the roadmap projects what will be needed to fulfill that demand. The More Moore section of the roadmap projects that advanced logic will drive higher-resolution patterning, rather than memory chips. Potential solutions for patterning future logic nodes can be derived as extensions of `next-generation' patterning technologies currently under development. Advanced patterning has made great progress, and two `next-generation' patterning technologies, EUV and nanoimprint lithography, have potential to be in production as early as 2018. The potential adoption of two different next-generation patterning technologies suggests that patterning technology is becoming more specialized. This is good for the industry in that it lowers overall costs, but may lead to slower progress in extending any one patterning technology in the future.
Scenario-neutral Food Security Risk Assessment: A livestock Heat Stress Case Study
NASA Astrophysics Data System (ADS)
Broman, D.; Rajagopalan, B.; Hopson, T. M.
2015-12-01
Food security risk assessments can provide decision-makers with actionable information to identify critical system limitations, and alternatives to mitigate the impacts of future conditions. The majority of current risk assessments have been scenario-led and results are limited by the scenarios - selected future states of the world's climate system and socioeconomic factors. A generic scenario-neutral framework for food security risk assessments is presented here that uses plausible states of the world without initially assigning likelihoods. Measures of system vulnerabilities are identified and system risk is assessed for these states. This framework has benefited greatly by research in the water and natural resource fields to adapt their planning to provide better risk assessments. To illustrate the utility of this framework we develop a case study using livestock heat stress risk within the pastoral system of West Africa. Heat stress can have a major impact not only on livestock owners, but on the greater food production system, decreasing livestock growth, milk production, and reproduction, and in severe cases, death. A heat stress index calculated from daily weather is used as a vulnerability measure and is computed from historic daily weather data at several locations in the study region. To generate plausible states, a stochastic weather generator is developed to generate synthetic weather sequences at each location, consistent with the seasonal climate. A spatial model of monthly and seasonal heat stress provide projections of current and future livestock heat stress measures across the study region, and can incorporate in seasonal climate and other external covariates. These models, when linked with empirical thresholds of heat stress risk for specific breeds offer decision-makers with actionable information for use in near-term warning systems as well as for future planning. Future assessment can indicate under which states livestock are at greatest risk of heat stress; when coupled with assessments of additional measures (e.g. water and fodder availability) can inform on alternatives that provide satisfactory performance under a wide range of states (e.g. optimal cattle breed, supplemental feed, increased water access).
Biogas Potential in the United States (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2013-10-01
Biogas has received increased attention as an alternative energy source in the United States. The factsheet provides information about the biogas (methane) potential from various sources in the country (by county and state) and estimates the power generation and transportation fuels production (renewable natural gas) potential from these biogas sources. It provides valuable information to the industry, academia and policy makers in support of their future decisions.
Resurgent Russia in 2030. Challenge for the USAF
2009-09-01
real-time unfiltered news and information from the Internet.33 For example, Chechen sup- porters posted videos on YouTube showing improvised explosive...Future Shane P. Courville, December 2007 64 Next Generation Nanotechnology Assembly Fabrication Methods: A Trend Forecast Vincent T. Jovene , Jr
Advanced technologies for NASA space programs
NASA Technical Reports Server (NTRS)
Krishen, Kumar
1991-01-01
A review of the technology requirements for future space programs is presented. The technologies are emphasized with a discussion of their mission impact. Attention is given to automation and robotics, materials, information acquisition/processing display, nano-electronics/technology, superconductivity, and energy generation and storage.
Grimm, Sabine E; Dixon, Simon; Stevens, John W
Health technology assessments (HTAs) that take account of future price changes have been examined in the literature, but the important issue of price reductions that are generated by the reimbursement decision has been ignored. To explore the impact of future price reductions caused by increasing uptake on HTAs and decision making for medical devices. We demonstrate the use of a two-stage modeling approach to derive estimates of technology price as a consequence of changes in technology uptake over future periods on the basis of existing theory and supported by empirical studies. We explore the impact on cost-effectiveness and expected value of information analysis in an illustrative example on the basis of a technology in development for preterm birth screening. The application of our approach to the case study technology generates smaller incremental cost-effectiveness ratios compared with the commonly used single cohort approach. The extent of this reduction in the incremental cost-effectiveness ratio depends on the magnitude of the modeled price reduction, the speed of diffusion, and the length of the assumed technology life horizon. Results of value of information analysis are affected through changes in the expected net benefit calculation, the addition of uncertain parameters, and the diffusion-adjusted estimate of the affected patient population. Because modeling future changes in price and uptake has the potential to affect HTA outcomes, modeling techniques that can address such changes should be considered for medical devices that may otherwise be rejected. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Parker, Andrew; Parkin, Adam; Dagnall, Neil
2017-06-01
The present research investigated the effects of personal handedness and saccadic eye movements on the specificity of past autobiographical memory and episodic future thinking. Handedness and saccadic eye movements have been hypothesised to share a common functional basis in that both influence cognition through hemispheric interaction. The technique used to elicit autobiographical memory and episodic future thought involved a cued sentence completion procedure that allowed for the production of memories spanning the highly specific to the very general. Experiment 1 found that mixed-handed (vs. right handed) individuals generated more specific past autobiographical memories, but equivalent numbers of specific future predictions. Experiment 2 demonstrated that following 30s of bilateral (horizontal) saccades, more specific cognitions about both the past and future were generated. These findings extend previous research by showing that more distinct and episodic-like information pertaining to the self can be elicited by either mixed-handedness or eye movements. The results are discussed in relation to hemispheric interaction and top-down influences in the control of memory retrieval. Copyright © 2017 Elsevier Inc. All rights reserved.
IVHM for the 3rd Generation RLV Program: Technology Development
NASA Technical Reports Server (NTRS)
Kahle, Bill
2000-01-01
The objective behind the Integrated Vehicle Health Management (IVHM) project is to develop and integrate the technologies which can provide a continuous, intelligent, and adaptive health state of a vehicle and use this information to improve safety and reduce costs of operations. Technological areas discussed include: developing, validating, and transfering next generation IVHM technologies to near term industry and government reusable launch systems; focus NASA on the next generation and highly advanced sensor and software technologies; and validating IVHM systems engineering design process for future programs.
Model learning for robot control: a survey.
Nguyen-Tuong, Duy; Peters, Jan
2011-11-01
Models are among the most essential tools in robotics, such as kinematics and dynamics models of the robot's own body and controllable external objects. It is widely believed that intelligent mammals also rely on internal models in order to generate their actions. However, while classical robotics relies on manually generated models that are based on human insights into physics, future autonomous, cognitive robots need to be able to automatically generate models that are based on information which is extracted from the data streams accessible to the robot. In this paper, we survey the progress in model learning with a strong focus on robot control on a kinematic as well as dynamical level. Here, a model describes essential information about the behavior of the environment and the influence of an agent on this environment. In the context of model-based learning control, we view the model from three different perspectives. First, we need to study the different possible model learning architectures for robotics. Second, we discuss what kind of problems these architecture and the domain of robotics imply for the applicable learning methods. From this discussion, we deduce future directions of real-time learning algorithms. Third, we show where these scenarios have been used successfully in several case studies.
M.A.E.G.U.S.: Measuring alternate energy generation via unity simulation
NASA Astrophysics Data System (ADS)
Nataraja, Kavin Muhilan
This paper presents the MAEGUS serious game and a study to determine its efficacy as a pedagogical tool. The MAEGUS serious game teaches sustainable energy concepts through gameplay simulating wind turbines and solar arrays. Players take the role of an energy manager for a city and use realistic data and information visualizations to learn the physical factors of wind and solar energy generation. The MAEGUS serious game study compares game assisted learning to a more traditional teaching method such as reading material in a crossover study, the results of which can inform future serious game development for educational purposes.
Liu, Kui; Guo, Jun; Cai, Chunxiao; Zhang, Junxiang; Gao, Jiangrui
2016-11-15
Multipartite entanglement is used for quantum information applications, such as building multipartite quantum communications. Generally, generation of multipartite entanglement is based on a complex beam-splitter network. Here, based on the spatial freedom of light, we experimentally demonstrated spatial quadripartite continuous variable entanglement among first-order Hermite-Gaussian modes using a single type II optical parametric oscillator operating below threshold with an HG0245° pump beam. The entanglement can be scalable for larger numbers of spatial modes by changing the spatial profile of the pump beam. In addition, spatial multipartite entanglement will be useful for future spatial multichannel quantum information applications.
NASA Astrophysics Data System (ADS)
Vallam, P.; Qin, X. S.
2017-07-01
Flooding risk is increasing in many parts of the world and may worsen under climate change conditions. The accuracy of predicting flooding risk relies on reasonable projection of meteorological data (especially rainfall) at the local scale. The current statistical downscaling approaches face the difficulty of projecting multi-site climate information for future conditions while conserving spatial information. This study presents a combined Long Ashton Research Station Weather Generator (LARS-WG) stochastic weather generator and multi-site rainfall simulator RainSim (CLWRS) approach to investigate flow regimes under future conditions in the Kootenay Watershed, Canada. To understand the uncertainty effect stemming from different scenarios, the climate output is fed into a hydrologic model. The results showed different variation trends of annual peak flows (in 2080-2099) based on different climate change scenarios and demonstrated that the hydrological impact would be driven by the interaction between snowmelt and peak flows. The proposed CLWRS approach is useful where there is a need for projection of potential climate change scenarios.
Environmental sciences information storage and retrieval system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engstrom, D.E.; White, M.G.; Dunaway, P.B.
Reynolds Electrical and Engineering Co., Inc. (REECo), has since 1970 accumulated information relating to the AEC's Nevada Applied Ecology Group (NAEG) programs at the Nevada Test Site (NTS). These programs, involving extensive soil, vegetation, and small-animal studies, have generated informational data concerning the collecting, processing, analyzing, and shipping of sample materials to various program participants and contractors. Future plans include incorporation of Lawrence Livermore Laboratory's resuspension study data, REECo's on-site air data, and EPA's large-animal, off-site air, and off-site soil data. (auth)
Urinary Sugars--A Biomarker of Total Sugars Intake.
Tasevska, Natasha
2015-07-15
Measurement error in self-reported sugars intake may explain the lack of consistency in the epidemiologic evidence on the association between sugars and disease risk. This review describes the development and applications of a biomarker of sugars intake, informs its future use and recommends directions for future research. Recently, 24 h urinary sucrose and fructose were suggested as a predictive biomarker for total sugars intake, based on findings from three highly controlled feeding studies conducted in the United Kingdom. From this work, a calibration equation for the biomarker that provides an unbiased measure of sugars intake was generated that has since been used in two US-based studies with free-living individuals to assess measurement error in dietary self-reports and to develop regression calibration equations that could be used in future diet-disease analyses. Further applications of the biomarker include its use as a surrogate measure of intake in diet-disease association studies. Although this biomarker has great potential and exhibits favorable characteristics, available data come from a few controlled studies with limited sample sizes conducted in the UK. Larger feeding studies conducted in different populations are needed to further explore biomarker characteristics and stability of its biases, compare its performance, and generate a unique, or population-specific biomarker calibration equations to be applied in future studies. A validated sugars biomarker is critical for informed interpretation of sugars-disease association studies.
Urinary Sugars—A Biomarker of Total Sugars Intake
Tasevska, Natasha
2015-01-01
Measurement error in self-reported sugars intake may explain the lack of consistency in the epidemiologic evidence on the association between sugars and disease risk. This review describes the development and applications of a biomarker of sugars intake, informs its future use and recommends directions for future research. Recently, 24 h urinary sucrose and fructose were suggested as a predictive biomarker for total sugars intake, based on findings from three highly controlled feeding studies conducted in the United Kingdom. From this work, a calibration equation for the biomarker that provides an unbiased measure of sugars intake was generated that has since been used in two US-based studies with free-living individuals to assess measurement error in dietary self-reports and to develop regression calibration equations that could be used in future diet-disease analyses. Further applications of the biomarker include its use as a surrogate measure of intake in diet-disease association studies. Although this biomarker has great potential and exhibits favorable characteristics, available data come from a few controlled studies with limited sample sizes conducted in the UK. Larger feeding studies conducted in different populations are needed to further explore biomarker characteristics and stability of its biases, compare its performance, and generate a unique, or population-specific biomarker calibration equations to be applied in future studies. A validated sugars biomarker is critical for informed interpretation of sugars-disease association studies. PMID:26184307
2014-09-01
under workman’s compensation? 9) Describe the Laissez -Fa ire leadership style. Is this style applicable in the fire service? 10) Is training required...Informal mentoring appears to be the status quo in homeland security agencies for leadership development. However, informal mentoring is flawed due to...formal mentoring programs can assist organizations with employee retention, succession planning, leadership development, closing generational gaps, and
Methods, apparatus and system for notification of predictable memory failure
Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.
2017-01-03
A method for providing notification of a predictable memory failure includes the steps of: obtaining information regarding at least one condition associated with a memory; calculating a memory failure probability as a function of the obtained information; calculating a failure probability threshold; and generating a signal when the memory failure probability exceeds the failure probability threshold, the signal being indicative of a predicted future memory failure.
Ottsen, Christina Lundsgaard; Berntsen, Dorthe
2015-12-01
Mental time travel is the ability to remember past events and imagine future events. Here, 124 Middle Easterners and 128 Scandinavians generated important past and future events. These different societies present a unique opportunity to examine effects of culture. Findings indicate stronger influence of normative schemas and greater use of mental time travel to teach, inform and direct behaviour in the Middle East compared with Scandinavia. The Middle Easterners generated more events that corresponded to their cultural life script and that contained religious words, whereas the Scandinavians reported events with a more positive mood impact. Effects of gender were mainly found in the Middle East. Main effects of time orientation largely replicated recent findings showing that simulation of future and past events are not necessarily parallel processes. In accordance with the notion that future simulations rely on schema-based construction, important future events showed a higher overlap with life script events than past events in both cultures. In general, cross-cultural discrepancies were larger in future compared with past events. Notably, the high focus in the Middle East on sharing future events to give cultural guidance is consistent with the increased adherence to normative scripts found in this culture. Copyright © 2015 Elsevier Inc. All rights reserved.
Age Moderates the Relationship Between Generativity Concern and Understanding of Wealth.
Li, Tianyuan; Tsang, Vivian Hiu-Ling
2016-01-01
Wealth can be considered as resource to promote either public welfare (i.e. through altruistic understanding) or personal well-being (i.e. through egoistic understanding). How people understand wealth can influence the distribution of valuable materialistic resources within a society. The current study examined how generativity concern, the concern for next generation and social welfare in the future, influenced people's understanding of wealth and whether age moderated the relationship. A total of 133 participants ranging from 18 to 78 years old were interviewed with four open-ended questions regarding their understanding of wealth. Their generativity concern and demographical information were also recorded. Findings showed that generativity concern was related to a less egoistic and more altruistic understanding of wealth. Moreover, the effect of generativity concern was especially salient for younger adults, but not significant for older adults. The results suggest that generativity concern is a construct that applies to both young and older adults. It can even be more influential to young adults' cognitive conceptualization in certain aspects (e.g., understanding of wealth) than that of older adults. Future studies can further investigate the general impact of generativity concern as well as the behavioral consequences of people's understanding of wealth. The results were also discussed in the context of lifelong learning.
The Future of the Fighter Pilot ... Will There Be a 6th Generation Fighter?
2012-02-02
tndudtng suggestioos lor r &ducing this burden to Washington Headquarters Service, Directorate ror Information Operations and Reports, 1215 Jefferson DaviS...replacement. 22 The concept of spreading R &D risk in this manner at the same time as pre-signing a wide number of future customers was designed to...aviationweek.com/aw/generic/story_channel.jsp ?channel=Defense &id=news/ awst /2011/03/21/AW_03 _21_2011_p27297530.xml&headline=null&next=20>. 33 Ibid
Science Teachers' Perspectives about Climate Change
ERIC Educational Resources Information Center
Dawson, Vaille
2012-01-01
Climate change and its effects are likely to present challenging problems for future generations of young people. It is important for Australian students to understand the mechanisms and consequences of climate change. If students are to develop a sophisticated understanding, then science teachers need to be well-informed about climate change…
ERIC Educational Resources Information Center
Taylor, Alison
In 2000, a government-supported foundation called Careers the Next Generation (CNG) in Alberta, Canada, began coordinating summer internships for high school students in information and computer technology (ICT). The participating firms represented a mix of large and small private and public organizations in high-tech and other industries in the…
Climate Change Ignorance: An Unacceptable Legacy
ERIC Educational Resources Information Center
Boon, Helen J.
2015-01-01
Climate change effects will be most acutely felt by future generations. Recent prior research has shown that school students' knowledge of climate change science is very limited in rural Australia. The purpose of this study was to assess the capacity of preservice teachers and parents to transmit climate change information and understanding to…
Battlefield of the Future: How to Achieve Superiority in the Cyberspace Domain
2016-02-01
the information. Spamming is sending unsolicited email advertising for services, products and websites used as a delivery mechanism for malware and...other cyber threats. Spoofing is generating a fake website to impersonate a real website run by a different party. Email spoofing 11 is altering
Race, Elizabeth; Keane, Margaret M.; Verfaellie, Mieke
2015-01-01
The medial temporal lobe (MTL) makes critical contributions to episodic memory, but its contributions to episodic future thinking remain a matter of debate. By one view, imagining future events relies on MTL mechanisms that also support memory for past events. Alternatively, it has recently been suggested that future thinking is independent of MTL-mediated processes and can be supported by regions outside the MTL. The current study investigated the nature and necessity of MTL involvement in imagining the future and tested the novel hypothesis that the MTL contributes to future thinking by supporting online binding processes related to narrative construction. Human amnesic patients with well-characterized MTL damage and healthy controls constructed narratives about (a) future events, (b) past events, and (c) visually-presented pictures. While all three tasks place similar demands on narrative construction, only the past and future conditions require memory/future thinking to mentally generate relevant narrative information. Patients produced impoverished descriptions of both past and future events but were unimpaired at producing detailed picture narratives. In addition, future-thinking performance positively correlated with episodic memory performance but did not correlate with picture narrative performance. Finally, future-thinking impairments were present when MTL lesions were restricted to the hippocampus and did not depend on the presence of neural damage outside the MTL. These results indicate that the ability to generate and maintain a detailed narrative is preserved in amnesia and suggest that a common MTL mechanism supports both episodic memory and episodic future thinking. PMID:21753003
Race, Elizabeth; Keane, Margaret M; Verfaellie, Mieke
2011-07-13
The medial temporal lobe (MTL) makes critical contributions to episodic memory, but its contributions to episodic future thinking remain a matter of debate. By one view, imagining future events relies on MTL mechanisms that also support memory for past events. Alternatively, it has recently been suggested that future thinking is independent of MTL-mediated processes and can be supported by regions outside the MTL. The current study investigated the nature and necessity of MTL involvement in imagining the future and tested the novel hypothesis that the MTL contributes to future thinking by supporting online binding processes related to narrative construction. Human amnesic patients with well characterized MTL damage and healthy controls constructed narratives about (1) future events, (2) past events, and (3) visually presented pictures. While all three tasks place similar demands on narrative construction, only the past and future conditions require memory/future thinking to mentally generate relevant narrative information. Patients produced impoverished descriptions of both past and future events but were unimpaired at producing detailed picture narratives. In addition, future-thinking performance positively correlated with episodic memory performance but did not correlate with picture narrative performance. Finally, future-thinking impairments were present when MTL lesions were restricted to the hippocampus and did not depend on the presence of neural damage outside the MTL. These results indicate that the ability to generate and maintain a detailed narrative is preserved in amnesia and suggest that a common MTL mechanism supports both episodic memory and episodic future thinking.
How a future energy world could look?
NASA Astrophysics Data System (ADS)
Ewert, M.
2012-10-01
The future energy system will change significantly within the next years as a result of the following Mega Trends: de-carbonization, urbanization, fast technology development, individualization, glocalization (globalization and localization) and changing demographics. Increasing fluctuating renewable production will change the role of non-renewable generation. Distributed energy from renewables and micro generation will change the direction of the energy flow in the electricity grids. Production will not follow demand but demand has to follow production. This future system is enabled by the fast technical development of information and communication technologies which will be present in the entire system. In this paper the results of a comprehensive analysis with different scenarios is summarized. Tools were used like the analysis of policy trends in the European countries, modelling of the European power grid, modelling of the European power markets and the analysis of technology developments with cost reduction potentials. With these tools the interaction of the main actors in the energy markets like conventional generation and renewable generation, grid transport, electricity storage including new storage options from E-Mobility, Power to Gas, Compressed Air Energy storage and demand side management were considered. The potential application of technologies and investments in new energy technologies were analyzed within existing frameworks and markets as well as new business models in new markets with different frameworks. In the paper the over all trend of this analysis is presented by describing a potential future energy world. This world represents only one of numerous options with comparable characteristics.
Brown, Eric W.; Detter, Chris; Gerner-Smidt, Peter; Gilmour, Matthew W.; Harmsen, Dag; Hendriksen, Rene S.; Hewson, Roger; Heymann, David L.; Johansson, Karin; Ijaz, Kashef; Keim, Paul S.; Koopmans, Marion; Kroneman, Annelies; Wong, Danilo Lo Fo; Lund, Ole; Palm, Daniel; Sawanpanyalert, Pathom; Sobel, Jeremy; Schlundt, Jørgen
2012-01-01
The rapid advancement of genome technologies holds great promise for improving the quality and speed of clinical and public health laboratory investigations and for decreasing their cost. The latest generation of genome DNA sequencers can provide highly detailed and robust information on disease-causing microbes, and in the near future these technologies will be suitable for routine use in national, regional, and global public health laboratories. With additional improvements in instrumentation, these next- or third-generation sequencers are likely to replace conventional culture-based and molecular typing methods to provide point-of-care clinical diagnosis and other essential information for quicker and better treatment of patients. Provided there is free-sharing of information by all clinical and public health laboratories, these genomic tools could spawn a global system of linked databases of pathogen genomes that would ensure more efficient detection, prevention, and control of endemic, emerging, and other infectious disease outbreaks worldwide. PMID:23092707
Electricity Generation Baseline Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Logan, Jeffrey; Marcy, Cara; McCall, James
This report was developed by a team of national laboratory analysts over the period October 2015 to May 2016 and is part of a series of studies that provide background material to inform development of the second installment of the Quadrennial Energy Review (QER 1.2). The report focuses specifically on U.S. power sector generation. The report limits itself to the generation sector and does not address in detail parallel issues in electricity end use, transmission and distribution, markets and policy design, and other important segments. The report lists 15 key findings about energy system needs of the future.
Towards a Preservation Content Standard for Earth Observation Data
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuram; Lowe, Dawn; Murphy, Kevin
2017-01-01
Information from Earth observing missions (remote sensing with airborne and spaceborne instruments, and in situ measurements such as those from field campaigns) is proliferating in the world. Many agencies across the globe are generating important datasets by collecting measurements from instruments on board aircraft and spacecraft, globally and constantly. The data resulting from such measurements are a valuable resource that needs to be preserved for the benefit of future generations. These observations are the primary record of the Earths environment and therefore are the key to understanding how conditions in the future will compare to conditions today. Earth science observational data, derived products and models are used to answer key questions of global significance. In the near-term, as long as the missions data are being used actively for scientific research, it continues to be important to provide easy access to the data and services commensurate with current information technology. For the longer term, when the focus of the research community shifts toward new missions and observations, it is essential to preserve the previous mission data and associated information. This will enable a new user in the future to understand how the data were used for deriving information, knowledge and policy recommendations and to repeat the experiment to ascertain the validity and possible limitations of conclusions reached in the past and to provide confidence in long term trends that depended on data from multiple missions. Organizations that collect, process, and utilize Earth observation data today have a responsibility to ensure that the data and associated content continue to be preserved by them or are gathered and handed off to other organizations for preservation for the benefit of future generations. In order to ensure preservation of complete content necessary for understanding and reusing the data and derived digital products from todays missions, it is necessary to develop a specification of such preservation content. While there are existing standards that address archival and preservation in general, there are no existing international standards or specifications today to address what content should be preserved. The purpose of this paper is to outline briefly the existing standards that apply to preservation, describe a recent effort in getting an international standard in place for specifying preservation content for Earth observation data and derived digital data products and the remaining work needed to arrive at a standard.
Emerging issues and future directions of the field of health communication.
Hannawa, Annegret F; Kreps, Gary L; Paek, Hye-Jin; Schulz, Peter J; Smith, Sandi; Street, Richard L
2014-01-01
The interdisciplinary intersections between communication science and health-related fields are pervasive, with numerous differences in regard to epistemology, career planning, funding perspectives, educational goals, and cultural orientations. This article identifies and elaborates on these challenges with illustrative examples. Furthermore, concrete suggestions for future scholarship are recommended to facilitate compatible, coherent, and interdisciplinary health communication inquiry. The authors hope that this article helps current and future generations of health communication scholars to make more informed decisions when facing some of the challenges discussed in this article so that they will be able to seize the interdisciplinary and international potential of this unique and important field of study.
Apuzzo, M L; Liu, C Y
2001-10-01
THIS ARTICLE DISCUSSES elements in the definition of modernity and emerging futurism in neurological surgery. In particular, it describes evolution, discovery, and paradigm shifts in the field and forces responsible for their realization. It analyzes the cyclical reinvention of the discipline experienced during the past generation and attempts to identify apertures to the near and more remote future. Subsequently, it focuses on forces and discovery in computational science, imaging, molecular science, biomedical engineering, and information processing as they relate to the theme of minimalism that is evident in the field. These areas are explained in the light of future possibilities offered by the emerging field of nanotechnology with molecular engineering.
Airframe Noise Studies: Review and Future Direction
NASA Technical Reports Server (NTRS)
Rackl, Robert G.; Miller, Gregory; Guo, Yueping; Yamamoto, Kingo
2005-01-01
This report contains the following information: 1) a review of airframe noise research performed under NASA's Advanced Subsonic Transport (AST) program up to the year 2000, 2) a comparison of the year 1992 airframe noise predictions with those using a year 2000 baseline, 3) an assessment of various airframe noise reduction concepts as applied to the year 2000 baseline predictions, and 4) prioritized recommendations for future airframe noise reduction work. NASA's Aircraft Noise Prediction Program was the software used for all noise predictions and assessments. For future work, the recommendations for the immediate future focus on the development of design tools sensitive to airframe noise treatment effects and on improving the basic understanding of noise generation by the landing gear as well as on its reduction.
Statistical downscaling and future scenario generation of temperatures for Pakistan Region
NASA Astrophysics Data System (ADS)
Kazmi, Dildar Hussain; Li, Jianping; Rasul, Ghulam; Tong, Jiang; Ali, Gohar; Cheema, Sohail Babar; Liu, Luliu; Gemmer, Marco; Fischer, Thomas
2015-04-01
Finer climate change information on spatial scale is required for impact studies than that presently provided by global or regional climate models. It is especially true for regions like South Asia with complex topography, coastal or island locations, and the areas of highly heterogeneous land-cover. To deal with the situation, an inexpensive method (statistical downscaling) has been adopted. Statistical DownScaling Model (SDSM) employed for downscaling of daily minimum and maximum temperature data of 44 national stations for base time (1961-1990) and then the future scenarios generated up to 2099. Observed as well as Predictors (product of National Oceanic and Atmospheric Administration) data were calibrated and tested on individual/multiple basis through linear regression. Future scenario was generated based on HadCM3 daily data for A2 and B2 story lines. The downscaled data has been tested, and it has shown a relatively strong relationship with the observed in comparison to ECHAM5 data. Generally, the southern half of the country is considered vulnerable in terms of increasing temperatures, but the results of this study projects that in future, the northern belt in particular would have a possible threat of increasing tendency in air temperature. Especially, the northern areas (hosting the third largest ice reserves after the Polar Regions), an important feeding source for Indus River, are projected to be vulnerable in terms of increasing temperatures. Consequently, not only the hydro-agricultural sector but also the environmental conditions in the area may be at risk, in future.
NASA Astrophysics Data System (ADS)
Goderniaux, Pascal; BrouyèRe, Serge; Blenkinsop, Stephen; Burton, Aidan; Fowler, Hayley J.; Orban, Philippe; Dassargues, Alain
2011-12-01
Several studies have highlighted the potential negative impact of climate change on groundwater reserves, but additional work is required to help water managers plan for future changes. In particular, existing studies provide projections for a stationary climate representative of the end of the century, although information is demanded for the near future. Such time-slice experiments fail to account for the transient nature of climatic changes over the century. Moreover, uncertainty linked to natural climate variability is not explicitly considered in previous studies. In this study we substantially improve upon the state-of-the-art by using a sophisticated transient weather generator in combination with an integrated surface-subsurface hydrological model (Geer basin, Belgium) developed with the finite element modeling software "HydroGeoSphere." This version of the weather generator enables the stochastic generation of large numbers of equiprobable climatic time series, representing transient climate change, and used to assess impacts in a probabilistic way. For the Geer basin, 30 equiprobable climate change scenarios from 2010 to 2085 have been generated for each of six different regional climate models (RCMs). Results show that although the 95% confidence intervals calculated around projected groundwater levels remain large, the climate change signal becomes stronger than that of natural climate variability by 2085. Additionally, the weather generator's ability to simulate transient climate change enabled the assessment of the likely time scale and associated uncertainty of a specific impact, providing managers with additional information when planning further investment. This methodology constitutes a real improvement in the field of groundwater projections under climate change conditions.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Science and Technology.
Hearings on the use of computer technology in the health care field are presented to provide information needed by Congress and the Food and Drug Administration to make future policies. Medical computing systems can make interpretations of data on the patient's health and can generate diagnostic recommendations to the physician. Included are…
Standards for Clinical Grade Genomic Databases.
Yohe, Sophia L; Carter, Alexis B; Pfeifer, John D; Crawford, James M; Cushman-Vokoun, Allison; Caughron, Samuel; Leonard, Debra G B
2015-11-01
Next-generation sequencing performed in a clinical environment must meet clinical standards, which requires reproducibility of all aspects of the testing. Clinical-grade genomic databases (CGGDs) are required to classify a variant and to assist in the professional interpretation of clinical next-generation sequencing. Applying quality laboratory standards to the reference databases used for sequence-variant interpretation presents a new challenge for validation and curation. To define CGGD and the categories of information contained in CGGDs and to frame recommendations for the structure and use of these databases in clinical patient care. Members of the College of American Pathologists Personalized Health Care Committee reviewed the literature and existing state of genomic databases and developed a framework for guiding CGGD development in the future. Clinical-grade genomic databases may provide different types of information. This work group defined 3 layers of information in CGGDs: clinical genomic variant repositories, genomic medical data repositories, and genomic medicine evidence databases. The layers are differentiated by the types of genomic and medical information contained and the utility in assisting with clinical interpretation of genomic variants. Clinical-grade genomic databases must meet specific standards regarding submission, curation, and retrieval of data, as well as the maintenance of privacy and security. These organizing principles for CGGDs should serve as a foundation for future development of specific standards that support the use of such databases for patient care.
Use of social media by residency program directors for resident selection.
Cain, Jeff; Scott, Doneka R; Smith, Kelly
2010-10-01
Pharmacy residency program directors' attitudes and opinions regarding the use of social media in residency recruitment and selection were studied. A 24-item questionnaire was developed, pilot tested, revised, and sent to 996 residency program directors via SurveyMonkey.com. Demographic, social media usage, and opinions on social media data were collected and analyzed. A total of 454 residency program directors completed the study (response rate, 46.4%). The majority of respondents were women (58.8%), were members of Generation X (75.4%), and worked in a hospital or health system (80%). Most respondents (73%) rated themselves as either nonusers or novice users of social media. Twenty percent indicated that they had viewed a pharmacy residency applicant's social media information. More than half (52%) had encountered e-professionalism issues, including questionable photos and posts revealing unprofessional attitudes, and 89% strongly agreed or agreed that information voluntarily published online was fair game for judgments on character, attitudes, and professionalism. Only 4% of respondents had reviewed applicants' profiles for residency selection decisions. Of those respondents, 52% indicated that the content had no effect on resident selection. Over half of residency program directors were unsure whether they will use social media information for future residency selection decisions. Residency program directors from different generations had different views regarding social media information and its use in residency applicant selections. Residency program directors anticipated using social media information to aid in future decisions for resident selection and hiring.
ERIC Educational Resources Information Center
Westover, Jennifer M.
2010-01-01
Literacy skills are fundamental for all learners. For students who require augmentative and alternative communication (AAC), strong literacy skills provide a gateway to generative communication, genuine social networking, improved access to academic opportunities, access to information technology and future employment opportunities. However, many…
Conservation in a World of Six Billion: A Grassroots Action Guide.
ERIC Educational Resources Information Center
Hren, Benedict J.
This grassroots action guide features a conservation initiative working to bring the impacts of human population growth, economic development, and natural resource consumption into balance with the limits of nature for the benefit of current and future generations. Contents include information sheets entitled "Six Billion People and Growing,""The…
Profits or people? The informative case of alcohol marketing.
Casswell, Sally
2014-11-28
To analyse influence on alcohol marketing policy in New Zealand. Document and literature review. There is a powerful argument and popular support for restricting alcohol marketing but no significant policy action taken. Greater priority has been placed on the profits of influential corporations compared with protecting the health of future generations of New Zealanders.
Scientific challenges in shrubland ecosystems
William T. Sommers
2001-01-01
A primary goal in land management is to sustain the health, diversity, and productivity of the countryâs rangelands and shrublands for future generations. This type of sustainable management is to assure the availability and appropriate use of scientific information for decisionmaking. Some of most challenging scientific problems of shrubland ecosystem management are...
NASA Technical Reports Server (NTRS)
1986-01-01
The past, present, and future status of space technology in Berlin is discussed, including raw material processing, transportation, energy, and information generation and distribution. How Berlin can contribute toward further advancement in this field, individually or in collaboration with international partners is indicated.
The Sustainable University: Green Goals and New Challenges for Higher Education Leaders
ERIC Educational Resources Information Center
Martin, James; Samels, James E.
2012-01-01
Colleges and universities are at the forefront of efforts to preserve the earth's resources for future generations. Carbon neutrality, renewable energy sources, green building strategies, and related initiatives require informed and courageous leaders at all levels of higher education. James Martin and James E. Samels have worked closely with…
Campus Demonstration Sites for Sustainable Systems and Design: Five "Creation" Stories.
ERIC Educational Resources Information Center
Jack, Kathy; Ihara, Dan, Ed.
This paper provides a summary of the development and management of five campus demonstration sites designed to create harmony with natural systems and meet current student needs without compromising the needs of future generations. Information for each campus includes an overview of the site, project origins, the proposal and design process, the…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-29
... Federal role in responding to the Nation's most urgent challenges, ranging from climate change, severe... changes in climate, weather, oceans, and coasts, share knowledge and information with others, and conserve... changing climate system and its impacts. Objective: Integrated assessments of current and future states of...
Toward New Data and Information Management Solutions for Data-Intensive Ecological Research
ERIC Educational Resources Information Center
Laney, Christine Marie
2013-01-01
Ecosystem health is deteriorating in many parts of the world due to direct and indirect anthropogenic pressures. Generating accurate, useful, and impactful models of past, current, and future states of ecosystem structure and function is a complex endeavor that often requires vast amounts of data from multiple sources and knowledge from…
Smoking behavior of Mexicans: patterns by birth-cohort, gender, and education.
Christopoulou, Rebekka; Lillard, Dean R; Balmori de la Miyar, Josè R
2013-06-01
Little is known about historical smoking patterns in Mexico. Policy makers must rely on imprecise predictions of human or fiscal burdens from smoking-related diseases. In this paper we document intergenerational patterns of smoking, project them for future cohorts, and discuss those patterns in the context of Mexico's impressive economic growth. We use retrospectively collected information to generate life-course smoking prevalence rates of five birth-cohorts, by gender and education. With dynamic panel data methods, we regress smoking rates on indicators of economic development. Smoking is most prevalent among men and the highly educated. Smoking rates peaked in the 1980s and have since decreased, slowly on average, and fastest among the highly educated. Development significantly contributed to this decline; a 1 % increase in development is associated with an average decline in smoking prevalence of 0.02 and 0.07 percentage points for women and men, respectively. Mexico's development may have triggered forces that decrease smoking, such as the spread of health information. Although smoking rates are falling, projections suggest that they will be persistently high for several future generations.
Analysis of the Transport and Fate of Metals Released From ...
This project’s objectives were to provide analysis of water quality following the release of acid mine drainage in the Animas and San Juan Rivers in a timely manner to 1) generate a comprehensive picture of the plume at the river system level, 2) help inform future monitoring efforts and 3) to predict potential secondary effects that could occur from materials that may remain stored within the system. The project focuses on assessing metals contamination during the plume and in the first month following the event. This project’s objectives were to provide analysis of water quality following the release of acid mine drainage from the Gold King Mine in the Animas and San Juan Rivers in a timely manner to 1) generate a comprehensive picture of the plume at the river system level, 2) help inform future monitoring efforts and 3) to predict potential secondary effects that could occur from materials that may remain stored within the system. The project focuses on assessing metals contamination during the plume and in the first month following the event.
The journal of undergraduate neuroscience education: history, challenges, and future developments.
Dunbar, Gary L; Lom, Barbara; Grisham, William; Ramirez, Julio J
2009-01-01
The 'JUNE and You' sessions presented at the July 2008 Undergraduate Neuroscience Education workshop, sponsored jointly by Faculty for Undergraduate Neuroscience (FUN) and Project Kaleidoscope (PKAL), featured background information about the history and mission of the Journal of Undergraduate Neuroscience Education (JUNE), followed by an informative discussion about the challenges facing JUNE, including new ideas for future developments. This article will highlight some of the information and ideas generated and shared at this conference. Critical discussion points included the need to keep members of FUN actively engaged in submitting and reviewing articles for JUNE. Ways in which authors, reviewers, and interested faculty members could best help in promoting the mission and vision of JUNE were discussed. Concerns about recent hackings into the JUNE website were also raised, and possible solutions and measures that can be taken to minimize this in the future were discussed. In addition, ideas for expanding the role of JUNE to provide a forum to evaluate new and emerging website information that is pertinent to undergraduate neuroscience education was discussed. Ideas for future developments of JUNE included revolving postings of articles as they are accepted, providing links to several related websites, and allowing updates for articles that have been previously published in JUNE. Finally, ideas for maintaining and expanding JUNE's stature as the resource for undergraduate neuroscience education included ensuring that JUNE is listed on important search vehicles, such as PubMed.
Van Meijgaard, Jeroen; Fielding, Jonathan E; Kominski, Gerald F
2009-01-01
A comprehensive population health-forecasting model has the potential to interject new and valuable information about the future health status of the population based on current conditions, socioeconomic and demographic trends, and potential changes in policies and programs. Our Health Forecasting Model uses a continuous-time microsimulation framework to simulate individuals' lifetime histories by using birth, risk exposures, disease incidence, and death rates to mark changes in the state of the individual. The model generates a reference forecast of future health in California, including details on physical activity, obesity, coronary heart disease, all-cause mortality, and medical expenditures. We use the model to answer specific research questions, inform debate on important policy issues in public health, support community advocacy, and provide analysis on the long-term impact of proposed changes in policies and programs, thus informing stakeholders at all levels and supporting decisions that can improve the health of populations.
Zhang, Jun; Tian, Gui Yun; Marindra, Adi M J; Sunny, Ali Imam; Zhao, Ao Bo
2017-01-29
In recent few years, the antenna and sensor communities have witnessed a considerable integration of radio frequency identification (RFID) tag antennas and sensors because of the impetus provided by internet of things (IoT) and cyber-physical systems (CPS). Such types of sensor can find potential applications in structural health monitoring (SHM) because of their passive, wireless, simple, compact size, and multimodal nature, particular in large scale infrastructures during their lifecycle. The big data from these ubiquitous sensors are expected to generate a big impact for intelligent monitoring. A remarkable number of scientific papers demonstrate the possibility that objects can be remotely tracked and intelligently monitored for their physical/chemical/mechanical properties and environment conditions. Most of the work focuses on antenna design, and significant information has been generated to demonstrate feasibilities. Further information is needed to gain deep understanding of the passive RFID antenna sensor systems in order to make them reliable and practical. Nevertheless, this information is scattered over much literature. This paper is to comprehensively summarize and clearly highlight the challenges and state-of-the-art methods of passive RFID antenna sensors and systems in terms of sensing and communication from system point of view. Future trends are also discussed. The future research and development in UK are suggested as well.
NASA Technical Reports Server (NTRS)
Schaffner, Philip R.; Harrah, Steven; Neece, Robert T.
2012-01-01
The air transportation system of the future will need to support much greater traffic densities than are currently possible, while preserving or improving upon current levels of safety. Concepts are under development to support a Next Generation Air Transportation System (NextGen) that by some estimates will need to support up to three times current capacity by the year 2025. Weather and other atmospheric phenomena, such as wake vortices and volcanic ash, constitute major constraints on airspace system capacity and can present hazards to aircraft if encountered. To support safe operations in the NextGen environment advanced systems for collection and dissemination of aviation weather and environmental information will be required. The envisioned NextGen Network Enabled Weather (NNEW) infrastructure will be a critical component of the aviation weather support services, providing access to a common weather picture for all system users. By taking advantage of Network Enabled Operations (NEO) capabilities, a virtual 4-D Weather Data Cube with aviation weather information from many sources will be developed. One new source of weather observations may be airborne forward-looking sensors, such as the X-band weather radar. Future sensor systems that are the subject of current research include advanced multi-frequency and polarimetric radar, a variety of Lidar technologies, and infrared imaging spectrometers.
Lessons Learned About Public Health from Online Crowd Surveillance
Merchant, Raina; Ungar, Lyle
2013-01-01
Abstract The Internet has forever changed the way people access information and make decisions about their healthcare needs. Patients now share information about their health at unprecedented rates on social networking sites such as Twitter and Facebook and on medical discussion boards. In addition to explicitly shared information about health conditions through posts, patients reveal data on their inner fears and desires about health when searching for health-related keywords on search engines. Data are also generated by the use of mobile phone applications that track users' health behaviors (e.g., eating and exercise habits) as well as give medical advice. The data generated through these applications are mined and repackaged by surveillance systems developed by academics, companies, and governments alike to provide insight to patients and healthcare providers for medical decisions. Until recently, most Internet research in public health has been surveillance focused or monitoring health behaviors. Only recently have researchers used and interacted with the crowd to ask questions and collect health-related data. In the future, we expect to move from this surveillance focus to the “ideal” of Internet-based patient-level interventions where healthcare providers help patients change their health behaviors. In this article, we highlight the results of our prior research on crowd surveillance and make suggestions for the future. PMID:25045598
LESSONS LEARNED ABOUT PUBLIC HEALTH FROM ONLINE CROWD SURVEILLANCE.
Hill, Shawndra; Merchant, Raina; Ungar, Lyle
2013-09-10
The Internet has forever changed the way people access information and make decisions about their healthcare needs. Patients now share information about their health at unprecedented rates on social networking sites such as Twitter and Facebook and on medical discussion boards. In addition to explicitly shared information about health conditions through posts, patients reveal data on their inner fears and desires about health when searching for health-related keywords on search engines. Data are also generated by the use of mobile phone applications that track users' health behaviors (e.g., eating and exercise habits) as well as give medical advice. The data generated through these applications are mined and repackaged by surveillance systems developed by academics, companies, and governments alike to provide insight to patients and healthcare providers for medical decisions. Until recently, most Internet research in public health has been surveillance focused or monitoring health behaviors. Only recently have researchers used and interacted with the crowd to ask questions and collect health-related data. In the future, we expect to move from this surveillance focus to the "ideal" of Internet-based patient-level interventions where healthcare providers help patients change their health behaviors. In this article, we highlight the results of our prior research on crowd surveillance and make suggestions for the future.
Satellite Networks: Architectures, Applications, and Technologies
NASA Technical Reports Server (NTRS)
Bhasin, Kul (Compiler)
1998-01-01
Since global satellite networks are moving to the forefront in enhancing the national and global information infrastructures due to communication satellites' unique networking characteristics, a workshop was organized to assess the progress made to date and chart the future. This workshop provided the forum to assess the current state-of-the-art, identify key issues, and highlight the emerging trends in the next-generation architectures, data protocol development, communication interoperability, and applications. Presentations on overview, state-of-the-art in research, development, deployment and applications and future trends on satellite networks are assembled.
Stochastic Modeling of Airlines' Scheduled Services Revenue
NASA Technical Reports Server (NTRS)
Hamed, M. M.
1999-01-01
Airlines' revenue generated from scheduled services account for the major share in the total revenue. As such, predicting airlines' total scheduled services revenue is of great importance both to the governments (in case of national airlines) and private airlines. This importance stems from the need to formulate future airline strategic management policies, determine government subsidy levels, and formulate governmental air transportation policies. The prediction of the airlines' total scheduled services revenue is dealt with in this paper. Four key components of airline's scheduled services are considered. These include revenues generated from passenger, cargo, mail, and excess baggage. By addressing the revenue generated from each schedule service separately, air transportation planners and designers arc able to enhance their ability to formulate specific strategies for each component. Estimation results clearly indicate that the four stochastic processes (scheduled services components) are represented by different Box-Jenkins ARIMA models. The results demonstrate the appropriateness of the developed models and their ability to provide air transportation planners with future information vital to the planning and design processes.
Stochastic Modeling of Airlines' Scheduled Services Revenue
NASA Technical Reports Server (NTRS)
Hamed, M. M.
1999-01-01
Airlines' revenue generated from scheduled services account for the major share in the total revenue. As such, predicting airlines' total scheduled services revenue is of great importance both to the governments (in case of national airlines) and private airlines. This importance stems from the need to formulate future airline strategic management policies, determine government subsidy levels, and formulate governmental air transportation policies. The prediction of the airlines' total scheduled services revenue is dealt with in this paper. Four key components of airline's scheduled services are considered. These include revenues generated from passenger, cargo, mail, and excess baggage. By addressing the revenue generated from each schedule service separately, air transportation planners and designers are able to enhance their ability to formulate specific strategies for each component. Estimation results clearly indicate that the four stochastic processes (scheduled services components) are represented by different Box-Jenkins ARIMA models. The results demonstrate the appropriateness of the developed models and their ability to provide air transportation planners with future information vital to the planning and design processes.
Digamma diagnostics for the mixed-phase generation at NICA
NASA Astrophysics Data System (ADS)
Kukulin, V. I.; Platonova, M. N.
2017-03-01
A novel type of diagnostics for dense and/or hot nuclear matter produced in heavy-ion collisions at NICA and similar future colliders (FAIR, etc.) is suggested. The diagnostics is based on an assumption (confirmed in many experiments worldwide) about intensive generation of light scalar mesons (σ) the consequent decay of which produces γγ pairs with the mass and width dependent upon density and temperature of the fireball produced in the collision process. Thus, measurements of the absolute yield, mass and width of the γγ signal carry valuable information about the state of fireball generated during the high-energy nuclear collision.
Nonlinear metamaterials for holography
Almeida, Euclides; Bitton, Ora
2016-01-01
A hologram is an optical element storing phase and possibly amplitude information enabling the reconstruction of a three-dimensional image of an object by illumination and scattering of a coherent beam of light, and the image is generated at the same wavelength as the input laser beam. In recent years, it was shown that information can be stored in nanometric antennas giving rise to ultrathin components. Here we demonstrate nonlinear multilayer metamaterial holograms. A background free image is formed at a new frequency—the third harmonic of the illuminating beam. Using e-beam lithography of multilayer plasmonic nanoantennas, we fabricate polarization-sensitive nonlinear elements such as blazed gratings, lenses and other computer-generated holograms. These holograms are analysed and prospects for future device applications are discussed. PMID:27545581
NASA Astrophysics Data System (ADS)
Sabeur, Z. A.; Denis, H.; Nativi, S.
2012-04-01
The phenomenal advances in information and communication technologies over the last decade have led to offering unprecedented connectivity with real potentials for "Smart living" between large segments of human populations around the world. In particular, Voluntary Groups(VGs) and individuals with interest in monitoring the state of their local environment can be connected through the internet and collaboratively generate important localised environmental observations. These could be considered as the Community Observatories(CO) of the Future Internet(FI). However, a set of FI enablers are needed to be deployed for these communities to become effective COs in the Future Internet. For example, these communities will require access to services for the intelligent processing of heterogeneous data and capture of advancend situation awarness about the environment. This important enablement will really unlock the communities true potential for participating in localised monitoring of the environment in addition to their contribution in the creation of business entreprise. Among the eight Usage Areas(UA) projects of the FP7 FI-PPP programme, the ENVIROFI Integrated Project focuses on the specifications of the Future Internet enablers of the Environment UA. The specifications are developed under multiple environmental domains in context of users needs for the development of mash-up applications in the Future Internet. It will enable users access to real-time, on-demand fused information with advanced situation awareness about the environment at localised scales. The mash-up applications shall get access to rich spatio-temporal information from structured fusion services which aggregate COs information with existing environmental monitoring stations data, established by research organisations and private entreprise. These applications are being developed in ENVIROFI for the atmospheric, marine and biodiversity domains, together with a potential to be extended to other domains and scenarios concerning smart and safe living in the Future Internet.
Evolution of an Intelligent Information Fusion System
NASA Technical Reports Server (NTRS)
Campbell, William J.; Cromp, Robert F.
1990-01-01
Consideration is given to the hardware and software needed to manage the enormous amount and complexity of data that the next generation of space-borne sensors will provide. An anthology is presented illustrating the evolution of artificial intelligence, science data processing, and management from the 1960s to the near future. Problems and limitations of technologies, data structures, data standards, and conceptual thinking are addressed. The development of an end-to-end Intelligent Information Fusion System that embodies knowledge of the user's domain-specific goals is proposed.
NASA Astrophysics Data System (ADS)
Guo, Rui; Zhou, Lan; Gu, Shi-Pu; Wang, Xing-Fu; Sheng, Yu-Bo
2017-03-01
The concatenated Greenberger-Horne-Zeilinger (C-GHZ) state is a new type of multipartite entangled state, which has potential application in future quantum information. In this paper, we propose a protocol of constructing arbitrary C-GHZ entangled state approximatively. Different from previous protocols, each logic qubit is encoded in the coherent state. This protocol is based on the linear optics, which is feasible in experimental technology. This protocol may be useful in quantum information based on the C-GHZ state.
Simulation of an ensemble of future climate time series with an hourly weather generator
NASA Astrophysics Data System (ADS)
Caporali, E.; Fatichi, S.; Ivanov, V. Y.; Kim, J.
2010-12-01
There is evidence that climate change is occurring in many regions of the world. The necessity of climate change predictions at the local scale and fine temporal resolution is thus warranted for hydrological, ecological, geomorphological, and agricultural applications that can provide thematic insights into the corresponding impacts. Numerous downscaling techniques have been proposed to bridge the gap between the spatial scales adopted in General Circulation Models (GCM) and regional analyses. Nevertheless, the time and spatial resolutions obtained as well as the type of meteorological variables may not be sufficient for detailed studies of climate change effects at the local scales. In this context, this study presents a stochastic downscaling technique that makes use of an hourly weather generator to simulate time series of predicted future climate. Using a Bayesian approach, the downscaling procedure derives distributions of factors of change for several climate statistics from a multi-model ensemble of GCMs. Factors of change are sampled from their distributions using a Monte Carlo technique to entirely account for the probabilistic information obtained with the Bayesian multi-model ensemble. Factors of change are subsequently applied to the statistics derived from observations to re-evaluate the parameters of the weather generator. The weather generator can reproduce a wide set of climate variables and statistics over a range of temporal scales, from extremes, to the low-frequency inter-annual variability. The final result of such a procedure is the generation of an ensemble of hourly time series of meteorological variables that can be considered as representative of future climate, as inferred from GCMs. The generated ensemble of scenarios also accounts for the uncertainty derived from multiple GCMs used in downscaling. Applications of the procedure in reproducing present and future climates are presented for different locations world-wide: Tucson (AZ), Detroit (MI), and Firenze (Italy). The stochastic downscaling is carried out with eight GCMs from the CMIP3 multi-model dataset (IPCC 4AR, A1B scenario).
Vutukuru, Satish; Carreras-Sospedra, Marc; Brouwer, Jacob; Dabdub, Donald
2011-12-01
Distributed power generation-electricity generation that is produced by many small stationary power generators distributed throughout an urban air basin-has the potential to supply a significant portion of electricity in future years. As a result, distributed generation may lead to increased pollutant emissions within an urban air basin, which could adversely affect air quality. However, the use of combined heating and power with distributed generation may reduce the energy consumption for space heating and air conditioning, resulting in a net decrease of pollutant and greenhouse gas emissions. This work used a systematic approach based on land-use geographical information system data to determine the spatial and temporal distribution of distributed generation emissions in the San Joaquin Valley Air Basin of California and simulated the potential air quality impacts using state-of-the-art three-dimensional computer models. The evaluation of the potential market penetration of distributed generation focuses on the year 2023. In general, the air quality impacts of distributed generation were found to be small due to the restrictive 2007 California Air Resources Board air emission standards applied to all distributed generation units and due to the use of combined heating and power. Results suggest that if distributed generation units were allowed to emit at the current Best Available Control Technology standards (which are less restrictive than the 2007 California Air Resources Board standards), air quality impacts of distributed generation could compromise compliance with the federal 8-hr average ozone standard in the region.
Vutukuru, Satish; Carreras-Sospedra, Marc; Brouwer, Jacob; Dabdub, Donald
2011-12-01
Distributed power generation-electricity generation that is produced by many small stationary power generators distributed throughout an urban air basin-has the potential to supply a significant portion of electricity in future years. As a result, distributed generation may lead to increased pollutant emissions within an urban air basin, which could adversely affect air quality. However, the use of combined heating and power with distributed generation may reduce the energy consumption for space heating and air conditioning, resulting in a net decrease of pollutant and greenhouse gas emissions. This work used a systematic approach based on land-use geographical information system data to determine the spatial and temporal distribution of distributed generation emissions in the San Joaquin Valley Air Basin of California and simulated the potential air quality impacts using state-of-the-art three-dimensional computer models. The evaluation of the potential market penetration of distributed generation focuses on the year 2023. In general, the air quality impacts of distributed generation were found to be small due to the restrictive 2007 California Air Resources Board air emission standards applied to all distributed generation units and due to the use of combined heating and power. Results suggest that if distributed generation units were allowed to emit at the current Best Available Control Technology standards (which are less restrictive than the 2007 California Air Resources Board standards), air quality impacts of distributed generation could compromise compliance with the federal 8-hr average ozone standard in the region. [Box: see text].
NASA Astrophysics Data System (ADS)
Voisin, N.; Macknick, J.; Fu, T.; O'Connell, M.; Zhou, T.; Brinkman, G.
2017-12-01
Water resources provide multiple critical services to the electrical grid through hydropower technologies, from generation to regulation of the electric grid (frequency, capacity reserve). Water resources can also represent vulnerabilities to the electric grid, as hydropower and thermo-electric facilities require water for operations. In the Western U.S., hydropower and thermo-electric plants that rely on fresh surface water represent 67% of the generating capacity. Prior studies have looked at the impact of change in water availability under future climate conditions on expected generating capacity in the Western U.S., but have not evaluated operational risks or changes resulting from climate. In this study, we systematically assess the impact of change in water availability and air temperatures on power operations, i.e. we take into account the different grid services that water resources can provide to the electric grid (generation, regulation) in the system-level context of inter-regional coordination through the electric transmission network. We leverage the Coupled Model Intercomparison Project Phase 5 (CMIP5) hydrology simulations under historical and future climate conditions, and force the large scale river routing- water management model MOSART-WM along with 2010-level sectoral water demands. Changes in monthly hydropower potential generation (including generation and reserves), as well as monthly generation capacity of thermo-electric plants are derived for each power plant in the Western U.S. electric grid. We then utilize the PLEXOS electricity production cost model to optimize power system dispatch and cost decisions for the 2010 infrastructure under 100 years of historical and future (2050 horizon) hydroclimate conditions. We use economic metrics as well as operational metrics such as generation portfolio, emissions, and reserve margins to assess the changes in power system operations between historical and future normal and extreme water availability conditions. We provide insight on how this information can be used to support resource adequacy and grid expansion studies over the Western U.S. in the context of inter-annual variability and climate change.
Embedded sensor systems for health - providing the tools in future healthcare.
Lindén, Maria; Björkman, Mats
2014-01-01
Wearable, embedded sensor systems for health applications are foreseen to be enablers in the future healthcare. They will provide ubiquitous monitoring of multiple parameters without restricting the person to stay at home or in the hospital. By following trend changes in the health status, early deteriorations will be detected and treatment can start earlier. Also health prevention will be supported. Such future healthcare requires technology development, including miniaturized sensors, smart textiles and wireless communication. The tremendous amount of data generated by these systems calls for both signal processing and decision support to guarantee the quality of data and avoid overflow of information. Safe and secure communications have to protect the integrity of the persons monitored.
NASA Technical Reports Server (NTRS)
1980-01-01
A survey instrument was developed and implemented in order to evaluate the current needs for natural resource information in Arizona and to determine which state agencies have information systems capable of coordinating, accessing and analyzing the data. Data and format requirements were determined for the following categories: air quality, animals, cultural resources, geology, land use, soils, water, vegetation, ownership, and social and economic aspects. Hardware and software capabilities were assessed and a data processing plan was developed. Possible future applications with the next generation LANDSAT were also identified.
An insight into cyanobacterial genomics--a perspective.
Lakshmi, Palaniswamy Thanga Velan
2007-05-20
At the turn of the millennium, cyanobacteria deserve attention to be reviewed to understand the past, present and future. The advent of post genomic research, which encompasses functional genomics, structural genomics, transcriptomics, pharmacogenomics, proteomics and metabolomics that allows a systematic wide approach for biological system studies. Thus by exploiting genomic and associated protein information through computational analyses, the fledging information that are generated by biotechnological analyses, could be well extrapolated to fill in the lacuna of scarce information on cyanobacteria and as an effort this paper attempts to highlights the perspectives available and awakens researcher to concentrate in the field of cyanobacterial informatics.
A New Look at NASA: Strategic Research In Information Technology
NASA Technical Reports Server (NTRS)
Alfano, David; Tu, Eugene (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on research undertaken by NASA to facilitate the development of information technologies. Specific ideas covered here include: 1) Bio/nano technologies: biomolecular and nanoscale systems and tools for assembly and computing; 2) Evolvable hardware: autonomous self-improving, self-repairing hardware and software for survivable space systems in extreme environments; 3) High Confidence Software Technologies: formal methods, high-assurance software design, and program synthesis; 4) Intelligent Controls and Diagnostics: Next generation machine learning, adaptive control, and health management technologies; 5) Revolutionary computing: New computational models to increase capability and robustness to enable future NASA space missions.
Empowering Learners in the Reader/Writer Nature of the Digital Informational Space
ERIC Educational Resources Information Center
O'Byrne, W. Ian
2014-01-01
The Internet is the dominant text of this generation, and through intentional use it may provide opportunities for the critical literacy infused pedagogy. As we consider the online and offline literacy practices that our students will need as future events warrant, the one constant is change. This requires a continual re-defining, and…
NASA Technical Reports Server (NTRS)
Haggerty, James J.
1986-01-01
The major programs that generate new technology and therefore expand the bank of knowledge available for future transfer are outlined. The focal point of this volume contains a representative sampling of spinoff products and processes that resulted from technology utilization, or secondary application. The various mechanisms NASA employs to stimulate technology utilization are described and in an appendix, are listed contact sources for further information.
Rio Grande/Rio Bravo Basin Coalition
Sarah Kotchian
1999-01-01
In June 1994, one hundred people gathered for the first Uniting the Basin Conference in El Paso to discuss the state of their basin and to explore ways to improve its sustainability for future generations. One of the recommendations of that conference was the formation of an international non-governmental coalition of groups throughout the Basin to share information...
The Security-Stability-Sustainability Nexus: Environmental Considerations
2009-05-01
Hierarchy Environment [Natural Resources and Ecosystem Services] Sources: Adapted from Maslow , 1943; Butts as reported...gathering and maintaining the data needed , and completing and reviewing the collection of information. Send comments regarding this burden estimate or...Sustainability – capacity to meet the needs of the present without compromising the needs of future generations (3) Sources: (1) Renner, Inventory
76 FR 62457 - Tennessee Valley Authority (Bellefonte Nuclear Plant, Unit 1)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-07
... (Bellefonte Nuclear Plant, Unit 1) Order I. The Tennessee Valley Authority (TVA, or the applicant) is the... Nuclear Plant (BLN), Units 1 and 2, respectively. The CPs for CPPR-122 and CPPR-123 expire on October 1... option for future power generation at BLN Unit 1. In the letter dated April 25, 2011, TVA informed the...
ERIC Educational Resources Information Center
Brady, Laura Thompson; Fong, Lisa; Waninger, Kendra N.; Eidelman, Steven
2009-01-01
As leaders from the Baby Boomer generation prepare for retirement over the next decade, emerging leaders must be identified and supported in anticipation of a major organizational transition. "Authentic leadership" is a construct that informs the development of values-driven leaders who will bring organizations into the future, just as the…
W.E.B. Du Bois and the Women of Hull-House, 1895-1899.
ERIC Educational Resources Information Center
Deegan, Mary Jo
1988-01-01
Uses correspondence generated by the writing of "The Philadelphia Negro" to describe the collaborative relationship between W.E.B. DuBois and women sociologists. Suggests that this historical bond between Black men and White women in their search for a more egalitarian future has the potential to inform efforts toward greater equity now…
3 CFR 8662 - Proclamation 8662 of April 29, 2011. National Physical Fitness and Sports Month, 2011
Code of Federal Regulations, 2012 CFR
2012-01-01
... and costly diseases like heart disease, diabetes, and obesity. For more information on the President’s.... The health of our sons and daughters is key to our Nation’s future. Unfortunately, childhood obesity... solving the epidemic of childhood obesity within a generation by inspiring children to be physically...
Using Educational Design Research to Inform Teaching and Learning in the Health Professions
ERIC Educational Resources Information Center
Steketee, Carole; Bate, Frank
2013-01-01
Teaching has always been at the core of what it means to practice in the health professions. Health professionals generally accept that as part of their role they will be involved in educating future generations in their discipline. However, whilst health professional educators typically have extensive knowledge and skills in their discipline…
NASA Technical Reports Server (NTRS)
Kempler, Steve; Alcott, Gary; Lynnes, Chris; Leptoukh, Greg; Vollmer, Bruce; Berrick, Steve
2008-01-01
NASA Earth Sciences Division (ESD) has made great investments in the development and maintenance of data management systems and information technologies, to maximize the use of NASA generated Earth science data. With information management system infrastructure in place, mature and operational, very small delta costs are required to fully support data archival, processing, and data support services required by the recommended Decadal Study missions. This presentation describes the services and capabilities of the Goddard Space Flight Center (GSFC) Earth Sciences Data and Information Services Center (GES DISC) and the reusability for these future missions. The GES DISC has developed a series of modular, reusable data management components currently in use. They include data archive and distribution (Simple, Scalable, Script-based, Science [S4] Product Archive aka S4PA), data processing (S4 Processor for Measurements aka S4PM), data search (Mirador), data browse, visualization, and analysis (Giovanni), and data mining services. Information management system components are based on atmospheric scientist inputs. Large development and maintenance cost savings can be realized through their reuse in future missions.
Next-Generation Technologies for Multiomics Approaches Including Interactome Sequencing
Ohashi, Hiroyuki; Miyamoto-Sato, Etsuko
2015-01-01
The development of high-speed analytical techniques such as next-generation sequencing and microarrays allows high-throughput analysis of biological information at a low cost. These techniques contribute to medical and bioscience advancements and provide new avenues for scientific research. Here, we outline a variety of new innovative techniques and discuss their use in omics research (e.g., genomics, transcriptomics, metabolomics, proteomics, and interactomics). We also discuss the possible applications of these methods, including an interactome sequencing technology that we developed, in future medical and life science research. PMID:25649523
Sustainability Science Needs Sustainable Data!
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2013-12-01
Sustainability science (SS) is an 'emerging field of research dealing with the interactions between natural and social systems, and with how those interactions affect the challenge of sustainability: meeting the needs of present and future generations while substantially reducing poverty and conserving the planet's life support systems' (Kates, 2011; Clark, 2007). Bettencourt & Kaur (2011) identified more than 20,000 scientific papers published on SS topics since the 1980s with more than 35,000 distinct authors. They estimated that the field is currently growing exponentially, with the number of authors doubling approximately every 8 years. These scholars are undoubtedly using and generating a vast quantity and variety of data and information for both SS research and applications. Unfortunately we know little about what data the SS community is actually using, and whether or not the data that SS scholars generate are being preserved for future use. Moreover, since much SS research is conducted by cross-disciplinary, multi-institutional teams, often scattered around the world, there could well be increased risks of data loss, reduced data quality, inadequate documentation, and poor long-term access and usability. Capabilities and processes therefore need to be established today to support continual, reliable, and efficient preservation of and access to SS data in the future, especially so that they can be reused in conjunction with future data and for new studies not conceived in the original data collection activities. Today's long-term data stewardship challenges include establishing sustainable data governance to facilitate continuing management, selecting data to ensure that limited resources are focused on high priority SS data holdings, securing sufficient rights to allow unforeseen uses, and preparing data to enable use by future communities whose specific research and information needs are not yet known. Adopting sustainable models for archival infrastructures will reduce dependencies on changing priorities and sponsorship that may not continue. Implementing community-based appraisal criteria and selection procedures for data will ensure that limited resources for long-term data management are applied efficiently to data likely to have the most enduring value. Encouraging producers to provide rights for open access to data will support their replication, reuse, integration, and application in a range of SS research and applications in both the near and long term. Identifying modest changes to current data preparation activities to meet preservation goals should reduce expensive post-hoc data and documentation rescue efforts. The NASA Socioeconomic Data and Applications Center (SEDAC), an active archive in the NASA Earth Observing System Data and Information System (EOSDIS), established the SEDAC Long-Term Archive (LTA) in collaboration with the Columbia University Libraries to preserve selected data and information resources for future access and use. A case study of the LTA shows how archives can be organized to foster sustainable data stewardship in a university environment. Lessons learned from the organization planning and the preparation, appraisal, and selection of data for the LTA are described along with enhancements that have been applied to data management by the active archive.
How valid are future generations' arguments for preserving wilderness?
Thomas A. More; James R. Averill; Thomas H. Stevens
2000-01-01
We are often urged to preserve wilderness for the sake of future generations. Future generations consist of potential persons who are mute stakeholders in the decisions of today. Many claims about the rights of future generations or our present obligations to them have been vigorously advanced and just as vigorously denied. Recent theorists, however, have argued for a...
Integrated Sustainable Planning for Industrial Region Using Geospatial Technology
NASA Astrophysics Data System (ADS)
Tiwari, Manish K.; Saxena, Aruna; Katare, Vivek
2012-07-01
The Geospatial techniques and its scope of applications have undergone an order of magnitude change since its advent and now it has been universally accepted as a most important and modern tool for mapping and monitoring of various natural resources as well as amenities and infrastructure. The huge and voluminous spatial database generated from various Remote Sensing platforms needs proper management like storage, retrieval, manipulation and analysis to extract desired information, which is beyond the capability of human brain. This is where the computer aided GIS technology came into existence. A GIS with major input from Remote Sensing satellites for the natural resource management applications must be able to handle the spatiotemporal data, supporting spatiotemporal quarries and other spatial operations. Software and the computer-based tools are designed to make things easier to the user and to improve the efficiency and quality of information processing tasks. The natural resources are a common heritage, which we have shared with the past generations, and our future generation will be inheriting these resources from us. Our greed for resource and our tremendous technological capacity to exploit them at a much larger scale has created a situation where we have started withdrawing from the future stocks. Bhopal capital region had attracted the attention of the planners from the beginning of the five-year plan strategy for Industrial development. However, a number of projects were carried out in the individual Districts (Bhopal, Rajgarh, Shajapur, Raisen, Sehore) which also gave fruitful results, but no serious efforts have been made to involve the entire region. No use of latest Geospatial technique (Remote Sensing, GIS, GPS) to prepare a well structured computerized data base without which it is very different to retrieve, analyze and compare the data for monitoring as well as for planning the developmental activities in future.
NASA Technical Reports Server (NTRS)
1999-01-01
The purpose of the Space 2000 Symposium is to present the creativity and achievements of key figures of the 20th century. It offers a retrospective discussion on space exploration. It considers the future of the enterprise, and the legacy that will be left for future generations. The symposium includes panel discussions, smaller session meetings with some panelists, exhibits, and displays. The first session entitled "From Science Fiction to Science Facts" commences after a brief overview of the symposium. The panel discussions include talks on space exploration over many decades, and the missions of the millennium to search for life on Mars. The second session, "Risks and Rewards of Human Space Exploration," focuses on the training and health risks that astronauts face on their exploratory mission to space. Session three, "Messages and Messengers Informing and Inspire Space Exploration and the Public," focuses on the use of TV medium by educators and actors to inform and inspire a wide variety of audiences with adventures of space exploration. Session four, "The Legacy of Carl Sagan," discusses the influences made by Sagan to scientific research and the general public. In session five, "Space Exploration for a new Generation," two student speakers and the NASA Administrator Daniel S. Goldin address the group. Session six, "Destiny or Delusion? -- Humankind's Place in the Cosmos," ends the symposium with issues of space exploration and some thought provoking questions. Some of these issues and questions are: what will be the societal implications if we discover the origin of the universe, stars, or life; what will be the impact if scientists find clear evidence of life outside the domains of the Earth; should there be limits to what humans can or should learn; and what visionary steps should space-faring people take now for future generations.
Future Applications of Remote Sensing to Archeological Research
NASA Technical Reports Server (NTRS)
Sever, Thomas L.
2003-01-01
Archeology was one of the first disciplines to use aerial photography in its investigations at the turn of the 20th century. However, the low resolution of satellite technology that became available in the 1970 s limited their application to regional studies. That has recently changed. The arrival of the high resolution, multi-spectral capabilities of the IKONOS and QUICKBIRD satellites and the scheduled launch of new satellites in the next few years provides an unlimited horizon for future archeological research. In addition, affordable aerial and ground-based remote sensing instrumentation are providing archeologists with information that is not available through traditional methodologies. Although many archeologists are not yet comfortable with remote sensing technology a new generation has embraced it and is accumulating a wealth of new evidence. They have discovered that through the use of remote sensing it is possible to gather information without disturbing the site and that those cultural resources can be monitored and protected for the future.
An Initial Strategy for Commercial Industry Awareness of the International Space Station
NASA Technical Reports Server (NTRS)
Jorgensen, Catherine A.
1999-01-01
While plans are being developed to utilize the ISS for scientific research, and human and microgravity experiments, it is time to consider the future of the ISS as a world-wide commercial marketplace developed from a government owned, operated and controlled facility. Commercial industry will be able to seize this opportunity to utilize the ISS as a unique manufacturing platform and engineering testbed for advanced technology. NASA has begun the strategic planning of the evolution and commercialization of the ISS. The Pre-Planned Program Improvement (P3I) Working Group at NASA is assessing the future ISS needs and technology plans to enhance ISS performance. Some of these enhancements will allow the accommodation of commercial applications and the Human Exploration and Development of Space mission support. As this information develops, it is essential to disseminate this information to commercial industry, targeting not only the private and public space sector but also the non-aerospace commercial industries. An approach is presented for early distribution of this information via the ISS Evolution Data book that includes ISS baseline system information, baseline utilization and operations plans, advanced technologies, future utilization opportunities, ISS evolution and Design Reference Missions (DRM). This information source and tool can be used as catalyst in the commercial world for the generation of ideas and options to enhance the current capabilities of the ISS.
Turbine Control of a Tidal and River Power Generator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muljadi, Eduard; Wright, Alan; Gevorgian, Vahan
As renewable generation has become less expensive during recent decades, and it becomes more accepted by the global population, the focus on renewable generation has expanded to include new types with promising future applications, such as river and tidal generation. The input variations to these types of resources are slower but also steadier than wind or solar generation. The level of water turbulent flow may vary from one place to another, however, the control algorithm can be adjusted to local environment. This paper describes the hydrokinetic aspects of river and tidal generation based on a river and tidal generator. Althoughmore » the information given in this paper is not that of an exact generator deployed on site, the data used is representative of a typical river or tidal generator. In this paper, the hydrokinetic and associated electrical controller of the system were not included; however, the focus of this paper is on the hydrodynamic control.« less
Turbine Control of a Tidal and River Power Generator: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muljadi, Eduard; Gevorgian, Vahan; Wright, Alan
As renewable generation has become less expensive during recent decades, and it becomes more accepted by the global population, the focus on renewable generation has expanded to include new types with promising future applications, such as river and tidal generation. The input variations to these types of resources are slower but also steadier than wind or solar generation. The level of water turbulent flow may vary from one place to another, however, the control algorithm can be adjusted to local environment. This paper describes the hydrokinetic aspects of river and tidal generation based on a river and tidal generator. Althoughmore » the information given in this paper is not that of an exact generator deployed on site, the data used is representative of a typical river or tidal generator. In this paper, the hydrokinetic and associated electrical controller of the system were not included; however, the focus of this paper is on the hydrodynamic control.« less
From what should we protect future generations: germ-line therapy or genetic screening?
Mallia, Pierre; ten Have, Henk
2003-01-01
This paper discusses the issue of whether we have responsibilities to future generations with respect to genetic screening, including for purposes of selective abortion or discard. Future generations have been discussed at length among scholars. The concept of 'Guardian for Future Generations' is tackled and its main criticisms discussed. Whilst germ-line cures, it is argued, can only affect family trees, genetic screening and testing can have wider implications. If asking how this may affect future generations is a legitimate question and since we indeed make retrospective moral judgements, it would be wise to consider that future generations will make the same retrospective judgements on us. Moreover such technologies affect present embryos to which we indeed can be considered to have an obligation.
The challenge of selecting tomorrow's police officers from Generations X and Y.
McCafferty, Francis L
2003-01-01
Demands on police officers in the past 30 years have grown dramatically with the increasing threats to social order and personal security. Selection of police officers has always been difficult, but now with the increasing demand and complexity of police work, along with the candidates applying from Generation X and even Generation Y, the selection process has become more critical. The personal characteristics attributed to Generation X--and in the future, to Generation Y--should be factored into the selection process to ensure that those individuals selected as police officers will be able to cope with what has been described as the impossible mandate of police work in a free society. Background information on the X and Y generations is imperative for psychiatrists working with police departments and other law enforcement agencies. This article will explore these areas and construct a paradigm selection process.
Information gathering, management and transfering for geospacial intelligence
NASA Astrophysics Data System (ADS)
Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena
2017-07-01
Information is a key subject in modern organization operations. The success of joint and combined operations with organizations partners depends on the accurate information and knowledge flow concerning the operations theatre: provision of resources, environment evolution, markets location, where and when an event occurred. As in the past and nowadays we cannot conceive modern operations without maps and geo-spatial information (GI). Information and knowledge management is fundamental to the success of organizational decisions in an uncertainty environment. The georeferenced information management is a process of knowledge management, it begins in the raw data and ends on generating knowledge. GI and intelligence systems allow us to integrate all other forms of intelligence and can be a main platform to process and display geo-spatial-time referenced events. Combining explicit knowledge with peoples know-how to generate a continuous learning cycle that supports real time decisions mitigates the influences of fog of everyday competition and provides the knowledge supremacy. Extending the preliminary analysis done in [1], this work applies the exploratory factor analysis to a questionnaire about the GI and intelligence management in an organization company allowing to identify future lines of action to improve information process sharing and exploration of all the potential of this important resource.
NASA Astrophysics Data System (ADS)
Lipiec, E.; Ruggiero, P.; Serafin, K.; Bolte, J.; Mills, A.; Corcoran, P.; Stevenson, J.; Lach, D.
2014-12-01
Local decision-makers often lack both the information and tools to reduce their community's overall vulnerability to current and future climate change impacts. Managers are restricted in their actions by the scale of the problem, inherent scientific uncertainty, limits of information exchange, and the global nature of available data, rendering place-based strategies difficult to generate. Several U.S. Pacific Northwest coastal communities are already experiencing chronic erosion and flooding, hazards only to be exacerbated by sea level rise and changing patterns of storminess associated with climate change. To address these issues, a knowledge to action network (KTAN) consisting of local Tillamook County stakeholders and Oregon State University researchers, was formed to project future flooding and erosion impacts and determine possible adaptation policies to reduce vulnerability. Via an iterative scenario planning process, the KTAN has developed four distinct adaptation policy scenarios, including 'Status Quo', 'Hold The Line', 'ReAlign', and 'Laissez-Faire'. These policy scenarios are being integrated with a range of climate change scenarios within the modeling framework Envision, a multi-agent GIS-based tool, which allows for the combination of physical processes data, probabilistic climate change information, coastal flood and erosion models, and stakeholder driven adaptation strategies into distinct plausible future scenarios. Because exact physical and social responses to climate change are impossible to ascertain, information about the differences between possible future scenarios can provide valuable information to decision-makers and the community at large. For example, the fewest projected coastal flood and erosion impacts to buildings occur under the 'ReAlign' policy scenario (i.e., adaptation strategies that move dwellings away from the coast) under both low and high climate change scenarios, especially in comparison to the 'Status Quo' or 'Hold The Line' scenarios. Statistical analysis of the scenario-based variations in impacts to private and public resources can help guide future adaptation policy implementation and support Oregon's coastal communities for years to come.
ABM Drag_Pass Report Generator
NASA Technical Reports Server (NTRS)
Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat
2008-01-01
dragREPORT software was developed in parallel with abmREPORT, which is described in the preceding article. Both programs were built on the capabilities created during that process. This tool generates a drag_pass report that summarizes vital information from the MRO aerobreaking drag_pass build process to facilitate both sequence reviews and provide a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files, presenting them in a single, easy-to-check report providing the majority of parameters needed for cross check and verification as part of the sequence review process. Prior to dragReport, all the needed information was spread across a number of different files, each in a different format. This software is a Perl script that extracts vital summarization information and build-process details from a number of source files into a single, concise report format used to aid the MPST sequence review process and to provide a high-level summarization of the sequence for mission management reference. This software could be adapted for future aerobraking missions to provide similar reports, review and summarization information.
The second generation intelligent user interface for the crustal dynamics data information system
NASA Technical Reports Server (NTRS)
Short, Nicholas, Jr.; Wattawa, Scott L.
1988-01-01
For the past decade, operations and research projects that support a major portion of NASA's overall mission have experienced a dramatic increase in the volume of generated data and resultant information that is unparalleled in the history of the agency. The effect of such an increase is that most of the science and engineering disciplines are undergoing an information glut, which has occurred, not only because of the amount, but also because of the type of data being collected. This information glut is growing exponentially and is expected to grow for the foreseeable future. Consequently, it is becoming physically and intellectually impossible to identify, access, modify, and analyze the most suitable information. Thus, the dilemma arises that the amount and complexity of information has exceeded and will continue to exceed, using present information systems, the ability of all the scientists and engineers to understand and take advantage of this information. As a result of this information problem, NASA has initiated the Intelligent Data Management (IDM) project to design and develop Advanced Information Management Systems (AIMS). The first effort of the Project was the prototyping of an Intelligent User Interface (IUI) to an operational scientific database using expert systems, natural language processing, and graphics technologies. An overview of the IUI formulation and development for the second phase is presented.
Future HIV Mentoring Programs to Enhance Diversity.
Stoff, David M; Cargill, Victoria A
2016-09-01
This paper proposes a general template to guide future mentoring program development addressing: (i) considerations to ensure an adequate research workforce; (ii) key guidelines and principles of mentoring; and (iii) use of a logic model to develop program milestones, outcomes and evaluation. We focus on these areas to guide and inform the most effective mentoring program components, which we find to be more helpful than identifying specific features and ingredients. Although the focus is on the development of a new generation of investigators from diverse backgrounds, this template may also apply to mentoring programs for other investigators and for disciplines beyond HIV.
[Generation Y : recruitment, retention and development].
Schmidt, C E; Möller, J; Schmidt, K; Gerbershagen, M U; Wappler, F; Limmroth, V; Padosch, S A; Bauer, M
2011-06-01
There is a significant shortage of highly qualified personnel in medicine, especially skilled doctors and nurses. This shortage of qualified labor has led to competition between hospitals. Analyzing the circumstances of the competition, nurses and doctors of the so-called generation Y are of importance. Recruitment and retention of these staff members will become a critical success factor for hospitals in the future. An internet search was conducted using the key words "generation Y and medicine, demography, personnel and hospitals". A search in Medline/pubmed for scientific studies on the topics of labor shortage was performed using the key words "personnel, shortage doctors, generation X, baby boomer, personnel and demographic changes, staff". Finally, sources from public institutions and academic medical societies were analyzed. The data were sorted by main categories and relevance for hospitals. Statistical analysis was done using descriptive measures. The analysis confirmed the heterogeneous and complex flood of information on the topic demography and generation. A comparison of the generations showed that they can be separated into baby boomers (born 1946-1964 live to work), generation X (born 1965-1980 work to live) and generation Y (born 1981 and after, live while working). Members of generation Y "live while working" are oriented to competence and less with hierarchies. They exchange information using modern communication methods and within networks. Internet and computers are part of their daily routine. Employees of generation Y challenge leadership in hospitals by increasing the demands. However, generation Y can significantly increase professionalization and competitiveness for hospitals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borth, F.C. III; Thompson, J.W.; Mishaga, J.M.
1996-11-01
Through ComEd Fossil (Generating) Division`s Competitive Action Plan (CAP) evaluation changes have been identified which are necessary to improve generating station performance. These changes are intended to improve both station reliability and financial margins, and are essential for stations to be successful in a competitive marketplace. Plant upgrades, advanced equipment stewardship, and personnel reductions have been identified as necessary steps in achieving industry leadership and competitive advantage. To deal effectively with plant systems and contend in the competitive marketplace Information Technology (IT) solutions to business problems are being developed. Data acquisition, storage, and retrieval are being automated through use ofmore » state-of-the-art Data Historians. Total plant, high resolution, long term process information will be accessed through Local/Wide Area Networks (LAN/WAN) connections from desktop PC`s. Generating unit Thermal Performance Monitors accessing the Data Historian will analyze plant and system performance enabling reductions in operating costs, and improvements in process control. As inputs to proactive maintenance toolsets this data allows anticipation of equipment service needs, advanced service scheduling, and cost/benefit analysis. The ultimate goal is to optimize repair needs with revenue generation. Advanced applications building upon these foundations will bring knowledge of the costs associated with all the products a generating station offers its customer(s). An overall design philosophy along with preliminary results is presented; these results include shortfalls, lessons learned, and future options.« less
This project’s objectives were to provide analysis of water quality following the release of acid mine drainage in the Animas and San Juan Rivers in a timely manner to 1) generate a comprehensive picture of the plume at the river system level, 2) help inform future monitoring eff...
Better Learning of Chinese Idioms through Storytelling: Current Trend of Multimedia Storytelling
ERIC Educational Resources Information Center
Li, Ee Hui; Hew, Soon Hin
2017-01-01
Storytelling plays a vital role to impart a nation's tradition, cultural beliefs and history to future generation. It is frequently used for the purpose of sharing or exchanging information as it enables the messages to be conveyed to the audience easily. Storytelling acts as a tool of human social interaction and is commonly used in education for…
Discussion of future cooperative actions and closing remarks
Patricia L. Pettit
1996-01-01
The knowledge shared and the energy generated by this symposium should not be lost as we leave for our homes and our jobs. We have a great wealth of experience, knowledge, and energy assembled. How can we continue to communicate with each other, share information, involve others, and influence decision makers? The steering committee for this symposium in hopes of...
2001-01-01
Management System (JTIMS) followed, and generated spirited discussion regarding the respective roles of JTIMS and the JLLP. The discussion concluded...waiting for the Director, Joint Staff�s signature and should be in official distribution by January 2001. An update on the Joint Training Information
An Investigation into Spike-Based Neuromorphic Approaches for Artificial Olfactory Systems
Osseiran, Adam
2017-01-01
The implementation of neuromorphic methods has delivered promising results for vision and auditory sensors. These methods focus on mimicking the neuro-biological architecture to generate and process spike-based information with minimal power consumption. With increasing interest in developing low-power and robust chemical sensors, the application of neuromorphic engineering concepts for electronic noses has provided an impetus for research focusing on improving these instruments. While conventional e-noses apply computationally expensive and power-consuming data-processing strategies, neuromorphic olfactory sensors implement the biological olfaction principles found in humans and insects to simplify the handling of multivariate sensory data by generating and processing spike-based information. Over the last decade, research on neuromorphic olfaction has established the capability of these sensors to tackle problems that plague the current e-nose implementations such as drift, response time, portability, power consumption and size. This article brings together the key contributions in neuromorphic olfaction and identifies future research directions to develop near-real-time olfactory sensors that can be implemented for a range of applications such as biosecurity and environmental monitoring. Furthermore, we aim to expose the computational parallels between neuromorphic olfaction and gustation for future research focusing on the correlation of these senses. PMID:29125586
Display Provides Pilots with Real-Time Sonic-Boom Information
NASA Technical Reports Server (NTRS)
Haering, Ed; Plotkin, Ken
2013-01-01
Supersonic aircraft generate shock waves that move outward and extend to the ground. As a cone of pressurized air spreads across the landscape along the flight path, it creates a continuous sonic boom along the flight track. Several factors can influence sonic booms: weight, size, and shape of the aircraft; its altitude and flight path; and weather and atmospheric conditions. This technology allows pilots to control the impact of sonic booms. A software system displays the location and intensity of shock waves caused by supersonic aircraft. This technology can be integrated into cockpits or flight control rooms to help pilots minimize sonic boom impact in populated areas. The system processes vehicle and flight parameters as well as data regarding current atmospheric conditions. The display provides real-time information regarding sonic boom location and intensity, enabling pilots to make the necessary flight adjustments to control the timing and location of sonic booms. This technology can be used on current-generation supersonic aircraft, which generate loud sonic booms, as well as future- generation, low-boom aircraft, anticipated to be quiet enough for populated areas.
The Textbook of the Future: What Will It Look Like?
NASA Astrophysics Data System (ADS)
Shipman, Harry L.; Finkelstein, N.; McCray, D.; Mac Low, M.; Zollman, D.
2006-12-01
In May 2006, a group of scientists, publishers, technology gurus, National Science Foundation officers, and other interested parties met for a few days to think collectively about the future of the textbook. We met because: -The Web and search engines like Google change the relationship between students and information. If the textbook no longer needs to be encyclopedic, then what is its role? --Knowing information is not enough. Our students, whether they follow academic or other careers, will need to know how to get information, evaluate it, and use it to solve real world problems. How can a textbook help students in these environments? --The static, comprehensive narrative of a textbook does not always lend itself well to inquiry learning, which is strongly encouraged by science education research and by national science k-12 education standards. How can textbooks support active, student-centered learning and support new faculty as they adopt it? The workshop generated partial and uncertain answers to these questions, providing some ideas for the future, though not a complete roadmap. A metaphor that generated considerable support among the group was the idea of a textbook as a compact travel guide, like the Lonely Planet guides. It should be adaptable, and thus web-based, but it might still exist in paper form. The participants discussed barriers on the path ahead. How will peer review, which many workshop participants value, be incorporated? What incentives could motivate textbook authors and publishers to produce truly innovative products? How will new technologies such as computer simulations & animations, electronic readers, and widely accessible databases reshape the role of the textbook in education? Many workshop participants including this paper’s authors acknowledge support from the NSF Distinguished Teaching Scholars Program and the NSF CAREER awards program.
NASA Astrophysics Data System (ADS)
Luck, M.; Landis, M.; Gassert, F.; Luo, T.; Reig, P.
2013-12-01
Climate adaptation and strategic planning by states, corporations, and long-term investors require reliable information on the range of possible climatic changes. However, most decision makers are incapable of planning over the century-scale time horizons for which global climate models (GCMs) are developed. Even the most forward-looking actors rarely consider scenarios more than several decades into the future. The mismatch in model design and practical demands poses a challenge in extracting useful information on the decadal scale from global climate change models. Here, we explore options and limitations in generating decadal water supply change projections, as evaluated for the World Resources Institute's Aqueduct project's estimates of future change in water stress. Our approach uses an ensemble of six CMIP5 GCMs, selected to represent a broad lineage of models that best reproduce the mean and standard deviation of recent streamflow records in 18 large river basins, bias corrected to GLDAS-2.0 runoff. We examine sensitivity of point estimates of climate normal supply and water supply variability (interannual and seasonal) at the years 2020, 2030, and 2040, with a focus on using temporal windows of different lengths (11-, 21-, and 31-years) to generate the point estimates. With the aim of creating practical information for non-expert audiences, we will discuss the persistent question of 'how can we balance uncertainty and usability in designing scientific data products?'
Omics databases on kidney disease: where they can be found and how to benefit from them.
Papadopoulos, Theofilos; Krochmal, Magdalena; Cisek, Katryna; Fernandes, Marco; Husi, Holger; Stevens, Robert; Bascands, Jean-Loup; Schanstra, Joost P; Klein, Julie
2016-06-01
In the recent decades, the evolution of omics technologies has led to advances in all biological fields, creating a demand for effective storage, management and exchange of rapidly generated data and research discoveries. To address this need, the development of databases of experimental outputs has become a common part of scientific practice in order to serve as knowledge sources and data-sharing platforms, providing information about genes, transcripts, proteins or metabolites. In this review, we present omics databases available currently, with a special focus on their application in kidney research and possibly in clinical practice. Databases are divided into two categories: general databases with a broad information scope and kidney-specific databases distinctively concentrated on kidney pathologies. In research, databases can be used as a rich source of information about pathophysiological mechanisms and molecular targets. In the future, databases will support clinicians with their decisions, providing better and faster diagnoses and setting the direction towards more preventive, personalized medicine. We also provide a test case demonstrating the potential of biological databases in comparing multi-omics datasets and generating new hypotheses to answer a critical and common diagnostic problem in nephrology practice. In the future, employment of databases combined with data integration and data mining should provide powerful insights into unlocking the mysteries of kidney disease, leading to a potential impact on pharmacological intervention and therapeutic disease management.
Zhang, Jun; Tian, Gui Yun; Marindra, Adi M. J.; Sunny, Ali Imam; Zhao, Ao Bo
2017-01-01
In recent few years, the antenna and sensor communities have witnessed a considerable integration of radio frequency identification (RFID) tag antennas and sensors because of the impetus provided by internet of things (IoT) and cyber-physical systems (CPS). Such types of sensor can find potential applications in structural health monitoring (SHM) because of their passive, wireless, simple, compact size, and multimodal nature, particular in large scale infrastructures during their lifecycle. The big data from these ubiquitous sensors are expected to generate a big impact for intelligent monitoring. A remarkable number of scientific papers demonstrate the possibility that objects can be remotely tracked and intelligently monitored for their physical/chemical/mechanical properties and environment conditions. Most of the work focuses on antenna design, and significant information has been generated to demonstrate feasibilities. Further information is needed to gain deep understanding of the passive RFID antenna sensor systems in order to make them reliable and practical. Nevertheless, this information is scattered over much literature. This paper is to comprehensively summarize and clearly highlight the challenges and state-of-the-art methods of passive RFID antenna sensors and systems in terms of sensing and communication from system point of view. Future trends are also discussed. The future research and development in UK are suggested as well. PMID:28146067
Demonstration of alternative traffic information collection and management technologies
NASA Astrophysics Data System (ADS)
Knee, Helmut E.; Smith, Cy; Black, George; Petrolino, Joe
2004-03-01
Many of the components associated with the deployment of Intelligent Transportation Systems (ITS) to support a traffic management center (TMC) such as remote control cameras, traffic speed detectors, and variable message signs, have been available for many years. Their deployment, however, has been expensive and applied primarily to freeways and interstates, and have been deployed principally in the major metropolitan areas in the US; not smaller cities. The Knoxville (Tennessee) Transportation Planning Organization is sponsoring a project that will test the integration of several technologies to estimate near-real time traffic information data and information that could eventually be used by travelers to make better and more informed decisions related to their travel needs. The uniqueness of this demonstration is that it will seek to predict traffic conditions based on cellular phone signals already being collected by cellular communications companies. Information about the average speed on various portions of local arterials and incident identification (incident location) will be collected and compared to similar data generated by "probe vehicles". Successful validation of the speed information generated from cell phone data will allow traffic data to be generated much more economically and utilize technologies that are minimally infrastructure invasive. Furthermore, when validated, traffic information could be provided to the traveling public allowing then to make better decisions about trips. More efficient trip planning and execution can reduce congestion and associated vehicle emissions. This paper will discuss the technologies, the demonstration project, the project details, and future directions.
Rommelmann, Vanessa; Setel, Philip W.; Hemed, Yusuf; Angeles, Gustavo; Mponezya, Hamisi; Whiting, David; Boerma, Ties
2005-01-01
OBJECTIVE: To examine the costs of complementary information generation activities in a resource-constrained setting and compare the costs and outputs of information subsystems that generate the statistics on poverty, health and survival required for monitoring, evaluation and reporting on health programmes in the United Republic of Tanzania. METHODS: Nine systems used by four government agencies or ministries were assessed. Costs were calculated from budgets and expenditure data made available by information system managers. System coverage, quality assurance and information production were reviewed using questionnaires and interviews. Information production was characterized in terms of 38 key sociodemographic indicators required for national programme monitoring. FINDINGS: In 2002-03 approximately US$ 0.53 was spent per Tanzanian citizen on the nine information subsystems that generated information on 37 of the 38 selected indicators. The census and reporting system for routine health service statistics had the largest participating populations and highest total costs. Nationally representative household surveys and demographic surveillance systems (which are not based on nationally representative samples) produced more than half the indicators and used the most rigorous quality assurance. Five systems produced fewer than 13 indicators and had comparatively high costs per participant. CONCLUSION: Policy-makers and programme planners should be aware of the many trade-offs with respect to system costs, coverage, production, representativeness and quality control when making investment choices for monitoring and evaluation. In future, formal cost-effectiveness studies of complementary information systems would help guide investments in the monitoring, evaluation and planning needed to demonstrate the impact of poverty-reduction and health programmes. PMID:16184275
The Collaborative Information Portal and NASA's Mars Exploration Rover Mission
NASA Technical Reports Server (NTRS)
Mak, Ronald; Walton, Joan
2005-01-01
The Collaborative Information Portal was enterprise software developed jointly by the NASA Ames Research Center and the Jet Propulsion Laboratory for NASA's Mars Exploration Rover mission. Mission managers, engineers, scientists, and researchers used this Internet application to view current staffing and event schedules, download data and image files generated by the rovers, receive broadcast messages, and get accurate times in various Mars and Earth time zones. This article describes the features, architecture, and implementation of this software, and concludes with lessons we learned from its deployment and a look towards future missions.
Moss, Simon A; Wilson, Samuel G; Irons, Melanie; Naivalu, Carmen
2017-12-01
Some research shows that people who often contemplate their future tend to be healthier. Yet the burgeoning literature on mindfulness demonstrates that people who are more attuned to their immediate experiences also enjoy many benefits. To reconcile these principles, many scholars recommend that people should distribute their attention, somewhat evenly, across the past, present, and future-but have not clarified how people should achieve this goal. We test the possibility that people who perceive their future as vivid and certain, called future clarity, might be able to both orient their attention to the future as well as experience mindfulness. Specifically, future clarity could diminish the inclination of people to reach decisions prematurely and dismiss information that contradicts these decisions, called need for closure-tendencies that diminish consideration of future consequences and mindfulness, respectively. In this cross-sectional study, 194 participants completed measures of mindfulness, consideration of future consequences, need for closure, and future clarity. Consistent with hypotheses, future clarity was positively associated with both mindfulness and consideration of future consequences. Need for closure partly mediated these relationships. Accordingly, interventions that empower people to shape and to clarify their future might generate the benefits of both mindfulness and a future orientation. Copyright © 2017 John Wiley & Sons, Ltd.
Intergenerational equity and conservation
NASA Technical Reports Server (NTRS)
Otoole, R. P.; Walton, A. L.
1980-01-01
The issue of integenerational equity in the use of natural resources is discussed in the context of coal mining conversion. An attempt to determine if there is a clear-cut benefit to future generations in setting minimum coal extraction efficiency standards in mining is made. It is demonstrated that preserving fossil fuels beyond the economically efficient level is not necessarily beneficial to future generations even in terms of their own preferences. Setting fossil fuel conservation targets for intermediate products (i.e. energy) may increase the quantities of fossil fuels available to future generations and hence lower the costs, but there may be serious disadvantages to future generations as well. The use of relatively inexpensive fossil fuels in this generation may result in more infrastructure development and more knowledge production available to future generations. The value of fossil fuels versus these other endowments in the future depends on many factors which cannot possibly be evaluated at present. Since there is no idea of whether future generations are being helped or harmed, it is recommended that integenerational equity not be used as a factor in setting coal mine extraction efficiency standards, or in establishing requirements.
Hillenburg, K L; Cederberg, R A; Gray, S A; Hurst, C L; Johnson, G K; Potter, B J
2006-08-01
The digital revolution and growth of the Internet have led to many innovations in the area of electronic learning (e-learning). To survive and prosper, educators must be prepared to respond creatively to these changes. Administrators and information technology specialists at six dental schools and their parent institutions were interviewed regarding their opinions of the impact that e-learning will have on the future of dental education. Interview questions encompassed vision, rate of change, challenges, role of faculty, resources, enrolment, collaboration, responsibility for course design and content, mission and fate of the institution. The objective of this qualitative study was to sample the opinions of educational administrators and information technology specialists from selected US universities regarding the impact of e-learning on dental education to detect trends in their attitudes. Responses to the survey indicated disagreement between administrators and informational technology specialists regarding the rate of change, generation of resources, impact on enrolment, responsibility for course design and content, mission and fate of the university. General agreement was noted with regard to vision, challenges, role of faculty and need for collaboration.
Povey, Sue; Al Aqeel, Aida I; Cambon-Thomsen, Anne; Dalgleish, Raymond; den Dunnen, Johan T; Firth, Helen V; Greenblatt, Marc S; Barash, Carol Isaacson; Parker, Michael; Patrinos, George P; Savige, Judith; Sobrido, Maria-Jesus; Winship, Ingrid; Cotton, Richard GH
2010-01-01
More than 1,000 Web-based locus-specific variation databases (LSDBs) are listed on the Website of the Human Genetic Variation Society (HGVS). These individual efforts, which often relate phenotype to genotype, are a valuable source of information for clinicians, patients, and their families, as well as for basic research. The initiators of the Human Variome Project recently recognized that having access to some of the immense resources of unpublished information already present in diagnostic laboratories would provide critical data to help manage genetic disorders. However, there are significant ethical issues involved in sharing these data worldwide. An international working group presents second-generation guidelines addressing ethical issues relating to the curation of human LSDBs that provide information via a Web-based interface. It is intended that these should help current and future curators and may also inform the future decisions of ethics committees and legislators. These guidelines have been reviewed by the Ethics Committee of the Human Genome Organization (HUGO). Hum Mutat 31:–6, 2010. © 2010 Wiley-Liss, Inc. PMID:20683926
Renewable Electricity Futures: Exploration of a U.S. Grid with 80% Renewable Electricity
NASA Astrophysics Data System (ADS)
Mai, Trieu
2013-04-01
Renewable Electricity Futures is an initial investigation of the extent to which renewable energy supply can meet the electricity demands of the contiguous United States over the next several decades. This study explores the implications and challenges of very high renewable electricity generation levels: from 30% up to 90% (focusing on 80%) of all U.S. electricity generation from renewable technologies in 2050. At such high levels of renewable electricity penetration, the unique characteristics of some renewable resources, specifically geographical distribution and variability and un-certainty in output, pose challenges to the operability of the nation's electric system. The study focuses on key technical implications of this environment from a national perspective, exploring whether the U.S. power system can supply electricity to meet customer demand on an hourly basis with high levels of renewable electricity, including variable wind and solar generation. The study also identifies some of the potential economic, environmental, and social implications of deploying and integrating high levels of renewable electricity in the U.S. The full report and associated supporting information is available at: http://www.nrel.gov/analysis/refutures/.
Understanding the mobile internet to develop the next generation of online medical teaching tools
Christiano, Cynthia; Ferris, Maria
2011-01-01
Healthcare providers (HCPs) use online medical information for self-directed learning and patient care. Recently, the mobile internet has emerged as a new platform for accessing medical information as it allows mobile devices to access online information in a manner compatible with their restricted storage. We investigated mobile internet usage parameters to direct the future development of mobile internet teaching websites. Nephrology On-Demand Mobile (NODM) (http://www.nephrologyondemand.org) was made accessible to all mobile devices. From February 1 to December 31, 2010, HCP use of NODM was tracked using code inserted into the root files. Nephrology On-Demand received 15 258 visits, of which approximately 10% were made to NODM, with the majority coming from the USA. Most access to NODM was through the Apple iOS family of devices and cellular connections were the most frequently used. These findings provide a basis for the future development of mobile nephrology and medical teaching tools. PMID:21659443
Understanding the mobile internet to develop the next generation of online medical teaching tools.
Desai, Tejas; Christiano, Cynthia; Ferris, Maria
2011-01-01
Healthcare providers (HCPs) use online medical information for self-directed learning and patient care. Recently, the mobile internet has emerged as a new platform for accessing medical information as it allows mobile devices to access online information in a manner compatible with their restricted storage. We investigated mobile internet usage parameters to direct the future development of mobile internet teaching websites. Nephrology On-Demand Mobile (NOD(M)) (http://www.nephrologyondemand.org) was made accessible to all mobile devices. From February 1 to December 31, 2010, HCP use of NOD(M) was tracked using code inserted into the root files. Nephrology On-Demand received 15,258 visits, of which approximately 10% were made to NOD(M), with the majority coming from the USA. Most access to NOD(M) was through the Apple iOS family of devices and cellular connections were the most frequently used. These findings provide a basis for the future development of mobile nephrology and medical teaching tools.
Web 2.0, Library 2.0, and Librarian 2.0:Preparing for the 2.0 World
NASA Astrophysics Data System (ADS)
Abram, S.
2007-10-01
There is a global conversation going on right now about the next generation of the web. It's happening under the name of Web 2.0. It's the McLuhanesque hot web where true human interaction takes precedence over merely `cool' information delivery and e-mail. It's about putting information into the real context of our users' lives, research, work and play. Concurrently, a group of information professionals are having a conversation about the vision for what Library 2.0 will look like in this Web 2.0 ecosystem. Some are even going so far as to talk about Web 3.0! Web 2.0 is coming fast and it's BIG! What are the skills and competencies that Librarian 2.0 will need? Come and hear an overview of Web 2.0 and a draft vision for Library 2.0 and an opinion about what adaptations we'll need to make to thrive in this future scenario. Let's talk about the Librarian 2.0 in our users' future!
NASA Astrophysics Data System (ADS)
Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio
2017-04-01
Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic and drought statistic of the historical data. A multi-objective analysis using basic statistics (mean, standard deviation and asymmetry coefficient) and droughts statistics (duration, magnitude and intensity) has been performed to identify which models are better in terms of goodness of fit to reproduce the historical series. The drought statistics have been obtained from the Standard Precipitation index (SPI) series using the Theory of Runs. This analysis allows discriminate the best RCM and the best combination of model and correction technique in the bias-correction method. We have also analyzed the possibilities of using different Stochastic Weather Generators to approximate the basic and droughts statistics of the historical series. These analyses have been performed in our case study in a lumped and in a distributed way in order to assess its sensibility to the spatial scale. The statistic of the future temperature series obtained with different ensemble options are quite homogeneous, but the precipitation shows a higher sensibility to the adopted method and spatial scale. The global increment in the mean temperature values are 31.79 %, 31.79 %, 31.03 % and 31.74 % for the distributed bias-correction, distributed delta-change, lumped bias-correction and lumped delta-change ensembles respectively and in the precipitation they are -25.48 %, -28.49 %, -26.42 % and -27.35% respectively. Acknowledgments: This research work has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank Spain02 and CORDEX projects for the data provided for this study and the R package qmap.
NASA Technical Reports Server (NTRS)
Applegate, Joseph L.
2014-01-01
This Land Use Control Implementation Plan (LUCIP) has been prepared to inform current and potential future users of the Kennedy Space Center (KSC) Shuttle Flight Operations Contract Generator Maintenance Facility (SFOC; SWMU 081; "the Site") of institutional controls that have been implemented at the Site1. Although there are no current unacceptable risks to human health or the environment associated with the SFOC, an institutional land use control (LUC) is necessary to prevent human health exposure to antimony-affected groundwater at the Site. Controls will include periodic inspection, condition certification, and agency notification.
NASA Astrophysics Data System (ADS)
Millstein, D.; Zhai, P.; Menon, S.
2011-12-01
Over the past decade significant reductions of NOx and SOx emissions from coal burning power plants in the U.S. have been achieved due to regulatory action and substitution of new generation towards natural gas and wind power. Low natural gas prices, ever decreasing solar generation costs, and proposed regulatory changes, such as to the Cross State Air Pollution Rule, promise further long-run coal power plant emission reductions. Reduced power plant emissions have the potential to affect ozone and particulate air quality and influence regional climate through aerosol cloud interactions and visibility effects. Here we investigate, on a national scale, the effects on future (~2030) air quality and regional climate of power plant emission regulations in contrast to and combination with policies designed to aggressively promote solar electricity generation. A sophisticated, economic and engineering based, hourly power generation dispatch model is developed to explore the integration of significant solar generation resources (>10% on an energy basis) at various regions across the county, providing detailed estimates of substitution of solar generation for fossil fuel generation resources. Future air pollutant emissions from all sectors of the economy are scaled based on the U.S. Environmental Protection Agency's National Emission Inventory to account for activity changes based on population and economic projections derived from county level U.S. Census data and the Energy Information Administration's Annual Energy Outlook. Further adjustments are made for technological and regulatory changes applicable within various sectors, for example, emission intensity adjustments to on-road diesel trucking due to exhaust treatment and improved engine design. The future year 2030 is selected for the emissions scenarios to allow for the development of significant solar generation resources. A regional climate and air quality model (Weather Research and Forecasting, WRF model) is used to investigate the effects of the various solar generation scenarios given emissions projections that account for changing regulatory environment, economic and population growth, and technological change. The results will help to quantify the potential air quality benefits of promotion of solar electricity generation in regions containing high penetration of coal-fired power generation. Note current national solar incentives that are based only on solar generation capacity. Further investigation of changes to regional climate due to emission reductions of aerosols and relevant precursors will provide insight into the environmental effects that may occur if solar power generation becomes widespread.
Connecting Biology to Electronics: Molecular Communication via Redox Modality.
Liu, Yi; Li, Jinyang; Tschirhart, Tanya; Terrell, Jessica L; Kim, Eunkyoung; Tsao, Chen-Yu; Kelly, Deanna L; Bentley, William E; Payne, Gregory F
2017-12-01
Biology and electronics are both expert at for accessing, analyzing, and responding to information. Biology uses ions, small molecules, and macromolecules to receive, analyze, store, and transmit information, whereas electronic devices receive input in the form of electromagnetic radiation, process the information using electrons, and then transmit output as electromagnetic waves. Generating the capabilities to connect biology-electronic modalities offers exciting opportunities to shape the future of biosensors, point-of-care medicine, and wearable/implantable devices. Redox reactions offer unique opportunities for bio-device communication that spans the molecular modalities of biology and electrical modality of devices. Here, an approach to search for redox information through an interactive electrochemical probing that is analogous to sonar is adopted. The capabilities of this approach to access global chemical information as well as information of specific redox-active chemical entities are illustrated using recent examples. An example of the use of synthetic biology to recognize external molecular information, process this information through intracellular signal transduction pathways, and generate output responses that can be detected by electrical modalities is also provided. Finally, exciting results in the use of redox reactions to actuate biology are provided to illustrate that synthetic biology offers the potential to guide biological response through electrical cues. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An On-Demand Optical Quantum Random Number Generator with In-Future Action and Ultra-Fast Response
Stipčević, Mario; Ursin, Rupert
2015-01-01
Random numbers are essential for our modern information based society e.g. in cryptography. Unlike frequently used pseudo-random generators, physical random number generators do not depend on complex algorithms but rather on a physicsal process to provide true randomness. Quantum random number generators (QRNG) do rely on a process, wich can be described by a probabilistic theory only, even in principle. Here we present a conceptualy simple implementation, which offers a 100% efficiency of producing a random bit upon a request and simultaneously exhibits an ultra low latency. A careful technical and statistical analysis demonstrates its robustness against imperfections of the actual implemented technology and enables to quickly estimate randomness of very long sequences. Generated random numbers pass standard statistical tests without any post-processing. The setup described, as well as the theory presented here, demonstrate the maturity and overall understanding of the technology. PMID:26057576
Extending the Framework of Generativity Theory Through Research: A Qualitative Study.
Rubinstein, Robert L; Girling, Laura M; de Medeiros, Kate; Brazda, Michael; Hannum, Susan
2015-08-01
Based on ethnographic interviews, we discuss three ideas we believe will expand knowledge of older informants' thoughts about and representations of generativity. We adapt the notion of "dividuality" as developed in cultural anthropology to reframe ideas on generativity. The term dividuality refers to a condition of interpersonal or intergenerational connectedness, as distinct from individuality. We also extend previous definitions of generativity by identifying both objects of generative action and temporal and relational frameworks for generative action. We define 4 foci of generativity (people, groups, things, and activities) and 4 spheres of generativity (historical, familial, individual, and relational) based in American culture and with which older informants could easily identify. The approach outlined here also discusses a form of generativity oriented to the past in which relationships with persons in senior generations form a kind of generative action since they are involved in caring for the origins of the self and hence of future generative acts. These 3 elements of a new framework will allow researchers to pose critical questions about generativity among older adults. Such questions include (a) How is the self, as culturally constituted, involved in generative action? and (b) What are the types of generativity within the context of American culture and how are they spoken about? Each of the above points is directly addressed in the data we present below. We defined these domains through extended ethnographic interviews with 200 older women. The article addresses some new ways of thinking about generativity as a construct, which may be useful in understanding the cultural personhood of older Americans. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Future Remains: Industrial Heritage at the Hanford Plutonium Works
NASA Astrophysics Data System (ADS)
Freer, Brian
This dissertation argues that U.S. environmental and historic preservation regulations, industrial heritage projects, history, and art only provide partial frameworks for successfully transmitting an informed story into the long range future about nuclear technology and its related environmental legacy. This argument is important because plutonium from nuclear weapons production is toxic to humans in very small amounts, threatens environmental health, has a half-life of 24, 110 years and because the industrial heritage project at Hanford is the first time an entire U.S. Department of Energy weapons production site has been designated a U.S. Historic District. This research is situated within anthropological interest in industrial heritage studies, environmental anthropology, applied visual anthropology, as well as wider discourses on nuclear studies. However, none of these disciplines is really designed or intended to be a completely satisfactory frame of reference for addressing this perplexing challenge of documenting and conveying an informed story about nuclear technology and its related environmental legacy into the long range future. Others have thought about this question and have made important contributions toward a potential solution. Examples here include: future generations movements concerning intergenerational equity as evidenced in scholarship, law, and amongst Native American groups; Nez Perce and Confederated Tribes of the Umatilla Indian Reservation responses to the Hanford End State Vision and Hanford's Canyon Disposition Initiative; as well as the findings of organizational scholars on the advantages realized by organizations that have a long term future perspective. While these ideas inform the main line inquiry of this dissertation, the principal approach put forth by the researcher of how to convey an informed story about nuclear technology and waste into the long range future is implementation of the proposed Future Remains clause, as originated by the author, by amendment to two U.S. federal laws: National Historic Preservation Act and Comprehensive Environmental Response, Compensation, and Liability Act. The dissertation provides a case study in public anthropology. The findings of the dissertation include recommendations whereby the Future Remains clause gives historic preservation and cultural resources a leading and ongoing role in facilitating real-time forward looking historical documentation at environmental restoration projects at United States National Priorities List (i.e., "Superfund") sites.
Genomic data-sharing: what will be our legacy?
Callier, Shawneequa; Husain, Rajah; Simpson, Rachel
2014-01-01
Prior to 1974, the Tuskegee Syphilis experiments, expansive use of the HeLa cells, and other blatant instances of research abuse pervaded the medical research field. Ongoing challenges to informed consent, privacy and data-sharing will influence the stories that research participants today share with future generations. This has significant implications for the advancement of genomic science, and the public's perception of genomic research. PMID:24634673
Future War: An Assessment of Aerospace Campaigns in 2010,
1996-01-01
theoretician: "The impending sixth generation of warfare, with its centerpiece of superior data-processing to support precision smart weaponry, will radically...tions concept of " smart push, warrior pull." If JFACC were colocated with the worldwide intelligence manager, unit taskings and the applicable...intelligence information could be distributed concurrently (" smart push"). Intelligence officers sitting alongside the operational tasking officers would
Snow mapping from space platforms
NASA Technical Reports Server (NTRS)
Itten, K. I.
1980-01-01
The paper considers problems of optimum resolution, periodicity, and wavelength bands used for snow mapping. Analog and digital methods were used for application of satellite data; techniques were developed for producing steamflow forecasts, hydroelectric power generation regulation data, irrigation potentials, and information on the availability of drinking water supplies. Future systems will utilize improved spectral band selection, new spectral regions, higher repetition rates, and more rapid access to satellite data.
ERIC Educational Resources Information Center
Pallant, Amy; Lee, Hee-Sun
2017-01-01
During the past several decades, there has been a growing awareness of the ways humans affect Earth systems. As global problems emerge, educating the next generation of citizens to be able to make informed choices related to future outcomes is increasingly important. The challenge for educators is figuring out how to prepare students to think…
ERIC Educational Resources Information Center
Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.
This document launches a wide public consultation with all those involved in and with an interest in the European Union's (EU's) education, training, and youth programs called Socrates, Tempus, Leonardo da Vinci, and Youth for Europe. It is the first step toward preparing the new generation of programs to start in 2007 and will inform the…
ERIC Educational Resources Information Center
Fluck, A.; Dowden, T.
2013-01-01
Few contemporary pre-service teachers would have completed their schooling with the extensive aid of computers. Yet, classroom use of information and communication technology (ICT) is now ubiquitous in much of the world. Today's pre-service teachers are the "cusp generation" who, at a unique moment in history, straddle the two worlds of…
NASA Technical Reports Server (NTRS)
Short, Nicholas, Jr.; Wattawa, Scott L.
1988-01-01
For the past decade, operations and research projects that support a major portion of NASA's overall mission have experienced a dramatic increase in the volume of generated data and resultant information that is unparalleled in the history of the agency. The effect of such an increase is that most of the science and engineering disciplines are undergoing an information glut, which has occurred, not only because of the amount, but also because of the type of data being collected. This information glut is growing exponentially and is expected to grow for the foreseeable future. Consequently, it is becoming physically and intellectually impossible to identify, access, modify, and analyze the most suitable information. Thus, the dilemma arises that the amount and complexity of information has exceeded and will continue to exceed, using present information systems, the ability of all the scientists and engineers to understand and take advantage of this information. As a result of this information problem, NASA has initiated the Intelligent Data Management (IDM) project to design and develop Advanced Information Management (IDM) project to design and develop Advanced Information Management Systems (AIMS). The first effort of the Project was the prototyping of an Intelligent User Interface (IUI) to an operational scientific database using expert systems, natural language processing, and graphics technologies. An overview of the IUI formulation and development for the second phase is presented.
Towards Scalable Entangled Photon Sources with Self-Assembled InAs /GaAs Quantum Dots
NASA Astrophysics Data System (ADS)
Wang, Jianping; Gong, Ming; Guo, G.-C.; He, Lixin
2015-08-01
The biexciton cascade process in self-assembled quantum dots (QDs) provides an ideal system for realizing deterministic entangled photon-pair sources, which are essential to quantum information science. The entangled photon pairs have recently been generated in experiments after eliminating the fine-structure splitting (FSS) of excitons using a number of different methods. Thus far, however, QD-based sources of entangled photons have not been scalable because the wavelengths of QDs differ from dot to dot. Here, we propose a wavelength-tunable entangled photon emitter mounted on a three-dimensional stressor, in which the FSS and exciton energy can be tuned independently, thereby enabling photon entanglement between dissimilar QDs. We confirm these results via atomistic pseudopotential calculations. This provides a first step towards future realization of scalable entangled photon generators for quantum information applications.
Next-generation healthcare: a strategic appraisal.
Montague, Terrence
2009-01-01
Successful next-generation healthcare must deliver timely access and quality for an aging population, while simultaneously promoting disease prevention and managing costs. The key factors for sustained success are a culture with aligned goals and values; coordinated team care that especially engages with physicians and patients; practical information that is collected and communicated reliably; and education in the theory and methods of collaboration, measurement and leadership. Currently, optimal population health is challenged by a high prevalence of chronic disease, with large gaps between best and usual care, a scarcity of health human resources - particularly with the skills, attitudes and training for coordinated team care - and the absence of flexible, reliable clinical measurement systems. However, to make things better, institutional models and supporting technologies are available. In the short term, a first step is to enhance the awareness of the practical opportunities to improve, including the expansion of proven community-based disease management programs that communicate knowledge, competencies and clinical measurements among professional and patient partners, leading to reduced care gaps and improved clinical and economic outcomes. Longer-term success requires two additional steps. One is formal inter-professional training to provide, on an ongoing basis, the polyvalent human resource skills and foster the culture of working with others to improve the care of whole populations. The other is the adoption of reliable information systems, including electronic health records, to allow useful and timely measurement and effective communication of clinical information in real-world settings. A better health future can commence immediately, within existing resources, and be sustained with feasible innovations in provider and patient education and information systems. The future is now.
Kermisch, Celine
2016-12-01
The nuclear community frequently refers to the concept of "future generations" when discussing the management of high-level radioactive waste. However, this notion is generally not defined. In this context, we have to assume a wide definition of the concept of future generations, conceived as people who will live after the contemporary people are dead. This definition embraces thus each generation following ours, without any restriction in time. The aim of this paper is to show that, in the debate about nuclear waste, this broad notion should be further specified and to clarify the related implications for nuclear waste management policies. Therefore, we provide an ethical analysis of different management strategies for high-level waste in the light of two principles, protection of future generations-based on safety and security-and respect for their choice. This analysis shows that high-level waste management options have different ethical impacts across future generations, depending on whether the memory of the waste and its location is lost, or not. We suggest taking this distinction into account by introducing the notions of "close future generations" and "remote future generations", which has important implications on nuclear waste management policies insofar as it stresses that a retrievable disposal has fewer benefits than usually assumed.
Sands, D Z; Wald, J S
2014-08-15
Address current topics in consumer health informatics. Literature review. Current health care delivery systems need to be more effective in the management of chronic conditions as the population turns older and experiences escalating chronic illness that threatens to consume more health care resources than countries can afford. Most health care systems are positioned poorly to accommodate this. Meanwhile, the availability of ever more powerful and cheaper information and communication technology, both for professionals and consumers, has raised the capacity to gather and process information, communicate more effectively, and monitor the quality of care processes. Adapting health care systems to serve current and future needs requires new streams of data to enable better self-management, improve shared decision making, and provide more virtual care. Changes in reimbursement for health care services, increased adoption of relevant technologies, patient engagement, and calls for data transparency raise the importance of patient-generated health information, remote monitoring, non-visit based care, and other innovative care approaches that foster more frequent contact with patients and better management of chronic conditions.
Social Intelligence: Next Generation Business Intelligence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Troy Hiltbrand
In order for Business Intelligence to truly move beyond where it is today, a shift in approach must occur. Currently, much of what is accomplished in the realm of Business Intelligence relies on reports and dashboards to summarize and deliver information to end users. As we move into the future, we need to get beyond these reports and dashboards to a point where we break out the individual metrics that are embedded in these reports and interact with these components independently. Breaking these pieces of information out of the confines of reports and dashboards will allow them to be dynamicallymore » assembled for delivery in the way that makes most sense to each consumer. With this change in ideology, Business Intelligence will move from the concept of collections of objects, or reports and dashboards, to individual objects, or information components. The Next Generation Business Intelligence suite will translate concepts popularized in Facebook, Flickr, and Digg into enterprise worthy communication vehicles.« less
Unlocking the potential of the smart grid
NASA Astrophysics Data System (ADS)
Konopko, Joanna
2015-12-01
The smart grid refers to describe a next-generation electrical power system that is typified by the increased use of Information and Communication Technologies (ICT) in the whole delivery electrical energy process. The generation, delivery and consumption energy, all the steps for power transmission and distribution make the smart grid a complex system. The question is if the amount, diversity, and uses of such data put the smart grid in the category of Big Data applications, followed by the natural question of what is the true value of such data. In this paper an initial answer to this question is provided, the current state of data generation of the Polish grid is analyzed, and a future realistic scenario is illustrated. The analysis shows that the amount of data generated in smart grid is comparable to some of Big Data system examples.
Overview of Existing and Future Residential Use Cases for Connected Thermostats
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rotondo, Julia; Johnson, Robert; Gonzalez, Nancy
This paper is intended to help inform future technology deployment opportunities for connected thermostats (CTs), based on investigation and review of the U.S. residential housing and CT markets, as well as existing, emerging, and future use cases for CT hardware and CT-generated data. The CT market has experienced tremendous growth over the last 5 years—both in terms of the number of units sold and the number of firms offering competing products—and can be characterized by its rapid pace of technological innovation. Despite many assuming CTs would become powerful tools for increasing comfort while saving energy, there remains a great dealmore » of uncertainty about the actual energy and cost savings that are likely to be realized from deployment of CTs, particularly under different conditions.« less
Nuclear Data Uncertainty Quantification: Past, Present and Future
NASA Astrophysics Data System (ADS)
Smith, D. L.
2015-01-01
An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.
Online Treatment and Virtual Therapists in Child and Adolescent Psychiatry.
Schueller, Stephen M; Stiles-Shields, Colleen; Yarosh, Lana
2017-01-01
Online and virtual therapies are a well-studied and efficacious treatment option for various mental and behavioral health conditions among children and adolescents. However, many interventions have not considered the unique affordances offered by technologies that might align with the capacities and interests of youth users. In this article, the authors discuss learnings from child-computer interaction that can inform future generations of interventions and guide developers, practitioners, and researchers how to best use new technologies for youth populations. The article concludes with innovative examples illustrating future potentials of online and virtual therapies such as gaming and social networking. Copyright © 2016 Elsevier Inc. All rights reserved.
Archiving strategy for USGS EROS center and our future direction
Faundeen, John L.
2010-01-01
The U. S. Geological Survey's Earth Resources Observation and Science Center has the responsibility to acquire, manage, and preserve our Nation's land observations. These records are obtained primarily from airplanes and satellites dating back to the 1930s. The ability to compare landscapes from the past with current information enables change analysis at local and global scales. With new observations added daily, the records management challenges are daunting, involving petabytes of electronic data and tens of thousands of rolls of analog film. This paper focuses upon the appraisal and preservation functions employed to ensure that these records are available for current and future generations.
Ancient Wisdom, Applied Knowledge for a Sustainable Future
NASA Astrophysics Data System (ADS)
Peterson, K.; Philippe, R. Elde; Dardar, T. M. Elde
2017-12-01
Ancient wisdom informs traditional knowledges that guide Indigenous communities on how to interact with the world. These knowledges and the ancient wisdom have been the life-giving forces that have prevented the complete genocide of Indigenous peoples, and is also the wisdom that is rejuvenating ancient ways that will take the world into a future that embraces the seventh generation philosophy.. Western scientists and agency representatives are learning from the work and wisdom of Native Americans. This presentation will share the ways in which the representatives of two Tribes along the coast of Louisiana have been helping to educate and apply their work with Western scientists.
The Millennial Generation: Developing Leaders for the Future Security Environment
2011-02-15
Dumbest Generation (Penguin Group, New York, New York: 2009) p 8, 10. 19 National Academy of Sciences, “Generation Y : The Millennials …Ready or Not, Here...St ra te gy R es ea rc h Pr oj ec t THE MILLENNIAL GENERATION: DEVELOPING LEADERS FOR THE FUTURE SECURITY ENVIRONMENT BY COLONEL LANCE...Strategy Research Project 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE The Millennial Generation: Developing Leaders for the Future
Translating Uncertain Sea Level Projections Into Infrastructure Impacts Using a Bayesian Framework
NASA Astrophysics Data System (ADS)
Moftakhari, Hamed; AghaKouchak, Amir; Sanders, Brett F.; Matthew, Richard A.; Mazdiyasni, Omid
2017-12-01
Climate change may affect ocean-driven coastal flooding regimes by both raising the mean sea level (msl) and altering ocean-atmosphere interactions. For reliable projections of coastal flood risk, information provided by different climate models must be considered in addition to associated uncertainties. In this paper, we propose a framework to project future coastal water levels and quantify the resulting flooding hazard to infrastructure. We use Bayesian Model Averaging to generate a weighted ensemble of storm surge predictions from eight climate models for two coastal counties in California. The resulting ensembles combined with msl projections, and predicted astronomical tides are then used to quantify changes in the likelihood of road flooding under representative concentration pathways 4.5 and 8.5 in the near-future (1998-2063) and mid-future (2018-2083). The results show that road flooding rates will be significantly higher in the near-future and mid-future compared to the recent past (1950-2015) if adaptation measures are not implemented.
Can We Communicate with Other Generations?
ERIC Educational Resources Information Center
Mellert, Robert B.
Communicating with future generations is not merely a question of "how" to communicate, but also of "what." Our major moral responsibilities to future generations concern the size of future populations, conservation of nonrenewable resources, diversity of the gene pool, and quality of the environment. To determine our…
Bonsignori, Mattia
2014-11-01
The induction of HIV-1 broadly neutralizing antibodies (bnAbs) remains the primary goal of a preventive HIV-1 vaccine but no HIV-1 vaccine candidate has succeeded in inducing bnAbs. All the bnAbs isolated from chronically HIV-1 infected subjects display one or more traits associated with control by host tolerance and immunoregulatory mechanisms, including reactivity against self antigens. Recent studies on a HIV-1 patient with concurrent systemic lupus erythematosus have informed on how similar bnAbs are to typical autoantibodies controlled by immune tolerance mechanisms. Future studies aimed at elucidating the intersection between autoantibodies generated in the context of systemic lupus erythematosus and the development of HIV-1 bnAbs will further our knowledge of specific roadblocks that hamper the production of bnAbs and, ultimately, inform us on how to implement vaccine strategies to circumvent them.
Future of IT, PT and superconductivity technology
NASA Astrophysics Data System (ADS)
Tanaka, Shoji
2003-10-01
Recently the Information Technology is developing very rapidly and the total traffic on the Internet is increasing dramatically. The numerous equipments connected to the Internet must be operated at very high-speed and the electricity consumed in the Internet is also increasing. Superconductivity devices of very high-speed and very low power consumption must be introduced. These superconducting devices will play very important roles in the future information society. Coated conductors will be used to generate extremely high magnetic fields of beyond 20 T at low temperatures. At the liquid nitrogen temperature they can find many applications in a wide range of Power Technology and other industries, since we have already large critical current and brilliant magnetic field dependences in some prototypes of coated conductors. It is becoming certain that the market for the superconductivity technology will be opened between the years of 2005 and 2010.
Real time forecasting of near-future evolution.
Gerrish, Philip J; Sniegowski, Paul D
2012-09-07
A metaphor for adaptation that informs much evolutionary thinking today is that of mountain climbing, where horizontal displacement represents change in genotype, and vertical displacement represents change in fitness. If it were known a priori what the 'fitness landscape' looked like, that is, how the myriad possible genotypes mapped onto fitness, then the possible paths up the fitness mountain could each be assigned a probability, thus providing a dynamical theory with long-term predictive power. Such detailed genotype-fitness data, however, are rarely available and are subject to change with each change in the organism or in the environment. Here, we take a very different approach that depends only on fitness or phenotype-fitness data obtained in real time and requires no a priori information about the fitness landscape. Our general statistical model of adaptive evolution builds on classical theory and gives reasonable predictions of fitness and phenotype evolution many generations into the future.
Mechanical vs. informational components of price impact
NASA Astrophysics Data System (ADS)
Doyne Farmer, J.; Zamani, N.
2007-01-01
We study the problem of what causes prices to change. It is well known that trading impacts prices — orders to buy drive the price up, and orders to sell drive it down. We introduce a means of decomposing the total impact of trading into two components, defining the mechanical impact of a trading order as the change in future prices in the absence of any future changes in decision making, and the informational impact as the remainder of the total impact once mechanical impact is removed. This decomposition is performed using order book data from the London Stock Exchange. The average mechanical impact of a market order decays to zero as a function of time, at an asymptotic rate that is consistent with a power law with an exponent of roughly 1.7. In contrast the average informational impact builds to approach a constant value. Initially the impact is entirely mechanical, and is about half as big as the asymptotic informational impact. The size of the informational impact is positively correlated to mechanical impact. For cases where the mechanical impact is zero for all times, we find that the informational impact is negative, i.e. buy market orders that have no mechanical impact at all generate strong negative price responses.
Mori, Chisato; Todaka, Emiko
2009-01-01
Recently, we have investigated the relationship between environment and health from a scientific perspective and developed a new academic field, "Sustainable Health Science" that will contribute to creating a healthy environment for future generations. There are three key points in Sustainable Heath Science. The first key point is "focusing on future generations"-society should improve the environment and prevent possible adverse health effects on future generations (Environmental Preventive Medicine). The second key point is the "precautious principle". The third key point is "transdisciplinary science", which means that not only medical science but also other scientific fields, such as architectural and engineering science, should be involved. Here, we introduce our recent challenging project "Chemiless Town Project", in which a model town is under construction with fewer chemicals. In the project, a trial of an education program and a health-examination system of chemical exposure is going to be conducted. In the future, we are aiming to establish health examination of exposure to chemicals of women of reproductive age so that the risk of adverse health effects to future generations will decrease and they can enjoy a better quality of life. We hope that society will accept the importance of forming a sustainable society for future generations not only with regard to chemicals but also to the whole surrounding environment. As the proverb of American native people tells us, we should live considering the effects on seven generations in the future.
Universally-Usable Interactive Electronic Physics Instructional And Educational Materials
NASA Astrophysics Data System (ADS)
Gardner, John
2000-03-01
Recent developments of technologies that promote full accessibility of electronic information by future generations of people with print disabilities will be described. ("Print disabilities" include low vision, blindness, and dyslexia.) The guiding philosophy of these developments is that information should be created and transmitted in a form that is as display-independent as possible, and that the user should have maximum freedom over how that information is to be displayed. This philosophy leads to maximum usability by everybody and is, in the long run, the only way to assure truly equal access. Research efforts to be described include access to mathematics and scientific notation and to graphs, tables, charts, diagrams, and general object-oriented graphics.
Reading, Jeff; Nowgesic, Earl
2002-09-01
In the past and in the present, research studies and media reports have focused on pathology and dysfunction in aboriginal communities and have often failed to present a true and complete picture of the aboriginal experience. The Canadian Institutes of Health Research Institute of Aboriginal Peoples' Health is a national strategic research initiative led by both the aboriginal and research communities. This initiative aims to improve aboriginal health information, develop research capacity, better translate research into practice, and inform public health policy with the goal of improving the health of indigenous peoples.
[Technological convergence will quickly generate disruptive innovations in oncology].
Coucke, Ph A
2016-06-01
Convergence between information and communication technology and recent developments in medical care will totally change the health care sector. The way we perform diagnosis, treatment and follow-up will undergo disruptive changes in a very near future. We intend to highlight this statement by a limited selection of examples of radical innovations, especially in the field of oncology. To be totally disruptive and to illustrate the concept of "lateral power" - especially cognitive distribution - the list of references is only made up of internet links. Anyone - patients included - can easily and instantly access to this information everywhere.
Reading, Jeff; Nowgesic, Earl
2002-01-01
In the past and in the present, research studies and media reports have focused on pathology and dysfunction in aboriginal communities and have often failed to present a true and complete picture of the aboriginal experience. The Canadian Institutes of Health Research Institute of Aboriginal Peoples’ Health is a national strategic research initiative led by both the aboriginal and research communities. This initiative aims to improve aboriginal health information, develop research capacity, better translate research into practice, and inform public health policy with the goal of improving the health of indigenous peoples. PMID:12197963
Trust evaluation in health information on the World Wide Web.
Moturu, Sai T; Liu, Huan; Johnson, William G
2008-01-01
The impact of health information on the web is mounting and with the Health 2.0 revolution around the corner, online health promotion and management is becoming a reality. User-generated content is at the core of this revolution and brings to the fore the essential question of trust evaluation, a pertinent problem for health applications in particular. Evolving Web 2.0 health applications provide abundant opportunities for research. We identify these applications, discuss the challenges for trust assessment, characterize conceivable variables, list potential techniques for analysis, and provide a vision for future research.
Srinivasa Rao, Mathukumalli; Swathi, Pettem; Rama Rao, Chitiprolu Anantha; Rao, K. V.; Raju, B. M. K.; Srinivas, Karlapudi; Manimanjari, Dammu; Maheswari, Mandapaka
2015-01-01
The present study features the estimation of number of generations of tobacco caterpillar, Spodoptera litura. Fab. on peanut crop at six locations in India using MarkSim, which provides General Circulation Model (GCM) of future data on daily maximum (T.max), minimum (T.min) air temperatures from six models viz., BCCR-BCM2.0, CNRM-CM3, CSIRO-Mk3.5, ECHams5, INCM-CM3.0 and MIROC3.2 along with an ensemble of the six from three emission scenarios (A2, A1B and B1). This data was used to predict the future pest scenarios following the growing degree days approach in four different climate periods viz., Baseline-1975, Near future (NF) -2020, Distant future (DF)-2050 and Very Distant future (VDF)—2080. It is predicted that more generations would occur during the three future climate periods with significant variation among scenarios and models. Among the seven models, 1–2 additional generations were predicted during DF and VDF due to higher future temperatures in CNRM-CM3, ECHams5 & CSIRO-Mk3.5 models. The temperature projections of these models indicated that the generation time would decrease by 18–22% over baseline. Analysis of variance (ANOVA) was used to partition the variation in the predicted number of generations and generation time of S. litura on peanut during crop season. Geographical location explained 34% of the total variation in number of generations, followed by time period (26%), model (1.74%) and scenario (0.74%). The remaining 14% of the variation was explained by interactions. Increased number of generations and reduction of generation time across the six peanut growing locations of India suggest that the incidence of S. litura may increase due to projected increase in temperatures in future climate change periods. PMID:25671564
Assimilating the Future for Better Forecasts and Earlier Warnings
NASA Astrophysics Data System (ADS)
Du, H.; Wheatcroft, E.; Smith, L. A.
2016-12-01
Multi-model ensembles have become popular tools to account for some of the uncertainty due to model inadequacy in weather and climate simulation-based predictions. The current multi-model forecasts focus on combining single model ensemble forecasts by means of statistical post-processing. Assuming each model is developed independently or with different primary target variables, each is likely to contain different dynamical strengths and weaknesses. Using statistical post-processing, such information is only carried by the simulations under a single model ensemble: no advantage is taken to influence simulations under the other models. A novel methodology, named Multi-model Cross Pollination in Time, is proposed for multi-model ensemble scheme with the aim of integrating the dynamical information regarding the future from each individual model operationally. The proposed approach generates model states in time via applying data assimilation scheme(s) to yield truly "multi-model trajectories". It is demonstrated to outperform traditional statistical post-processing in the 40-dimensional Lorenz96 flow. Data assimilation approaches are originally designed to improve state estimation from the past to the current time. The aim of this talk is to introduce a framework that uses data assimilation to improve model forecasts at future time (not to argue for any one particular data assimilation scheme). Illustration of applying data assimilation "in the future" to provide early warning of future high-impact events is also presented.
Water supply as a constraint on transmission expansion planning in the Western interconnection
NASA Astrophysics Data System (ADS)
Tidwell, Vincent C.; Bailey, Michael; Zemlick, Katie M.; Moreland, Barbara D.
2016-12-01
Consideration of water supply in transmission expansion planning (TEP) provides a valuable means of managing impacts of thermoelectric generation on limited water resources. Toward this opportunity, thermoelectric water intensity factors and water supply availability (fresh and non-fresh sources) were incorporated into a recent TEP exercise conducted for the electric interconnection in the Western United States. The goal was to inform the placement of new thermoelectric generation so as to minimize issues related to water availability. Although freshwater availability is limited in the West, few instances across five TEP planning scenarios were encountered where water availability impacted the development of new generation. This unexpected result was related to planning decisions that favored the development of low water use generation that was geographically dispersed across the West. These planning decisions were not made because of their favorable influence on thermoelectric water demand; rather, on the basis of assumed future fuel and technology costs, policy drivers and the topology of electricity demand. Results also projected that interconnection-wide thermoelectric water consumption would increase by 31% under the business-as-usual case, while consumption would decrease by 42% under a scenario assuming a low-carbon future. Except in a few instances, new thermoelectric water consumption could be accommodated with less than 10% of the local available water supply; however, limited freshwater supplies and state-level policies could increase use of non-fresh water sources for new thermoelectric generation. Results could have been considerably different if scenarios favoring higher-intensity water use generation technology or potential impacts of climate change had been explored. Conduct of this exercise highlighted the importance of integrating water into all phases of TEP, particularly joint management of decisions that are both directly (e.g., water availability constraint) and indirectly (technology or policy constraints) related to future thermoelectric water demand, as well as, the careful selection of scenarios that adequately bound the potential dimensions of water impact.
Water supply as a constraint on transmission expansion planning in the Western interconnection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tidwell, Vincent C.; Bailey, Michael; Zemlick, Katie M.
Here, consideration of water supply in transmission expansion planning (TEP) provides a valuable means of managing impacts of thermoelectric generation on limited water resources. Toward this opportunity, thermoelectric water intensity factors and water supply availability (fresh and non-fresh sources) were incorporated into a recent TEP exercise conducted for the electric interconnection in the Western United States. The goal was to inform the placement of new thermoelectric generation so as to minimize issues related to water availability. Although freshwater availability is limited in the West, few instances across five TEP planning scenarios were encountered where water availability impacted the development ofmore » new generation. This unexpected result was related to planning decisions that favored the development of low water use generation that was geographically dispersed across the West. These planning decisions were not made because of their favorable influence on thermoelectric water demand; rather, on the basis of assumed future fuel and technology costs, policy drivers and the topology of electricity demand. Results also projected that interconnection-wide thermoelectric water consumption would increase by 31% under the business-as-usual case, while consumption would decrease by 42% under a scenario assuming a low-carbon future. Except in a few instances, new thermoelectric water consumption could be accommodated with less than 10% of the local available water supply; however, limited freshwater supplies and state-level policies could increase use of non-fresh water sources for new thermoelectric generation. Results could have been considerably different if scenarios favoring higher-intensity water use generation technology or potential impacts of climate change had been explored. Conduct of this exercise highlighted the importance of integrating water into all phases of TEP, particularly joint management of decisions that are both directly (e.g., water availability constraint) and indirectly (technology or policy constraints) related to future thermoelectric water demand, as well as, the careful selection of scenarios that adequately bound the potential dimensions of water impact.« less
Water supply as a constraint on transmission expansion planning in the Western interconnection
Tidwell, Vincent C.; Bailey, Michael; Zemlick, Katie M.; ...
2016-11-21
Here, consideration of water supply in transmission expansion planning (TEP) provides a valuable means of managing impacts of thermoelectric generation on limited water resources. Toward this opportunity, thermoelectric water intensity factors and water supply availability (fresh and non-fresh sources) were incorporated into a recent TEP exercise conducted for the electric interconnection in the Western United States. The goal was to inform the placement of new thermoelectric generation so as to minimize issues related to water availability. Although freshwater availability is limited in the West, few instances across five TEP planning scenarios were encountered where water availability impacted the development ofmore » new generation. This unexpected result was related to planning decisions that favored the development of low water use generation that was geographically dispersed across the West. These planning decisions were not made because of their favorable influence on thermoelectric water demand; rather, on the basis of assumed future fuel and technology costs, policy drivers and the topology of electricity demand. Results also projected that interconnection-wide thermoelectric water consumption would increase by 31% under the business-as-usual case, while consumption would decrease by 42% under a scenario assuming a low-carbon future. Except in a few instances, new thermoelectric water consumption could be accommodated with less than 10% of the local available water supply; however, limited freshwater supplies and state-level policies could increase use of non-fresh water sources for new thermoelectric generation. Results could have been considerably different if scenarios favoring higher-intensity water use generation technology or potential impacts of climate change had been explored. Conduct of this exercise highlighted the importance of integrating water into all phases of TEP, particularly joint management of decisions that are both directly (e.g., water availability constraint) and indirectly (technology or policy constraints) related to future thermoelectric water demand, as well as, the careful selection of scenarios that adequately bound the potential dimensions of water impact.« less
The Revised WIPP Passive Institutional Controls Program - A Conceptual Plan - 13145
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patterson, Russ; Klein, Thomas; Van Luik, Abraham
2013-07-01
The Department of Energy/Carlsbad Field Office (DOE/CBFO) is responsible for managing all activities related to the disposal of TRU and TRU-mixed waste in the geologic repository, 650 m below the land surface, at WIPP, near Carlsbad, New Mexico. The main function of the Passive Institutional Controls (PIC's) program is to inform future generations of the long-lived radioactive wastes buried beneath their feet in the desert. For the first 100 years after cessation of disposal operations, the rooms are closed and the shafts leading underground sealed, WIPP is mandated by law to institute Active Institutional Controls (AIC's) with fences, gates, andmore » armed guards on patrol. At this same time a plan must be in place of how to warn/inform the future, after the AIC's are gone, of the consequences of intrusion into the geologic repository disposal area. A plan was put into place during the 1990's with records management and storage, awareness triggers, permanent marker design concepts and testing schedules. This work included the thoughts of expert panels and individuals. The plan held up under peer review and met the requirements of the U.S. Environmental Protection Agency (EPA). Today the NEA is coordinating a study called the 'Preservation of Records, Knowledge and Memory (RK and M) Across Generations' to provide the international nuclear waste repository community with a guide on how a nuclear record archive programs should be approached and developed. CBFO is cooperating and participating in this project and will take what knowledge is gained and apply that to the WIPP program. At the same time CBFO is well aware that the EPA and others are expecting DOE to move forward with planning for the future WIPP PIC's program; so a plan will be in place in time for WIPP's closure slated for the early 2030's. The DOE/CBFO WIPP PIC's program in place today meets the regulatory criteria, but complete feasibility of implementation is questionable, and may not be in conformance with the international guidance being developed. International guidance currently under development may suggest that the inter-generational equity principle strives to warn the future, however, in doing so not to unduly burden present generations. Building markers and monuments that are out of proportion to the risk being presented to the future is not in keeping with generational equity. With this in mind the DOE/CBFO is developing conceptual plans for re-evaluating and revising the current WIPP PIC's program. These conceptual plans will suggest scientific and technical work that must be completed to develop a 'new' PICs program that takes the best ideas of the present plan, blended with new ideas from the RK and M project, and proposed alternative permanent markers designs and materials in consideration. (authors)« less
Data Acquisition and Mass Storage
NASA Astrophysics Data System (ADS)
Vande Vyvre, P.
2004-08-01
The experiments performed at supercolliders will constitute a new challenge in several disciplines of High Energy Physics and Information Technology. This will definitely be the case for data acquisition and mass storage. The microelectronics, communication, and computing industries are maintaining an exponential increase of the performance of their products. The market of commodity products remains the largest and the most competitive market of technology products. This constitutes a strong incentive to use these commodity products extensively as components to build the data acquisition and computing infrastructures of the future generation of experiments. The present generation of experiments in Europe and in the US already constitutes an important step in this direction. The experience acquired in the design and the construction of the present experiments has to be complemented by a large R&D effort executed with good awareness of industry developments. The future experiments will also be expected to follow major trends of our present world: deliver physics results faster and become more and more visible and accessible. The present evolution of the technologies and the burgeoning of GRID projects indicate that these trends will be made possible. This paper includes a brief overview of the technologies currently used for the different tasks of the experimental data chain: data acquisition, selection, storage, processing, and analysis. The major trends of the computing and networking technologies are then indicated with particular attention paid to their influence on the future experiments. Finally, the vision of future data acquisition and processing systems and their promise for future supercolliders is presented.
Future Dietitian 2025: informing the development of a workforce strategy for dietetics.
Hickson, M; Child, J; Collinson, A
2018-02-01
Healthcare is changing and the professions that deliver it need to adapt and change too. The aim of this research was to inform the development of a workforce strategy for Dietetics for 2020-2030. This included an understanding of the drivers for change, the views of stakeholders and recommendations to prepare the profession for the future. The research included three phases: (i) establishing the context which included a literature and document review (environmental scan); (ii) discovering the profession and professional issues using crowd-sourcing technology; and (iii) articulating the vision for the future using appreciative inquiry. The environmental scan described the current status of the dietetic profession, the changing healthcare environment, the context in which dietitians work and what future opportunities exist for the profession. The online conversation facilitated by crowd-sourcing technology asked the question: 'How can dietitians strengthen their future role, influence and impact?' Dietitians and interested stakeholders (726 and 109, respectively) made 6130 contributions. Seven priorities were identified and fed into the appreciative inquiry event. The event bought together 54 dietitians and analysis of the discussions generated five themes: (i) professional identity; (ii) strong foundations-creating structure and direction for the profession; (iii) amplifying visibility and influence; (iv) embracing advances in science and technology; and (v) career advancement and emerging opportunities. A series of recommendations were made for the next steps in moving the workforce to a new future. The future for dietetics looks bright, embracing technology, as well as exploring different ways of working and new opportunities, as this dynamic profession continues to evolve. © 2017 The British Dietetic Association Ltd.
Evidence and Options for Informed Decision-Making to Achive Arctic Sustainability
NASA Astrophysics Data System (ADS)
Berkman, P. A.
2017-12-01
This presentation will consider the development of evidence and options for informed decision-making that will need to operate across generations to achieve Arctic sustainability (Figure). Context of these Arctic decisions is global, recognizing that we live in an interconnected civilization on a planetary scale, as revealed unambiguously with evidence from the `world' wars in the first half of the 20thcentury. First, for clarification, data and evidence are not the same. Data is generated from information and observations to answer specific questions, posed with methods from the natural and social sciences as well as indigenous knowledge. These data reveal patterns and trends in our societies and natural world, underscoring the evidence for decisions to address impacts, issues and resources within, across and beyond the boundaries of nations - recognizing that nations still are the principal jurisdictional unit. However, for this decision-support process to be effective, options (without advocacy) - which can be used or ignored explicitly - must be generated from the evidence, taking into consideration stakeholder perspectives and governance records in a manner that will contribute to informed decision-making. The resulting decisions will involve built elements that require capitalization and technology as well as governance mechanisms coming from diverse jurisdictional authorities. The innovation required is to balance economic prosperity, environmental protection and societal well-being. These three pillars of sustainability further involve stability, balancing urgencies of the moment and of future generations, recognizing that children born today will be alive in the 22nd century. Consequently, options for informed decisions must operate across a continuum of urgencies from security time scales to sustainability time scales. This decision-support process is holistic (international, interdisciplinary and inclusive), reflecting the applications of science diplomacy to balance national interests and common interests for the benefit of all on Earth.
Vaccine Hesitancy and Online Information: The Influence of Digital Networks.
Getman, Rebekah; Helmi, Mohammad; Roberts, Hal; Yansane, Alfa; Cutler, David; Seymour, Brittany
2017-12-01
This article analyzes the digital childhood vaccination information network for vaccine-hesitant parents. The goal of this study was to explore the structure and influence of vaccine-hesitant content online by generating a database and network analysis of vaccine-relevant content. We used Media Cloud, a searchable big-data platform of over 550 million stories from 50,000 media sources, for quantitative and qualitative study of an online media sample based on keyword selection. We generated a hyperlink network map and measured indegree centrality of the sources and vaccine sentiment for a random sample of 450 stories. 28,122 publications from 4,817 sources met inclusion criteria. Clustered communities formed based on shared hyperlinks; communities tended to link within, not among, each other. The plurality of information was provaccine (46.44%, 95% confidence interval [39.86%, 53.20%]). The most influential sources were in the health community (National Institutes of Health, Centers for Disease Control and Prevention) or mainstream media ( New York Times); some user-generated sources also had strong influence and were provaccine (Wikipedia). The vaccine-hesitant community rarely interacted with provaccine content and simultaneously used primary provaccine content within vaccine-hesitant narratives. The sentiment of the overall conversation was consistent with scientific evidence. These findings demonstrate an online environment where scientific evidence online drives vaccine information outside of the vaccine-hesitant community but is also prominently used and misused within the robust vaccine-hesitant community. Future communication efforts should take current context into account; more information may not prevent vaccine hesitancy.
Next-Generation Terrestrial Laser Scanning to Measure Forest Canopy Structure
NASA Astrophysics Data System (ADS)
Danson, M.
2015-12-01
Terrestrial laser scanners (TLS) are now capable of semi-automatic reconstruction of the structure of complete trees or forest stands and have the potential to provide detailed information on tree architecture and foliage biophysical properties. The trends for the next generation of TLS are towards higher resolution, faster scanning and full-waveform data recording, with mobile, multispectral laser devices. The convergence of these technological advances in the next generation of TLS will allow the production of information for forest and woodland mapping and monitoring that is far more detailed, more accurate, and more comprehensive than any available today. This paper describes recent scientific advances in the application of TLS for characterising forest and woodland areas, drawing on the authors' development of the Salford Advanced Laser Canopy Analyser (SALCA), the activities of the Terrestrial Laser Scanner International Interest Group (TLSIIG), and recent advances in laser scanner technology around the world. The key findings illustrated in the paper are that (i) a complete understanding of system measurement characteristics is required for quantitative analysis of TLS data, (ii) full-waveform data recording is required for extraction of forest biophysical variables and, (iii) multi-wavelength systems provide additional spectral information that is essential for classifying different vegetation components. The paper uses a range of recent experimental TLS measurements to support these findings, and sets out a vision for new research to develop an information-rich future-forest information system, populated by mobile autonomous multispectral TLS devices.
Quantum key distribution session with 16-dimensional photonic states.
Etcheverry, S; Cañas, G; Gómez, E S; Nogueira, W A T; Saavedra, C; Xavier, G B; Lima, G
2013-01-01
The secure transfer of information is an important problem in modern telecommunications. Quantum key distribution (QKD) provides a solution to this problem by using individual quantum systems to generate correlated bits between remote parties, that can be used to extract a secret key. QKD with D-dimensional quantum channels provides security advantages that grow with increasing D. However, the vast majority of QKD implementations has been restricted to two dimensions. Here we demonstrate the feasibility of using higher dimensions for real-world quantum cryptography by performing, for the first time, a fully automated QKD session based on the BB84 protocol with 16-dimensional quantum states. Information is encoded in the single-photon transverse momentum and the required states are dynamically generated with programmable spatial light modulators. Our setup paves the way for future developments in the field of experimental high-dimensional QKD.
Replacement Sequence of Events Generator
NASA Technical Reports Server (NTRS)
Fisher, Forest; Gladden, Daniel Wenkert Roy; Khanampompan, Teerpat
2008-01-01
The soeWINDOW program automates the generation of an ITAR (International Traffic in Arms Regulations)-compliant sub-RSOE (Replacement Sequence of Events) by extracting a specified temporal window from an RSOE while maintaining page header information. RSOEs contain a significant amount of information that is not ITAR-compliant, yet that foreign partners need to see for command details to their instrument, as well as the surrounding commands that provide context for validation. soeWINDOW can serve as an example of how command support products can be made ITAR-compliant for future missions. This software is a Perl script intended for use in the mission operations UNIX environment. It is designed for use to support the MRO (Mars Reconnaissance Orbiter) instrument team. The tool also provides automated DOM (Distributed Object Manager) storage into the special ITAR-okay DOM collection, and can be used for creating focused RSOEs for product review by any of the MRO teams.
Quantum key distribution session with 16-dimensional photonic states
NASA Astrophysics Data System (ADS)
Etcheverry, S.; Cañas, G.; Gómez, E. S.; Nogueira, W. A. T.; Saavedra, C.; Xavier, G. B.; Lima, G.
2013-07-01
The secure transfer of information is an important problem in modern telecommunications. Quantum key distribution (QKD) provides a solution to this problem by using individual quantum systems to generate correlated bits between remote parties, that can be used to extract a secret key. QKD with D-dimensional quantum channels provides security advantages that grow with increasing D. However, the vast majority of QKD implementations has been restricted to two dimensions. Here we demonstrate the feasibility of using higher dimensions for real-world quantum cryptography by performing, for the first time, a fully automated QKD session based on the BB84 protocol with 16-dimensional quantum states. Information is encoded in the single-photon transverse momentum and the required states are dynamically generated with programmable spatial light modulators. Our setup paves the way for future developments in the field of experimental high-dimensional QKD.
Quantum key distribution session with 16-dimensional photonic states
Etcheverry, S.; Cañas, G.; Gómez, E. S.; Nogueira, W. A. T.; Saavedra, C.; Xavier, G. B.; Lima, G.
2013-01-01
The secure transfer of information is an important problem in modern telecommunications. Quantum key distribution (QKD) provides a solution to this problem by using individual quantum systems to generate correlated bits between remote parties, that can be used to extract a secret key. QKD with D-dimensional quantum channels provides security advantages that grow with increasing D. However, the vast majority of QKD implementations has been restricted to two dimensions. Here we demonstrate the feasibility of using higher dimensions for real-world quantum cryptography by performing, for the first time, a fully automated QKD session based on the BB84 protocol with 16-dimensional quantum states. Information is encoded in the single-photon transverse momentum and the required states are dynamically generated with programmable spatial light modulators. Our setup paves the way for future developments in the field of experimental high-dimensional QKD. PMID:23897033
Peeraer, Jef; Van Petegem, Peter
2015-02-01
Over the last two decades, crucial factors for Information and Communication Technology (ICT) in education have improved significantly in Vietnam. Nevertheless, it is clear that, as in other countries, no educational revolution is taking place. We argue that there is a need for a broad dialogue on the future of ICT in education in Vietnam as discussion of ideas about future possibilities can be instrumental in rationalizing and generating educational change. We explore how a group of key players representing the public and private sector as well as development partners in the field look at the future of ICT in education in the country. Following the Delphi method, these key players assessed in different survey rounds the current situation of ICT in education, identified a series of targets and were asked to assess these targets in respect of their importance. The key players reached a consensus that the purpose of technology integration is to achieve learning goals and enhance learning. However, there is more controversy on targets that could potentially transform education practice in Vietnam. We discuss the value of the Delphi technique and argue for increased participation of all involved stakeholders in policy development on ICT in education. Copyright © 2014 Elsevier Ltd. All rights reserved.
Effectiveness guidance document (EGD) for Chinese medicine trials: a consensus document
2014-01-01
Background There is a need for more Comparative Effectiveness Research (CER) on Chinese medicine (CM) to inform clinical and policy decision-making. This document aims to provide consensus advice for the design of CER trials on CM for researchers. It broadly aims to ensure more adequate design and optimal use of resources in generating evidence for CM to inform stakeholder decision-making. Methods The Effectiveness Guidance Document (EGD) development was based on multiple consensus procedures (survey, written Delphi rounds, interactive consensus workshop, international expert review). To balance aspects of internal and external validity, multiple stakeholders, including patients, clinicians, researchers and payers were involved in creating this document. Results Recommendations were developed for “using available data” and “future clinical studies”. The recommendations for future trials focus on randomized trials and cover the following areas: designing CER studies, treatments, expertise and setting, outcomes, study design and statistical analyses, economic evaluation, and publication. Conclusion The present EGD provides the first systematic methodological guidance for future CER trials on CM and can be applied to single or multi-component treatments. While CONSORT statements provide guidelines for reporting studies, EGDs provide recommendations for the design of future studies and can contribute to a more strategic use of limited research resources, as well as greater consistency in trial design. PMID:24885146
Rescorla, Leslie A
2016-10-01
As summarized in this commentary, the first generation of cross-informant agreement research focused on perceptions of child and adolescent mental health. Contributions of this research include demonstrating that modest cross-informant agreement is a very robust phenomenon, utilizing numerous statistical approaches to measure degree of agreement, and identifying many factors that moderate agreement. An important focus of this work has been using multi-society international comparisons to examine cross-cultural similarities and differences in cross-informant agreement. The articles in this Special Issue represent a significant paradigm shift in which cross-informant agreement is examined as an independent variable predicting a wide variety of outcomes. Furthermore, moving beyond perceptions of adolescent mental health, these articles compare parent and adolescent perceptions of diverse aspects of family functioning (e.g., family conflict, parent-adolescent communication, family relationships, parental authority). Additionally, the research presented in this Special Issue employs innovative and sophisticated statistical techniques. Although the Special Issue represents some first steps toward considering cross-cultural aspects of perceptions of family functioning, much work still needs to be done in this area. Some suggestions for future research strategies to accomplish this goal conclude this commentary.
Popular Imagination and Identity Politics: Reading the Future in "Star Trek: The Next Generation."
ERIC Educational Resources Information Center
Ott, Brian L.; Aoki, Eric
2001-01-01
Analyzes the television series "Star Trek: The Next Generation." Theorizes the relationship between collective visions of the future and the identity politics of the present. Argues that "The Next Generation" invites audiences to participate in a shared sense of the future that constrains human agency and (re)produces the…
Future generations, environmental ethics, and global environmental change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonn, B.E.
1994-12-31
The elements of a methodology to be employed by the global community to investigate the consequences of global environmental change upon future generations and global ecosystems are outlined in this paper. The methodology is comprised of two major components: A possible future worlds model; and a formal, citizen-oriented process to judge whether the possible future worlds potentially inheritable by future generations meet obligational standards. A broad array of descriptors of future worlds can be encompassed within this framework, including survival of ecosystems and other species and satisfaction of human concerns. The methodology expresses fundamental psychological motivations and human myths journey,more » renewal, mother earth, and being-in-nature-and incorporates several viewpoints on obligations to future generations-maintaining options, fairness, humility, and the cause of humanity. The methodology overcomes several severe drawbacks of the economic-based methods most commonly used for global environmental policy analysis.« less
The Predictive Brain State: Timing Deficiency in Traumatic Brain Injury?
Ghajar, Jamshid; Ivry, Richard B.
2015-01-01
Attention and memory deficits observed in traumatic brain injury (TBI) are postulated to result from the shearing of white matter connections between the prefrontal cortex, parietal lobe, and cerebellum that are critical in the generation, maintenance, and precise timing of anticipatory neural activity. These fiber tracts are part of a neural network that generates predictions of future states and events, processes that are required for optimal performance on attention and working memory tasks. The authors discuss the role of this anticipatory neural system for understanding the varied symptoms and potential rehabilitation interventions for TBI. Preparatory neural activity normally allows the efficient integration of sensory information with goal-based representations. It is postulated that an impairment in the generation of this activity in traumatic brain injury (TBI) leads to performance variability as the brain shifts from a predictive to reactive mode. This dysfunction may constitute a fundamental defect in TBI as well as other attention disorders, causing working memory deficits, distractibility, a loss of goal-oriented behavior, and decreased awareness. “The future is not what is coming to meet us, but what we are moving forward to meet.” —Jean-Marie Guyau1 PMID:18460693
Data Preservation, Information Preservation, and Lifecyle of Information Management at NASA GES DISC
NASA Technical Reports Server (NTRS)
Khayat, Mo; Kempler, Steve; Deshong, Barbara; Johnson, James; Gerasimov, Irina; Esfandiari, Ed; Berganski, Michael; Wei, Jennifer
2014-01-01
Data lifecycle management awareness is common today; planners are more likely to consider lifecycle issues at mission start. NASA remote sensing missions are typically subject to life cycle management plans of the Distributed Active Archive Center (DAAC), and NASA invests in these national centers for the long-term safeguarding and benefit of future generations. As stewards of older missions, it is incumbent upon us to ensure that a comprehensive enough set of information is being preserved to prevent the risk for information loss. This risk is greater when the original data experts have moved on or are no longer available. Preservation of items like documentation related to processing algorithms, pre-flight calibration data, or input-output configuration parameters used in product generation, are examples of digital artifacts that are sometimes not fully preserved. This is the grey area of information preservation; the importance of these items is not always clear and requires careful consideration. Missing important metadata about intermediate steps used to derive a product could lead to serious challenges in the reproducibility of results or conclusions. Organizations are rapidly recognizing that the focus of life-cycle preservation needs to be enlarged from the strict raw data to the more encompassing arena of information lifecycle management. By understanding what constitutes information, and the complexities involved, we are better equipped to deliver longer lasting value about the original data and derived knowledge (information) from them. The NASA Earth Science Data Preservation Content Specification is an attempt to define the content necessary for long-term preservation. It requires new lifecycle infrastructure approach along with content repositories to accommodate artifacts other than just raw data. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) setup an open-source Preservation System capable of long-term archive of digital content to augment its raw data holding. This repository is being used for such missions as HIRDLS, UARS, TOMS, OMI, among others. We will provide a status of this implementation; report on challenges, lessons learned, and detail our plans for future evolution to include other missions and services.
NASA Astrophysics Data System (ADS)
Khayat, M. G.; Deshong, B.; Esfandiari, A. E.; Gerasimov, I. V.; Johnson, J. E.; Kempler, S. J.; Wei, J. C.
2014-12-01
Data lifecycle management awareness is common today; planners are more likely to consider lifecycle issues at mission start. NASA remote sensing missions are typically subject to life cycle management plans of the Distributed Active Archive Center (DAAC), and NASA invests in these national centers for the long-term safeguarding and benefit of future generations. As stewards of older missions, it is incumbent upon us to ensure that a comprehensive enough set of information is being preserved to prevent the risk for "information loss". This risk is greater when the original data experts have moved on or are no longer available. Preservation of items like documentation related to processing algorithms, pre-flight calibration data, or input/output configuration parameters used in product generation, are examples of digital artifacts that are sometimes not fully preserved. This is the grey area of "information preservation"; the importance of these items is not always clear and requires careful consideration. Missing important "metadata" about intermediate steps used to derive a product could lead to serious challenges in the reproducibility of results or conclusions.Organizations are rapidly recognizing that the focus of life-cycle preservation needs to be enlarged from the strict raw data to the more encompassing arena of "information lifecycle management". By understanding what constitutes information, and the complexities involved, we are better equipped to deliver longer lasting value about the original data and derived knowledge (information) from them. The "NASA Earth Science Data Preservation Content Specification" is an attempt to define the content necessary for long-term preservation. It requires new lifecycle infrastructure approach along with content repositories to accommodate artifacts other than just raw data. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) setup an open-source Preservation System capable of long-term archive of digital content to augment its raw data holding. This repository is being used for such missions as HIRDLS, UARS, TOMS, OMI, among others. We will provide a status of this implementation; report on challenges, lessons learned, and detail our plans for future evolution to include other missions and services.
Hypothetical Scenario Generator for Fault-Tolerant Diagnosis
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
The Hypothetical Scenario Generator for Fault-tolerant Diagnostics (HSG) is an algorithm being developed in conjunction with other components of artificial- intelligence systems for automated diagnosis and prognosis of faults in spacecraft, aircraft, and other complex engineering systems. By incorporating prognostic capabilities along with advanced diagnostic capabilities, these developments hold promise to increase the safety and affordability of the affected engineering systems by making it possible to obtain timely and accurate information on the statuses of the systems and predicting impending failures well in advance. The HSG is a specific instance of a hypothetical- scenario generator that implements an innovative approach for performing diagnostic reasoning when data are missing. The special purpose served by the HSG is to (1) look for all possible ways in which the present state of the engineering system can be mapped with respect to a given model and (2) generate a prioritized set of future possible states and the scenarios of which they are parts.
Kwon, Yeondae; Natori, Yukikazu
2017-01-01
The proportion of the elderly population in most countries worldwide is increasing dramatically. Therefore, social interest in the fields of health, longevity, and anti-aging has been increasing as well. However, the basic research results obtained from a reductionist approach in biology and a bioinformatic approach in genome science have limited usefulness for generating insights on future health, longevity, and anti-aging-related research on a case by case basis. We propose a new approach that uses our literature mining technique and bioinformatics, which lead to a better perspective on research trends by providing an expanded knowledge base to work from. We demonstrate that our approach provides useful information that deepens insights on future trends which differs from data obtained conventionally, and this methodology is already paving the way for a new field in aging-related research based on literature mining. One compelling example of this is how our new approach can be a useful tool in drug repositioning. PMID:28817730
Docherty, Andrea; Sandhu, Harbinder
2006-01-01
WHAT IS ALREADY KNOWN IN THIS AREA • E-learning is being increasingly used within learning and teaching including its application within healthcare education and service provision. Multiple advantages have been identified including enhanced accessibility and increased flexibility of learning. Guidance on the generic-design and development of e-learning courses has been generated. WHAT THIS WORK ADDS • This paper provides a detailed understanding of the barriers and facilitators to e-learning as perceived by students on a continuing professional development (CPD); course arid highlights its multifaceted values. In addition, the paper ṕrovides evidence-based guidance for the development of courses within CPD utilising e-learning. SUGGESTIONS FOR FUTURE RESEARCH • Future research would benefit from, focusing upon the perceptions of staff including barriers and facilitators to the implementation of e-learning and awareness of student experience to generate a balanced and informed understanding of e-learning within the context-of CPD.
Overview of Existing and Future Residential Use Cases for Connected Thermostats
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rotondo, Julia; Johnson, Robert; Gonzales, Nancy
This paper is intended to help inform future technology deployment opportunities for connected thermostats (CTs), based on investigation and review of the U.S. residential housing and CT markets, as well as existing, emerging, and future use cases for CT hardware and CT-generated data. The CT market has experienced tremendous growth over the last five years — both in terms of the number of units sold and the number of firms offering competing products — and can be characterized by its rapid pace of technological innovation. Despite many assuming CTs would become powerful tools for increasing comfort while saving energy, theremore » remains a great deal of uncertainty about the actual energy and cost savings that are likely to be realized from deployment of CTs, particularly under different conditions.« less
Event-Driven Technology to Generate Relevant Collections of Near-Realtime Data
NASA Astrophysics Data System (ADS)
Graves, S. J.; Keiser, K.; Nair, U. S.; Beck, J. M.; Ebersole, S.
2017-12-01
Getting the right data when it is needed continues to be a challenge for researchers and decision makers. Event-Driven Data Delivery (ED3), funded by the NASA Applied Science program, is a technology that allows researchers and decision makers to pre-plan what data, information and processes they need to have collected or executed in response to future events. The Information Technology and Systems Center at the University of Alabama in Huntsville (UAH) has developed the ED3 framework in collaboration with atmospheric scientists at UAH, scientists at the Geological Survey of Alabama, and other federal, state and local stakeholders to meet the data preparedness needs for research, decisions and situational awareness. The ED3 framework supports an API that supports the addition of loosely-coupled, distributed event handlers and data processes. This approach allows the easy addition of new events and data processes so the system can scale to support virtually any type of event or data process. Using ED3's underlying services, applications have been developed that monitor for alerts of registered event types and automatically triggers subscriptions that match new events, providing users with a living "album" of results that can continued to be curated as more information for an event becomes available. This capability can allow users to improve capacity for the collection, creation and use of data and real-time processes (data access, model execution, product generation, sensor tasking, social media filtering, etc), in response to disaster (and other) events by preparing in advance for data and information needs for future events. This presentation will provide an update on the ED3 developments and deployments, and further explain the applicability for utilizing near-realtime data in hazards research, response and situational awareness.
Pollack, Ari H; Miller, Andrew; Mishra, Sonali R.; Pratt, Wanda
2016-01-01
Participatory design, a method by which system users and stakeholders meaningfully contribute to the development of a new process or technology, has great potential to revolutionize healthcare technology, yet has seen limited adoption. We conducted a design session with eleven physicians working to create a novel clinical information tool utilizing participatory design methods. During the two-hour session, the physicians quickly engaged in the process and generated a large quantity of information, informing the design of a future tool. By utilizing facilitators experienced in design methodology, with detailed domain expertise, and well integrated into the healthcare organization, the participatory design session engaged a group of users who are often disenfranchised with existing processes as well as health information technology in general. We provide insight into why participatory design works with clinicians and provide guiding principles for how to implement these methods in healthcare organizations interested in advancing health information technology. PMID:28269900
Beyond scenario planning: projecting the future using models at Wind Cave National Park (USA)
NASA Astrophysics Data System (ADS)
King, D. A.; Bachelet, D. M.; Symstad, A. J.
2011-12-01
Scenario planning has been used by the National Park Service as a tool for natural resource management planning in the face of climate change. Sets of plausible but divergent future scenarios are constructed from available information and expert opinion and serve as starting point to derive climate-smart management strategies. However, qualitative hypotheses about how systems would react to a particular set of conditions assumed from coarse scale climate projections may lack the scientific rigor expected from a federal agency. In an effort to better assess the range of likely futures at Wind Cave National Park, a project was conceived to 1) generate high resolution historic and future climate time series to identify local weather patterns that may or may not persist, 2) simulate the hydrological cycle in this geologically varied landscape and its response to future climate, 3) project vegetation dynamics and ensuing changes in the biogeochemical cycles given grazing and fire disturbances under new climate conditions, and 4) synthesize and compare results with those from the scenario planning exercise. In this framework, we tested a dynamic global vegetation model against local information on vegetation cover, disturbance history and stream flow to better understand the potential resilience of these ecosystems to climate change. We discuss the tradeoffs between a coarse scale application of the model showing regional trends with limited ability to project the fine scale mosaic of vegetation at Wind Cave, and a finer scale approach that can account for local slope effects on water balance and better assess the vulnerability of landscape facets, but requires more intensive data acquisition. We elaborate on the potential for sharing information between models to mitigate the often-limited treatment of biological feedbacks in the physical representations of soil and atmospheric processes.
Anticipatory systems using a probabilistic-possibilistic formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsoukalas, L.H.
1989-01-01
A methodology for the realization of the Anticipatory Paradigm in the diagnosis and control of complex systems, such as power plants, is developed. The objective is to synthesize engineering systems as analogs of certain biological systems which are capable of modifying their present states on the basis of anticipated future states. These future states are construed to be the output of predictive, numerical, stochastic or symbolic models. The mathematical basis of the implementation is developed on the basis of a formulation coupling probabilistic (random) and possibilistic(fuzzy) data in the form of an Information Granule. Random data are generated from observationsmore » and sensors input from the environment. Fuzzy data consists of eqistemic information, such as criteria or constraints qualifying the environmental inputs. The approach generates mathematical performance measures upon which diagnostic inferences and control functions are based. Anticipated performance is generated using a fuzzified Bayes formula. Triplex arithmetic is used in the numerical estimation of the performance measures. Representation of the system is based upon a goal-tree within the rule-based paradigm from the field of Applied Artificial Intelligence. The ensuing construction incorporates a coupling of Symbolic and Procedural programming methods. As a demonstration of the possibility of constructing such systems, a model-based system of a nuclear reactor is constructed. A numerical model of the reactor as a damped simple harmonic oscillator is used. The neutronic behavior is described by a point kinetics model with temperature feedback. The resulting system is programmed in OPS5 for the symbolic component and in FORTRAN for the procedural part.« less
Messner, Donna A; Mohr, Penny; Towse, Adrian
2015-08-01
Explore key factors influencing future expectations for the production of evidence from comparative effectiveness research for drugs in the USA in 2020 and construct three plausible future scenarios. Semistructured key informant interviews and three rounds of modified Delphi with systematic scenario-building methods. Most influential key factors were: health delivery system integration; electronic health record development; exploitation of very large databases and mixed data sources; and proactive patient engagement in research. The scenario deemed most likely entailed uneven development of large integrated health systems with pockets of increased provider risk for patient care, enhanced data collection systems, changing incentives to do comparative effectiveness research and new opportunities for evidence generation partnerships.
Social Neuroscience and Hyperscanning Techniques: Past, Present and Future
Babiloni, Fabio; Astolfi, Laura
2012-01-01
This paper reviews the published literature on the hyperscanning methodologies using hemodynamic or neuro-electric modalities. In particular, we describe how different brain recording devices have been employed in different experimental paradigms to gain information about the subtle nature of human interactions. This review also included papers based on single-subject recordings in which a correlation was found between the activities of different (non-simultaneously recorded) participants in the experiment. The descriptions begin with the methodological issues related to the simultaneous measurements and the descriptions of the results generated by such approaches will follow. Finally, a discussion of the possible future uses of such new approaches to explore human social interactions will be presented. PMID:22917915
Past, present and future of spike sorting techniques
Rey, Hernan Gonzalo; Pedreira, Carlos; Quian Quiroga, Rodrigo
2015-01-01
Spike sorting is a crucial step to extract information from extracellular recordings. With new recording opportunities provided by the development of new electrodes that allow monitoring hundreds of neurons simultaneously, the scenario for the new generation of algorithms is both exciting and challenging. However, this will require a new approach to the problem and the development of a common reference framework to quickly assess the performance of new algorithms. In this work, we review the basic concepts of spike sorting, including the requirements for different applications, together with the problems faced by presently available algorithms. We conclude by proposing a roadmap stressing the crucial points to be addressed to support the neuroscientific research of the near future. PMID:25931392
Polio vaccination: past, present and future.
Bandyopadhyay, Ananda S; Garon, Julie; Seib, Katherine; Orenstein, Walter A
2015-01-01
Live attenuated oral polio vaccine (OPV) and inactivated polio vaccine (IPV) are the tools being used to achieve eradication of wild polio virus. Because OPV can rarely cause paralysis and generate revertant polio strains, IPV will have to replace OPV after eradication of wild polio virus is certified to sustain eradication of all polioviruses. However, uncertainties remain related to IPV's ability to induce intestinal immunity in populations where fecal-oral transmission is predominant. Although substantial effectiveness and safety data exist on the use and delivery of OPV and IPV, several new research initiatives are currently underway to fill specific knowledge gaps to inform future vaccination policies that would assure polio is eradicated and eradication is maintained.
Management of Dynamic Biomedical Terminologies: Current Status and Future Challenges
Dos Reis, J. C.; Pruski, C.
2015-01-01
Summary Objectives Controlled terminologies and their dependent artefacts provide a consensual understanding of a domain while reducing ambiguities and enabling reasoning. However, the evolution of a domain’s knowledge directly impacts these terminologies and generates inconsistencies in the underlying biomedical information systems. In this article, we review existing work addressing the dynamic aspect of terminologies as well as their effects on mappings and semantic annotations. Methods We investigate approaches related to the identification, characterization and propagation of changes in terminologies, mappings and semantic annotations including techniques to update their content. Results and conclusion Based on the explored issues and existing methods, we outline open research challenges requiring investigation in the near future. PMID:26293859
Madsen, Wendy; McAllister, Margaret; Godden, Judith; Greenhill, Jennene; Reed, Rachel
2009-01-01
This paper draws on the results of a national study of approaches to teaching nursing's history in Australia. We argue that the neglect of history learning within undergraduate nursing and midwifery education is undermining the development in students of a strong professional nursing identity. The data in our study shows that instead of proud, informed professionals, we are at risk of producing a generation of professional orphans -- unaware of who they are and where they've come from, unaware of reasons underlying cultural practices within the profession, lacking in vision for the future, insecure about their capacity to contribute to future directions, and not feeling part of something bigger and more enduring.
ERIC Educational Resources Information Center
Corces-Zimmerman, Chris; Utt, Jamie; Cabrera, Nolan L.
2017-01-01
In this response to the article by Tanner and Corrie, the authors provide three critiques of the methodology and theoretical framing of the study with the hopes of informing future scholarship and practice. Specifically, the three critiques addressed in this paper include the integration of CWS frameworks and YPAR methodology, the application and…
Manning the Next Unmanned Air Force: Developing RPA Pilots of the Future
2013-08-01
is essential since “natural human capacities are becoming mismatched to the enormous data volumes, processing capabilities, and decision speeds that...screening criteria, the tests “are a rich source of information on the attributes of the candidate and have been used to construct a composite...against terrorism than any manned aircraft. From a recruiting point, it is also critical to reach out to this generation of millennials that have a
Szaciłowski, Konrad
2007-01-01
Analogies between photoactive nitric oxide generators and various electronic devices: logic gates and operational amplifiers are presented. These analogies have important biological consequences: application of control parameters allows for better targeting and control of nitric oxide drugs. The same methodology may be applied in the future for other therapeutic strategies and at the same time helps to understand natural regulatory and signaling processes in biological systems.
NASA Astrophysics Data System (ADS)
Kim, H.; Lee, J.; Choi, K.; Lee, I.
2012-07-01
Rapid responses for emergency situations such as natural disasters or accidents often require geo-spatial information describing the on-going status of the affected area. Such geo-spatial information can be promptly acquired by a manned or unmanned aerial vehicle based multi-sensor system that can monitor the emergent situations in near real-time from the air using several kinds of sensors. Thus, we are in progress of developing such a real-time aerial monitoring system (RAMS) consisting of both aerial and ground segments. The aerial segment acquires the sensory data about the target areas by a low-altitude helicopter system equipped with sensors such as a digital camera and a GPS/IMU system and transmits them to the ground segment through a RF link in real-time. The ground segment, which is a deployable ground station installed on a truck, receives the sensory data and rapidly processes them to generate ortho-images, DEMs, etc. In order to generate geo-spatial information, in this system, exterior orientation parameters (EOP) of the acquired images are obtained through direct geo-referencing because it is difficult to acquire coordinates of ground points in disaster area. The main process, since the data acquisition stage until the measurement of EOP, is discussed as follows. First, at the time of data acquisition, image acquisition time synchronized by GPS time is recorded as part of image file name. Second, the acquired data are then transmitted to the ground segment in real-time. Third, by processing software for ground segment, positions/attitudes of acquired images are calculated through a linear interpolation using the GPS time of the received position/attitude data and images. Finally, the EOPs of images are obtained from position/attitude data by deriving the relationships between a camera coordinate system and a GPS/IMU coordinate system. In this study, we evaluated the accuracy of the EOP decided by direct geo-referencing in our system. To perform this, we used the precisely calculated EOP through the digital photogrammetry workstation (DPW) as reference data. The results of the evaluation indicate that the accuracy of the EOP acquired by our system is reasonable in comparison with the performance of GPS/IMU system. Also our system can acquire precise multi-sensory data to generate the geo-spatial information in emergency situations. In the near future, we plan to complete the development of the rapid generation system of the ground segment. Our system is expected to be able to acquire the ortho-image and DEM on the damaged area in near real-time. Its performance along with the accuracy of the generated geo-spatial information will also be evaluated and reported in the future work.
Liu, Hu; Su, Rong-jia; Wu, Min-jie; Zhang, Yi; Qiu, Xiang-jun; Feng, Jian-gang; Xie, Ting; Lu, Shu-liang
2012-06-01
To form a wound information management scheme with objectivity, standardization, and convenience by means of wound information management system. A wound information management system was set up with the acquisition terminal, the defined wound description, the data bank, and related softwares. The efficacy of this system was evaluated in clinical practice. The acquisition terminal was composed of the third generation mobile phone and the software. It was feasible to get access to the wound information, including description, image, and therapeutic plan from the data bank by mobile phone. During 4 months, a collection of a total of 232 wound treatment information was entered, and accordingly standardized data of 38 patients were formed automatically. This system can provide standardized wound information management by standardized techniques of acquisition, transmission, and storage of wound information. It can be used widely in hospitals, especially primary medical institutions. Data resource of the system makes it possible for epidemiological study with large sample size in future.
Fukushima Daiichi Radionuclide Inventories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardoni, Jeffrey N.; Jankovsky, Zachary Kyle
Radionuclide inventories are generated to permit detailed analyses of the Fukushima Daiichi meltdowns. This is necessary information for severe accident calculations, dose calculations, and source term and consequence analyses. Inventories are calculated using SCALE6 and compared to values predicted by international researchers supporting the OECD/NEA's Benchmark Study on the Accident at Fukushima Daiichi Nuclear Power Station (BSAF). Both sets of inventory information are acceptable for best-estimate analyses of the Fukushima reactors. Consistent nuclear information for severe accident codes, including radionuclide class masses and core decay powers, are also derived from the SCALE6 analyses. Key nuclide activity ratios are calculated asmore » functions of burnup and nuclear data in order to explore the utility for nuclear forensics and support future decommissioning efforts.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Du, Pengwei; Etingov, Pavel V.
This planning reference book is a document reflecting a Western Electricity Coordination Council (WECC) effort to put together multiple sources of information and provide a clear, systemic, comprehensive outline of the problems, both existing and anticipated; their impacts on the system; currently used and proposed solutions by the industry and research community; planning practices; new technologies, equipment, and standards; and expected future trends. This living (periodically updated) document could help WECC and other practicing engineers, especially the younger generation of engineers joining the workforce, to get familiar with a large variety of information related to the integration of variable resourcesmore » into the WECC system, bypassing in part the need for time-consuming information gathering and learning processes from more experienced engineers or from the literature.« less
Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T
2015-01-01
Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast.
Information Systems for NASA's Aeronautics and Space Enterprises
NASA Technical Reports Server (NTRS)
Kutler, Paul
1998-01-01
The aerospace industry is being challenged to reduce costs and development time as well as utilize new technologies to improve product performance. Information technology (IT) is the key to providing revolutionary solutions to the challenges posed by the increasing complexity of NASA's aeronautics and space missions and the sophisticated nature of the systems that enable them. The NASA Ames vision is to develop technologies enabling the information age, expanding the frontiers of knowledge for aeronautics and space, improving America's competitive position, and inspiring future generations. Ames' missions to accomplish that vision include: 1) performing research to support the American aviation community through the unique integration of computation, experimentation, simulation and flight testing, 2) studying the health of our planet, understanding living systems in space and the origins of the universe, developing technologies for space flight, and 3) to research, develop and deliver information technologies and applications. Information technology may be defined as the use of advance computing systems to generate data, analyze data, transform data into knowledge and to use as an aid in the decision-making process. The knowledge from transformed data can be displayed in visual, virtual and multimedia environments. The decision-making process can be fully autonomous or aided by a cognitive processes, i.e., computational aids designed to leverage human capacities. IT Systems can learn as they go, developing the capability to make decisions or aid the decision making process on the basis of experiences gained using limited data inputs. In the future, information systems will be used to aid space mission synthesis, virtual aerospace system design, aid damaged aircraft during landing, perform robotic surgery, and monitor the health and status of spacecraft and planetary probes. NASA Ames through the Center of Excellence for Information Technology Office is leading the effort in pursuit of revolutionary, IT-based approaches to satisfying NASA's aeronautics and space requirements. The objective of the effort is to incorporate information technologies within each of the Agency's four Enterprises, i.e., Aeronautics and Space Transportation Technology, Earth, Science, Human Exploration and Development of Space and Space Sciences. The end results of these efforts for Enterprise programs and projects should be reduced cost, enhanced mission capability and expedited mission completion.
Trust and Dialogue in the Army Profession
2008-05-22
to solve this gap must also account for the future generations as well. This future is the Millennials . This generation, also known as Generation Y ...generations were the Generation X and Generation Next or Millennials . A characterization of these generations is warranted to understand the...nearly ever major and revered institution from the Presidency to organized religion to corporate America has been entangled in some type of crime or
Spectroscopic and Statistical Techniques for Information Recovery in Metabonomics and Metabolomics
NASA Astrophysics Data System (ADS)
Lindon, John C.; Nicholson, Jeremy K.
2008-07-01
Methods for generating and interpreting metabolic profiles based on nuclear magnetic resonance (NMR) spectroscopy, mass spectrometry (MS), and chemometric analysis methods are summarized and the relative strengths and weaknesses of NMR and chromatography-coupled MS approaches are discussed. Given that all data sets measured to date only probe subsets of complex metabolic profiles, we describe recent developments for enhanced information recovery from the resulting complex data sets, including integration of NMR- and MS-based metabonomic results and combination of metabonomic data with data from proteomics, transcriptomics, and genomics. We summarize the breadth of applications, highlight some current activities, discuss the issues relating to metabonomics, and identify future trends.
Spectroscopic and statistical techniques for information recovery in metabonomics and metabolomics.
Lindon, John C; Nicholson, Jeremy K
2008-01-01
Methods for generating and interpreting metabolic profiles based on nuclear magnetic resonance (NMR) spectroscopy, mass spectrometry (MS), and chemometric analysis methods are summarized and the relative strengths and weaknesses of NMR and chromatography-coupled MS approaches are discussed. Given that all data sets measured to date only probe subsets of complex metabolic profiles, we describe recent developments for enhanced information recovery from the resulting complex data sets, including integration of NMR- and MS-based metabonomic results and combination of metabonomic data with data from proteomics, transcriptomics, and genomics. We summarize the breadth of applications, highlight some current activities, discuss the issues relating to metabonomics, and identify future trends.
Water for the Nation: An overview of the USGS Water Resources Division
,
1998-01-01
The Water Resources Division (WRD) of the U.S. Geological Survey (USGS) provides reliable, impartial, timely information needed to understand the Nation's water resources. WRD actively promotes the use of this information by decisionmakers to: * Minimize the loss of life and property as a result of water-related hazards such as floods, droughts, and land movement. * Effectively manage ground-water and surface-water resources for domestic, agricultural, commercial, industrial, recreational, and ecological uses. * Protect and enhance water resources for human health, aquatic health, and environmental quality. * Contribute to wise physical and economic development of the Nation's resources for the benefit of present and future generations.
Gorelik, Gregory; Shackelford, Todd K
2014-08-27
In this article, we advance the concept of "evolutionary awareness," a metacognitive framework that examines human thought and emotion from a naturalistic, evolutionary perspective. We begin by discussing the evolution and current functioning of the moral foundations on which our framework rests. Next, we discuss the possible applications of such an evolutionarily-informed ethical framework to several domains of human behavior, namely: sexual maturation, mate attraction, intrasexual competition, culture, and the separation between various academic disciplines. Finally, we discuss ways in which an evolutionary awareness can inform our cross-generational activities-which we refer to as "intergenerational extended phenotypes"-by helping us to construct a better future for ourselves, for other sentient beings, and for our environment.
Molecular genetics at the Fort Collins Science Center
Oyler-McCance, S.J.; Stevens, P.D.
2011-01-01
The Fort Collins Science Center operates a molecular genetic and systematics research facility (FORT Molecular Ecology Laboratory) that uses molecular genetic tools to provide genetic information needed to inform natural resource management decisions. For many wildlife species, the data generated have become increasingly important in the development of their long-term management strategies, leading to a better understanding of species diversity, population dynamics and ecology, and future conservation and management needs. The Molecular Ecology Lab serves Federal research and resource management agencies by developing scientifically rigorous research programs using nuclear, mitochondrial and chloroplast DNA to help address many of today's conservation biology and natural resource management issues.
Pause, K.C.; Bonde, R.K.; McGuire, P.M.; Zori, Roberto T.; Gray, B.A.
2006-01-01
Published cytogenetic data for extant cetacean species remain incomplete. In a review of the literature, we found karyotypic information for 6 of the 13 tentatively recognized species of the suborder Mysticeti (baleen whales). Among those yet to be described is the critically endangered North Atlantic right whale (Eubalaena glacialis). Herein, we describe and propose a first-generation G-banded karyotype and ideogram for this species (2n = 42), obtained from peripheral blood chromosome preparations from a stranded male calf. This information may prove useful for future genetic mapping projects and for interspecific and intraspecific genomic comparisons by techniques such as zoo-FISH.
MSFC Space Station Program Commonly Used Acronyms and Abbreviations Listing
NASA Technical Reports Server (NTRS)
Gates, Thomas G.
1988-01-01
The Marshall Space Flight Center maintains an active history program to assure that the foundation of the Center's history is captured and preserved for current and future generations. As part of that overall effort, the Center began a project in 1987 to capture historical information and documentation on the Marshall Center's roles regarding Space Shuttle and Space Station. This document is MSFC Space Station Program Commonly Used Acronyms and Abbreviations Listing. It contains acronyms and abbreviations used in Space Station documentation and in the Historian Annotated Bibliography of Space Station Program. The information may be used by the researcher as a reference tool.
Raman Microscopy: A Noninvasive Method to Visualize the Localizations of Biomolecules in the Cornea.
Kaji, Yuichi; Akiyama, Toshihiro; Segawa, Hiroki; Oshika, Tetsuro; Kano, Hideaki
2017-11-01
In vivo and in situ visualization of biomolecules without pretreatment will be important for diagnosis and treatment of ocular disorders in the future. Recently, multiphoton microscopy, based on the nonlinear interactions between molecules and photons, has been applied to reveal the localizations of various molecules in tissues. We aimed to use multimodal multiphoton microscopy to visualize the localizations of specific biomolecules in rat corneas. Multiphoton images of the corneas were obtained from nonlinear signals of coherent anti-Stokes Raman scattering, third-order sum frequency generation, and second-harmonic generation. The localizations of the adhesion complex-containing basement membrane and Bowman layer were clearly visible in the third-order sum frequency generation images. The fine structure of type I collagen was observed in the corneal stroma in the second-harmonic generation images. The localizations of lipids, proteins, and nucleic acids (DNA/RNA) was obtained in the coherent anti-Stokes Raman scattering images. Imaging technologies have progressed significantly and been applied in medical fields. Optical coherence tomography and confocal microscopy are widely used but do not provide information on the molecular structure of the cornea. By contrast, multiphoton microscopy provides information on the molecular structure of living tissues. Using this technique, we successfully visualized the localizations of various biomolecules including lipids, proteins, and nucleic acids in the cornea. We speculate that multiphoton microscopy will provide essential information on the physiological and pathological conditions of the cornea, as well as molecular localizations in tissues without pretreatment.
A visual analytics approach for pattern-recognition in patient-generated data.
Feller, Daniel J; Burgermaster, Marissa; Levine, Matthew E; Smaldone, Arlene; Davidson, Patricia G; Albers, David J; Mamykina, Lena
2018-06-13
To develop and test a visual analytics tool to help clinicians identify systematic and clinically meaningful patterns in patient-generated data (PGD) while decreasing perceived information overload. Participatory design was used to develop Glucolyzer, an interactive tool featuring hierarchical clustering and a heatmap visualization to help registered dietitians (RDs) identify associative patterns between blood glucose levels and per-meal macronutrient composition for individuals with type 2 diabetes (T2DM). Ten RDs participated in a within-subjects experiment to compare Glucolyzer to a static logbook format. For each representation, participants had 25 minutes to examine 1 month of diabetes self-monitoring data captured by an individual with T2DM and identify clinically meaningful patterns. We compared the quality and accuracy of the observations generated using each representation. Participants generated 50% more observations when using Glucolyzer (98) than when using the logbook format (64) without any loss in accuracy (69% accuracy vs 62%, respectively, p = .17). Participants identified more observations that included ingredients other than carbohydrates using Glucolyzer (36% vs 16%, p = .027). Fewer RDs reported feelings of information overload using Glucolyzer compared to the logbook format. Study participants displayed variable acceptance of hierarchical clustering. Visual analytics have the potential to mitigate provider concerns about the volume of self-monitoring data. Glucolyzer helped dietitians identify meaningful patterns in self-monitoring data without incurring perceived information overload. Future studies should assess whether similar tools can support clinicians in personalizing behavioral interventions that improve patient outcomes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolinger, Mark A; Wiser, Ryan
2008-09-15
For better or worse, natural gas has become the fuel of choice for new power plants being built across the United States. According to the Energy Information Administration (EIA), natural gas-fired units account for nearly 90% of the total generating capacity added in the U.S. between 1999 and 2005 (EIA 2006b), bringing the nationwide market share of gas-fired generation to 19%. Looking ahead over the next decade, the EIA expects this trend to continue, increasing the market share of gas-fired generation to 22% by 2015 (EIA 2007a). Though these numbers are specific to the US, natural gas-fired generation is makingmore » similar advances in many other countries as well. A large percentage of the total cost of gas-fired generation is attributable to fuel costs--i.e., natural gas prices. For example, at current spot prices of around $7/MMBtu, fuel costs account for more than 75% of the levelized cost of energy from a new combined cycle gas turbine, and more than 90% of its operating costs (EIA 2007a). Furthermore, given that gas-fired plants are often the marginal supply units that set the market-clearing price for all generators in a competitive wholesale market, there is a direct link between natural gas prices and wholesale electricity prices. In this light, the dramatic increase in natural gas prices since the 1990s should be a cause for ratepayer concern. Figure 1 shows the daily price history of the 'first-nearby' (i.e., closest to expiration) NYMEX natural gas futures contract (black line) at Henry Hub, along with the futures strip (i.e., the full series of futures contracts) from August 22, 2007 (red line). First, nearby prices, which closely track spot prices, have recently been trading within a $7-9/MMBtu range in the United States and, as shown by the futures strip, are expected to remain there through 2012. These price levels are $6/MMBtu higher than the $1-3/MMBtu range seen throughout most of the 1990s, demonstrating significant price escalation for natural gas in the United States over a relatively brief period. Perhaps of most concern is that this dramatic price increase was largely unforeseen. Figure 2 compares the EIA's natural gas wellhead price forecast from each year's Annual Energy Outlook (AEO) going back to 1985 against the average US wellhead price that actually transpired. As shown, our forecasting abilities have proven rather dismal over time, as over-forecasts made in the late 1980's eventually yielded to under-forecasts that have persisted to this day. This historical experience demonstrates that little weight should be placed on any one forecast of future natural gas prices, and that a broad range of future price conditions ought to be considered in planning and investment decisions. Against this backdrop of high, volatile, and unpredictable natural gas prices, increasing the market penetration of renewable generation such as wind, solar, and geothermal power may provide economic benefits to ratepayers by displacing gas-fired generation. These benefits may manifest themselves in several ways. First, the displacement of natural gas-fired generation by increased renewable generation reduces ratepayer exposure to natural gas price risk--i.e., the risk that future gas prices (and by extension future electricity prices) may end up markedly different than expected. Second, this displacement reduces demand for natural gas among gas-fired generators, which, all else equal, will put downward pressure on natural gas prices. Lower natural gas prices in turn benefit both electric ratepayers and other end-users of natural gas. Using analytic approaches that build upon, yet differ from, the past work of others, including Awerbuch (1993, 1994, 2003), Kahn and Stoft (1993), and Humphreys and McClain (1998), this chapter explores each of these two potential 'hedging' benefits of renewable electricity. Though we do not seek to judge whether these two specific benefits outweigh any incremental cost of renewable energy (relative to conventional fuels), we do seek to quantify the magnitude of these two individual benefits. We also note that these benefits are not unique to renewable electricity: other generation (or demand-side) resources whose costs are not tied to natural gas would provide similar benefits.« less
NASA Astrophysics Data System (ADS)
Jack-Scott, E.; Arnott, J. C.; Katzenberger, J.; Davis, S. J.; Delman, E.
2015-12-01
It has been a generational challenge to simultaneously meet the world's energy requirements, while remaining within the bounds of acceptable cost and environmental impact. To this end, substantial research has explored various energy futures on a global scale, leaving decision-makers and the public overwhelmed by information on energy options. In response, this interactive energy table was developed as a comprehensive resource through which users can explore the availability, scalability, and growth potentials of all energy technologies currently in use or development. Extensive research from peer-reviewed papers and reports was compiled and summarized, detailing technology costs, technical considerations, imminent breakthroughs, and obstacles to integration, as well as political, social, and environmental considerations. Energy technologies fall within categories of coal, oil, natural gas, nuclear, solar, wind, hydropower, ocean, geothermal and biomass. In addition to 360 expandable cells of cited data, the interactive table also features educational windows with background information on each energy technology. The table seeks not to advocate for specific energy futures, but to succinctly and accurately centralize peer-reviewed research and information in an interactive, accessible resource. With this tool, decision-makers, researchers and the public alike can explore various combinations of energy technologies and their quantitative and qualitative attributes that can satisfy the world's total primary energy supply (TPES) while making progress towards a near zero carbon future.
NASA Astrophysics Data System (ADS)
McLeod, Jeffrey
The recent increase in U.S. natural gas production made possible through advancements in extraction techniques including hydraulic fracturing has transformed the U.S. energy supply landscape while raising questions regarding the balance of environmental impacts associated with natural gas production and use. Impact areas at issue include emissions of methane and criteria pollutants from natural gas production, alongside changes in emissions from increased use of natural gas in place of coal for electricity generation. In the Rocky Mountain region, these impact areas have been subject to additional scrutiny due to the high level of regional oil and gas production activity and concerns over its links to air quality. Here, the MARKAL (MArket ALlocation) least-cost energy system optimization model in conjunction with the EPA-MARKAL nine-region database has been used to characterize future regional and national emissions of CO 2, CH4, VOC, and NOx attributed to natural gas production and use in several sectors of the economy. The analysis is informed by comparing and contrasting a base case, business-as-usual scenario with scenarios featuring variations in future natural gas supply characteristics, constraints affecting the electricity generation mix, carbon emission reduction strategies and increased demand for natural gas in the transportation sector. Emission trends and their associated sensitivities are identified and contrasted between the Rocky Mountain region and the U.S. as a whole. The modeling results of this study illustrate the resilience of the short term greenhouse gas emission benefits associated with fuel switching from coal to gas in the electric sector, but also call attention to the long term implications of increasing natural gas production and use for emissions of methane and VOCs, especially in the Rocky Mountain region. This analysis can help to inform the broader discussion of the potential environmental impacts of future natural gas production and use by illustrating links between relevant economic and environmental variables.
Friggens, Megan M.; Finch, Deborah M.
2015-01-01
Future expected changes in climate and human activity threaten many riparian habitats, particularly in the southwestern U.S. Using Maximum Entropy (MaxEnt3.3.3) modeling, we characterized habitat relationships and generated spatial predictions of habitat suitability for the Lucy’s warbler (Oreothlypis luciae), the Southwestern willow flycatcher (Empidonax traillii extimus) and the Western yellow-billed cuckoo (Coccyzus americanus). Our goal was to provide site- and species-specific information that can be used by managers to identify areas for habitat conservation and/or restoration along the Rio Grande in New Mexico. We created models of suitable habitat for each species based on collection and survey samples and climate, biophysical, and vegetation data. We projected habitat suitability under future climates by applying these models to conditions generated from three climate models for 2030, 2060 and 2090. By comparing current and future distributions, we identified how habitats are likely to change as a result of changing climate and the consequences of those changes for these bird species. We also examined whether land ownership of high value sites shifts under changing climate conditions. Habitat suitability models performed well. Biophysical characteristics were more important that climate conditions for predicting habitat suitability with distance to water being the single most important predictor. Climate, though less important, was still influential and led to declines of suitable habitat of more than 60% by 2090. For all species, suitable habitat tended to shrink over time within the study area leaving a few core areas of high importance. Overall, climate changes will increase habitat fragmentation and reduce breeding habitat patch size. The best strategy for conserving bird species within the Rio Grande will include measures to maintain and restore critical habitat refugia. This study provides an example of a presence-only habitat model that can be used to inform the management of species at intermediate scales. PMID:26700871
Friggens, Megan M; Finch, Deborah M
2015-01-01
Future expected changes in climate and human activity threaten many riparian habitats, particularly in the southwestern U.S. Using Maximum Entropy (MaxEnt3.3.3) modeling, we characterized habitat relationships and generated spatial predictions of habitat suitability for the Lucy's warbler (Oreothlypis luciae), the Southwestern willow flycatcher (Empidonax traillii extimus) and the Western yellow-billed cuckoo (Coccyzus americanus). Our goal was to provide site- and species-specific information that can be used by managers to identify areas for habitat conservation and/or restoration along the Rio Grande in New Mexico. We created models of suitable habitat for each species based on collection and survey samples and climate, biophysical, and vegetation data. We projected habitat suitability under future climates by applying these models to conditions generated from three climate models for 2030, 2060 and 2090. By comparing current and future distributions, we identified how habitats are likely to change as a result of changing climate and the consequences of those changes for these bird species. We also examined whether land ownership of high value sites shifts under changing climate conditions. Habitat suitability models performed well. Biophysical characteristics were more important that climate conditions for predicting habitat suitability with distance to water being the single most important predictor. Climate, though less important, was still influential and led to declines of suitable habitat of more than 60% by 2090. For all species, suitable habitat tended to shrink over time within the study area leaving a few core areas of high importance. Overall, climate changes will increase habitat fragmentation and reduce breeding habitat patch size. The best strategy for conserving bird species within the Rio Grande will include measures to maintain and restore critical habitat refugia. This study provides an example of a presence-only habitat model that can be used to inform the management of species at intermediate scales.
Random numbers certified by Bell's theorem.
Pironio, S; Acín, A; Massar, S; de la Giroday, A Boyer; Matsukevich, D N; Maunz, P; Olmschenk, S; Hayes, D; Luo, L; Manning, T A; Monroe, C
2010-04-15
Randomness is a fundamental feature of nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on non-locality-based and device-independent quantum information processing, we show that the non-local correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design a cryptographically secure random number generator that does not require any assumption about the internal working of the device. Such a strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately one metre. The observed Bell inequality violation, featuring near perfect detection efficiency, guarantees that 42 new random numbers are generated with 99 per cent confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory.
Expectancy-related changes in firing of dopamine neurons depend on orbitofrontal cortex.
Takahashi, Yuji K; Roesch, Matthew R; Wilson, Robert C; Toreson, Kathy; O'Donnell, Patricio; Niv, Yael; Schoenbaum, Geoffrey
2011-10-30
The orbitofrontal cortex has been hypothesized to carry information regarding the value of expected rewards. Such information is essential for associative learning, which relies on comparisons between expected and obtained reward for generating instructive error signals. These error signals are thought to be conveyed by dopamine neurons. To test whether orbitofrontal cortex contributes to these error signals, we recorded from dopamine neurons in orbitofrontal-lesioned rats performing a reward learning task. Lesions caused marked changes in dopaminergic error signaling. However, the effect of lesions was not consistent with a simple loss of information regarding expected value. Instead, without orbitofrontal input, dopaminergic error signals failed to reflect internal information about the impending response that distinguished externally similar states leading to differently valued future rewards. These results are consistent with current conceptualizations of orbitofrontal cortex as supporting model-based behavior and suggest an unexpected role for this information in dopaminergic error signaling.
Social relevance: toward understanding the impact of the individual in an information cascade
NASA Astrophysics Data System (ADS)
Hall, Robert T.; White, Joshua S.; Fields, Jeremy
2016-05-01
Information Cascades (IC) through a social network occur due to the decision of users to disseminate content. We define this decision process as User Diffusion (UD). IC models typically describe an information cascade by treating a user as a node within a social graph, where a node's reception of an idea is represented by some activation state. The probability of activation then becomes a function of a node's connectedness to other activated nodes as well as, potentially, the history of activation attempts. We enrich this Coarse-Grained User Diffusion (CGUD) model by applying actor type logics to the nodes of the graph. The resulting Fine-Grained User Diffusion (FGUD) model utilizes prior research in actor typing to generate a predictive model regarding the future influence a user will have on an Information Cascade. Furthermore, we introduce a measure of Information Resonance that is used to aid in predictions regarding user behavior.
Central American information system for energy planning (in English; Spanish)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fonseca, M.G.; Lyon, P.C.; Heskett, J.C.
1991-04-01
SICAPE (Sistema de Information Centroamericano para Planificacion Energetica) is an expandable information system designed for energy planning. Its objective is to satisfy ongoing information requirements by means of a menu driver operational environment. SICAPE is as easily used by the novice computer user as those with more experience. Moreover, the system is capable of evolving concurrently with future requirements of the individual country. The expansion is accomplished by menu restructuring as data and user requirements change. The new menu configurations require no programming effort. The use and modification of SICAPE are separate menu-driven processes that allow for rapid data query,more » minimal training, and effortless continued growth. SICAPE's data is organized by country or region. Information is available in the following areas: energy balance, macro economics, electricity generation capacity, and electricity and petroleum product pricing. (JF)« less
Exploring cluster Monte Carlo updates with Boltzmann machines
NASA Astrophysics Data System (ADS)
Wang, Lei
2017-11-01
Boltzmann machines are physics informed generative models with broad applications in machine learning. They model the probability distribution of an input data set with latent variables and generate new samples accordingly. Applying the Boltzmann machines back to physics, they are ideal recommender systems to accelerate the Monte Carlo simulation of physical systems due to their flexibility and effectiveness. More intriguingly, we show that the generative sampling of the Boltzmann machines can even give different cluster Monte Carlo algorithms. The latent representation of the Boltzmann machines can be designed to mediate complex interactions and identify clusters of the physical system. We demonstrate these findings with concrete examples of the classical Ising model with and without four-spin plaquette interactions. In the future, automatic searches in the algorithm space parametrized by Boltzmann machines may discover more innovative Monte Carlo updates.
Inter-satellite optical communications: from SILEX to next generation systems
NASA Astrophysics Data System (ADS)
Laurent, Bernard; Planche, Gilles; Michel, Cyril
2004-06-01
The continuous growth in data rate demand, the importance of real time commanding and real time access to the information for diverse civilian and military applications as well as the in-orbit demonstration of optical communication have led to boost the interest of such systems for future applications. After a presentation of the different fields of application and their associated performances requirements, this paper presents the possible optical link candidates. Then, the architecture, the design and the performances of new optical terminal generations, which profits from SILEX experience and the use of new technologies such as SiC and APS, are detailed. This new optimised generation, highly simplified with respect to SILEX terminals and dimensioned to offer higher data rate, presents attractive mass, volume and power characteristics compatible with a simple accommodation on the host vehicle.
Computational Burden Resulting from Image Recognition of High Resolution Radar Sensors
López-Rodríguez, Patricia; Fernández-Recio, Raúl; Bravo, Ignacio; Gardel, Alfredo; Lázaro, José L.; Rufo, Elena
2013-01-01
This paper presents a methodology for high resolution radar image generation and automatic target recognition emphasizing the computational cost involved in the process. In order to obtain focused inverse synthetic aperture radar (ISAR) images certain signal processing algorithms must be applied to the information sensed by the radar. From actual data collected by radar the stages and algorithms needed to obtain ISAR images are revised, including high resolution range profile generation, motion compensation and ISAR formation. Target recognition is achieved by comparing the generated set of actual ISAR images with a database of ISAR images generated by electromagnetic software. High resolution radar image generation and target recognition processes are burdensome and time consuming, so to determine the most suitable implementation platform the analysis of the computational complexity is of great interest. To this end and since target identification must be completed in real time, computational burden of both processes the generation and comparison with a database is explained separately. Conclusions are drawn about implementation platforms and calculation efficiency in order to reduce time consumption in a possible future implementation. PMID:23609804
Computational burden resulting from image recognition of high resolution radar sensors.
López-Rodríguez, Patricia; Fernández-Recio, Raúl; Bravo, Ignacio; Gardel, Alfredo; Lázaro, José L; Rufo, Elena
2013-04-22
This paper presents a methodology for high resolution radar image generation and automatic target recognition emphasizing the computational cost involved in the process. In order to obtain focused inverse synthetic aperture radar (ISAR) images certain signal processing algorithms must be applied to the information sensed by the radar. From actual data collected by radar the stages and algorithms needed to obtain ISAR images are revised, including high resolution range profile generation, motion compensation and ISAR formation. Target recognition is achieved by comparing the generated set of actual ISAR images with a database of ISAR images generated by electromagnetic software. High resolution radar image generation and target recognition processes are burdensome and time consuming, so to determine the most suitable implementation platform the analysis of the computational complexity is of great interest. To this end and since target identification must be completed in real time, computational burden of both processes the generation and comparison with a database is explained separately. Conclusions are drawn about implementation platforms and calculation efficiency in order to reduce time consumption in a possible future implementation.
de Medeiros, Kate; Rubinstein, Robert; Ermoshkina, Polina
2015-01-01
Purpose of the Study: This paper examines generativity, social suffering, and culture change in a sample of 16 women aged 65 years or older who emigrated from the former Soviet Union. Key concerns with generativity are identity, which can be strongly rooted in one’s original cultural formation, and a stable life course, which is what ideally enables generative impulses to be cultivated in later life. Design and Methods: To better understand how early social suffering may affect later life generativity, we conducted two 90-min interviews with each of our participants on their past experiences and current views of generativity. Results: The trauma of World War II, poor quality of life in the Soviet Union, scarcity of shelter and supplies, and fear of arrest emerged as common components in social suffering, which affected their identity. Implications: Overall, the theme of broken links to the future—the sense that their current lives were irrelevant to future generations—was strong among informants in their interviews, pointing to the importance of life course stability in relation to certain forms of generativity. PMID:24184859
NASA Astrophysics Data System (ADS)
Kucharski, John; Tkach, Mark; Olszewski, Jennifer; Chaudhry, Rabia; Mendoza, Guillermo
2016-04-01
This presentation demonstrates the application of Climate Risk Informed Decision Analysis (CRIDA) at Zambia's principal water treatment facility, The Iolanda Water Treatment Plant. The water treatment plant is prone to unacceptable failures during periods of low hydropower production at the Kafue Gorge Dam Hydroelectric Power Plant. The case study explores approaches of increasing the water treatment plant's ability to deliver acceptable levels of service under the range of current and potential future climate states. The objective of the study is to investigate alternative investments to build system resilience that might have been informed by the CRIDA process, and to evaluate the extra resource requirements by a bilateral donor agency to implement the CRIDA process. The case study begins with an assessment of the water treatment plant's vulnerability to climate change. It does so by following general principals described in "Confronting Climate Uncertainty in Water Resource Planning and Project Design: the Decision Tree Framework". By utilizing relatively simple bootstrapping methods a range of possible future climate states is generated while avoiding the use of more complex and costly downscaling methodologies; that are beyond the budget and technical capacity of many teams. The resulting climate vulnerabilities and uncertainty in the climate states that produce them are analyzed as part of a "Level of Concern" analysis. CRIDA principals are then applied to this Level of Concern analysis in order to arrive at a set of actionable water management decisions. The principal goals of water resource management is to transform variable, uncertain hydrology into dependable services (e.g. water supply, flood risk reduction, ecosystem benefits, hydropower production, etc…). Traditional approaches to climate adaptation require the generation of predicted future climate states but do little guide decision makers how this information should impact decision making. In this context it is not surprising that the increased hydrologic variability and uncertainty produced by many climate risk analyses bedevil water resource decision making. The Climate Risk Informed Decision Analysis (CRIDA) approach builds on work found in "Confronting Climate Uncertainty in Water Resource Planning and Project Design: the Decision Tree Framework" which provide guidance of vulnerability assessments. It guides practitioners through a "Level of Concern" analysis where climate vulnerabilities are analyzed to produce actionable alternatives and decisions.
DSSTOX (DISTRIBUTED STRUCTURE-SEARCHABLE ...
Distributed Structure-Searchable Toxicity Database Network Major trends affecting public toxicity information resources have the potential to significantly alter the future of predictive toxicology. Chemical toxicity screening is undergoing shifts towards greater use of more fundamental information on gene/protein expression patterns and bioactivity and bioassay profiles, the latter generated with highthroughput screening technologies. Curated, systematically organized, and webaccessible toxicity and biological activity data in association with chemical structures, enabling the integration of diverse data information domains, will fuel the next frontier of advancement for QSAR (quantitative structure-activity relationship) and data mining technologies. The DSSTox project is supporting progress towards these goals on many fronts, promoting the use of formalized and structure-annotated toxicity data models, helping to interface these efforts with QSAR modelers, linking data from diverse sources, and creating a large, quality reviewed, central chemical structure information resource linked to various toxicity data sources
Emerging issues in public health genomics
Roberts, J. Scott
2014-01-01
This review highlights emerging areas of interest in public health genomics. First, recent advances in newborn screening (NBS) are described, with a focus on practice and policy implications of current and future efforts to expand NBS programs (e.g., via next-generation sequencing). Next, research findings from the rapidly progressing field of epigenetics and epigenomics are detailed, highlighting ways in which our emerging understanding in these areas could guide future intervention and research efforts in public health. We close by considering various ethical, legal and social issues posed by recent developments in public health genomics; these include policies to regulate access to personal genomic information; the need to enhance genetic literacy in both health professionals and the public; and challenges in ensuring that the benefits (and burdens) from genomic discoveries and applications are equitably distributed. Needs for future genomics research that integrates across basic and social sciences are also noted. PMID:25184533
Reflections on the present and future of upper limb prostheses.
Farina, Dario; Amsüss, Sebastian
2016-01-01
Despite progress in research and media attention on active upper limb prostheses, presently the most common commercial upper limb prosthetic devices are not fundamentally different from solutions offered almost one century ago. Limited information transfer for both control and sensory-motor integration and challenges in socket technology have been major obstacles. By analysing the present state-of-the-art and academic achievements, we provide our opinion on the future of upper limb prostheses. We believe that surgical procedures for muscle reinnervation and osseointegration will become increasingly clinically relevant; muscle electrical signals will remain the main clinical means for prosthetic control; and chronic electrode implants, first in muscles (control), then in nerves (sensory feedback), will become viable clinical solutions. After decades of suspended clinically relevant progress, it is foreseeable that a new generation of upper limb prostheses will enter the market in the near future based on such advances, thereby offering substantial clinical benefit for patients.
Development of enterprise architecture in university using TOGAF as framework
NASA Astrophysics Data System (ADS)
Amalia, Endang; Supriadi, Hari
2017-06-01
The university of XYZ is located in Bandung, West Java. It has an infrastructure of technology information (IT) which is managed independently. Currently, the IT at the University of XYZ employs a complex conventional management pattern that does not result in a fully integrated IT infrastructure. This is not adaptive in addressing solutions to changing business needs and applications. In addition, it impedes the innovative development of sustainable IT services and also contributes to an unnecessary high workload for managers. This research aims to establish the concept of IS/IT strategic planning. This is used in the development of the IS/IT and in designing the information technology infrastructure based on the framework of The Open Group Architecture Framework (TOGAF) and Architecture Development Method (ADM). A case study will be done at the University of XYZ using the concept of qualitative research through review of literatures and interviews. This study generates the following stages:(1) forming a design using TOGAF and the ADM around nine functional areas of business and propose 12 application candidates to be developed at XYZ University; (2) generating 11 principles of the development of information technology architecture; (3) creating a portfolio for future applications (McFarlan Grid), generating 6 applications in the strategic quadrant (SIAKAD-T, E-LIBRARY, SIPADU-T, DSS, SIPPM-T, KMS), 2 quadrant application operations (PMS-T, CRM), 4 quadrant application supports (MNC-T, NOPEC-T, EMAIL-SYSTEM, SSO); and (4) modelling the enterprise architecture of this study which could be a reference in making a blueprint for the development of information systems and information technology at the University of XYZ.
Probing Cherenkov and Scintillation Light Separation for Next-Generation Neutrino Detectors
NASA Astrophysics Data System (ADS)
Caravaca, J.; Descamps, F. B.; Land, B. J.; Orebi Gann, G. D.; Wallig, J.; Yeh, M.
2017-09-01
The ability to separate Cherenkov and scintillation signals in liquid scintillator detectors would enable outstanding background rejection for next-generation neutrino experiments. Reconstruction of directional information, ring imaging, and sub-Cherenkov threshold detection all have the potential to substantially improve particle and event identification. The Cherenkov-Scintillation Separation (CHESS) experiment uses an array of small, fast photomultipliers (PMTs) and state-of-the-art electronics to demonstrate the reconstruction of a Cherenkov ring in a scintillation medium based on photon hit times and detected charge. This setup has been used to characterize the ability to detect Cherenkov light in a range of target media. We show results with pure organic scintillator (LAB) and the prospects with scintillators with a secondary fluor (LAB/PPO). There are future plans to deploy the newly developed water-based liquid scintillator, a medium with a higher Cherenkov/Scintillation light yield ratio than conventional pure liquid scintillators, enhancing the visibility of the less abundant Cherenkov light in the presence of scintillation light. These results can inform the development of future large-scale detectors, such as the proposed Theia experiment, or other large detectors at underground laboratories such as the far-site of the new Long Baseline Neutrino Facility at the Sanford Underground Research Facility. CHESS detector calibrations and commissioning will be discussed, and the latest results will be presented.
Satellite markers: a simple method for ground truth car pose on stereo video
NASA Astrophysics Data System (ADS)
Gil, Gustavo; Savino, Giovanni; Piantini, Simone; Pierini, Marco
2018-04-01
Artificial prediction of future location of other cars in the context of advanced safety systems is a must. The remote estimation of car pose and particularly its heading angle is key to predict its future location. Stereo vision systems allow to get the 3D information of a scene. Ground truth in this specific context is associated with referential information about the depth, shape and orientation of the objects present in the traffic scene. Creating 3D ground truth is a measurement and data fusion task associated with the combination of different kinds of sensors. The novelty of this paper is the method to generate ground truth car pose only from video data. When the method is applied to stereo video, it also provides the extrinsic camera parameters for each camera at frame level which are key to quantify the performance of a stereo vision system when it is moving because the system is subjected to undesired vibrations and/or leaning. We developed a video post-processing technique which employs a common camera calibration tool for the 3D ground truth generation. In our case study, we focus in accurate car heading angle estimation of a moving car under realistic imagery. As outcomes, our satellite marker method provides accurate car pose at frame level, and the instantaneous spatial orientation for each camera at frame level.
A Challenge for Radioactive Waste Management: Memory Preservation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charton, P.; Ouzounian, G.
2008-07-01
ANDRA, the French National Radioactive Waste Management Agency, is responsible for managing all radioactive waste in France over the long term. In the case of short-lived waste for which disposal facilities have a life expectancy of a few centuries, the Agency has set up a system for preserving the memory of those sites. Based on the historical analysis on a comparable timescale and on an appraisal of information-conservation means, a series of regulatory as well as technical provisions was made in order to ensure that sound information be transmitted to future generations. Requirements associated to the provisions deal mostly withmore » legibility and a clear understanding of the information that must be decrypted and understood at least during the lifetime of the facilities (i.e., a few centuries). It must therefore be preserved throughout the same period. Responses to the requirements will be presented notably on various information-recording media, together with the information-diffusion strategy to the different authorities and structures within French society. A concrete illustration of the achievements made so far is the Centre de la Manche Disposal Facility, which was closed down in 1994 and is currently in its post-closure monitoring phase since 2003. In the case of deep geological repositories for long-lived radioactive waste, preserving memory takes a different aspect. First of all, timescales are much longer and are counted in hundreds of thousands of years. It is therefore much more difficult to consider how to maintain the richness of the information over such time periods than it is for short-lived waste. Both the nature and the form of the information to be transmitted must be revised. It would be risky indeed to base memory preservation over the long term on similar mechanisms beyond 1,000 years. Based on the heritage of a much more ancient history, we must seek to find appropriate means in order to develop surface markers and even more to ensure their conservation over compatible timescales with those of deep geological repositories. It will also be necessary, in the light of the experiments and efforts made in order to decrypt the messages written on rupestral paintings or in pyramids, find suitable expression means that will help, not the next few generations, but much more future generations, to grasp the meaning of what we aim at transmitting them. This paper presents the state of the French reflection on memory preservation and transmission over the very long term, for timescales consistent with the long-lived radioactive geological waste disposal projects. (author)« less
NanoSIMS for biological applications: Current practices and analyses
Nunez, Jamie R.; Renslow, Ryan S.; Cliff, III, John B.; ...
2017-09-27
Secondary ion mass spectrometry (SIMS) has become an increasingly utilized tool in biologically-relevant studies. Of these, high lateral resolution methodologies using the NanoSIMS 50/50L have been especially powerful within many biological fields over the past decade. Here, we provide a review of this technology, sample preparation and analysis considerations, examples of recent biological studies, data analysis, and current outlooks. Specifically, we offer an overview of SIMS and development of the NanoSIMS. We describe the major experimental factors that should be considered prior to NanoSIMS analysis and then provide information on best practices for data analysis and image generation, which includesmore » an in-depth discussion of appropriate colormaps. Additionally, we provide an open-source method for data representation that allows simultaneous visualization of secondary electron and ion information within a single image. Lastly, we present a perspective on the future of this technology and where we think it will have the greatest impact in near future.« less
NanoSIMS for biological applications: Current practices and analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nunez, Jamie R.; Renslow, Ryan S.; Cliff, III, John B.
Secondary ion mass spectrometry (SIMS) has become an increasingly utilized tool in biologically-relevant studies. Of these, high lateral resolution methodologies using the NanoSIMS 50/50L have been especially powerful within many biological fields over the past decade. Here, we provide a review of this technology, sample preparation and analysis considerations, examples of recent biological studies, data analysis, and current outlooks. Specifically, we offer an overview of SIMS and development of the NanoSIMS. We describe the major experimental factors that should be considered prior to NanoSIMS analysis and then provide information on best practices for data analysis and image generation, which includesmore » an in-depth discussion of appropriate colormaps. Additionally, we provide an open-source method for data representation that allows simultaneous visualization of secondary electron and ion information within a single image. Lastly, we present a perspective on the future of this technology and where we think it will have the greatest impact in near future.« less
NASA Astrophysics Data System (ADS)
Little, M. M.; Moe, K.; Komar, G.
2014-12-01
NASA's Earth Science Technology Office (ESTO) manages a wide range of information technology projects under the Advanced Information Systems Technology (AIST) Program. The AIST Program aims to support all phases of NASA's Earth Science program with the goal of enabling new observations and information products, increasing the accessibility and use of Earth observations, and reducing the risk and cost of satellite and ground based information systems. Recent initiatives feature computational technologies to improve information extracted from data streams or model outputs and researchers' tools for Big Data analytics. Data-centric technologies enable research communities to facilitate collaboration and increase the speed with which results are produced and published. In the future NASA anticipates more small satellites (e.g., CubeSats), mobile drones and ground-based in-situ sensors will advance the state-of-the-art regarding how scientific observations are performed, given the flexibility, cost and deployment advantages of new operations technologies. This paper reviews the success of the program and the lessons learned. Infusion of these technologies is challenging and the paper discusses the obstacles and strategies to adoption by the earth science research and application efforts. It also describes alternative perspectives for the future program direction and for realizing the value in the steps to transform observations from sensors to data, to information, and to knowledge, namely: sensor measurement concepts development; data acquisition and management; data product generation; and data exploitation for science and applications.
Kang, Wenjun; Kadri, Sabah; Puranik, Rutika; Wurst, Michelle N; Patil, Sushant A; Mujacic, Ibro; Benhamed, Sonia; Niu, Nifang; Zhen, Chao Jie; Ameti, Bekim; Long, Bradley C; Galbo, Filipo; Montes, David; Iracheta, Crystal; Gamboa, Venessa L; Lopez, Daisy; Yourshaw, Michael; Lawrence, Carolyn A; Aisner, Dara L; Fitzpatrick, Carrie; McNerney, Megan E; Wang, Y Lynn; Andrade, Jorge; Volchenboum, Samuel L; Furtado, Larissa V; Ritterhouse, Lauren L; Segal, Jeremy P
2018-04-24
Next-generation sequencing (NGS) diagnostic assays increasingly are becoming the standard of care in oncology practice. As the scale of an NGS laboratory grows, management of these assays requires organizing large amounts of information, including patient data, laboratory processes, genomic data, as well as variant interpretation and reporting. Although several Laboratory Information Systems and/or Laboratory Information Management Systems are commercially available, they may not meet all of the needs of a given laboratory, in addition to being frequently cost-prohibitive. Herein, we present the System for Informatics in the Molecular Pathology Laboratory, a free and open-source Laboratory Information System/Laboratory Information Management System for academic and nonprofit molecular pathology NGS laboratories, developed at the Genomic and Molecular Pathology Division at the University of Chicago Medicine. The System for Informatics in the Molecular Pathology Laboratory was designed as a modular end-to-end information system to handle all stages of the NGS laboratory workload from test order to reporting. We describe the features of the system, its clinical validation at the Genomic and Molecular Pathology Division at the University of Chicago Medicine, and its installation and testing within a different academic center laboratory (University of Colorado), and we propose a platform for future community co-development and interlaboratory data sharing. Copyright © 2018. Published by Elsevier Inc.
Improved hybrid information filtering based on limited time window
NASA Astrophysics Data System (ADS)
Song, Wen-Jun; Guo, Qiang; Liu, Jian-Guo
2014-12-01
Adopting the entire collecting information of users, the hybrid information filtering of heat conduction and mass diffusion (HHM) (Zhou et al., 2010) was successfully proposed to solve the apparent diversity-accuracy dilemma. Since the recent behaviors are more effective to capture the users' potential interests, we present an improved hybrid information filtering of adopting the partial recent information. We expand the time window to generate a series of training sets, each of which is treated as known information to predict the future links proven by the testing set. The experimental results on one benchmark dataset Netflix indicate that by only using approximately 31% recent rating records, the accuracy could be improved by an average of 4.22% and the diversity could be improved by 13.74%. In addition, the performance on the dataset MovieLens could be preserved by considering approximately 60% recent records. Furthermore, we find that the improved algorithm is effective to solve the cold-start problem. This work could improve the information filtering performance and shorten the computational time.
NASA Astrophysics Data System (ADS)
Rosner, A.; Letcher, B. H.; Vogel, R. M.
2014-12-01
Predicting streamflow in headwaters and over a broad spatial scale pose unique challenges due to limited data availability. Flow observation gages for headwaters streams are less common than for larger rivers, and gages with records lengths of ten year or more are even more scarce. Thus, there is a great need for estimating streamflows in ungaged or sparsely-gaged headwaters. Further, there is often insufficient basin information to develop rainfall-runoff models that could be used to predict future flows under various climate scenarios. Headwaters in the northeastern U.S. are of particular concern to aquatic biologists, as these stream serve as essential habitat for native coldwater fish. In order to understand fish response to past or future environmental drivers, estimates of seasonal streamflow are needed. While there is limited flow data, there is a wealth of data for historic weather conditions. Observed data has been modeled to interpolate a spatially continuous historic weather dataset. (Mauer et al 2002). We present a statistical model developed by pairing streamflow observations with precipitation and temperature information for the same and preceding time-steps. We demonstrate this model's use to predict flow metrics at the seasonal time-step. While not a physical model, this statistical model represents the weather drivers. Since this model can predict flows not directly tied to reference gages, we can generate flow estimates for historic as well as potential future conditions.
Online Treatment and Virtual Therapists in Child and Adolescent Psychiatry
Schueller, Stephen M.; Stiles-Shields, Colleen; Yarosh, Lana
2016-01-01
Summary Online and virtual therapies are a well-studied and efficacious treatment option for various mental and behavioral health conditions among children and adolescents. That said, many interventions have not concerned the unique affordances offered by technologies that might align with the capacities and interests of youth users. In this article, we discuss learnings from child-computer interaction that can inform future generations of interventions and guide developers, practitioners, and researchers how to best utilize new technologies for youth populations. We highlight issues related to usability and user experience including challenge and feedback, social interaction, and storytelling. We conclude with innovative examples illustrating future potentials of online and virtual therapies such as gaming and social networking. PMID:27837935
The long hold: Storing data at the National Archives
NASA Technical Reports Server (NTRS)
Thibodeau, Kenneth
1991-01-01
A description of the information collection and storage needs of the National Archives and Records Administration (NARA) is presented. The unique situation of NARA is detailed. Two aspects which make the issue of obsolescence especially complex and costly are dealing with incoherent data and satisfying unknown and unknowable requirements. The data is incoherent because it comes from a wide range of independent sources, covers unrelated subjects, and is organized and encoded in ways that are not only not controlled but often unknown until received. NARA's mission to preserve and provide access to records with enduring value makes NARA, in effect, the agent of future generations. NARA's responsibility to the future places itself is a perpetual quandary of devotion to serving needs which are unknown.
Time Series Analysis of Technology Trends based on the Internet Resources
NASA Astrophysics Data System (ADS)
Kobayashi, Shin-Ichi; Shirai, Yasuyuki; Hiyane, Kazuo; Kumeno, Fumihiro; Inujima, Hiroshi; Yamauchi, Noriyoshi
Information technology is increasingly important in recent years for the development of our society. IT has brought many changes to everything in our society with incredible speed. Hence, when we investigate R & D themes or plan business strategies in IT, we must understand overall situation around the target technology area besides technology itself. Especially it is crucial to understand overall situation as time series to know what will happen in the near future in the target area. For this purpose, we developed a method to generate Multiple-phased trend maps automatically based on the Internet content. Furthermore, we introduced quantitative indicators to analyze near future possible changes. According to the evaluation of this method we got successful and interesting results.
Engaging Living Kidney Donors in a New Paradigm of Postdonation Care.
Newell, K A; Formica, R N; Gill, J S
2016-01-01
Recent studies have highlighted the need for better understanding of the long-term health outcomes of living donors. Barriers to establishment of a dedicated long-term donor follow-up data system in the United States include infrastructure costs and donor retention. We propose providing all previous and future living donors with a lifelong health insurance benefit for the primary purpose of facilitating acquisition of health information after donation as an alternative to establishment of a dedicated donor follow-up data system. Donors would consent to allow collection and analysis of their medical data, and continuation of insurance coverage would require completion of regular health assessments. The extension of health insurance would be analogous to the established practice of paying people for participation in a research study and would provide a mechanism to engage donors in a new paradigm of postdonation care in which donors are actively involved in their own health maintenance. Rather than acting as an inducement for donation, providing donors with the ability to easily contribute information about their health status represents a practical strategy to acquire the long-term medical information necessary to better inform future generations of living kidney donors. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.
Trajectories of water table recovery following the re-vegetation of bare peat
NASA Astrophysics Data System (ADS)
Shuttleworth, Emma; Evans, Martin; Allott, Tim; Maskill, Rachael; Pilkington, Michael; Walker, Jonathan
2016-04-01
The hydrological status of blanket peat influences a wide range of peatland functions, such as runoff generation, water quality, vegetation distribution, and rates of carbon sequestration. The UK supports 15% of the world's blanket peat cover, but much of this vital resource is significantly degraded, impacted by industrial pollution, overgrazing, wildfire, and climatic shifts. These pressures have produced a unique landscape characterised by severe gully erosion and extensive areas of bare peat. This in turn has led water tables to become substantially drawn down, impacting peatland function and limiting the resilience of these landscapes to future changes in climate. The restoration of eroding UK peatlands is a major conservation concern, and landscape-scale interventions through the re-vegetation of bare peat is becoming increasingly extensive in areas of upland Britain. Water table is the primary physical parameter considered in the monitoring of many peatland restoration projects, and there is a wealth of data on individual monitoring programmes which indicates that re-vegetation significantly raises water tables. This paper draws on data from multiple restoration projects carried out by the Moors for the Future Partnership in the Southern Pennines, UK, covering a range of stages in the erosion-restoration continuum, to assess the trajectories of water table recovery following re-vegetation. This will allow us to generate projections of future water table recovery, which will be of benefit to land managers and conservation organisations to inform future restoration initiatives.
Internally generated hippocampal sequences as a vantage point to probe future-oriented cognition.
Pezzulo, Giovanni; Kemere, Caleb; van der Meer, Matthijs A A
2017-05-01
Information processing in the rodent hippocampus is fundamentally shaped by internally generated sequences (IGSs), expressed during two different network states: theta sequences, which repeat and reset at the ∼8 Hz theta rhythm associated with active behavior, and punctate sharp wave-ripple (SWR) sequences associated with wakeful rest or slow-wave sleep. A potpourri of diverse functional roles has been proposed for these IGSs, resulting in a fragmented conceptual landscape. Here, we advance a unitary view of IGSs, proposing that they reflect an inferential process that samples a policy from the animal's generative model, supported by hippocampus-specific priors. The same inference affords different cognitive functions when the animal is in distinct dynamical modes, associated with specific functional networks. Theta sequences arise when inference is coupled to the animal's action-perception cycle, supporting online spatial decisions, predictive processing, and episode encoding. SWR sequences arise when the animal is decoupled from the action-perception cycle and may support offline cognitive processing, such as memory consolidation, the prospective simulation of spatial trajectories, and imagination. We discuss the empirical bases of this proposal in relation to rodent studies and highlight how the proposed computational principles can shed light on the mechanisms of future-oriented cognition in humans. © 2017 New York Academy of Sciences.
Means and extremes: building variability into community-level climate change experiments.
Thompson, Ross M; Beardall, John; Beringer, Jason; Grace, Mike; Sardina, Paula
2013-06-01
Experimental studies assessing climatic effects on ecological communities have typically applied static warming treatments. Although these studies have been informative, they have usually failed to incorporate either current or predicted future, patterns of variability. Future climates are likely to include extreme events which have greater impacts on ecological systems than changes in means alone. Here, we review the studies which have used experiments to assess impacts of temperature on marine, freshwater and terrestrial communities, and classify them into a set of 'generations' based on how they incorporate variability. The majority of studies have failed to incorporate extreme events. In terrestrial ecosystems in particular, experimental treatments have reduced temperature variability, when most climate models predict increased variability. Marine studies have tended to not concentrate on changes in variability, likely in part because the thermal mass of oceans will moderate variation. In freshwaters, climate change experiments have a much shorter history than in the other ecosystems, and have tended to take a relatively simple approach. We propose a new 'generation' of climate change experiments using down-scaled climate models which incorporate predicted changes in climatic variability, and describe a process for generating data which can be applied as experimental climate change treatments. © 2013 John Wiley & Sons Ltd/CNRS.
Yoo, Grace J; Kim, Barbara W
2010-06-01
Korean immigration peaked in the mid-1980s, so that large cohorts of post-1965 immigrants are now approaching or entering retirement. As the baby boomer generation ages, few studies have examined how the lack of retirement savings and eldercare plans combined with cultural expectations such as filial piety may pose challenges for aging Korean immigrants and their adult children. This exploratory study examines attitudes and beliefs among 1.5 and 2nd generation Korean American adults regarding filial expectations and support for aging immigrant parents. In-depth interviews conducted with 124 adult children of immigrants show that their attitudes and beliefs around filial care were primarily motivated by feelings of gratitude and a strong sense of responsibility toward their parents. In addition, because Korean immigrant parents often face language and financial barriers, adult children were preparing themselves for future support of their parents' finances, health care and long-term care needs. Although both adult sons and daughters expressed a desire to care for their parents, adult daughters often discussed in detail their concerns and worries about future care of their parents. The findings of this paper illustrate how the intersections of gender, culture, and class inform attitudes and beliefs regarding aging and family support among Korean American families.
NASA Technical Reports Server (NTRS)
Prescott, Glenn; Komar, George (Technical Monitor)
2001-01-01
Future NASA Earth observing satellites will carry high-precision instruments capable of producing large amounts of scientific data. The strategy will be to network these instrument-laden satellites into a web-like array of sensors to facilitate the collection, processing, transmission, storage, and distribution of data and data products - the essential elements of what we refer to as "Information Technology." Many of these Information Technologies will enable the satellite and ground information systems to function effectively in real-time, providing scientists with the capability of customizing data collection activities on a satellite or group of satellites directly from the ground. In future systems, extremely large quantities of data collected by scientific instruments will require the fastest processors, the highest communication channel transfer rates, and the largest data storage capacity to insure that data flows smoothly from the satellite-based instrument to the ground-based archive. Autonomous systems will control all essential processes and play a key role in coordinating the data flow through space-based communication networks. In this paper, we will discuss those critical information technologies for Earth observing satellites that will support the next generation of space-based scientific measurements of planet Earth, and insure that data and data products provided by these systems will be accessible to scientists and the user community in general.
van der Linden, Helma; Austin, Tony; Talmon, Jan
2009-09-01
Future-proof EHR systems must be capable of interpreting information structures for medical concepts that were not available at the build-time of the system. The two-model approach of CEN 13606/openEHR using archetypes achieves this by separating generic clinical knowledge from domain-related knowledge. The presentation of this information can either itself be generic, or require design time awareness of the domain knowledge being employed. To develop a Graphical User Interface (GUI) that would be capable of displaying previously unencountered clinical data structures in a meaningful way. Through "reasoning by analogy" we defined an approach for the representation and implementation of "presentational knowledge". A proof-of-concept implementation was built to validate its implementability and to test for unanticipated issues. A two-model approach to specifying and generating a screen representation for archetype-based information, inspired by the two-model approach of archetypes, was developed. There is a separation between software-related display knowledge and domain-related display knowledge and the toolkit is designed with the reuse of components in mind. The approach leads to a flexible GUI that can adapt not only to information structures that had not been predefined within the receiving system, but also to novel ways of displaying the information. We also found that, ideally, the openEHR Archetype Definition Language should receive minor adjustments to allow for generic binding.
Downscaling climate change scenarios for apple pest and disease modeling in Switzerland
NASA Astrophysics Data System (ADS)
Hirschi, M.; Stoeckli, S.; Dubrovsky, M.; Spirig, C.; Calanca, P.; Rotach, M. W.; Fischer, A. M.; Duffy, B.; Samietz, J.
2012-02-01
As a consequence of current and projected climate change in temperate regions of Europe, agricultural pests and diseases are expected to occur more frequently and possibly to extend to previously non-affected regions. Given their economic and ecological relevance, detailed forecasting tools for various pests and diseases have been developed, which model their phenology, depending on actual weather conditions, and suggest management decisions on that basis. Assessing the future risk of pest-related damages requires future weather data at high temporal and spatial resolution. Here, we use a combined stochastic weather generator and re-sampling procedure for producing site-specific hourly weather series representing present and future (1980-2009 and 2045-2074 time periods) climate conditions in Switzerland. The climate change scenarios originate from the ENSEMBLES multi-model projections and provide probabilistic information on future regional changes in temperature and precipitation. Hourly weather series are produced by first generating daily weather data for these climate scenarios and then using a nearest neighbor re-sampling approach for creating realistic diurnal cycles. These hourly weather series are then used for modeling the impact of climate change on important life phases of the codling moth and on the number of predicted infection days of fire blight. Codling moth (Cydia pomonella) and fire blight (Erwinia amylovora) are two major pest and disease threats to apple, one of the most important commercial and rural crops across Europe. Results for the codling moth indicate a shift in the occurrence and duration of life phases relevant for pest control. In southern Switzerland, a 3rd generation per season occurs only very rarely under today's climate conditions but is projected to become normal in the 2045-2074 time period. While the potential risk for a 3rd generation is also significantly increasing in northern Switzerland (for most stations from roughly 1% on average today to over 60% in the future for the median climate change signal of the multi-model projections), the actual risk will critically depend on the pace of the adaptation of the codling moth with respect to the critical photoperiod. To control this additional generation, an intensification and prolongation of control measures (e.g. insecticides) will be required, implying an increasing risk of pesticide resistances. For fire blight, the projected changes in infection days are less certain due to uncertainties in the leaf wetness approximation and the simulation of the blooming period. Two compensating effects are projected, warmer temperatures favoring infections are balanced by a temperature-induced advancement of the blooming period, leading to no significant change in the number of infection days under future climate conditions for most stations.
Downscaling climate change scenarios for apple pest and disease modeling in Switzerland
NASA Astrophysics Data System (ADS)
Hirschi, M.; Stoeckli, S.; Dubrovsky, M.; Spirig, C.; Calanca, P.; Rotach, M. W.; Fischer, A. M.; Duffy, B.; Samietz, J.
2011-08-01
As a consequence of current and projected climate change in temperate regions of Europe, agricultural pests and diseases are expected to occur more frequently and possibly to extend to previously not affected regions. Given their economic and ecological relevance, detailed forecasting tools for various pests and diseases have been developed, which model their phenology depending on actual weather conditions and suggest management decisions on that basis. Assessing the future risk of pest-related damages requires future weather data at high temporal and spatial resolution. Here, we use a combined stochastic weather generator and re-sampling procedure for producing site-specific hourly weather series representing present and future (1980-2009 and 2045-2074 time periods) climate conditions in Switzerland. The climate change scenarios originate from the ENSEMBLES multi-model projections and provide probabilistic information on future regional changes in temperature and precipitation. Hourly weather series are produced by first generating daily weather data for these climate scenarios and then using a nearest neighbor re-sampling approach for creating realistic diurnal cycles. These hourly weather series are then used for modeling the impact of climate change on important life phases of the codling moth and on the number of predicted infection days of fire blight. Codling moth (Cydia pomonella) and fire blight (Erwinia amylovora) are two major pest and disease threats to apple, one of the most important commercial and rural crops across Europe. Results for the codling moth indicate a shift in the occurrence and duration of life phases relevant for pest control. In southern Switzerland, a 3rd generation per season occurs only very rarely under today's climate conditions but is projected to become normal in the 2045-2074 time period. While the potential risk for a 3rd generation is also significantly increasing in northern Switzerland (for most stations from roughly 1 % on average today to over 60 % in the future for the median climate change signal of the multi-model projections), the actual risk will critically depend on the pace of the adaptation of the codling moth with respect to the critical photoperiod. To control this additional generation, an intensification and prolongation of control measures (e.g., insecticides) will be required, implying an increasing risk of pesticide resistances. For fire blight, the projected changes in infection days are less certain due to uncertainties in the leaf wetness approximation and the simulation of the blooming period. Two compensating effects are projected, warmer temperatures favoring infections are balanced by a temperature-induced advancement of the blooming period, leading to no significant change in the number of infection days under future climate conditions for most stations.
Expect the unexpected: screening for secondary findings in clinical genomics research.
Mackley, Michael P; Capps, Benjamin
2017-06-01
Due to decreasing cost, and increasing speed and precision, genomic sequencing in research is resulting in the generation of vast amounts of genetic data. The question of how to manage that information has been an area of significant debate. In particular, there has been much discussion around the issue of 'secondary findings' (SF)-findings unrelated to the research that have diagnostic significance. The following includes ethical commentaries, guidelines and policies in respect to large-scale clinical genomics studies. Research participant autonomy and their informed consent are paramount-policies around SF must be made clear and participants must have the choice as to which results they wish to receive, if any. While many agree that clinically 'actionable' findings should be returned, some question whether they should be actively sought within a research protocol. SF present challenges to a growing field; diverse policies around their management have the potential to hinder collaboration and future research. The impact of returning SF and accurate estimates of their clinical utility are needed to inform future protocol design. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Della, Lindsay J; Eroglu, Dogan; Bernhardt, Jay M; Edgerton, Erin; Nall, Janice
2008-01-01
Market trend data show that the media marketplace continues to rapidly evolve. Recent research shows that substantial portions of the U.S. media population are "new media" users. Today, more than ever before, media consumers are exposed to multiple media at the same point in time, encouraged to participate in media content generation, and challenged to learn, access, and use the new media that are continually entering the market. These media trends have strong implications for how consumers of health information access, process, and retain health-related knowledge. In this article we review traditional information processing models and theories of interpersonal and mass media access and consumption. We make several theory-based propositions for how traditional information processing and media consumption concepts will function as new media usage continues to increase. These propositions are supported by new media usage data from the Centers for Disease Control and Prevention's entry into the new media market (e.g., podcasting, virtual events, blogging, and webinars). Based on these propositions, we conclude by presenting both opportunities and challenges that public health communicators and marketers will face in the future.
An effective online data monitoring and saving strategy for large-scale climate simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
An effective online data monitoring and saving strategy for large-scale climate simulations
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin; ...
2018-01-22
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
Cancer prevention and control interventions using social media: user-generated approaches.
Cavallo, David N; Chou, Wen-Ying Sylvia; McQueen, Amy; Ramirez, Amelie; Riley, William T
2014-09-01
Social media are now used by a majority of American internet users. Social media platforms encourage participants to share information with their online social connections and exchange user-generated content. Significant numbers of people are already using social media to share health-related information. As such, social media provide an opportunity for "user-generated" cancer control and prevention interventions that employ users' behavior, knowledge, and existing social networks for the creation and dissemination of interventions. These interventions also enable novel data collection techniques and research designs that will allow investigators to examine real-time behavioral responses to interventions. Emerging social media-based interventions for modifying cancer-related behaviors have been applied to such domains as tobacco use, diet, physical activity, and sexual practices, and several examples are discussed for illustration purposes. Despite some promising early findings, challenges including inadequate user engagement, privacy concerns, and lack of internet access among some groups need to be addressed in future research. Recommendations for advancing the field include stronger partnerships with commercial technology companies, utilization of rapid and adaptive designs to identify successful strategies for user engagement, rigorous and iterative efficacy testing of these strategies, and inclusive methods for intervention dissemination. ©2014 American Association for Cancer Research.
Morris, J A
1999-08-01
A model is proposed in which information from the environment is analysed by complex biological decision-making systems which are highly redundant. A correct response is intelligent behaviour which preserves health; incorrect responses lead to disease. Mutations in genes which code for the redundant systems will accumulate in the genome and impair decision-making. The number of mutant genes will depend upon a balance between the new mutation rate per generation and systems of elimination based on synergistic interaction in redundant systems. This leads to a polygenic pattern of inheritance for intelligence and the common diseases. The model also gives a simple explanation for some of the hitherto puzzling aspects of work on the genetic basis of intelligence including the recorded rise in IQ this century. There is a prediction that health, intelligence and socio-economic position will be correlated generating a health differential in the social hierarchy. Furthermore, highly competitive societies will place those least able to cope in the harshest environment and this will impair health overall. The model points to a need for population monitoring of somatic mutation in order to preserve the health and intelligence of future generations.
A Review of Current Neuromorphic Approaches for Vision, Auditory, and Olfactory Sensors.
Vanarse, Anup; Osseiran, Adam; Rassau, Alexander
2016-01-01
Conventional vision, auditory, and olfactory sensors generate large volumes of redundant data and as a result tend to consume excessive power. To address these shortcomings, neuromorphic sensors have been developed. These sensors mimic the neuro-biological architecture of sensory organs using aVLSI (analog Very Large Scale Integration) and generate asynchronous spiking output that represents sensing information in ways that are similar to neural signals. This allows for much lower power consumption due to an ability to extract useful sensory information from sparse captured data. The foundation for research in neuromorphic sensors was laid more than two decades ago, but recent developments in understanding of biological sensing and advanced electronics, have stimulated research on sophisticated neuromorphic sensors that provide numerous advantages over conventional sensors. In this paper, we review the current state-of-the-art in neuromorphic implementation of vision, auditory, and olfactory sensors and identify key contributions across these fields. Bringing together these key contributions we suggest a future research direction for further development of the neuromorphic sensing field.
NASA Astrophysics Data System (ADS)
Tsuji, Takao; Hara, Ryoichi; Oyama, Tsutomu; Yasuda, Keiichiro
A super distributed energy system is a future energy system in which the large part of its demand is fed by a huge number of distributed generators. At one time some nodes in the super distributed energy system behave as load, however, at other times they behave as generator - the characteristic of each node depends on the customers' decision. In such situation, it is very difficult to regulate voltage profile over the system due to the complexity of power flows. This paper proposes a novel control method of distributed generators that can achieve the autonomous decentralized voltage profile regulation by using multi-agent technology. The proposed multi-agent system employs two types of agent; a control agent and a mobile agent. Control agents generate or consume reactive power to regulate the voltage profile of neighboring nodes and mobile agents transmit the information necessary for VQ-control among the control agents. The proposed control method is tested through numerical simulations.
Chung, Chia-Fang; Dew, Kristin; Cole, Allison; Zia, Jasmine; Fogarty, James; Kientz, Julie A.; Munson, Sean A.
2017-01-01
Patient-generated data is increasingly common in chronic disease care management. Smartphone applications and wearable sensors help patients more easily collect health information. However, current commercial tools often do not effectively support patients and providers in collaboration surrounding these data. This paper examines patient expectations and current collaboration practices around patient-generated data. We survey 211 patients, interview 18 patients, and re-analyze a dataset of 21 provider interviews. We find that collaboration occurs in every stage of self-tracking and that patients and providers create boundary negotiating artifacts to support the collaboration. Building upon current practices with patient-generated data, we use these theories of patient and provider collaboration to analyze misunderstandings and privacy concerns as well as identify opportunities to better support these collaborations. We reflect on the social nature of patient-provider collaboration to suggest future development of the stage-based model of personal informatics and the theory of boundary negotiating artifacts. PMID:28516171
70 years of radiation genetics: Fruit flies, mice and humans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abrahamson, S.
1997-03-01
Radiation protection`s function is to protect society from the potential hazards that might occur through the human use of radiation, whether it be from energy production, medical uses or other sources of exposure. To do so, various scientific bodies are called upon to develop risk estimates which will provide society with adequate protection to the adverse effects of radiation, as best we can understand those adverse affects. Geneticists have the added burden, in that they must attempt to provide protection not only to the offspring of the present generation but also for all subsequent generations. While most of us havemore » difficulty in thinking of effects that might be manifest only one or two generations into the future, some have projected potential risks for 50 to 100 generations. Here the author reviews work on fruit flies and mice, and studies of human exposures, which has provided much of the foundational information upon which geneticists can derive conclusions with regard to radiation protection questions.« less
Toledo-Pereyra, Luis H
2011-01-01
After the interest in surgical research, developing the research idea is of fundamental importance because without it we can not have research. Where do the research ideas come from then? Is there any better way to improve our ability to generate research ideas? Where do they come from? What are the factors that stimulate the research idea? Anything we do in and out of medicine or surgery should be the force that will maintain our mind occupied on our future research ideas. From events in the clinical arena to discussions in formal rounds or informal meetings should be the origin of our thinking in research. So, the generation of research ideas come from any place and we should be aware of it. We could be successful in research if we could produce and accumulate the ideas as they frequently present to us in our professional or daily life. The research environment could help us in securing the presence and evolution of the idea. Be aware of changes and future developments and be ready to admit and grow the research idea that could be presented to you during the practice of medicine.
Wiedmann, Thomas O; Suh, Sangwon; Feng, Kuishuang; Lenzen, Manfred; Acquaye, Adolf; Scott, Kate; Barrett, John R
2011-07-01
Future energy technologies will be key for a successful reduction of man-made greenhouse gas emissions. With demand for electricity projected to increase significantly in the future, climate policy goals of limiting the effects of global atmospheric warming can only be achieved if power generation processes are profoundly decarbonized. Energy models, however, have ignored the fact that upstream emissions are associated with any energy technology. In this work we explore methodological options for hybrid life cycle assessment (hybrid LCA) to account for the indirect greenhouse gas (GHG) emissions of energy technologies using wind power generation in the UK as a case study. We develop and compare two different approaches using a multiregion input-output modeling framework - Input-Output-based Hybrid LCA and Integrated Hybrid LCA. The latter utilizes the full-sized Ecoinvent process database. We discuss significance and reliability of the results and suggest ways to improve the accuracy of the calculations. The comparison of hybrid LCA methodologies provides valuable insight into the availability and robustness of approaches for informing energy and environmental policy.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-21
...The National Agricultural Statistics Service (NASS) is currently conducting the 2009 On-farm Renewable Energy Production (OREP) survey as a follow-on to the 2007 Census of Agriculture. Respondents who answered that they generated energy or electricity in 2007 are eligible for the follow-on survey to determine types of selected energy produced and associated information. NASS is currently accepting stakeholder feedback on future energy related topics and questionnaire content for development of an annual agricultural energy survey.
ERIC Educational Resources Information Center
Bizzo, Nelio, Ed.; Kawasaki, Clarice Sumi, Ed.; Ferracioli, Laercio, Ed.; Leyser da Rosa, Vivian, Ed.
This document is the proceedings of the 10th annual meeting of the International Organization for Science and Technology Education (IOSTE). Papers include: (1) "Liberal Education, Information Assessment and Argumentation in Science-LIA" (Andreas Quale, Anders Isnes, Terje Kristensen, and Ketil Mathiassen); (2) "Placing the History…
Reference software implementation for GIFTS ground data processing
NASA Astrophysics Data System (ADS)
Garcia, R. K.; Howell, H. B.; Knuteson, R. O.; Martin, G. D.; Olson, E. R.; Smuga-Otto, M. J.
2006-08-01
Future satellite weather instruments such as high spectral resolution imaging interferometers pose a challenge to the atmospheric science and software development communities due to the immense data volumes they will generate. An open-source, scalable reference software implementation demonstrating the calibration of radiance products from an imaging interferometer, the Geosynchronous Imaging Fourier Transform Spectrometer1 (GIFTS), is presented. This paper covers essential design principles laid out in summary system diagrams, lessons learned during implementation and preliminary test results from the GIFTS Information Processing System (GIPS) prototype.
SPAGETTA, a Gridded Weather Generator: Calibration, Validation and its Use for Future Climate
NASA Astrophysics Data System (ADS)
Dubrovsky, Martin; Rotach, Mathias W.; Huth, Radan
2017-04-01
Spagetta is a new (started in 2016) stochastic multi-site multi-variate weather generator (WG). It can produce realistic synthetic daily (or monthly, or annual) weather series representing both present and future climate conditions at multiple sites (grids or stations irregularly distributed in space). The generator, whose model is based on the Wilks' (1999) multi-site extension of the parametric (Richardson's type) single site M&Rfi generator, may be run in two modes: In the first mode, it is run as a classical generator, which is calibrated in the first step using weather data from multiple sites, and only then it may produce arbitrarily long synthetic time series mimicking the spatial and temporal structure of the calibration weather data. To generate the weather series representing the future climate, the WG parameters are modified according to the climate change scenario, typically derived from GCM or RCM simulations. In the second mode, the user provides only basic information (not necessarily to be realistic) on the temporal and spatial auto-correlation structure of the surface weather variables and their mean annual cycle; the generator itself derives the parameters of the underlying autoregressive model, which produces the multi-site weather series. In the latter mode of operation, the user is allowed to prescribe the spatially varying trend, which is superimposed to the values produced by the generator; this feature has been implemented for use in developing the methodology for assessing significance of trends in multi-site weather series (for more details see another EGU-2017 contribution: Huth and Dubrovsky, 2017, Evaluating collective significance of climatic trends: A comparison of methods on synthetic data; EGU2017-4993). This contribution will focus on the first (classical) mode. The poster will present (a) model of the generator, (b) results of the validation tests made in terms of the spatial hot/cold/dry/wet spells, and (c) results of the pilot climate change impact experiment, in which (i) the WG parameters representing the spatial and temporal variability are modified using the climate change scenarios and then (ii) the effect on the above spatial validation indices derived from the synthetic series produced by the modified WG is analysed. In this experiment, the generator is calibrated using the E-OBS gridded daily weather data for several European regions, and the climate change scenarios are derived from the selected RCM simulation (taken from the CORDEX database).
Epigenetics and Future Generations.
Del Savio, Lorenzo; Loi, Michele; Stupka, Elia
2015-10-01
Recent evidence of intergenerational epigenetic programming of disease risk broadens the scope of public health preventive interventions to future generations, i.e. non existing people. Due to the transmission of epigenetic predispositions, lifestyles such as smoking or unhealthy diet might affect the health of populations across several generations. While public policy for the health of future generations can be justified through impersonal considerations, such as maximizing aggregate well-being, in this article we explore whether there are rights-based obligations supervening on intergenerational epigenetic programming despite the non-identity argument, which challenges this rationale in case of policies that affect the number and identity of future people. We propose that rights based obligations grounded in the interests of non-existing people might fall upon existing people when generations overlap. In particular, if environmental exposure in F0 (i.e. existing people) will affect the health of F2 (i.e. non-existing people) through epigenetic programming, then F1 (i.e. existing and overlapping with both F0 and F2) might face increased costs to address F2's condition in the future: this might generate obligations upon F0 from various distributive principles, such as the principle of equal opportunity for well being. © 2015 John Wiley & Sons Ltd.
Crosson, Bruce; Benefield, Hope; Cato, M Allison; Sadek, Joseph R; Moore, Anna Bacon; Wierenga, Christina E; Gopinath, Kaundinya; Soltysik, David; Bauer, Russell M; Auerbach, Edward J; Gökçay, Didem; Leonard, Christiana M; Briggs, Richard W
2003-11-01
fMRI was used to determine the frontal, basal ganglia, and thalamic structures engaged by three facets of language generation: lexical status of generated items, the use of semantic vs. phonological information during language generation, and rate of generation. During fMRI, 21 neurologically normal subjects performed four tasks: generation of nonsense syllables given beginning and ending consonant blends, generation of words given a rhyming word, generation of words given a semantic category at a fast rate (matched to the rate of nonsense syllable generation), and generation of words given a semantic category at a slow rate (matched to the rate of generating of rhyming words). Components of a left pre-SMA-dorsal caudate nucleus-ventral anterior thalamic loop were active during word generation from rhyming or category cues but not during nonsense syllable generation. Findings indicate that this loop is involved in retrieving words from pre-existing lexical stores. Relatively diffuse activity in the right basal ganglia (caudate nucleus and putamen) also was found during word-generation tasks but not during nonsense syllable generation. Given the relative absence of right frontal activity during the word generation tasks, we suggest that the right basal ganglia activity serves to suppress right frontal activity, preventing right frontal structures from interfering with language production. Current findings establish roles for the left and the right basal ganglia in word generation. Hypotheses are discussed for future research to help refine our understanding of basal ganglia functions in language generation.
Kohli, R; Tan, J K; Piontek, F A; Ziege, D E; Groot, H
1999-08-01
Changes in health care delivery, reimbursement schemes, and organizational structure have required health organizations to manage the costs of providing patient care while maintaining high levels of clinical and patient satisfaction outcomes. Today, cost information, clinical outcomes, and patient satisfaction results must become more fully integrated if strategic competitiveness and benefits are to be realized in health management decision making, especially in multi-entity organizational settings. Unfortunately, traditional administrative and financial systems are not well equipped to cater to such information needs. This article presents a framework for the acquisition, generation, analysis, and reporting of cost information with clinical outcomes and patient satisfaction in the context of evolving health management and decision-support system technology. More specifically, the article focuses on an enhanced costing methodology for determining and producing improved, integrated cost-outcomes information. Implementation issues and areas for future research in cost-information management and decision-support domains are also discussed.
Derived crop management data for the LandCarbon Project
Schmidt, Gail; Liu, Shu-Guang; Oeding, Jennifer
2011-01-01
The LandCarbon project is assessing potential carbon pools and greenhouse gas fluxes under various scenarios and land management regimes to provide information to support the formulation of policies governing climate change mitigation, adaptation and land management strategies. The project is unique in that spatially explicit maps of annual land cover and land-use change are created at the 250-meter pixel resolution. The project uses vast amounts of data as input to the models, including satellite, climate, land cover, soil, and land management data. Management data have been obtained from the U.S. Department of Agriculture (USDA) National Agricultural Statistics Service (NASS) and USDA Economic Research Service (ERS) that provides information regarding crop type, crop harvesting, manure, fertilizer, tillage, and cover crop (U.S. Department of Agriculture, 2011a, b, c). The LandCarbon team queried the USDA databases to pull historic crop-related management data relative to the needs of the project. The data obtained was in table form with the County or State Federal Information Processing Standard (FIPS) and the year as the primary and secondary keys. Future projections were generated for the A1B, A2, B1, and B2 Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) scenarios using the historic data values along with coefficients generated by the project. The PBL Netherlands Environmental Assessment Agency (PBL) Integrated Model to Assess the Global Environment (IMAGE) modeling framework (Integrated Model to Assess the Global Environment, 2006) was used to develop coefficients for each IPCC SRES scenario, which were applied to the historic management data to produce future land management practice projections. The LandCarbon project developed algorithms for deriving gridded data, using these tabular management data products as input. The derived gridded crop type, crop harvesting, manure, fertilizer, tillage, and cover crop products are used as input to the LandCarbon models to represent the historic and the future scenario management data. The overall algorithm to generate each of the gridded management products is based on the land cover and the derived crop type. For each year in the land cover dataset, the algorithm loops through each 250-meter pixel in the ecoregion. If the current pixel in the land cover dataset is an agriculture pixel, then the crop type is determined. Once the crop type is derived, then the crop harvest, manure, fertilizer, tillage, and cover crop values are derived independently for that crop type. The following is the overall algorithm used for the set of derived grids. The specific algorithm to generate each management dataset is discussed in the respective section for that dataset, along with special data handling and a description of the output product.
Transforming nursing education in a 140-character world: The efficacy of becoming social.
Stevens, Karen Patterson; Nies, Mary A
A generational gap exists across educational settings today. The potential and actual mismatch of learning styles and curriculum delivery suggests that the current educational models are in need of change. The advent of social media has transformed students from passive recipients of information to co-creators and engaged members of a global and information rich community. Responding proactively with social media integration through a responsive curriculum delivery system would serve to enhance student engagement and improve collaborative learning opportunities. Future implications for social media use in research and education will allow for rapid and efficient research to practice dissemination. Copyright © 2017 Elsevier Inc. All rights reserved.
Decomposition Technique for Remaining Useful Life Prediction
NASA Technical Reports Server (NTRS)
Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor); Saxena, Abhinav (Inventor); Celaya, Jose R. (Inventor)
2014-01-01
The prognostic tool disclosed here decomposes the problem of estimating the remaining useful life (RUL) of a component or sub-system into two separate regression problems: the feature-to-damage mapping and the operational conditions-to-damage-rate mapping. These maps are initially generated in off-line mode. One or more regression algorithms are used to generate each of these maps from measurements (and features derived from these), operational conditions, and ground truth information. This decomposition technique allows for the explicit quantification and management of different sources of uncertainty present in the process. Next, the maps are used in an on-line mode where run-time data (sensor measurements and operational conditions) are used in conjunction with the maps generated in off-line mode to estimate both current damage state as well as future damage accumulation. Remaining life is computed by subtracting the instance when the extrapolated damage reaches the failure threshold from the instance when the prediction is made.
Numerical simulation of turbulent flow affected by vortex generators in straight channel
NASA Astrophysics Data System (ADS)
Souckova, Natalie; Simurda, David; Uruba, Vaclav
2012-04-01
The presented work is the next step after several experimental examinations of the vortex generator (VG) influence on flow separation occurring on a model of the NACA 63A421 airfoil with a deflected simple flap. The other purpose of this simulation is to obtain beneficial information that can be utilized for the preparation of the experimental investigation of the same configuration using Particle image Velocimetry method (PIV) in the future. The numerical simulation was performed for one single pair and two pairs of low-profile VGs of the same size, whose heights were smaller than the boundary layer thickness. The rectangular vane type VGs in such configuration, which generates counter-rotating vortices, was examined. The behaviour of vortices produced by VG pair or pairs in several positions downstream the VGs is investigated and will be used as a background of the measurement.
Troyer, Angela K; Häfliger, Andrea; Cadieux, Mélanie J; Craik, Fergus I M
2006-03-01
Many older adults are interested in strategies to help them learn new names. We examined the learning conditions that provide maximal benefit to name and face learning. In Experiment 1, consistent with levels-of-processing theory, name recall and recognition by 20 younger and 20 older adults was poorest with physical processing, intermediate with phonemic processing, and best with semantic processing. In Experiment 2, name and face learning in 20 younger and 20 older adults was maximized with semantic processing of names and physical processing of faces. Experiment 3 showed a benefit of self-generation and of intentional learning of name-face pairs in 24 older adults. Findings suggest that memory interventions should emphasize processing names semantically, processing faces physically, self-generating this information, and keeping in mind that memory for the names will be needed in the future.
NASA Technical Reports Server (NTRS)
Graff, P. V.; Foxworth, S.; Miller, R.; Runco, S.; Luckey, M. K.; Maudlin, E.
2018-01-01
The public with hands-on activities that infuse content related to NASA assets, missions, and science and reflect authentic scientific practices promotes understanding and generates excitement about NASA science, research, and exploration. These types of activities expose our next generation of explorers to science they may be inspired to pursue as a future STEM career and expose people of all ages to unique, exciting, and authentic aspects of NASA exploration. The activities discussed here (Blue Marble Matches, Lunar Geologist Practice, Let's Discover New Frontiers, Target Asteroid, and Meteorite Bingo) have been developed by Astromaterials Research and Exploration Science (ARES) Science Engagement Specialists in conjunction with ARES Scientists at the NASA Johnson Space Center. Activities are designed to be usable across a variety of educational environments (formal and informal) and reflect authentic scientific content and practices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marcy, Cara; Beiter, Philipp
2016-09-01
This report provides a high-level indicator of the future electricity demand for additional electric power generation that is not met by existing generation sources between 2015 and 2050. The indicator is applied to coastal regions, including the Great Lakes, to assess the regional opportunity space for offshore wind. An assessment of opportunity space can be a first step in determining the prospects and the system value of a technology. The metric provides the maximal amount of additional generation that is likely required to satisfy load in future years.
Renewable Electricity Futures (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hand, M.
2012-10-01
This presentation library summarizes findings of NREL's Renewable Electricity Futures study, published in June 2012. RE Futures investigated the challenges and impacts of achieving very high renewable electricity generation levels in the contiguous United States by 2050. It is being presented at the Utility Variable-Generation Integration Group Fall Technical Workshop on October 24, 2012.
Basolateral Amygdala to Orbitofrontal Cortex Projections Enable Cue-Triggered Reward Expectations.
Lichtenberg, Nina T; Pennington, Zachary T; Holley, Sandra M; Greenfield, Venuz Y; Cepeda, Carlos; Levine, Michael S; Wassum, Kate M
2017-08-30
To make an appropriate decision, one must anticipate potential future rewarding events, even when they are not readily observable. These expectations are generated by using observable information (e.g., stimuli or available actions) to retrieve often quite detailed memories of available rewards. The basolateral amygdala (BLA) and orbitofrontal cortex (OFC) are two reciprocally connected key nodes in the circuitry supporting such outcome-guided behaviors. But there is much unknown about the contribution of this circuit to decision making, and almost nothing known about the whether any contribution is via direct, monosynaptic projections, or the direction of information transfer. Therefore, here we used designer receptor-mediated inactivation of OFC→BLA or BLA→OFC projections to evaluate their respective contributions to outcome-guided behaviors in rats. Inactivation of BLA terminals in the OFC, but not OFC terminals in the BLA, disrupted the selective motivating influence of cue-triggered reward representations over reward-seeking decisions as assayed by Pavlovian-to-instrumental transfer. BLA→OFC projections were also required when a cued reward representation was used to modify Pavlovian conditional goal-approach responses according to the reward's current value. These projections were not necessary when actions were guided by reward expectations generated based on learned action-reward contingencies, or when rewards themselves, rather than stored memories, directed action. These data demonstrate that BLA→OFC projections enable the cue-triggered reward expectations that can motivate the execution of specific action plans and allow adaptive conditional responding. SIGNIFICANCE STATEMENT Deficits anticipating potential future rewarding events are associated with many psychiatric diseases. Presently, we know little about the neural circuits supporting such reward expectation. Here we show that basolateral amygdala to orbitofrontal cortex projections are required for expectations of specific available rewards to influence reward seeking and decision making. The necessity of these projections was limited to situations in which expectations were elicited by reward-predictive cues. These projections therefore facilitate adaptive behavior by enabling the orbitofrontal cortex to use environmental stimuli to generate expectations of potential future rewarding events. Copyright © 2017 the authors 0270-6474/17/378374-11$15.00/0.
Basolateral Amygdala to Orbitofrontal Cortex Projections Enable Cue-Triggered Reward Expectations
Lichtenberg, Nina T.; Pennington, Zachary T.; Holley, Sandra M.; Greenfield, Venuz Y.; Levine, Michael S.
2017-01-01
To make an appropriate decision, one must anticipate potential future rewarding events, even when they are not readily observable. These expectations are generated by using observable information (e.g., stimuli or available actions) to retrieve often quite detailed memories of available rewards. The basolateral amygdala (BLA) and orbitofrontal cortex (OFC) are two reciprocally connected key nodes in the circuitry supporting such outcome-guided behaviors. But there is much unknown about the contribution of this circuit to decision making, and almost nothing known about the whether any contribution is via direct, monosynaptic projections, or the direction of information transfer. Therefore, here we used designer receptor-mediated inactivation of OFC→BLA or BLA→OFC projections to evaluate their respective contributions to outcome-guided behaviors in rats. Inactivation of BLA terminals in the OFC, but not OFC terminals in the BLA, disrupted the selective motivating influence of cue-triggered reward representations over reward-seeking decisions as assayed by Pavlovian-to-instrumental transfer. BLA→OFC projections were also required when a cued reward representation was used to modify Pavlovian conditional goal-approach responses according to the reward's current value. These projections were not necessary when actions were guided by reward expectations generated based on learned action-reward contingencies, or when rewards themselves, rather than stored memories, directed action. These data demonstrate that BLA→OFC projections enable the cue-triggered reward expectations that can motivate the execution of specific action plans and allow adaptive conditional responding. SIGNIFICANCE STATEMENT Deficits anticipating potential future rewarding events are associated with many psychiatric diseases. Presently, we know little about the neural circuits supporting such reward expectation. Here we show that basolateral amygdala to orbitofrontal cortex projections are required for expectations of specific available rewards to influence reward seeking and decision making. The necessity of these projections was limited to situations in which expectations were elicited by reward-predictive cues. These projections therefore facilitate adaptive behavior by enabling the orbitofrontal cortex to use environmental stimuli to generate expectations of potential future rewarding events. PMID:28743727
Measuring research impact: a large cancer research funding programme in Australia.
Bowden, Jacqueline A; Sargent, Nicole; Wesselingh, Steve; Size, Lincoln; Donovan, Claire; Miller, Caroline L
2018-05-09
Measuring research impact is of critical interest to philanthropic and government funding agencies interested in ensuring that the research they fund is both scientifically excellent and has meaningful impact into health and other outcomes. The Beat Cancer Project (BCP) is a AUD $34 m cancer research funding scheme that commenced in 2011. It was initiated by an Australian charity (Cancer Council SA), and supported by the South Australian Government and the state's major universities. This study applied Buxton and Hanney's Payback Framework to assess research impact generated from the BCP after 3 years of funding. Data sources were an audit of peer-reviewed publications from January 2011 to September 2014 from Web of Knowledge and a self-report survey of investigators awarded BCP research funding during its first 3 years of implementation (2011-2013). Of the 104 surveys, 92 (88%) were completed. The BCP performed well across all five categories of the Payback Framework. In terms of knowledge production, 1257 peer-reviewed publications were generated and the mean impact factor of publishing journals increased annually. There were many benefits to future research with 21 respondents (23%) reporting career advancement, and 110 higher degrees obtained or expected (including 84 PhDs). Overall, 52% of funded projects generated tools for future research. The funded research attracted substantial further income yielding a very high rate of leverage. For every AUD $1 that the cancer charity invested, the BCP gained an additional AUD $6.06. Five projects (5%) had informed policy and 5 (5%) informed product development, with an additional 31 (34%) and 35 (38%) projects, respectively, anticipating doing so. In terms of health and sector and broader economic benefits, 8 (9%) projects had influenced practice or behaviour of health staff and 32 (34%) would reportedly to do so in the future. Research impact was a priority of charity and government funders and led to a deliberate funding strategy. Emphasising research impact while maintaining rigorous, competitive processes can achieve the joint objectives of excellence in research, yielding good research impact and a high rate of leverage for philanthropic and public investment, as indicated by these early results.
A candidate concept for display of forward-looking wind shear information
NASA Technical Reports Server (NTRS)
Hinton, David A.
1989-01-01
A concept is proposed which integrates forward-look wind shear information with airplane performance capabilities to predict future airplane energy state as a function of range. The information could be displayed to a crew either in terms of energy height or airspeed deviations. The anticipated benefits of the proposed display information concept are: (1) a wind shear hazard product that scales directly to the performance impact on the airplane and that has intuitive meaning to flight crews; (2) a reduction in flight crew workload by automatic processing of relevant hazard parameters; and (3) a continuous display of predicted airplane energy state if the approach is continued. Such a display may be used to improve pilot situational awareness or improve pilot confidence in wind shear alerts generated by other systems. The display is described and the algorithms necessary for implementation in a simulation system are provided.
Collaborative Preservation of At-Risk Data at NOAA's National Centers for Environmental Information
NASA Astrophysics Data System (ADS)
Casey, K. S.; Collins, D.; Cooper, J. M.; Ritchey, N. A.
2017-12-01
The National Centers for Environmental Information (NCEI) serves as the official long term archive of NOAA's environmental data. Adhering to the principles and responsibilities of the Open Archival Information System (OAIS, ISO 14721), and backed by both agency policies and formal legislation, NCEI ensures that these irreplaceable environmental data are preserved and made available for current users and future generations. These goals are achieved through regional, national, and international collaborative efforts like the ICSU World Data System, the Intergovernmental Oceanographic Commission's International Oceanographic Data and Information Exchange (IODE) program, NSF's DataOne, and through specific data preservation projects with partners such as the NOAA Cooperative Institutes, ESIP, and even retired federal employees. Through efforts like these, at-risk data with poor documentation, on aging media, and of unknown format and content are being rescued and made available to the public for widespread reuse.
Worklist handling in workflow-enabled radiological application systems
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens
2000-05-01
For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.
Inventory of Power Plants in the United States, October 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Inventory of Power Plants in the United States is prepared annually by the Survey Management Division, Office of Coal, Nuclear, Electric and Alternate Fuels, Energy Information Administration (EIA), US Department of Energy (DOE). The purpose of this publication is to provide year-end statistics about electric generating units operated by electric utilities in the United States (the 50 States and the District of Columbia). The publication also provides a 10-year outlook of future generating unit additions. Data summarized in this report are useful to a wide audience including Congress, Federal and State agencies, the electric utility industry, and the generalmore » public. Data presented in this report were assembled and published by the EIA to fulfill its data collection and dissemination responsibilities as specified in the Federal Energy Administration Act of 1974 (Public Law 93-275) as amended. The report is organized into the following chapters: Year in Review, Operable Electric Generating Units, and Projected Electric Generating Unit Additions. Statistics presented in these chapters reflect the status of electric generating units as of December 31, 1992.« less
Reducing a Knowledge-Base Search Space When Data Are Missing
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
This software addresses the problem of how to efficiently execute a knowledge base in the presence of missing data. Computationally, this is an exponentially expensive operation that without heuristics generates a search space of 1 + 2n possible scenarios, where n is the number of rules in the knowledge base. Even for a knowledge base of the most modest size, say 16 rules, it would produce 65,537 possible scenarios. The purpose of this software is to reduce the complexity of this operation to a more manageable size. The problem that this system solves is to develop an automated approach that can reason in the presence of missing data. This is a meta-reasoning capability that repeatedly calls a diagnostic engine/model to provide prognoses and prognosis tracking. In the big picture, the scenario generator takes as its input the current state of a system, including probabilistic information from Data Forecasting. Using model-based reasoning techniques, it returns an ordered list of fault scenarios that could be generated from the current state, i.e., the plausible future failure modes of the system as it presently stands. The scenario generator models a Potential Fault Scenario (PFS) as a black box, the input of which is a set of states tagged with priorities and the output of which is one or more potential fault scenarios tagged by a confidence factor. The results from the system are used by a model-based diagnostician to predict the future health of the monitored system.
Cyanobacteria: A Precious Bio-resource in Agriculture, Ecosystem, and Environmental Sustainability.
Singh, Jay Shankar; Kumar, Arun; Rai, Amar N; Singh, Devendra P
2016-01-01
Keeping in view, the challenges concerning agro-ecosystem and environment, the recent developments in biotechnology offers a more reliable approach to address the food security for future generations and also resolve the complex environmental problems. Several unique features of cyanobacteria such as oxygenic photosynthesis, high biomass yield, growth on non-arable lands and a wide variety of water sources (contaminated and polluted waters), generation of useful by-products and bio-fuels, enhancing the soil fertility and reducing green house gas emissions, have collectively offered these bio-agents as the precious bio-resource for sustainable development. Cyanobacterial biomass is the effective bio-fertilizer source to improve soil physico-chemical characteristics such as water-holding capacity and mineral nutrient status of the degraded lands. The unique characteristics of cyanobacteria include their ubiquity presence, short generation time and capability to fix the atmospheric N2. Similar to other prokaryotic bacteria, the cyanobacteria are increasingly applied as bio-inoculants for improving soil fertility and environmental quality. Genetically engineered cyanobacteria have been devised with the novel genes for the production of a number of bio-fuels such as bio-diesel, bio-hydrogen, bio-methane, synga, and therefore, open new avenues for the generation of bio-fuels in the economically sustainable manner. This review is an effort to enlist the valuable information about the qualities of cyanobacteria and their potential role in solving the agricultural and environmental problems for the future welfare of the planet.
Cyanobacteria: A Precious Bio-resource in Agriculture, Ecosystem, and Environmental Sustainability
Singh, Jay Shankar; Kumar, Arun; Rai, Amar N.; Singh, Devendra P.
2016-01-01
Keeping in view, the challenges concerning agro-ecosystem and environment, the recent developments in biotechnology offers a more reliable approach to address the food security for future generations and also resolve the complex environmental problems. Several unique features of cyanobacteria such as oxygenic photosynthesis, high biomass yield, growth on non-arable lands and a wide variety of water sources (contaminated and polluted waters), generation of useful by-products and bio-fuels, enhancing the soil fertility and reducing green house gas emissions, have collectively offered these bio-agents as the precious bio-resource for sustainable development. Cyanobacterial biomass is the effective bio-fertilizer source to improve soil physico-chemical characteristics such as water-holding capacity and mineral nutrient status of the degraded lands. The unique characteristics of cyanobacteria include their ubiquity presence, short generation time and capability to fix the atmospheric N2. Similar to other prokaryotic bacteria, the cyanobacteria are increasingly applied as bio-inoculants for improving soil fertility and environmental quality. Genetically engineered cyanobacteria have been devised with the novel genes for the production of a number of bio-fuels such as bio-diesel, bio-hydrogen, bio-methane, synga, and therefore, open new avenues for the generation of bio-fuels in the economically sustainable manner. This review is an effort to enlist the valuable information about the qualities of cyanobacteria and their potential role in solving the agricultural and environmental problems for the future welfare of the planet. PMID:27148218
New Methods for Crafting Locally Decision-Relevant Scenarios
NASA Astrophysics Data System (ADS)
Lempert, R. J.
2015-12-01
Scenarios can play an important role in helping decision makers to imagine future worlds, both good and bad, different than the one with which we are familiar and to take concrete steps now to address the risks generated by climate change. At their best, scenarios can effectively represent deep uncertainty; integrate over multiple domains; and enable parties with different expectation and values to expand the range of futures they consider, to see the world from different points of view, and to grapple seriously with the potential implications of surprising or inconvenient futures. These attributes of scenario processes can prove crucial in helping craft effective responses to climate change. But traditional scenario methods can also fail to overcome difficulties related to choosing, communicating, and using scenarios to identify, evaluate, and reach consensus on appropriate policies. Such challenges can limit scenario's impact in broad public discourse. This talk will demonstrate how new decision support approaches can employ new quantitative tools that allow scenarios to emerge from a process of deliberation with analysis among stakeholders, rather than serve as inputs to it, thereby increasing the impacts of scenarios on decision making. This talk will demonstrate these methods in the design of a decision support tool to help residents of low lying coastal cities grapple with the long-term risks of sea level rise. In particular, this talk will show how information from the IPCC SSP's can be combined with local information to provide a rich set of locally decision-relevant information.
Development of a Portfolio Management Approach with Case Study of the NASA Airspace Systems Program
NASA Technical Reports Server (NTRS)
Neitzke, Kurt W.; Hartman, Christopher L.
2012-01-01
A portfolio management approach was developed for the National Aeronautics and Space Administration s (NASA s) Airspace Systems Program (ASP). The purpose was to help inform ASP leadership regarding future investment decisions related to its existing portfolio of advanced technology concepts and capabilities (C/Cs) currently under development and to potentially identify new opportunities. The portfolio management approach is general in form and is extensible to other advanced technology development programs. It focuses on individual C/Cs and consists of three parts: 1) concept of operations (con-ops) development, 2) safety impact assessment, and 3) benefit-cost-risk (B-C-R) assessment. The first two parts are recommendations to ASP leaders and will be discussed only briefly, while the B-C-R part relates to the development of an assessment capability and will be discussed in greater detail. The B-C-R assessment capability enables estimation of the relative value of each C/C as compared with all other C/Cs in the ASP portfolio. Value is expressed in terms of a composite weighted utility function (WUF) rating, based on estimated benefits, costs, and risks. Benefit utility is estimated relative to achieving key NAS performance objectives, which are outlined in the ASP Strategic Plan.1 Risk utility focuses on C/C development and implementation risk, while cost utility focuses on the development and implementation portions of overall C/C life-cycle costs. Initial composite ratings of the ASP C/Cs were successfully generated; however, the limited availability of B-C-R information, which is used as inputs to the WUF model, reduced the meaningfulness of these initial investment ratings. Development of this approach, however, defined specific information-generation requirements for ASP C/C developers that will increase the meaningfulness of future B-C-R ratings.
A prototype for automation of land-cover products from Landsat Surface Reflectance Data Records
NASA Astrophysics Data System (ADS)
Rover, J.; Goldhaber, M. B.; Steinwand, D.; Nelson, K.; Coan, M.; Wylie, B. K.; Dahal, D.; Wika, S.; Quenzer, R.
2014-12-01
Landsat data records of surface reflectance provide a three-decade history of land surface processes. Due to the vast number of these archived records, development of innovative approaches for automated data mining and information retrieval were necessary. Recently, we created a prototype utilizing open source software libraries for automatically generating annual Anderson Level 1 land cover maps and information products from data acquired by the Landsat Mission for the years 1984 to 2013. The automated prototype was applied to two target areas in northwestern and east-central North Dakota, USA. The approach required the National Land Cover Database (NLCD) and two user-input target acquisition year-days. The Landsat archive was mined for scenes acquired within a 100-day window surrounding these target dates, and then cloud-free pixels where chosen closest to the specified target acquisition dates. The selected pixels were then composited before completing an unsupervised classification using the NLCD. Pixels unchanged in pairs of the NLCD were used for training decision tree models in an iterative process refined with model confidence measures. The decision tree models were applied to the Landsat composites to generate a yearly land cover map and related information products. Results for the target areas captured changes associated with the recent expansion of oil shale production and agriculture driven by economics and policy, such as the increase in biofuel production and reduction in Conservation Reserve Program. Changes in agriculture, grasslands, and surface water reflect the local hydrological conditions that occurred during the 29-year span. Future enhancements considered for this prototype include a web-based client, ancillary spatial datasets, trends and clustering algorithms, and the forecasting of future land cover.
How to exploit twitter for public health monitoring?
Denecke, K; Krieck, M; Otrusina, L; Smrz, P; Dolog, P; Nejdl, W; Velasco, E
2013-01-01
Detecting hints to public health threats as early as possible is crucial to prevent harm from the population. However, many disease surveillance strategies rely upon data whose collection requires explicit reporting (data transmitted from hospitals, laboratories or physicians). Collecting reports takes time so that the reaction time grows. Moreover, context information on individual cases is often lost in the collection process. This paper describes a system that tries to address these limitations by processing social media for identifying information on public health threats. The primary objective is to study the usefulness of the approach for supporting the monitoring of a population's health status. The developed system works in three main steps: Data from Twitter, blogs, and forums as well as from TV and radio channels are continuously collected and filtered by means of keyword lists. Sentences of relevant texts are classified relevant or irrelevant using a binary classifier based on support vector machines. By means of statistical methods known from biosurveillance, the relevant sentences are further analyzed and signals are generated automatically when unexpected behavior is detected. From the generated signals a subset is selected for presentation to a user by matching with user queries or profiles. In a set of evaluation experiments, public health experts assessed the generated signals with respect to correctness and relevancy. In particular, it was assessed how many relevant and irrelevant signals are generated during a specific time period. The experiments show that the system provides information on health events identified in social media. Signals are mainly generated from Twitter messages posted by news agencies. Personal tweets, i.e. tweets from persons observing some symptoms, only play a minor role for signal generation given a limited volume of relevant messages. Relevant signals referring to real world outbreaks were generated by the system and monitored by epidemiologists for example during the European football championship. But, the number of relevant signals among generated signals is still very small: The different experiments yielded a proportion between 5 and 20% of signals regarded as "relevant" by the users. Vaccination or education campaigns communicated via Twitter as well as use of medical terms in other contexts than for outbreak reporting led to the generation of irrelevant signals. The aggregation of information into signals results in a reduction of monitoring effort compared to other existing systems. Against expectations, only few messages are of personal nature, reporting on personal symptoms. Instead, media reports are distributed over social media channels. Despite the high percentage of irrelevant signals generated by the system, the users reported that the effort in monitoring aggregated information in form of signals is less demanding than monitoring huge social-media data streams manually. It remains for the future to develop strategies for reducing false alarms.
Does cross-generational epigenetic inheritance contribute to cultural continuity?
Pembrey, Marcus E
2018-01-01
Abstract Human studies of cross-generational epigenetic inheritance have to consider confounding by social patterning down the generations, often referred to as ‘cultural inheritance’. This raises the question to what extent is ‘cultural inheritance’ itself epigenetically mediated rather than just learnt. Human studies of non-genetic inheritance have demonstrated that, beyond foetal life, experiences occurring in mid-childhood before puberty are the most likely to be associated with cross-generational responses in the next generation(s). It is proposed that cultural continuity is played out along the axis, or ‘payoff’, between responsiveness and stability. During the formative years of childhood a stable family and/or home permits small children to explore and thereby learn. To counter disruptions to this family home ideal, cultural institutions such as local schools, religious centres and market places emerged to provide ongoing stability, holding the received wisdom of the past in an accessible state. This cultural support allows the growing child to freely indulge their responsiveness. Some of these prepubertal experiences induce epigenetic responses that also transfer molecular signals to the gametes through which they contribute to the conception of future offspring. In parallel co-evolution with growing cultural support for increasing responsiveness, ‘runaway’ responsiveness is countered by the positive selection of genetic variants that dampen responsiveness. Testing these ideas within longitudinal multigenerational cohorts will need information on ancestors/parents’ own communities and experiences (Exposome scans) linked to ongoing Phenome scans on grandchildren; coupled with epigenome analysis, metastable epialleles and DNA methylation age. Interactions with genetic variants affecting responsiveness should help inform the broad hypothesis. PMID:29732169
Does cross-generational epigenetic inheritance contribute to cultural continuity?
Pembrey, Marcus E
2018-04-01
Human studies of cross-generational epigenetic inheritance have to consider confounding by social patterning down the generations, often referred to as 'cultural inheritance'. This raises the question to what extent is 'cultural inheritance' itself epigenetically mediated rather than just learnt. Human studies of non-genetic inheritance have demonstrated that, beyond foetal life, experiences occurring in mid-childhood before puberty are the most likely to be associated with cross-generational responses in the next generation(s). It is proposed that cultural continuity is played out along the axis, or 'payoff', between responsiveness and stability. During the formative years of childhood a stable family and/or home permits small children to explore and thereby learn. To counter disruptions to this family home ideal, cultural institutions such as local schools, religious centres and market places emerged to provide ongoing stability, holding the received wisdom of the past in an accessible state. This cultural support allows the growing child to freely indulge their responsiveness. Some of these prepubertal experiences induce epigenetic responses that also transfer molecular signals to the gametes through which they contribute to the conception of future offspring. In parallel co-evolution with growing cultural support for increasing responsiveness, 'runaway' responsiveness is countered by the positive selection of genetic variants that dampen responsiveness. Testing these ideas within longitudinal multigenerational cohorts will need information on ancestors/parents' own communities and experiences (Exposome scans) linked to ongoing Phenome scans on grandchildren; coupled with epigenome analysis, metastable epialleles and DNA methylation age. Interactions with genetic variants affecting responsiveness should help inform the broad hypothesis.
A Role for Science in Responding to Health Crises
Brothers, Reginald; Murata, Christina E.
2016-01-01
The Department of Homeland Security's (DHS) Science and Technology (S&T) Directorate plays a role in public health that extends beyond biodefense. These responsibilities were exercised as part of the 2014-16 Ebola outbreak, leading to productive and beneficial contributions to the international public health response and improved operations in the United States. However, we and others have identified numerous areas for improvement. Based on our successes and lessons learned, we propose a number of ways that DHS, the interagency, and academia can act now to ensure improved responses to future public health crises. These include pre-developing scientific capabilities to respond agnostically to threats, and disease-specific master question lists to organize and inform initial efforts. We are generating DHS-specific playbooks and tools for anticipating future needs and capturing requests from DHS components and our national and international partners, where efforts will also be used to refine and exercise communication and information-sharing practices. These experiences and improvement efforts have encouraged discussions on the role of science in developing government policy, specifically responding to public health crises. We propose specific considerations for both scientists and government decision makers to ensure that the best available science is incorporated into policy and operational decisions to facilitate highly effective responses to future health crises. PMID:27482881
Romig, Barbara D; Tucker, Ann W; Hewitt, Anne M; O'Sullivan Maillet, Julie
2017-01-01
There is limited information and consensus on the future of clinical education and the key factors impacting allied health (AH) clinical training. AH deans identified both opportunities and challenges impacting clinical education based on a proposed educational model. From July 2013 to March 2014, 61 deans whose institutions were 2013 members of the Association of Schools of Allied Health Professions (ASAHP) participated in a three-round Delphi survey. Agreement on the relative importance of and the ability to impact the key factors was analyzed. Impact was evaluated for three groups: individual, collective, and both individual and collective deans. AH deans' responses were summarized and refined; individual items were accepted or rerated until agreement was achieved or study conclusion. Based on the deans' ratings of importance and impact, 159 key factors within 13 clinical education categories emerged as important for the future of clinical education. Agreement was achieved on 107 opportunities and 52 challenges. The Delphi technique generated new information where little existed specific to AH deans' perspectives on AH clinical education. This research supports the Key Factors Impacting Allied Health Clinical Education conceptual model proposed earlier and provides a foundation for AH deans to evaluate opportunities and challenges impacting AH clinical education and to design action plans based on this research.
Cao, Xiancai; Madore, Kevin P; Wang, Dahua; Schacter, Daniel L
2018-09-01
Attachment theories and studies have shown that Internal Working Models (IWMs) can impact autobiographical memory and future-oriented information processing relevant to close relationships. According to the constructive episodic simulation hypothesis (CESH), both remembering the past and imagining the future rely on episodic memory. We hypothesised that one way IWMs may bridge past experiences and future adaptations is via episodic memory. The present study investigated the association between attachment and episodic specificity in attachment-relevant and attachment-irrelevant memory and imagination among young and older adults. We measured the attachment style of 37 young adults and 40 older adults, and then asked them to remember or imagine attachment-relevant and attachment-irrelevant events. Participants' narratives were coded for internal details (i.e., episodic) and external details (e.g., semantic, repetitions). The results showed that across age group, secure individuals generated more internal details and fewer external details in attachment-relevant tasks compared to attachment-irrelevant tasks; these differences were not observed in insecure individuals. These findings support the CESH and provide a new perspective to understand the function of IWMs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustine, C.; Bain, R.; Chapman, J.
2012-06-01
The Renewable Electricity Futures (RE Futures) Study investigated the challenges and impacts of achieving very high renewable electricity generation levels in the contiguous United States by 2050. The analysis focused on the sufficiency of the geographically diverse U.S. renewable resources to meet electricity demand over future decades, the hourly operational characteristics of the U.S. grid with high levels of variable wind and solar generation, and the potential implications of deploying high levels of renewables in the future. RE Futures focused on technical aspects of high penetration of renewable electricity; it did not focus on how to achieve such a futuremore » through policy or other measures. Given the inherent uncertainties involved with analyzing alternative long-term energy futures as well as the multiple pathways that might be taken to achieve higher levels of renewable electricity supply, RE Futures explored a range of scenarios to investigate and compare the impacts of renewable electricity penetration levels (30%-90%), future technology performance improvements, potential constraints to renewable electricity development, and future electricity demand growth assumptions. RE Futures was led by the National Renewable Energy Laboratory (NREL) and the Massachusetts Institute of Technology (MIT).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustine, Chad; Bain, Richard; Chapman, Jamie
2012-06-15
The Renewable Electricity Futures (RE Futures) Study investigated the challenges and impacts of achieving very high renewable electricity generation levels in the contiguous United States by 2050. The analysis focused on the sufficiency of the geographically diverse U.S. renewable resources to meet electricity demand over future decades, the hourly operational characteristics of the U.S. grid with high levels of variable wind and solar generation, and the potential implications of deploying high levels of renewables in the future. RE Futures focused on technical aspects of high penetration of renewable electricity; it did not focus on how to achieve such a futuremore » through policy or other measures. Given the inherent uncertainties involved with analyzing alternative long-term energy futures as well as the multiple pathways that might be taken to achieve higher levels of renewable electricity supply, RE Futures explored a range of scenarios to investigate and compare the impacts of renewable electricity penetration levels (30%–90%), future technology performance improvements, potential constraints to renewable electricity development, and future electricity demand growth assumptions. RE Futures was led by the National Renewable Energy Laboratory (NREL) and the Massachusetts Institute of Technology (MIT). Learn more at the RE Futures website. http://www.nrel.gov/analysis/re_futures/« less
The Joint Confidence Level Paradox: A History of Denial
NASA Technical Reports Server (NTRS)
Butts, Glenn; Linton, Kent
2009-01-01
This paper is intended to provide a reliable methodology for those tasked with generating price tags on construction (C0F) and research and development (R&D) activities in the NASA performance world. This document consists of a collection of cost-related engineering detail and project fulfillment information from early agency days to the present. Accurate historical detail is the first place to start when determining improved methodologies for future cost and schedule estimating. This paper contains a beneficial proposed cost estimating method for arriving at more reliable numbers for future submits. When comparing current cost and schedule methods with earlier cost and schedule approaches, it became apparent that NASA's organizational performance paradigm has morphed. Mission fulfillment speed has slowed and cost calculating factors have increased in 21st Century space exploration.
Challenges ahead for mass spectrometry and proteomics applications in epigenetics.
Kessler, Benedikt M
2010-02-01
Inheritance of biological information to future generations depends on the replication of DNA and the Mendelian principle of distribution of genes. In addition, external and environmental factors can influence traits that can be propagated to offspring, but the molecular details of this are only beginning to be understood. The discoveries of DNA methylation and post-translational modifications on chromatin and histones provided entry points for regulating gene expression, an area now defined as epigenetics and epigenomics. Mass spectrometry turned out to be instrumental in uncovering molecular details involved in these processes. The central role of histone post-translational modifications in epigenetics related biological processes has revitalized mass spectrometry based investigations. In this special report, current approaches and future challenges that lay ahead due to the enormous complexity are discussed.
Past, present and future of spike sorting techniques.
Rey, Hernan Gonzalo; Pedreira, Carlos; Quian Quiroga, Rodrigo
2015-10-01
Spike sorting is a crucial step to extract information from extracellular recordings. With new recording opportunities provided by the development of new electrodes that allow monitoring hundreds of neurons simultaneously, the scenario for the new generation of algorithms is both exciting and challenging. However, this will require a new approach to the problem and the development of a common reference framework to quickly assess the performance of new algorithms. In this work, we review the basic concepts of spike sorting, including the requirements for different applications, together with the problems faced by presently available algorithms. We conclude by proposing a roadmap stressing the crucial points to be addressed to support the neuroscientific research of the near future. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Clinical exome sequencing reports: current informatics practice and future opportunities.
Swaminathan, Rajeswari; Huang, Yungui; Astbury, Caroline; Fitzgerald-Butt, Sara; Miller, Katherine; Cole, Justin; Bartlett, Christopher; Lin, Simon
2017-11-01
The increased adoption of clinical whole exome sequencing (WES) has improved the diagnostic yield for patients with complex genetic conditions. However, the informatics practice for handling information contained in whole exome reports is still in its infancy, as evidenced by the lack of a common vocabulary within clinical sequencing reports generated across genetic laboratories. Genetic testing results are mostly transmitted using portable document format, which can make secondary analysis and data extraction challenging. This paper reviews a sample of clinical exome reports generated by Clinical Laboratory Improvement Amendments-certified genetic testing laboratories at tertiary-care facilities to assess and identify common data elements. Like structured radiology reports, which enable faster information retrieval and reuse, structuring genetic information within clinical WES reports would help facilitate integration of genetic information into electronic health records and enable retrospective research on the clinical utility of WES. We identify elements listed as mandatory according to practice guidelines but are currently missing from some of the clinical reports, which might help to organize the data when stored within structured databases. We also highlight elements, such as patient consent, that, although they do not appear within any of the current reports, may help in interpreting some of the information within the reports. Integrating genetic and clinical information would assist the adoption of personalized medicine for improved patient care and outcomes. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Video Creation: A Tool for Engaging Students to Learn Science
NASA Astrophysics Data System (ADS)
Courtney, A. R.
2016-12-01
Students today process information very differently than those of previous generations. They are used to getting their news from 140-character tweets, being entertained by You-Tube videos, and Googling everything. Thus, traditional passive methods of content delivery do not work well for many of these millennials. All students, regardless of career goals, need to become scientifically literate to be able to function in a world where scientific issues are of increasing importance. Those who have had experience applying scientific reasoning to real-world problems in the classroom will be better equipped to make informed decisions in the future. The problem to be solved is how to present scientific content in a manner that fosters student learning in today's world. This presentation will describe how the appeal of technology and social communication via creation of documentary-style videos has been used to engage students to learn scientific concepts in a university non-science major course focused on energy and the environment. These video projects place control of the learning experience into the hands of the learner and provide an opportunity to develop critical thinking skills. Students discover how to locate scientifically reliable information by limiting searches to respected sources and synthesize the information through collaborative content creation to generate a "story". Video projects have a number of advantages over research paper writing. They allow students to develop collaboration skills and be creative in how they deliver the scientific content. Research projects are more effective when the audience is larger than just a teacher. Although our videos are used as peer-teaching tools in the classroom, they also are shown to a larger audience in a public forum to increase the challenge. Video will be the professional communication tool of the future. This presentation will cover the components of the video production process and instructional lessons learned over a seven-year period.
NASA Astrophysics Data System (ADS)
Smith, R.; Kasprzyk, J. R.; Balaji, R.
2017-12-01
In light of deeply uncertain factors like future climate change and population shifts, responsible resource management will require new types of information and strategies. For water utilities, this entails potential expansion and efficient management of water supply infrastructure systems for changes in overall supply; changes in frequency and severity of climate extremes such as droughts and floods; and variable demands, all while accounting for conflicting long and short term performance objectives. Multiobjective Evolutionary Algorithms (MOEAs) are emerging decision support tools that have been used by researchers and, more recently, water utilities to efficiently generate and evaluate thousands of planning portfolios. The tradeoffs between conflicting objectives are explored in an automated way to produce (often large) suites of portfolios that strike different balances of performance. Once generated, the sets of optimized portfolios are used to support relatively subjective assertions of priorities and human reasoning, leading to adoption of a plan. These large tradeoff sets contain information about complex relationships between decisions and between groups of decisions and performance that, until now, has not been quantitatively described. We present a novel use of Multivariate Regression Trees (MRTs) to analyze tradeoff sets to reveal these relationships and critical decisions. Additionally, when MRTs are applied to tradeoff sets developed for different realizations of an uncertain future, they can identify decisions that are robust across a wide range of conditions and produce fundamental insights about the system being optimized.
Space Weather affects on Air Transportation
NASA Astrophysics Data System (ADS)
Jones, J. B. L.; Bentley, R. D.; Dyer, C.; Shaw, A.
In Europe, legislation requires the airline industry to monitor the occupational exposure of aircrew to cosmic radiation. However, there are other significant impacts of space weather phenomena on the technological systems used for day-to-day operations which need to be considered by the airlines. These were highlighted by the disruption caused to the industry by the period of significant solar activity in late October and early November 2003. Next generation aircraft will utilize increasingly complex avionics as well as expanding the performance envelopes. These and future generation platforms will require the development of a new air-space management infrastructure with improved position accuracy (for route navigation and landing in bad weather) and reduced separation minima in order to cope with the expected growth in air travel. Similarly, greater reliance will be placed upon satellites for command, control, communication and information (C3I) of the operation. However, to maximize effectiveness of this globally interoperable C3I and ensure seamless fusion of all components for a safe operation will require a greater understanding of the space weather affects, their risks with increasing technology, and the inclusion of space weather information into the operation. This paper will review space weather effects on air transport and the increasing risks for future operations cause by them. We will examine how well the effects can be predicted, some of the tools that can be used and the practicalities of using such predictions in an operational scenario. Initial results from the SOARS ESA Space Weather Pilot Project will also be discussed,
Second harmonic generation imaging - a new method for unraveling molecular information of starch.
Zhuo, Zong-Yan; Liao, Chien-Sheng; Huang, Chen-Han; Yu, Jiun-Yann; Tzeng, Yu-Yi; Lo, Wen; Dong, Chen-Yuan; Chui, Hsiang-Chen; Huang, Yu-Chan; Lai, Hsi-Mei; Chu, Shi-Wei
2010-07-01
We present a new method, second harmonic generation (SHG) imaging for the study of starch structure. SHG imaging can provide the structural organization and molecular orientation information of bio-tissues without centrosymmetry. In recent years, SHG has proven its capability in the study of crystallized bio-molecules such as collagen and myosin. Starch, the most important food source and a promising future energy candidate, has, for a decade, been shown to exhibit strong SHG response. By comparing SHG intensity from different starch species, we first identified that the SHG-active molecule is amylopectin, which accounts for the crystallinity in starch granules. With the aid of SHG polarization anisotropy, we extracted the complete χ((2)) tensor of amylopectin, which reflects the underlying molecular details. Through χ((2)) tensor analysis, three-dimensional orientation and packing symmetry of amylopectin are determined. The helical angle of the double-helix in amylopectin is also deduced from the tensor, and the value corresponds well to previous X-ray studies, further verifying amylopectin as SHG source. It is noteworthy that the nm-sized structure of amylopectin inside a starch granule can be determined by this far-field optical method with 1-μm excitation wavelength. Since SHG is a relatively new tool for plant research, a detailed understanding of SHG in starch structure will be useful for future high-resolution imaging and quantitative analyses for food/energy applications. Copyright © 2010 Elsevier Inc. All rights reserved.
Jing, Helen G; Madore, Kevin P; Schacter, Daniel L
2017-12-01
A critical adaptive feature of future thinking involves the ability to generate alternative versions of possible future events. However, little is known about the nature of the processes that support this ability. Here we examined whether an episodic specificity induction - brief training in recollecting details of a recent experience that selectively impacts tasks that draw on episodic retrieval - (1) boosts alternative event generation and (2) changes one's initial perceptions of negative future events. In Experiment 1, an episodic specificity induction significantly increased the number of alternative positive outcomes that participants generated to a series of standardized negative events, compared with a control induction not focused on episodic specificity. We also observed larger decreases in the perceived plausibility and negativity of the original events in the specificity condition, where participants generated more alternative outcomes, relative to the control condition. In Experiment 2, we replicated and extended these findings using a series of personalized negative events. Our findings support the idea that episodic memory processes are involved in generating alternative outcomes to anticipated future events, and that boosting the number of alternative outcomes is related to subsequent changes in the perceived plausibility and valence of the original events, which may have implications for psychological well-being. Published by Elsevier B.V.
Heritage, Image and Identity: The Evolution of USAF Leadership
2011-02-16
up-in-coming “ Generation Z ” (also known as the “Net or Digital Generation”), which is the most connected and high-tech generation ever seen. 40...for future RPA warrior leaders. 43 The USAF has already set the ground work to position “ Generation Z ” RPA pilots for future senior leadership
76 FR 23198 - Segregation of Lands-Renewable Energy
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-26
... could be used to carry the power generated from a specific wind or solar energy ROW project, and the... included in a pending or future wind or solar energy generation right- of-way (ROW) application, or public lands identified by the BLM for a potential future wind or solar energy generation ROW authorization...
Gugger, Paul F; Liang, Christina T; Sork, Victoria L; Hodgskiss, Paul; Wright, Jessica W
2018-02-01
Identifying and quantifying the importance of environmental variables in structuring population genetic variation can help inform management decisions for conservation, restoration, or reforestation purposes, in both current and future environmental conditions. Landscape genomics offers a powerful approach for understanding the environmental factors that currently associate with genetic variation, and given those associations, where populations may be most vulnerable under future environmental change. Here, we applied genotyping by sequencing to generate over 11,000 single nucleotide polymorphisms from 311 trees and then used nonlinear, multivariate environmental association methods to examine spatial genetic structure and its association with environmental variation in an ecologically and economically important tree species endemic to Hawaii, Acacia koa . Admixture and principal components analyses showed that trees from different islands are genetically distinct in general, with the exception of some genotypes that match other islands, likely as the result of recent translocations. Gradient forest and generalized dissimilarity models both revealed a strong association between genetic structure and mean annual rainfall. Utilizing a model for projected future climate on the island of Hawaii, we show that predicted changes in rainfall patterns may result in genetic offset, such that trees no longer may be genetically matched to their environment. These findings indicate that knowledge of current and future rainfall gradients can provide valuable information for the conservation of existing populations and also help refine seed transfer guidelines for reforestation or replanting of koa throughout the state.
Lunar Exploration and Science in ESA
NASA Astrophysics Data System (ADS)
Carpenter, J.; Houdou, B.; Fisackerly, R.; De Rosa, D.; Espinasse, S.; Hufenbach, B.
2013-09-01
Lunar exploration continues to be a priority for the European Space Agency (ESA) and is recognized as the next step for human exploration beyond low Earth orbit. The Moon is also recognized as an important scientific target providing vital information on the history of the inner solar system; Earth and the emergence of life, and fundamental information on the formation and evolution of terrestrial planets. The Moon also provides a platform that can be utilized for fundamental science and to prepare the way for exploration deeper into space and towards a human Mars mission, the ultimate exploration goal. Lunar missions can also provide a means of preparing for a Mars sample return mission, which is an important long term robotic milestone. ESA is preparing for future participation in lunar exploration through a combination of human and robotic activities, in cooperation with international partners. These include activities on the ISS and participation with US led Multi-Purpose Crew Vehicle, which is planned for a first unmanned lunar flight in 2017. Future activities planned activities also include participation in international robotic missions. These activities are performed with a view to generating the technologies, capabilities, knowledge and heritage that will make Europe an indispensible partner in the exploration missions of the future. We present ESA's plans for Lunar exploration and the current status of activities. In particular we will show that this programme gives rise to unique scientific opportunities and prepares scientifically and technologically for future exploratory steps.
Three-Dimensional Displays In The Future Flight Station
NASA Astrophysics Data System (ADS)
Bridges, Alan L.
1984-10-01
This review paper summarizes the development and applications of computer techniques for the representation of three-dimensional data in the future flight station. It covers the development of the Lockheed-NASA Advanced Concepts Flight Station (ACFS) research simulators. These simulators contain: A Pilot's Desk Flight Station (PDFS) with five 13- inch diagonal, color, cathode ray tubes on the main instrument panel; a computer-generated day and night visual system; a six-degree-of-freedom motion base; and a computer complex. This paper reviews current research, development, and evaluation of easily modifiable display systems and software requirements for three-dimensional displays that may be developed for the PDFS. This includes the analysis and development of a 3-D representation of the entire flight profile. This 3-D flight path, or "Highway-in-the-Sky", will utilize motion and perspective cues to tightly couple the human responses of the pilot to the aircraft control systems. The use of custom logic, e.g., graphics engines, may provide the processing power and architecture required for 3-D computer-generated imagery (CGI) or visual scene simulation (VSS). Diffraction or holographic head-up displays (HUDs) will also be integrated into the ACFS simulator to permit research on the requirements and use of these "out-the-window" projection systems. Future research may include the retrieval of high-resolution, perspective view terrain maps which could then be overlaid with current weather information or other selectable cultural features.
The mutual causality analysis between the stock and futures markets
NASA Astrophysics Data System (ADS)
Yao, Can-Zhong; Lin, Qing-Wen
2017-07-01
In this paper we employ the conditional Granger causality model to estimate the information flow, and find that the improved model outperforms the Granger causality model in revealing the asymmetric correlation between stocks and futures in the Chinese market. First, we find that information flows estimated by Granger causality tests from futures to stocks are greater than those from stocks to futures. Additionally, average correlation coefficients capture some important characteristics between stock prices and information flows over time. Further, we find that direct information flows estimated by conditional Granger causality tests from stocks to futures are greater than those from futures to stocks. Besides, the substantial increases of information flows and direct information flows exhibit a certain degree of synchronism with the occurrences of important events. Finally, the comparative analysis with the asymmetric ratio and the bootstrap technique demonstrates the slight asymmetry of information flows and the significant asymmetry of direct information flows. It reveals that the information flows from futures to stocks are slightly greater than those in the reverse direction, while the direct information flows from stocks to futures are significantly greater than those in the reverse direction.
Carolan, Kate; Verran, Joanna; Crossley, Matthew; Redfern, James; Whitton, Nicola
2018-01-01
Background Current immunisation levels in England currently fall slightly below the threshold recommended by the World Health Organization, and the three-year trend for vaccination uptake is downwards. Attitudes towards vaccination can affect future decisions on whether or not to vaccinate, and this can have significant public health implications. Interventions can impact future vaccination decisions, and these interventions can take several forms. Relatively little work has been reported on the use of vaccination interventions in young people, who form the next generation of individuals likely to make vaccination decisions. Method We investigated the impact of two different types of educational intervention on attitudes towards vaccination in young people in England. A cohort of young people (n = 63) was recruited via a local school. This group was divided into three sub-groups; one (n = 21) received a presentation-based intervention, one (n = 26) received an interactive simulation-based intervention, and the third (n = 16) received no intervention. Participants supplied information on (1) their attitudes towards vaccination, and (2) their information needs and views on personal choice concerning vaccination, at three time points: immediately before and after the intervention, and after six months. Results Neither intervention had a significant effect on participants’ attitudes towards vaccination. However, the group receiving the presentation-based intervention saw a sustained uplift in confidence about information needs, which was not observed in the simulation-based intervention group. Discussion Our findings with young people are consistent with previous work on vaccination interventions aimed at adults, which have shown limited effectiveness, and which can actually reduce intention to vaccinate. Our findings on the most effective mode of delivery for the intervention should inform future discussion in the growing “games for health” domain, which proposes the use of interactive digital resources in healthcare education. PMID:29351325
Carolan, Kate; Verran, Joanna; Crossley, Matthew; Redfern, James; Whitton, Nicola; Amos, Martyn
2018-01-01
Current immunisation levels in England currently fall slightly below the threshold recommended by the World Health Organization, and the three-year trend for vaccination uptake is downwards. Attitudes towards vaccination can affect future decisions on whether or not to vaccinate, and this can have significant public health implications. Interventions can impact future vaccination decisions, and these interventions can take several forms. Relatively little work has been reported on the use of vaccination interventions in young people, who form the next generation of individuals likely to make vaccination decisions. We investigated the impact of two different types of educational intervention on attitudes towards vaccination in young people in England. A cohort of young people (n = 63) was recruited via a local school. This group was divided into three sub-groups; one (n = 21) received a presentation-based intervention, one (n = 26) received an interactive simulation-based intervention, and the third (n = 16) received no intervention. Participants supplied information on (1) their attitudes towards vaccination, and (2) their information needs and views on personal choice concerning vaccination, at three time points: immediately before and after the intervention, and after six months. Neither intervention had a significant effect on participants' attitudes towards vaccination. However, the group receiving the presentation-based intervention saw a sustained uplift in confidence about information needs, which was not observed in the simulation-based intervention group. Our findings with young people are consistent with previous work on vaccination interventions aimed at adults, which have shown limited effectiveness, and which can actually reduce intention to vaccinate. Our findings on the most effective mode of delivery for the intervention should inform future discussion in the growing "games for health" domain, which proposes the use of interactive digital resources in healthcare education.
Looking forward: the effects of photographs on the qualities of future thinking.
Bays, Rebecca B; Wellen, Brianna C M; Greenberg, Katherine S
2018-04-01
Future episodic thinking relies on the reconstruction of remembered experiences. Photographs provide one means of remembering, acting as a "cognitive springboard" for generating related memory qualities. We wondered whether photographs would also invite embellishment of future thought qualities, particularly in the presence (or absence) of associated memories. In two studies participants generated future events in familiar (associated memories) and novel (no associated memories) locations. Half of the participants viewed scene location photographs during event generation. All participants then imagined the events for one minute and completed a self-report measure of content qualities. Results of the current set of studies suggested that for novel locations, no differences in qualities emerged; however, for familiar locations, photographs did not enhance qualities and, in some cases, actually constrained perceptual (Experiments 1 and 2) and sensory (Experiment 1) detail ratings of future thoughts. Thus, photographs did not invite embellishment of future thought details.
Episodic Future Thinking in Generalized Anxiety Disorder
Wu, Jade Q.; Szpunar, Karl K.; Godovich, Sheina A.; Schacter, Daniel L.; Hofmann, Stefan G.
2015-01-01
Research on future-oriented cognition in generalized anxiety disorder (GAD) has primarily focused on worry, while less is known about the role of episodic future thinking (EFT), an imagery-based cognitive process. To characterize EFT in this disorder, we used the experimental recombination procedure, in which 21 GAD and 19 healthy participants simulated positive, neutral and negative novel future events either once or repeatedly, and rated their phenomenological experience of EFT. Results showed that healthy controls spontaneously generated more detailed EFT over repeated simulations. Both groups found EFT easier to generate after repeated simulations, except when GAD participants simulated positive events. They also perceived higher plausibility of negative—not positive or neutral—future events than did controls. These results demonstrate a negativity bias in GAD individuals’ episodic future cognition, and suggest their relative deficit in generating vivid EFT. We discuss implications for the theory and treatment of GAD. PMID:26398003
Mazloomdoost, Danesh; Mehregan, Shervineh; Mahmoudi, Hilda; Soltani, Akbar; Embi, Peter J.
2007-01-01
Studies performed in the US and other Western countries have documented that physicians generate many clinical questions during a typical day and rely on various information sources for answers. Little is known about the information seeking behaviors of physicians practicing in other countries, particularly those with limited Internet connectivity. We conducted this study to document the perceived barriers to information resources used by medical residents in Iran. Our findings reveal that different perceived barriers exist for electronic versus paper-based resources. Notably, paper-based resources are perceived to be limited by resident time-constraints and availability of resources, whereas electronic resources are limited by cost decentralized resources (such as PDAs) and accessibility of centralized, Internet access. These findings add to the limited literature regarding health information-seeking activities in international healthcare settings, particularly those with limited Internet connectivity, and will supplement future studies of and interventions in such settings. PMID:18693891
Mazloomdoost, Danesh; Mehregan, Shervineh; Mahmoudi, Hilda; Soltani, Akbar; Embi, Peter J
2007-10-11
Studies performed in the US and other Western countries have documented that physicians generate many clinical questions during a typical day and rely on various information sources for answers. Little is known about the information seeking behaviors of physicians practicing in other countries, particularly those with limited Internet connectivity. We conducted this study to document the perceived barriers to information resources used by medical residents in Iran. Our findings reveal that different perceived barriers exist for electronic versus paper-based resources. Notably, paper-based resources are perceived to be limited by resident time-constraints and availability of resources, whereas electronic resources are limited by cost decentralized resources (such as PDAs) and accessibility of centralized, Internet access. These findings add to the limited literature regarding health information-seeking activities in international healthcare settings, particularly those with limited Internet connectivity, and will supplement future studies of and interventions in such settings.
Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra
NASA Astrophysics Data System (ADS)
Riechers, Paul M.; Crutchfield, James P.
2018-03-01
The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.
UV Remote Sensing Data Products - Turning Data Into Knowledge
NASA Astrophysics Data System (ADS)
Weiss, M.; Paxton, L.; Schaefer, R. K.; Comberiate, J.; Hsieh, S. W.; Romeo, G.; Wolven, B. C.; Zhang, Y.
2013-12-01
The DMSP/SSUSI instruments have been taking UV images of the upper atmosphere for more than a decade. Each of the SSUSI instruments takes complete global UV images on a daily basis. Although this scientific data is very valuable, it is not actionable information. Perhaps the simplest use of SSUSI data is the assimilation of radiances into the GAIM ionospheric forecast model; even then, the data must be massaged to get it into a GAIM-ingestable form. We describe a development effort funded by the DMSP program and the Air Force Weather Agency to turn the raw data into actionable information in the form of SSUSI environmental data parameters and other derived information. We will describe current nowcasts, forecasts, and other related actionable information (e.g. auroral oval forecasts) that is currently generated by the SSUSI ground processing system for AFWA, and also concepts we have for future tools (e.g., geomagnetic storm alerts, scintillation forecasts, HF radio propagation information, auroral radar clutter) to turn more of the SSUSI dataset into actionable knowledge.
Hassan, Lamiece; Swarbrick, Caroline; Sanders, Caroline; Parker, Angela; Machin, Matt; Tully, Mary P; Ainsworth, John
2017-01-01
There are a growing number of mobile phones, watches and electronic devices which can be worn on the body to track aspects of health and well-being, such as daily steps, sleep and exercise. Dementia researchers think that these devices could potentially be used as part of future research projects, for example to help spot changes in daily activity that may signal the early symptoms of dementia. We asked a range of older people, including people living with dementia and their carers, to participate in interactive discussions about how future participants might find using these devices as part of research projects. We also invited volunteers to borrow a range of devices to test at home, giving them further insights. Discussions revealed that people were generally supportive of this type of research, provided they gave informed consent and that devices were discreet, comfortable and easy to use. They also valued technical support and regular feedback on study progress to encourage ongoing participation. These findings were used to develop a pool of devices for researchers, with computer software and written guidance to help plan, design and support studies. Our work shows that when given the right opportunities, people who are affected by dementia can provide valuable insights that can enhance the design, delivery and quality of future research. Background Increasingly, researchers are recognising the potential for connected health devices, including smartphones and smartwatches, to generate high resolution data about patterns of daily activity and health outcomes. One aim of the Dementias Platform UK (DPUK) project is to provide researchers with a secure means to collect, collate and link data generated by such devices, thereby accelerating this type of research in the field of dementia. We aimed to involve members of the public in discussions about the acceptability and feasibility of different devices and research designs to inform the development of a device pool, software platform and written guidance to support future studies. Methods Over 30 people attended a series of interactive workshops, drop-in sessions and meetings in Greater Manchester. This included people living with dementia and cognitive impairments, carers and people without memory problems. Discussions were tailored to suit different audiences and focused on the feasibility and acceptability of a range of different wearable devices and research designs. We also invited volunteers to borrow a device to test at home, enabling further insights from hands-on interactions with devices. Results Discussions revealed that people were supportive of connected health dementia research in principle, provided they gave informed consent and that devices were discreet, comfortable and easy to use. Moreover, they recommended technical support and regular feedback on study progress to encourage ongoing participation. Conclusion By using a range of discussion-based and practical activities, we found it was feasible to involve people affected by dementia and use their insights to shape the development of a software platform and device pool to support future connected health dementia research. We recommend that researchers planning such studies in future pay adequate attention to designing suitable participant information, technical support and mechanisms of providing study progress updates to support sustained engagement from participants.
NASA Astrophysics Data System (ADS)
Yokomizu, Yasunobu
Dispersed generation systems, such as micro gas-turbines and fuel cells, have been installed on some of commercial facilities. Smaller dispersed generators like solar photovoltaics have been also located on the several of individual homes. The trends in the introduction of the these generation systems seem to continue in the future and to cause the power system to have the enormous number of the dispersed generation systems. The present report discusses the near-future power distribution systems.
Vaughan, Catherine; Dessai, Suraje
2014-01-01
Climate services involve the generation, provision, and contextualization of information and knowledge derived from climate research for decision making at all levels of society. These services are mainly targeted at informing adaptation to climate variability and change, widely recognized as an important challenge for sustainable development. This paper reviews the development of climate services, beginning with a historical overview, a short summary of improvements in climate information, and a description of the recent surge of interest in climate service development including, for example, the Global Framework for Climate Services, implemented by the World Meteorological Organization in October 2012. It also reviews institutional arrangements of selected emerging climate services across local, national, regional, and international scales. By synthesizing existing literature, the paper proposes four design elements of a climate services evaluation framework. These design elements include: problem identification and the decision-making context; the characteristics, tailoring, and dissemination of the climate information; the governance and structure of the service, including the process by which it is developed; and the socioeconomic value of the service. The design elements are intended to serve as a guide to organize future work regarding the evaluation of when and whether climate services are more or less successful. The paper concludes by identifying future research questions regarding the institutional arrangements that support climate services and nascent efforts to evaluate them. PMID:25798197
Vaughan, Catherine; Dessai, Suraje
2014-09-01
Climate services involve the generation, provision, and contextualization of information and knowledge derived from climate research for decision making at all levels of society. These services are mainly targeted at informing adaptation to climate variability and change, widely recognized as an important challenge for sustainable development. This paper reviews the development of climate services, beginning with a historical overview, a short summary of improvements in climate information, and a description of the recent surge of interest in climate service development including, for example, the Global Framework for Climate Services, implemented by the World Meteorological Organization in October 2012. It also reviews institutional arrangements of selected emerging climate services across local, national, regional, and international scales. By synthesizing existing literature, the paper proposes four design elements of a climate services evaluation framework. These design elements include: problem identification and the decision-making context; the characteristics, tailoring, and dissemination of the climate information; the governance and structure of the service, including the process by which it is developed; and the socioeconomic value of the service. The design elements are intended to serve as a guide to organize future work regarding the evaluation of when and whether climate services are more or less successful. The paper concludes by identifying future research questions regarding the institutional arrangements that support climate services and nascent efforts to evaluate them.
Bridging the gap between data, publications, and images
NASA Astrophysics Data System (ADS)
Ritchey, N. A.; Collins, D.; Sprain, M.
2017-12-01
NOAA's National Centers for Environmental Information (NCEI) manages the most comprehensive, accessible, and trusted source of environmental data and information in the US. It archives data from the depths of the ocean to the surface of the sun and from million-year-old sediment records to near real-time satellite observations. NCEI has a wealth of knowledge and experience in long-term data preservation with the goal of supporting today's scientists as well as future generations. In order to reduce fragmentation of data, publications, images, and documentation, and to improve preservation, curation, and stewardship of data, NCEI continues to partner with the NOAA Central Library (NCL). NCEI and NCL have long-established linkages between data metadata, published reports, and data or archival information packages (AIP). We also have analog AIPs that are stored and maintained in the NCL collection and discoverable in both NCEI and NCL collections via the AIP identifier. We are currently working with NCL to establish a workflow for submitting reports to their Institutional Repository and linking the data and report via digital object identifiers. We hope to establish linkages between images of physical samples and the NCL Photo Collection management infrastructure in the future. This presentation will detail how NCEI engages with the NCL in order to fully integrate documentation, images, publications, and data in preservation practices and improve the discovery and usability of NOAA's billion dollar investment in environmental data and information.
NASA Astrophysics Data System (ADS)
Anderson, O. Roger
The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.
NASA Astrophysics Data System (ADS)
Calla, O. P. N.; Mathur, Shubhra; Gadri, Kishan Lal; Jangid, Monika
2016-12-01
In the present paper, permittivity maps of equatorial lunar surface are generated using brightness temperature (TB) data obtained from Microwave Radiometer (MRM) of Chang'e-1 and physical temperature (TP) data obtained from Diviner of Lunar Reconnaissance Orbiter (LRO). Here, permittivity mapping is not carried out above 60° latitudes towards the lunar poles due to large anomaly in the physical temperature obtained from the Diviner. Microwave frequencies, which are used to generate these maps are 3 GHz, 7.8 GHz, 19.35 GHz and 37 GHz. Permittivity values are simulated using TB values at these four frequencies. Here, weighted average of physical temperature obtained from Diviner are used to compute permittivity at each microwave frequencies. Longer wavelengths of microwave signals give information of more deeper layers of the lunar surface as compared to smaller wavelength. Initially, microwave emissivity is estimated using TB values from MRM and physical temperature (TP) from Diviner. From estimated emissivity the real part of permittivity (ε), is calculated using Fresnel equations. The permittivity maps of equatorial lunar surface is generated. The simulated permittivity values are normalized with respect to density for easy comparison of simulated permittivity values with the permittivity values of Apollo samples as well as with the permittivity values of Terrestrial Analogue of Lunar Soil (TALS) JSC-1A. Lower value of dielectric constant (ε‧) indicates that the corresponding lunar surface is smooth and doesn't have rough rocky terrain. Thus a future lunar astronaut can use these data to decide proper landing site for future lunar missions. The results of this paper will serve as input to future exploration of lunar surface.
Future projection of design storms using a GCM-informed weather generator
NASA Astrophysics Data System (ADS)
KIm, T. W.; Wi, S.; Valdés-Pineda, R.; Valdés, J. B.
2017-12-01
The rainfall Intensity-Duration-Frequency (IDF) curves are one of the most common tools used to provide planners with a description of the frequency of extreme rainfall events of various intensities and durations. Therefore deriving appropriate IDF estimates is important to avoid malfunctions of water structures that cause huge damage. Evaluating IDF estimates in the context of climate change has become more important because projections from climate models suggest that the frequency of intense rainfall events will increase in the future due to the increase in greenhouse gas emissions. In this study, the Bartlett-Lewis (BL) stochastic rainfall model is employed to generate annual maximum series of various sub-daily durations for test basins of the Model Parameter Estimation Experiment (MOPEX) project, and to derive the IDF curves in the context of climate changes projected by the North American Regional Climate Change (NARCCAP) models. From our results, it has been found that the observed annual rainfall maximum series is reasonably represented by the synthetic annual maximum series generated by the BL model. The observed data is perturbed by change factors to incorporate the NARCCAP climate change scenarios into the IDF estimates. The future IDF curves show a significant difference from the historical IDF curves calculated for the period 1968-2000. Overall, the projected IDF curves show an increasing trend over time. The impacts of changes in extreme rainfall on the hydrologic response of the MOPEX basins are also explored. Acknowledgement: This research was supported by a grant [MPSS-NH-2015-79] through the Disaster and Safety Management Institute funded by Ministry of Public Safety and Security of Korean government.
U.S. Geological Survey Studies of Energy Resources in Sub-Saharan Africa
,
1997-01-01
The U.S. Government and the American public need access to information on energy resources in sub-Saharan Africa.Sub-Saharan Africa (mostly Nigeria) produces 5 percent of the world's oil, while supplying the United States with 15 percent of our imports (Energy Information Administration). In the next 10 years, sub-Saharan oil and gas will become increasingly more important to the export market. New discoveries in offshore provinces of West Africa ensure a bright future for the region. Projections indicate that increased oil production in sub-Saharan Africa will far outpace the growth of intraregional consumption, providing greater quantities of oil for export (Forman, 1996). Also, West Africa, although a marginal supplier of liquefied natural gas (LNG) today, will become an important LNG source to the international market by the year 2000 (Oil & Gas Journal, 1996). The United States needs up-to-date information about petroleum resources and the energy balance within the region to predict the future role of sub-Saharan Africa as a major oil and gas exporter. The data required to generate the needed information are often disseminated in archives of oil companies and African geologic surveys, or in obscure publications. For these reasons, the U.S. Geological Survey is collecting data on sub-Saharan energy and constructing a regional energy bibliography. The team of geoscientists will assure that this information is available quickly and from a scientifically based, objective view point.
A paternal environmental legacy: evidence for epigenetic inheritance through the male germ line.
Soubry, Adelheid; Hoyo, Cathrine; Jirtle, Randy L; Murphy, Susan K
2014-04-01
Literature on maternal exposures and the risk of epigenetic changes or diseases in the offspring is growing. Paternal contributions are often not considered. However, some animal and epidemiologic studies on various contaminants, nutrition, and lifestyle-related conditions suggest a paternal influence on the offspring's future health. The phenotypic outcomes may have been attributed to DNA damage or mutations, but increasing evidence shows that the inheritance of environmentally induced functional changes of the genome, and related disorders, are (also) driven by epigenetic components. In this essay we suggest the existence of epigenetic windows of susceptibility to environmental insults during sperm development. Changes in DNA methylation, histone modification, and non-coding RNAs are viable mechanistic candidates for a non-genetic transfer of paternal environmental information, from maturing germ cell to zygote. Inclusion of paternal factors in future research will ultimately improve the understanding of transgenerational epigenetic plasticity and health-related effects in future generations. © 2014 The Authors. Bioessays published by WILEY Periodicals, Inc.
Stricker, Thomas; Catenacci, Daniel V T; Seiwert, Tanguy Y
2011-04-01
Cancers arise as a result of an accumulation of genetic aberrations that are either acquired or inborn. Virtually every cancer has its unique set of molecular changes. Technologies have been developed to study cancers and derive molecular characteristics that increasingly have implications for clinical care. Indeed, the identification of key genetic aberrations (molecular drivers) may ultimately translate into dramatic benefit for patients through the development of highly targeted therapies. With the increasing availability of newer, more powerful, and cheaper technologies such as multiplex mutational screening, next generation sequencing, array-based approaches that can determine gene copy numbers, methylation, expression, and others, as well as more sophisticated interpretation of high-throughput molecular information using bioinformatics tools like signatures and predictive algorithms, cancers will routinely be characterized in the near future. This review examines the background information and technologies that clinicians and physician-scientists will need to interpret in order to develop better, personalized treatment strategies. Copyright © 2011 Elsevier Inc. All rights reserved.
Prediction markets and their potential role in biomedical research--a review.
Pfeiffer, Thomas; Almenberg, Johan
2010-01-01
Predictions markets are marketplaces for trading contracts with payoffs that depend on the outcome of future events. Popular examples are markets on the outcome of presidential elections, where contracts pay $1 if a specific candidate wins the election and $0 if someone else wins. Contract prices on prediction markets can be interpreted as forecasts regarding the outcome of future events. Further attractive properties include the potential to aggregate private information, to generate and disseminate a consensus among the market participants, and to offer incentives for the acquisition of information. It has been argued that these properties might be valuable in the context of scientific research. In this review, we give an overview of key properties of prediction markets and discuss potential benefits for science. To illustrate these benefits for biomedical research, we discuss an example application in the context of decision making in research on the genetics of diseases. Moreover, some potential practical problems of prediction market application in science are discussed, and solutions are outlined. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
OryzaGenome: Genome Diversity Database of Wild Oryza Species.
Ohyanagi, Hajime; Ebata, Toshinobu; Huang, Xuehui; Gong, Hao; Fujita, Masahiro; Mochizuki, Takako; Toyoda, Atsushi; Fujiyama, Asao; Kaminuma, Eli; Nakamura, Yasukazu; Feng, Qi; Wang, Zi-Xuan; Han, Bin; Kurata, Nori
2016-01-01
The species in the genus Oryza, encompassing nine genome types and 23 species, are a rich genetic resource and may have applications in deeper genomic analyses aiming to understand the evolution of plant genomes. With the advancement of next-generation sequencing (NGS) technology, a flood of Oryza species reference genomes and genomic variation information has become available in recent years. This genomic information, combined with the comprehensive phenotypic information that we are accumulating in our Oryzabase, can serve as an excellent genotype-phenotype association resource for analyzing rice functional and structural evolution, and the associated diversity of the Oryza genus. Here we integrate our previous and future phenotypic/habitat information and newly determined genotype information into a united repository, named OryzaGenome, providing the variant information with hyperlinks to Oryzabase. The current version of OryzaGenome includes genotype information of 446 O. rufipogon accessions derived by imputation and of 17 accessions derived by imputation-free deep sequencing. Two variant viewers are implemented: SNP Viewer as a conventional genome browser interface and Variant Table as a text-based browser for precise inspection of each variant one by one. Portable VCF (variant call format) file or tab-delimited file download is also available. Following these SNP (single nucleotide polymorphism) data, reference pseudomolecules/scaffolds/contigs and genome-wide variation information for almost all of the closely and distantly related wild Oryza species from the NIG Wild Rice Collection will be available in future releases. All of the resources can be accessed through http://viewer.shigen.info/oryzagenome/. © The Author 2015. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists.
NASA Astrophysics Data System (ADS)
Wang, Xubo; Li, Qi; Yu, Hong; Kong, Lingfeng
2016-12-01
Four successive mass selection lines of the Pacific oyster, Crassostrea gigas, selected for faster growth in breeding programs in China were examined at ten polymorphic microsatellite loci to assess the level of allelic diversity and estimate the effective population size. These data were compared with those of their base population. The results showed that the genetic variation of the four generations were maintained at high levels with an average allelic richness of 18.8-20.6, and a mean expected heterozygosity of 0.902-0.921. They were not reduced compared with those of their base population. Estimated effective population sizes based on temporal variances in microsatellite frequencies were smaller to that of sex ratio-corrected broodstock count estimates. Using a relatively large number of broodstock and keeping an equal sex ratio in the broodstock each generation may have contributed to retaining the original genetic diversity and maintaining relatively large effective population size. The results obtained in this study showed that the genetic variation was not affected greatly by mass selection progress and high genetic variation still existed in the mass selection lines, suggesting that there is still potential for increasing the gains in future generations of C. gigas. The present study provided important information for future genetic improvement by selective breeding, and for the design of suitable management guidelines for genetic breeding of C. gigas.
Modeling a space-based quantum link that includes an adaptive optics system
NASA Astrophysics Data System (ADS)
Duchane, Alexander W.; Hodson, Douglas D.; Mailloux, Logan O.
2017-10-01
Quantum Key Distribution uses optical pulses to generate shared random bit strings between two locations. If a high percentage of the optical pulses are comprised of single photons, then the statistical nature of light and information theory can be used to generate secure shared random bit strings which can then be converted to keys for encryption systems. When these keys are incorporated along with symmetric encryption techniques such as a one-time pad, then this method of key generation and encryption is resistant to future advances in quantum computing which will significantly degrade the effectiveness of current asymmetric key sharing techniques. This research first reviews the transition of Quantum Key Distribution free-space experiments from the laboratory environment to field experiments, and finally, ongoing space experiments. Next, a propagation model for an optical pulse from low-earth orbit to ground and the effects of turbulence on the transmitted optical pulse is described. An Adaptive Optics system is modeled to correct for the aberrations caused by the atmosphere. The long-term point spread function of the completed low-earth orbit to ground optical system is explored in the results section. Finally, the impact of this optical system and its point spread function on an overall quantum key distribution system as well as the future work necessary to show this impact is described.
The SCALE Verified, Archived Library of Inputs and Data - VALID
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, William BJ J; Rearden, Bradley T
The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less
William G. Kepner; I. Shea Burns; David C Goodrich; D. Phillip Guertin; Gabriel S. Sidman; Lainie R. Levick; Wison W.S. Yee; Melissa M.A. Scianni; Clifton S. Meek; Jared B. Vollmer
2016-01-01
Long-term land-use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology was developed to characterize potential hydrologic impacts from future urban growth through time. Future growth is represented by housing density maps generated in decadal...
Towards Laser Cooling Trapped Ions with Telecom Light
NASA Astrophysics Data System (ADS)
Dungan, Kristina; Becker, Patrick; Donoghue, Liz; Liu, Jackie; Olmschenk, Steven
2015-05-01
Quantum information has many potential applications in communication, atomic clocks, and the precision measurement of fundamental constants. Trapped ions are excellent candidates for applications in quantum information because of their isolation from external perturbations, and the precise control afforded by laser cooling and manipulation of the quantum state. For many applications in quantum communication, it would be advantageous to interface ions with telecom light. We present progress towards laser cooling and trapping of doubly-ionized lanthanum, which should require only infrared, telecom-compatible light. Additionally, we present progress on optimization of a second-harmonic generation cavity for laser cooling and trapping barium ions, for future sympathetic cooling experiments. This research is supported by the Army Research Office, Research Corporation for Science Advancement, and Denison University.
Real-Time Seismic Displays in Museums Appeal to the Public
NASA Astrophysics Data System (ADS)
Smith, Meagan; Taber, John; Hubenthal, Michael
2006-02-01
Technology provides people with constant access to the latest news, weather, and entertainment. Not surprisingly, the public increasingly demands that the most current information be available for immediate consumption. For museums striving to educate the public and to maintain and expand visitor interest, gone are the days of passively conveying scientific concepts through static displays. Instead, science museums must find creative ways to capture the public's interest-successful advocacy for research funding, solutions to environmental problems, even future generations' scientific innovation depend on this. To this end, the continuous collection and dissemination of real-time science information by the scientific community offers museums an opportunity to capitalize on visitors' data addiction and increase the public's interest in, and understanding of, the Earth system.
A parallel data management system for large-scale NASA datasets
NASA Technical Reports Server (NTRS)
Srivastava, Jaideep
1993-01-01
The past decade has experienced a phenomenal growth in the amount of data and resultant information generated by NASA's operations and research projects. A key application is the reprocessing problem which has been identified to require data management capabilities beyond those available today (PRAT93). The Intelligent Information Fusion (IIF) system (ROEL91) is an ongoing NASA project which has similar requirements. Deriving our understanding of NASA's future data management needs based on the above, this paper describes an approach to using parallel computer systems (processor and I/O architectures) to develop an efficient parallel database management system to address the needs. Specifically, we propose to investigate issues in low-level record organizations and management, complex query processing, and query compilation and scheduling.
A Survey on Security and Privacy in Emerging Sensor Networks: From Viewpoint of Close-Loop.
Zhang, Lifu; Zhang, Heng
2016-03-26
Nowadays, as the next generation sensor networks, Cyber-Physical Systems (CPSs) refer to the complex networked systems that have both physical subsystems and cyber components, and the information flow between different subsystems and components is across a communication network, which forms a closed-loop. New generation sensor networks are found in a growing number of applications and have received increasing attention from many inter-disciplines. Opportunities and challenges in the design, analysis, verification and validation of sensor networks co-exists, among which security and privacy are two important ingredients. This paper presents a survey on some recent results in the security and privacy aspects of emerging sensor networks from the viewpoint of the closed-loop. This paper also discusses several future research directions under these two umbrellas.
Translating science into the next generation meat quality program for Australian lamb.
Pethick, D W; Ball, A J; Banks, R G; Gardner, G E; Rowe, J B; Jacob, R H
2014-02-01
This paper introduces a series of papers in the form of a special edition that reports phenotypic analyses done in parallel with genotypic analyses for the Australian Sheep Industry Cooperative Research Centre (Sheep CRC) using data generated from the information nucleus flock (INF). This has allowed new knowledge to be gained of the genetic, environment and management factors that impact on the carcase and eating quality, visual appeal, odour and health attributes of Australian lamb meat. The research described involved close collaboration with commercial partners across the supply chain in the sire breeding as well as the meat processing industries. This approach has enabled timely delivery and adoption of research results to industry in an unprecedented way and provides a good model for future research. © 2013.
The east coast petroleum province: Science and society
Jordan, R.R.
1999-01-01
The U.S. Atlantic offshore, especially the mid-Atlantic, was an exciting exploration area from the 1970s into the 1980s. Much pioneering 'frontier' activity in both scientific and policy matters occurred in this area. Although production was not achieved, objective geological evidence indicates that the province does have potential. Major population centers of the mid-Atlantic area demand large amounts of energy and enormous amounts of crude and product are shipped through East Coast waters. Nevertheless, exploration has been shut down by moratoria, environmental concerns, and international pricing. It is suggested that the province will be revisited in the future and that the geologic and environmental information that has been generated at great cost should be preserved for use by the next generation of explorationists and policy-makers.
Brady, Laura Thompson; Fong, Lisa; Waninger, Kendra N; Eidelman, Steven
2009-10-01
As leaders from the Baby Boomer generation prepare for retirement over the next decade, emerging leaders must be identified and supported in anticipation of a major organizational transition. Authentic leadership is a construct that informs the development of values-driven leaders who will bring organizations into the future, just as the previous generation of leaders oversaw the movement of services away from state institutions and into networks of community-based service delivery organizations. The purpose of this exploratory study was to examine executive and emerging leaders' opinions about the unique leadership values, skills, and challenges in organizations that serve individuals with intellectual and developmental disabilities. Themes of defining, developing, and sustaining leaders emerged from the data and are explored through an authentic leadership framework.
Amer, Mona M; Hovey, Joseph D
2007-10-01
This study examined socio-demographic differences in acculturation patterns among early immigrant and second-generation Arab Americans, using data from 120 participants who completed a Web-based study. Although sex, age, education, and income did not significantly relate to the acculturation process, respondents who were female and those who were married reported greater Arab ethnic identity and religiosity. Striking differences were found based on religious affiliation. Christian patterns of acculturation and mental health were consistent with acculturation theory. For Muslims, however, integration was not associated with better mental health, and religiosity was predictive of better family functioning and less depression. The results of this study suggest unique acculturation patterns for Christian and Muslim subgroups that can better inform future research and mental health service.
Rising Expectations: Access to Biomedical Information
Lindberg, D. A. B.; Humphreys, B. L.
2008-01-01
Summary Objective To provide an overview of the expansion in public access to electronic biomedical information over the past two decades, with an emphasis on developments to which the U.S. National Library of Medicine contributed. Methods Review of the increasingly broad spectrum of web-accessible genomic data, biomedical literature, consumer health information, clinical trials data, and images. Results The amount of publicly available electronic biomedical information has increased dramatically over the past twenty years. Rising expectations regarding access to biomedical information were stimulated by the spread of the Internet, the World Wide Web, advanced searching and linking techniques. These informatics advances simplified and improved access to electronic information and reduced costs, which enabled inter-organizational collaborations to build and maintain large international information resources and also aided outreach and education efforts The demonstrated benefits of free access to electronic biomedical information encouraged the development of public policies that further increase the amount of information available. Conclusions Continuing rapid growth of publicly accessible electronic biomedical information presents tremendous opportunities and challenges, including the need to ensure uninterrupted access during disasters or emergencies and to manage digital resources so they remain available for future generations. PMID:18587496
Drug Information in Space Medicine
NASA Technical Reports Server (NTRS)
Bayuse, Tina M.
2009-01-01
Published drug information is widely available for terrestrial conditions. However, information on dosing, administration, drug interactions, stability, and side effects is scant as it relates to use in Space Medicine. Multinational crews on board the International Space Station present additional challenges for drug information because medication nomenclature, information available for the drug as well as the intended use for the drug is not standard across countries. This presentation will look at unique needs for drug information and how the information is managed in Space Medicine. A review was conducted of the drug information requests submitted to the Johnson Space Center Pharmacy by Space Medicine practitioners, astronaut crewmembers and researchers. The information requested was defined and cataloged. A list of references used was maintained. The wide range of information was identified. Due to the information needs for the medications in the on-board medical kits, the Drug Monograph Project was created. A standard method for answering specific drug information questions was generated and maintained by the Johnson Space Center Pharmacy. The Drug Monograph Project will be presented. Topic-centered requests, including multinational drug information, drug-induced adverse reactions, and medication events due to the environment will be highlighted. Information management of the drug information will be explained. Future considerations for drug information needs will be outlined.
Improving Earth Science Metadata: Modernizing ncISO
NASA Astrophysics Data System (ADS)
O'Brien, K.; Schweitzer, R.; Neufeld, D.; Burger, E. F.; Signell, R. P.; Arms, S. C.; Wilcox, K.
2016-12-01
ncISO is a package of tools developed at NOAA's National Center for Environmental Information (NCEI) that facilitates the generation of ISO 19115-2 metadata from NetCDF data sources. The tool currently exists in two iterations: a command line utility and a web-accessible service within the THREDDS Data Server (TDS). Several projects, including NOAA's Unified Access Framework (UAF), depend upon ncISO to generate the ISO-compliant metadata from their data holdings and use the resulting information to populate discovery tools such as NCEI's ESRI Geoportal and NOAA's data.noaa.gov CKAN system. In addition to generating ISO 19115-2 metadata, the tool calculates a rubric score based on how well the dataset follows the Attribute Conventions for Dataset Discovery (ACDD). The result of this rubric calculation, along with information about what has been included and what is missing is displayed in an HTML document generated by the ncISO software package. Recently ncISO has fallen behind in terms of supporting updates to conventions such updates to the ACDD. With the blessing of the original programmer, NOAA's UAF has been working to modernize the ncISO software base. In addition to upgrading ncISO to utilize version1.3 of the ACDD, we have been working with partners at Unidata and IOOS to unify the tool's code base. In essence, we are merging the command line capabilities into the same software that will now be used by the TDS service, allowing easier updates when conventions such as ACDD are updated in the future. In this presentation, we will discuss the work the UAF project has done to support updated conventions within ncISO, as well as describe how the updated tool is helping to improve metadata throughout the earth and ocean sciences.
Translational Biomedical Informatics in the Cloud: Present and Future
Chen, Jiajia; Qian, Fuliang; Yan, Wenying; Shen, Bairong
2013-01-01
Next generation sequencing and other high-throughput experimental techniques of recent decades have driven the exponential growth in publicly available molecular and clinical data. This information explosion has prepared the ground for the development of translational bioinformatics. The scale and dimensionality of data, however, pose obvious challenges in data mining, storage, and integration. In this paper we demonstrated the utility and promise of cloud computing for tackling the big data problems. We also outline our vision that cloud computing could be an enabling tool to facilitate translational bioinformatics research. PMID:23586054
The hospital tech laboratory: quality innovation in a new era of value-conscious care.
Keteyian, Courtland K; Nallamothu, Brahmajee K; Ryan, Andrew M
2017-08-01
For decades, the healthcare industry has been incentivized to develop new diagnostic technologies, but this limitless progress fueled rapidly growing expenditures. With an emphasis on value, the future will favor information synthesis and processing over pure data generation, and hospitals will play a critical role in developing these systems. A Michigan Medicine, IBM, and AirStrip partnership created a robust streaming analytics platform tasked with creating predictive algorithms for critical care with the potential to support clinical decisions and deliver significant value.
NASA Technical Reports Server (NTRS)
1997-01-01
Astronaut Katherine Hire and LEGO-Master Model Builders assisted children from Mississippi, Louisiana and Mississippi in the building of a 12-foot tall Space Shuttle made entirely from tiny LEGO bricks at the John C. Stennis Space Center Visitors Center in South Mississippi. The shuttle was part of an exhibit titled ' Travel in Space' World Show which depicts the history of flight and space travel from the Wright brothers to future generations of space vehicles. For more information concerning hours of operation or Visitors Center educational programs, call 1-800-237-1821 in Mississippi and Louisiana or (601) 688-2370.
NASA Technical Reports Server (NTRS)
1997-01-01
More than 2,000 children and adults from Mississippi, Louisiana and Alabama recently build a 12-foot tall Space Shuttle made entirely from tiny LEGO bricks at the John C. Stennis Space Center Visitors Center in South Mississippi. The shuttle was part of an exhibit titled 'Travel in Space' World Show which depicts the history of flight and space travel from the Wright brothers to future generations of space vehicles. For more information concerning hours of operation or Visitors Center educational programs, call 1-800-237-1821 in Mississippi and Louisiana or (601) 688-2370.
Gravitational waves from a first-order electroweak phase transition: a brief review
NASA Astrophysics Data System (ADS)
Weir, David J.
2018-01-01
We review the production of gravitational waves by an electroweak first-order phase transition. The resulting signal is a good candidate for detection at next-generation gravitational wave detectors, such as LISA. Detection of such a source of gravitational waves could yield information about physics beyond the Standard Model that is complementary to that accessible to current and near-future collider experiments. We summarize efforts to simulate and model the phase transition and the resulting production of gravitational waves. This article is part of the Theo Murphy meeting issue `Higgs cosmology'.
NASA Technical Reports Server (NTRS)
Chavez, Patrick F.
1987-01-01
The effort at Sandia National Labs. on the methodologies and techniques being used to generate strict hexahedral finite element meshes from a solid model is described. The functionality of the modeler is used to decompose the solid into a set of nonintersecting meshable finite element primitives. The description of the decomposition is exported, via a Boundary Representative format, to the meshing program which uses the information for complete finite element model specification. Particular features of the program are discussed in some detail along with future plans for development which includes automation of the decomposition using artificial intelligence techniques.
Jones, Lyell K.; Craft, Karolina; Fritz, Joseph V.
2016-01-01
Abstract In 2014, the Centers for Medicare and Medicaid Services began a now annual process of releasing payment data made to physicians and other providers from Medicare Part B. The unprecedented availability of detailed payment information has generated considerable interest among policymakers, the public, and the media, and raised concerns from a number of physician groups. In the current climate of financial transparency, publication of Medicare payment data will likely continue. In an effort to prepare neurologists for future releases of payment data, we review the background, limitations, potential benefits, and appropriate responses to Medicare payment data releases. PMID:29443257
High flexible Hydropower Generation concepts for future grids
NASA Astrophysics Data System (ADS)
Hell, Johann
2017-04-01
The ongoing changes in electric power generation are resulting in new requirements for the classical generating units. In consequence a paradigm change in operation of power systems is necessary and a new approach in finding solutions is needed. The presented paper is dealing with the new requirements on current and future energy systems with the focus on hydro power generation. A power generation landscape for some European regions is shown and generation and operational flexibility is explained. Based on the requirements from the Transmission System Operator in UK, the transient performance of a Pumped Storage installation is discussed.
O'Toole, J; Keywood, M; Sinclair, M; Leder, K
2009-01-01
The aim of this study was to address existing data gaps and to determine the size distribution of aerosols associated with water-efficient devices during typical domestic activities. This information is important to assist in understanding infection spread during water-using activities and in designing water regulations. Three water-using scenarios were evaluated: i) showering using a water-efficient showerhead; ii) use of a high pressure spray unit for cleaning cars and iii) toilet flushing using a dual flush low volume flush device. For each scenario a control condition (conventional lower efficiency device) was selected for benchmarking purposes. Shower module results highlighted the complexity of particle generation and removal processes and showed that more than 90% of total particle mass in the breathing zone was attributed to particle diameters greater than 6 mum. Conversely, results for car washing experiments showed that particle diameters up to 6 mum constituted the major part of the total mass generated by both water-efficient and conventional devices. Even under worse case scenario conditions for toilet flushing, particle measurements were at or below the level of detection of the measuring instrumentation. The data provide information that assists in health risk assessment and in determining future research directions, including methodological aspects.
Can Accelerators Meet the Medical Isotopes Needs of the World?
NASA Astrophysics Data System (ADS)
Ruth, Thomas
2011-10-01
Over 80% of all Nuclear Medicine procedures make use of the radionuclide Tc-99 for SPECT imaging of heart disease, cancer and other disorders. Historically TC-99 has been produced from a generator through the decay of Mo-99 where the Mo-99 is a fission product of U-235. Five reactors around the world supply the market. However, these reactors are aging (many over 50 years old) and governments are reluctant to replace them. Therefore researchers have turned to accelerators as a potential source of this important radionuclide. In Canada the government has funded research project for two accelerator approaches: Mo-100(gamma,n)Mo-99 and Mo-100(p,2n)Tc-99m where the photons are generated from the conversion of high powered electrons into Bremsstrahlung radiation and the protons generated in low energy cyclotrons (15-25 MeV). The goal of these project is to provide the Government with sufficient information so that an informed decision can be made with respect to future supplies of medical isotopes for Canada. International interest has been expressed by the IAEA as a way to allow Member States with existing cyclotron programs to take advantage of the direct production route. This talk will describe the challenges with the approaches and the progress to date.
Yielding to desire: the durability of affective preferences.
Rapp, David N; Jacovina, Matthew E; Slaten, Daniel G; Krause, Elise
2014-09-01
People's expectations about the future are guided not just by the contingencies of situations but also by what they hope or wish will happen next. These preferences can inform predictions that run counter to what should or must occur based on the logic of unfolding events. Effects of this type have been regularly identified in studies of judgment and decision making, with individuals' choices often reflecting emotional rather than rational influences. Encouraging individuals to rely less on their emotional considerations has proven a challenge as affective responses are generated quickly and are seemingly informative for decisions. In 6 experiments we examined whether individuals could be encouraged to rely less on their affective preferences when making judgments about future events. Participants read stories in which contexts informed the likelihood of events in ways that might run counter to their preferential investments in particular outcomes. While being less than relevant given the logic of events, participants' affective considerations remained influential despite time allotted for predictive reflection. In contrast, instructional warnings helped attenuate the influence of affective considerations, even under conditions previously shown to encourage preferential biases. The findings are discussed with respect to factors that mediate preference effects, and highlight challenges for overcoming people's reliance on affective contributors to everyday judgments and comprehension.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-23
... the proposed Information Collection Request (ICR) titled: Part 41, Relating to Security Futures... COMMODITY FUTURES TRADING COMMISSION Agency Information Collection Activities; Proposed Collection; Comment Request: Part 41, Relating to Security Futures Products AGENCY: Commodity Futures Trading...
Capone, A; Cicchetti, A; Mennini, F S; Marcellusi, A; Baio, G; Favato, G
2016-01-01
Healthcare expenses will be the most relevant policy issue for most governments in the EU and in the USA. This expenditure can be associated with two major key categories: demographic and economic drivers. Factors driving healthcare expenditure were rarely recognised, measured and comprehended. An improvement of health data generation and analysis is mandatory, and in order to tackle healthcare spending growth, it may be useful to design and implement an effective, advanced system to generate and analyse these data. A methodological approach relied upon the Health Data Entanglement (HDE) can be a suitable option. By definition, in the HDE a large amount of data sets having several sources are functionally interconnected and computed through learning machines that generate patterns of highly probable future health conditions of a population. Entanglement concept is borrowed from quantum physics and means that multiple particles (information) are linked together in a way such that the measurement of one particle's quantum state (individual health conditions and related economic requirements) determines the possible quantum states of other particles (population health forecasts to predict their impact). The value created by the HDE is based on the combined evaluation of clinical, economic and social effects generated by health interventions. To predict the future health conditions of a population, analyses of data are performed using self-learning AI, in which sequential decisions are based on Bayesian algorithmic probabilities. HDE and AI-based analysis can be adopted to improve the effectiveness of the health governance system in ways that also lead to better quality of care.
Shachak, Aviv; Dow, Rustam; Barnsley, Jan; Tu, Karen; Domb, Sharon; Jadad, Alejandro R; Lemieux-Charles, Louise
2013-06-04
Tutorials and user manuals are important forms of impersonal support for using software applications including electronic medical records (EMRs). Differences between user- and vendor documentation may indicate support needs, which are not sufficiently addressed by the official documentation, and reveal new elements that may inform the design of tutorials and user manuals. What are the differences between user-generated tutorials and manuals for an EMR and the official user manual from the software vendor? Effective design of tutorials and user manuals requires careful packaging of information, balance between declarative and procedural texts, an action and task-oriented approach, support for error recognition and recovery, and effective use of visual elements. No previous research compared these elements between formal and informal documents. We conducted an mixed methods study. Seven tutorials and two manuals for an EMR were collected from three family health teams and compared with the official user manual from the software vendor. Documents were qualitatively analyzed using a framework analysis approach in relation to the principles of technical documentation described above. Subsets of the data were quantitatively analyzed using cross-tabulation to compare the types of error information and visual cues in screen captures between user- and vendor-generated manuals. The user-developed tutorials and manuals differed from the vendor-developed manual in that they contained mostly procedural and not declarative information; were customized to the specific workflow, user roles, and patient characteristics; contained more error information related to work processes than to software usage; and used explicit visual cues on screen captures to help users identify window elements. These findings imply that to support EMR implementation, tutorials and manuals need to be customized and adapted to specific organizational contexts and workflows. The main limitation of the study is its generalizability. Future research should address this limitation and may explore alternative approaches to software documentation, such as modular manuals or participatory design.
NASA Astrophysics Data System (ADS)
Njome, Manga S.; Suh, Cheo E.; Chuyong, George; de Wit, Maarten J.
2010-11-01
A study of volcanic risk perception was carried out in rural communities around the Mount Cameroon volcano between August and December 2008. The results indicate that risk perception reflects the levels of threat to which a resident population has been exposed to previously. Results of 70 responses to questionnaires show that local knowledge of hazards is high. Most respondents correctly indicated that earthquake and lava flow activities would affect resident population most in the future. By contrast, respondent's ability to adapt and protect themselves from the effects of future eruptions is poor, and inhabitants would likely shift responsibility for their protection to the requisite experts. This study confirms that there is little knowledge of any existing emergency plan, little or no educational outreach activities, but a high perceived need for information about and implementation of such actions. Knowledge about natural threats is found to be directly related to past exposure to volcanic hazard, and is significantly higher for people living along the southern than those along the northern slopes of Mt. Cameroon. The data also show that the media remains the most accessible channel for hazard communication, and that the internet is a growing information source that should be used to reach out to the younger generation. It is clear from the results of this study that major education and information efforts are required to improve the public's knowledge, confidence in the government, and growing self-reliance, in order to improve both collective and individual capacity to face future volcanic emergencies.
Vortex detection through pressure measurements
NASA Astrophysics Data System (ADS)
Bhide, Aditi
Vortex Generators (VGs) are known to hinder boundary layer separation, a frequently unwanted phenomenon when it comes to external flows over aircraft wings, on-ground vehicles or internal flows within pipes, diffusers and turbomachinery. Boundary layer separation leads to loss of lift, higher drag and subsequently, energy losses. The vortices generated inhibit boundary layer separation. This thesis is an effort to discern the strength and location of these generated vortices using an array of VGs over a flat plate. Such information may be useful in the future in active control systems for streamwise vortices, which have been proposed to relaminarize turbulent boundary layers. Flow over flat plates, simulated using wind tunnel experiments, is studied for pressure variation using an array of pressure ports mounted over the plate and connected to suitable pressure sensors. Pressure coefficient and Velocity maps are generated using the data obtained from the Kirsten Wind Tunnel data acquisition system. These represent the nature of the flow field over the plate and are used to locate the vortices and determine their strength. It was found that the vortices can be detected using this method and their strength and location can be estimated.
Rosenfeld, Alan L; Mandelaris, George A; Tardieu, Philippe B
2006-08-01
The purpose of this paper is to expand on part 1 of this series (published in the previous issue) regarding the emerging future of computer-guided implant dentistry. This article will introduce the concept of rapid-prototype medical modeling as well as describe the utilization and fabrication of computer-generated surgical drilling guides used during implant surgery. The placement of dental implants has traditionally been an intuitive process, whereby the surgeon relies on mental navigation to achieve optimal implant positioning. Through rapid-prototype medical modeling and the ste-reolithographic process, surgical drilling guides (eg, SurgiGuide) can be created. These guides are generated from a surgical implant plan created with a computer software system that incorporates all relevant prosthetic information from which the surgical plan is developed. The utilization of computer-generated planning and stereolithographically generated surgical drilling guides embraces the concept of collaborative accountability and supersedes traditional mental navigation on all levels of implant therapy.
Axiope tools for data management and data sharing.
Goddard, Nigel H; Cannon, Robert C; Howell, Fred W
2003-01-01
Many areas of biological research generate large volumes of very diverse data. Managing this data can be a difficult and time-consuming process, particularly in an academic environment where there are very limited resources for IT support staff such as database administrators. The most economical and efficient solutions are those that enable scientists with minimal IT expertise to control and operate their own desktop systems. Axiope provides one such solution, Catalyzer, which acts as flexible cataloging system for creating structured records describing digital resources. The user is able specify both the content and structure of the information included in the catalog. Information and resources can be shared by a variety of means, including automatically generated sets of web pages. Federation and integration of this information, where needed, is handled by Axiope's Mercat server. Where there is a need for standardization or compatibility of the structures usedby different researchers this canbe achieved later by applying user-defined mappings in Mercat. In this way, large-scale data sharing can be achieved without imposing unnecessary constraints or interfering with the way in which individual scientists choose to record and catalog their work. We summarize the key technical issues involved in scientific data management and data sharing, describe the main features and functionality of Axiope Catalyzer and Axiope Mercat, and discuss future directions and requirements for an information infrastructure to support large-scale data sharing and scientific collaboration.
Next Generation RFID-Based Medical Service Management System Architecture in Wireless Sensor Network
NASA Astrophysics Data System (ADS)
Tolentino, Randy S.; Lee, Kijeong; Kim, Yong-Tae; Park, Gil-Cheol
Radio Frequency Identification (RFID) and Wireless Sensor Network (WSN) are two important wireless technologies that have wide variety of applications and provide unlimited future potentials most especially in healthcare systems. RFID is used to detect presence and location of objects while WSN is used to sense and monitor the environment. Integrating RFID with WSN not only provides identity and location of an object but also provides information regarding the condition of the object carrying the sensors enabled RFID tag. However, there isn't any flexible and robust communication infrastructure to integrate these devices into an emergency care setting. An efficient wireless communication substrate for medical devices that addresses ad hoc or fixed network formation, naming and discovery, transmission efficiency of data, data security and authentication, as well as filtration and aggregation of vital sign data need to be study and analyze. This paper proposed an efficient next generation architecture for RFID-based medical service management system in WSN that possesses the essential elements of each future medical application that are integrated with existing medical practices and technologies in real-time, remote monitoring, in giving medication, and patient status tracking assisted by embedded wearable wireless sensors which are integrated in wireless sensor network.
Systems solutions by lactic acid bacteria: from paradigms to practice
2011-01-01
Lactic acid bacteria are among the powerhouses of the food industry, colonize the surfaces of plants and animals, and contribute to our health and well-being. The genomic characterization of LAB has rocketed and presently over 100 complete or nearly complete genomes are available, many of which serve as scientific paradigms. Moreover, functional and comparative metagenomic studies are taking off and provide a wealth of insight in the activity of lactic acid bacteria used in a variety of applications, ranging from starters in complex fermentations to their marketing as probiotics. In this new era of high throughput analysis, biology has become big science. Hence, there is a need to systematically store the generated information, apply this in an intelligent way, and provide modalities for constructing self-learning systems that can be used for future improvements. This review addresses these systems solutions with a state of the art overview of the present paradigms that relate to the use of lactic acid bacteria in industrial applications. Moreover, an outlook is presented of the future developments that include the transition into practice as well as the use of lactic acid bacteria in synthetic biology and other next generation applications. PMID:21995776
Historic Frontier Processes active in Future Space-Based Mineral Extraction
NASA Astrophysics Data System (ADS)
Gray, D. M.
2000-01-01
The forces that shaped historic mining frontiers are in many cases not bound by geographic or temporal limits. The forces that helped define historic frontiers are active in today's physical and virtual frontiers, and will be present in future space-based frontiers. While frontiers derived from position and technology are primarily economic in nature, non-economic conditions affect the success or failure of individual frontier endeavors, local "mining camps" and even entire frontiers. Frontiers can be defined as the line of activity that divides the established markets and infrastructure of civilization from the unclaimed resources and potential wealth of a wilderness. At the frontier line, ownership of resources is established. The resource can then be developed using capital, energy and information. In a mining setting, the resource is concentrated for economic shipment to the markets of civilization. Profits from the sale of the resource are then used to fund further development of the resource and/or pay investors. Both positional and technical frontiers develop as a series of generations. The profits from each generation of development provides the capital and/or investment incentive for the next round of development. Without profit, the self-replicating process of frontiers stops.
Breuer, Eun-Kyoung Yim; Murph, Mandi M.
2011-01-01
Technological and scientific innovations over the last decade have greatly contributed to improved diagnostics, predictive models, and prognosis among cancers affecting women. In fact, an explosion of information in these areas has almost assured future generations that outcomes in cancer will continue to improve. Herein we discuss the current status of breast, cervical, and ovarian cancers as it relates to screening, disease diagnosis, and treatment options. Among the differences in these cancers, it is striking that breast cancer has multiple predictive tests based upon tumor biomarkers and sophisticated, individualized options for prescription therapeutics while ovarian cancer lacks these tools. In addition, cervical cancer leads the way in innovative, cancer-preventative vaccines and multiple screening options to prevent disease progression. For each of these malignancies, emerging proteomic technologies based upon mass spectrometry, stable isotope labeling with amino acids, high-throughput ELISA, tissue or protein microarray techniques, and click chemistry in the pursuit of activity-based profiling can pioneer the next generation of discovery. We will discuss six of the latest techniques to understand proteomics in cancer and highlight research utilizing these techniques with the goal of improvement in the management of women's cancers. PMID:21886869
Easy to retrieve but hard to believe: metacognitive discounting of the unpleasantly possible.
O'Brien, Ed
2013-06-01
People who recall or forecast many pleasant moments should perceive themselves as happier in the past or future than people who generate few such moments; the same principle should apply to generating unpleasant moments and perceiving unhappiness. Five studies suggest that this is not always true. Rather, people's metacognitive experience of ease of thought retrieval ("fluency") can affect perceived well-being over time beyond actual thought content. The easier it is to recall positive past experiences, the happier people think they were at the time; likewise, the easier it is to recall negative past experiences, the unhappier people think they were. But this is not the case for predicting the future. Although people who easily generate positive forecasts predict more future happiness, people who easily generate negative forecasts do not infer future unhappiness. Given pervasive tendencies to underestimate the likelihood of experiencing negative events, people apparently discount hard-to-believe metacognitive feelings (e.g., easily imagined unpleasant futures). Paradoxically, people's well-being may be maximized when they contemplate some bad moments or just a few good moments.
Misreporting signs of child abuse: the role of decision-making and outcome information.
Lindholm, Torun; Sjöberg, Rickard L; Memon, Amina
2014-02-01
Two studies provided evidence that a decision to report an ambiguous case of child abuse affected subsequent memory of the case information, such that participants falsely recognized details that were not presented in the original information, but that are schematically associated with child abuse. Moreover, post-decision information that the child had later died from abuse influenced the memory reports of participants who had chosen not to report the case, increasing their reports of false schema-consistent details. This suggests that false decision-consistent memories are primarily due to sense-making, schematic processing rather than the motivation to justify the decision. The present findings points to an important mechanism by which decision information can become distorted in retrospect, and emphasize the difficulties of improving future decision-making by contemplating past decisions. The results also indicate that decisions may generate false memories in the apparent absence of external suggestion or misleading information. Implications for decision-making theory, and applied practices are discussed. © 2013 Scandinavian Psychological Associations and John Wiley & Sons Ltd.
Global Change Data Center: Mission, Organization, Major Activities, and 2001 Highlights
NASA Technical Reports Server (NTRS)
Wharton, Stephen W. (Technical Monitor)
2002-01-01
Rapid efficient access to Earth sciences data is fundamental to the Nation's efforts to understand the effects of global environmental changes and their implications for public policy. It becomes a bigger challenge in the future when data volumes increase further and missions with constellations of satellites start to appear. Demands on data storage, data access, network throughput, processing power, and database and information management are increased by orders of magnitude, while budgets remain constant and even shrink. The Global Change Data Center's (GCDC) mission is to provide systems, data products, and information management services to maximize the availability and utility of NASA's Earth science data. The specific objectives are (1) support Earth science missions be developing and operating systems to generate, archive, and distribute data products and information; (2) develop innovative information systems for processing, archiving, accessing, visualizing, and communicating Earth science data; and (3) develop value-added products and services to promote broader utilization of NASA Earth Sciences Enterprise (ESE) data and information. The ultimate product of GCDC activities is access to data and information to support research, education, and public policy.
Assessment of source probabilities for potential tsunamis affecting the U.S. Atlantic coast
Geist, E.L.; Parsons, T.
2009-01-01
Estimating the likelihood of tsunamis occurring along the U.S. Atlantic coast critically depends on knowledge of tsunami source probability. We review available information on both earthquake and landslide probabilities from potential sources that could generate local and transoceanic tsunamis. Estimating source probability includes defining both size and recurrence distributions for earthquakes and landslides. For the former distribution, source sizes are often distributed according to a truncated or tapered power-law relationship. For the latter distribution, sources are often assumed to occur in time according to a Poisson process, simplifying the way tsunami probabilities from individual sources can be aggregated. For the U.S. Atlantic coast, earthquake tsunami sources primarily occur at transoceanic distances along plate boundary faults. Probabilities for these sources are constrained from previous statistical studies of global seismicity for similar plate boundary types. In contrast, there is presently little information constraining landslide probabilities that may generate local tsunamis. Though there is significant uncertainty in tsunami source probabilities for the Atlantic, results from this study yield a comparative analysis of tsunami source recurrence rates that can form the basis for future probabilistic analyses.
A Review of Current Neuromorphic Approaches for Vision, Auditory, and Olfactory Sensors
Vanarse, Anup; Osseiran, Adam; Rassau, Alexander
2016-01-01
Conventional vision, auditory, and olfactory sensors generate large volumes of redundant data and as a result tend to consume excessive power. To address these shortcomings, neuromorphic sensors have been developed. These sensors mimic the neuro-biological architecture of sensory organs using aVLSI (analog Very Large Scale Integration) and generate asynchronous spiking output that represents sensing information in ways that are similar to neural signals. This allows for much lower power consumption due to an ability to extract useful sensory information from sparse captured data. The foundation for research in neuromorphic sensors was laid more than two decades ago, but recent developments in understanding of biological sensing and advanced electronics, have stimulated research on sophisticated neuromorphic sensors that provide numerous advantages over conventional sensors. In this paper, we review the current state-of-the-art in neuromorphic implementation of vision, auditory, and olfactory sensors and identify key contributions across these fields. Bringing together these key contributions we suggest a future research direction for further development of the neuromorphic sensing field. PMID:27065784
Evaluation of Factors that Influence Residential Solar Panel Installations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morton, April M.; Omitaomu, Olufemi A.; Kotikot, Susan M.
Though rooftop photovoltaic (PV) systems are the fastest growing source of distributed generation, detailed information about where they are located and who their owners are is often known only to installers and utility companies. This lack of detailed information is a barrier to policy and financial assessment of solar energy generation and use. To bridge the described data gap, Oak Ridge National Laboratory (ORNL) was sponsored by the Department of Energy (DOE) Office of Energy Policy and Systems Analysis (EPSA) to create an automated approach for detecting and characterizing buildings with installed solar panels using high-resolution overhead imagery. Additionally, ORNLmore » was tasked with using machine learning techniques to classify parcels on which solar panels were automatically detected in the Washington, DC, and Boston areas as commercial or residential, and then providing a list of recommended variables and modeling techniques that could be combined with these results to identify attributes that motivate the installation of residential solar panels. This technical report describes the methodology, results, and recommendations in greater detail, including lessons learned and future work.« less
NASA Astrophysics Data System (ADS)
Berkman, P. A.
2005-12-01
The World Data Center system emerged in 1957-58 with the International Geophysical Year (which was renamed from the 3rd International Polar Year) to preserve and provide access to scientific data collected from observational programs throughout the Earth system. Fast forward a half century ... access to diverse digital information has become effectively infinite and instantaneous with nearly 20,000 petabytes of information produced and stored on print, optical and magnetic media each year; microprocessor speeds that have increased 5 orders of magnitude since 1972; existence of the Internet; increasing global capacity to collect and transmit information via satellites; availability of powerful search engines; and proliferation of data warehouses like the World Data Centers. The problem is that we already have reached the threshold in our world information society when accessing more information does not equate with generating more knowledge. In 2007-08, the International Council of Science and World Meteorological Organization will convene the next International Polar Year to accelerate our understanding of how the polar regions respond to, amplify and drive changes elsewhere in the Earth system (http://www.ipy.org). Beyond Earth system science, strategies and tools for integrating digital information to discover meaningful relationships among the disparate data would have societal benefits from boardrooms to classrooms. In the same sense that human-launched satellites became a strategic focus that justified national investments in the International Geophysical Year, developing the next generation of knowledge discovery tools is an opportunity for the International Polar Year 2007-08 and its affiliated programs to contribute in an area that is critical to the future of our global community. Knowledge is the common wealth of humanity. H.E. Mr. Adama Samassekou President, World Summit on the Information Society
Raisaro, Jean-Louis; McLaren, Paul J; Fellay, Jacques; Cavassini, Matthias; Klersy, Catherine; Hubaux, Jean-Pierre
2018-03-01
Protecting patient privacy is a major obstacle for the implementation of genomic-based medicine. Emerging privacy-enhancing technologies can become key enablers for managing sensitive genetic data. We studied physicians' attitude toward this kind of technology in order to derive insights that might foster their future adoption for clinical care. We conducted a questionnaire-based survey among 55 physicians of the Swiss HIV Cohort Study who tested the first implementation of a privacy-preserving model for delivering genomic test results. We evaluated their feedback on three different aspects of our model: clinical utility, ability to address privacy concerns and system usability. 38/55 (69%) physicians participated in the study. Two thirds of them acknowledged genetic privacy as a key aspect that needs to be protected to help building patient trust and deploy new-generation medical information systems. All of them successfully used the tool for evaluating their patients' pharmacogenomics risk and 90% were happy with the user experience and the efficiency of the tool. Only 8% of physicians were unsatisfied with the level of information and wanted to have access to the patient's actual DNA sequence. This survey, although limited in size, represents the first evaluation of privacy-preserving models for genomic-based medicine. It has allowed us to derive unique insights that will improve the design of these new systems in the future. In particular, we have observed that a clinical information system that uses homomorphic encryption to provide clinicians with risk information based on sensitive genetic test results can offer information that clinicians feel sufficient for their needs and appropriately respectful of patients' privacy. The ability of this kind of systems to ensure strong security and privacy guarantees and to provide some analytics on encrypted data has been assessed as a key enabler for the management of sensitive medical information in the near future. Providing clinically relevant information to physicians while protecting patients' privacy in order to comply with regulations is crucial for the widespread use of these new technologies. Copyright © 2017. Published by Elsevier Inc.
A self-sensing magnetorheological damper with power generation
NASA Astrophysics Data System (ADS)
Chen, Chao; Liao, Wei-Hsin
2012-02-01
Magnetorheological (MR) dampers are promising for semi-active vibration control of various dynamic systems. In the current MR damper systems, a separate power supply and dynamic sensor are required. To enable the MR damper to be self-powered and self-sensing in the future, in this paper we propose and investigate a self-sensing MR damper with power generation, which integrates energy harvesting, dynamic sensing and MR damping technologies into one device. This MR damper has self-contained power generation and velocity sensing capabilities, and is applicable to various dynamic systems. It combines the advantages of energy harvesting—reusing wasted energy, MR damping—controllable damping force, and sensing—providing dynamic information for controlling system dynamics. This multifunctional integration would bring great benefits such as energy saving, size and weight reduction, lower cost, high reliability, and less maintenance for the MR damper systems. In this paper, a prototype of the self-sensing MR damper with power generation was designed, fabricated, and tested. Theoretical analyses and experimental studies on power generation were performed. A velocity-sensing method was proposed and experimentally validated. The magnetic-field interference among three functions was prevented by a combined magnetic-field isolation method. Modeling, analysis, and experimental results on damping forces are also presented.
NASA Astrophysics Data System (ADS)
Odaka, Shigeru; Kurihara, Yoshimasa
2016-05-01
We have developed an event generator for direct-photon production in hadron collisions, including associated 2-jet production in the framework of the GR@PPA event generator. The event generator consistently combines γ + 2-jet production processes with the lowest-order γ + jet and photon-radiation (fragmentation) processes from quantum chromodynamics (QCD) 2-jet production using a subtraction method. The generated events can be fed to general-purpose event generators to facilitate the addition of hadronization and decay simulations. Using the obtained event information, we can simulate photon isolation and hadron-jet reconstruction at the particle (hadron) level. The simulation reasonably reproduces measurement data obtained at the large hadron collider (LHC) concerning not only the inclusive photon spectrum, but also the correlation between the photon and jet. The simulation implies that the contribution of the γ + 2-jet is very large, especially in low photon-pT ( ≲ 50 GeV) regions. Discrepancies observed at low pT, although marginal, may indicate the necessity for the consideration of further higher-order processes. Unambiguous particle-level definition of the photon-isolation condition for the signal events is desired to be given explicitly in future measurements.
Production of gravitational waves during preheating with nonminimal coupling
NASA Astrophysics Data System (ADS)
Fu, Chengjie; Wu, Puxun; Yu, Hongwei
2018-04-01
We study the preheating and the in-process production of gravitational waves (GWs) after inflation in which the inflaton is nonminimally coupled to the curvature in a self-interacting quartic potential with the method of lattice simulation. We find that the nonminimal coupling enhances the amplitude of the density spectrum of inflaton quanta, and as a result, the peak value of the GW spectrum generated during preheating is enhanced as well and might reach the limit of detection in future GW experiments. The peaks of the GW spectrum not only exhibit distinctive characteristics as compared to those of minimally coupled inflaton potentials but also imprint information on the nonminimal coupling and the parametric resonance, and thus the detection of these peaks in the future will provide us a new avenue to reveal the physics of the early universe.
Blue Sky Funders Forum - Advancing Environmental Literacy through Funder Collaboration
NASA Astrophysics Data System (ADS)
Chen, A.
2015-12-01
The Blue Sky Funders Forum inspires, deepens, and expands private funding and philanthropic leadership to promote learning opportunities that connect people and nature and promote environmental literacy. Being prepared for the future requires all of us to understand the consequences of how we live on where we live - the connection between people and nature. Learning about the true meaning of that connection is a process that starts in early childhood and lasts a lifetime. Blue Sky brings supporters of this work together to learn from one another and to strategize how to scale up the impact of the effective programs that transform how people interact with their surroundings. By making these essential learning opportunities more accessible in all communities, we broaden and strengthen the constituency that makes well-informed choices, balancing the needs of today with the needs of future generations.
A postmortem and future look at the personality disorders in DSM-5.
Widiger, Thomas A
2013-10-01
It might seem difficult to describe the outcome of the proposals by the American Psychiatric Association's (APA) Personality and Personality Disorders Work Group (PPDWG) to be a success, given that all of the proposals were ultimately rejected. Nevertheless, one can interpret the result as a step forward, because the final outcome might not have been much different if a more conservative approach was adopted at the outset. The PPDWG did provide a significant contribution to the field through the provision of proposals that will likely generate a considerable body of informative research. The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) effort and suggestions for the future are discussed with respect to magnitude of change, documentation of empirical support, and addressing opposition. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Can An Evolutionary Process Create English Text?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, David H.
Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed tomore » produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).« less
De Brigard, Felipe; Rodriguez, Diana Carolina; Montañés, Patricia
2017-05-01
Although extant evidence suggests that many neural and cognitive mechanisms underlying episodic past, future, and counterfactual thinking overlap, recent results have uncovered differences among these three processes. However, the extent to which there may be age-related differences in the phenomenological characteristics associated with episodic past, future and counterfactual thinking remains unclear. This study used adapted versions of the Memory Characteristics Questionnaire and the Autobiographical Interview in younger and older adults to investigate the subjective experience of episodic past, future and counterfactual thinking. The results suggest that, across all conditions, younger adults generated more internal details than older adults. However, older adults generated more external details for episodic future and counterfactual thinking than younger adults. Additionally, younger and older adults generated more internal details, and gave higher sensory and contextual ratings, for memories rather than future and counterfactual thoughts. Methodological and theoretical consequences for extant theories of mental simulation are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.
The economics (or lack thereof) of aerosol geoengineering
NASA Astrophysics Data System (ADS)
Goes, M.; Keller, K.; Tuana, N.
2009-04-01
Anthropogenic greenhouse gas emissions are changing the Earth's climate and impose substantial risks for current and future generations. What are scientifically sound, economically viable, and ethically defendable strategies to manage these climate risks? Ratified international agreements call for a reduction of greenhouse gas emissions to avoid dangerous anthropogenic interference with the climate system. Recent proposals, however, call for the deployment of a different approach: to geoengineer climate by injecting aerosol precursors into the stratosphere. Published economic studies typically suggest that substituting aerosol geoengineering for abatement of carbon dioxide emissions results in large net monetary benefits. However, these studies neglect the risks of aerosol geoengineering due to (i) the potential for future geoengineering failures and (ii) the negative impacts associated with the aerosol forcing. Here we use a simple integrated assessment model of climate change to analyze potential economic impacts of aerosol geoengineering strategies over a wide range of uncertain parameters such as climate sensitivity, the economic damages due to climate change, and the economic damages due to aerosol geoengineering forcing. The simplicity of the model provides the advantages of parsimony and transparency, but it also imposes severe caveats on the interpretation of the results. For example, the analysis is based on a globally aggregated model and is hence silent on the question of intragenerational distribution of costs and benefits. In addition, the analysis neglects the effects of endogenous learning about the climate system. We show that the risks associated with a future geoengineering failure and negative impacts of aerosol forcings can cause geoenginering strategies to fail an economic cost-benefit test. One key to this finding is that a geoengineering failure would lead to dramatic and abrupt climatic changes. The monetary damages due to this failure can dominate the cost-benefit analysis because the monetary damages of climate change are expected to increase with the rate of change. Substituting aerosol geoengineering for greenhouse gas emission abatement might fail not only an economic cost-benefit test but also an ethical test of distributional justice. Substituting aerosol geoengineering for greenhouse gas emissions abatements constitutes a conscious risk transfer to future generations. Intergenerational justice demands distributional justice, namely that present generations may not create benefits for themselves in exchange for burdens on future generations. We use the economic model to quantify this risk transfer to better inform the judgment of whether substituting aerosol geoengineering for carbon dioxide emission abatement fails this ethical test.
Contrast-enhanced endoscopic ultrasonography in digestive diseases.
Hirooka, Yoshiki; Itoh, Akihiro; Kawashima, Hiroki; Ohno, Eizaburo; Itoh, Yuya; Nakamura, Yosuke; Hiramatsu, Takeshi; Sugimoto, Hiroyuki; Sumi, Hajime; Hayashi, Daijiro; Ohmiya, Naoki; Miyahara, Ryoji; Nakamura, Masanao; Funasaka, Kohei; Ishigami, Masatoshi; Katano, Yoshiaki; Goto, Hidemi
2012-10-01
Contrast-enhanced endoscopic ultrasonography (CE-EUS) was introduced in the early 1990s. The concept of the injection of carbon dioxide microbubbles into the hepatic artery as a contrast material (enhanced ultrasonography) led to "endoscopic ultrasonographic angiography". After the arrival of the first-generation contrast agent, high-frequency (12 MHz) EUS brought about the enhancement of EUS images in the diagnosis of pancreatico-biliary diseases, upper gastrointestinal (GI) cancer, and submucosal tumors. The electronic scanning endosonoscope with both radial and linear probes enabled the use of high-end ultrasound machines and depicted the enhancement of both color/power Doppler flow-based imaging and harmonic-based imaging using second-generation contrast agents. Many reports have described the usefulness of the differential diagnosis of pancreatic diseases and other abdominal lesions. Quantitative evaluation of CE-EUS images was an objective method of diagnosis using the time-intensity curve (TIC), but it was limited to the region of interest. Recently developed Inflow Time Mapping™ can be generated from stored clips and used to display the pattern of signal enhancement with time after injection, offering temporal difference of contrast agents and improved tumor characterization. On the other hand, three-dimensional CE-EUS images added new information to the literature, but lacked positional information. Three-dimensional CE-EUS with accurate positional information is awaited. To date, most reports have been related to pancreatic lesions or lymph nodes. Hemodynamic analysis might be of use for diseases in other organs: upper GI cancer diagnosis, submucosal tumors, and biliary disorders, and it might also provide functional information. Studies of CE-EUS in diseases in many other organs will increase in the near future.
Episodic simulation of future events is impaired in mild Alzheimer's disease
Addis, Donna Rose; Sacchetti, Daniel C.; Ally, Brandon A.; Budson, Andrew E.; Schacter, Daniel L.
2009-01-01
Recent neuroimaging studies have demonstrated that both remembering the past and simulating the future activate a core neural network including the medial temporal lobes. Regions of this network, in particular the medial temporal lobes, are prime sites for amyloid deposition and are structurally and functionally compromised in Alzheimer's disease (AD). While we know some functions of this core network, specifically episodic autobiographical memory, are impaired in AD, no study has examined whether future episodic simulation is similarly impaired. We tested the ability of sixteen AD patients and sixteen age-matched controls to generate past and future autobiographical events using an adapted version of the Autobiographical Interview. Participants also generated five remote autobiographical memories from across the lifespan. Event transcriptions were segmented into distinct details, classified as either internal (episodic) or external (non-episodic). AD patients exhibited deficits in both remembering past events and simulating future events, generating fewer internal and external episodic details than healthy older controls. The internal and external detail scores were strongly correlated across past and future events, providing further evidence of the close linkages between the mental representations of past and future. PMID:19497331
Future trends in computer waste generation in India.
Dwivedy, Maheshwar; Mittal, R K
2010-11-01
The objective of this paper is to estimate the future projection of computer waste in India and to subsequently analyze their flow at the end of their useful phase. For this purpose, the study utilizes the logistic model-based approach proposed by Yang and Williams to forecast future trends in computer waste. The model estimates future projection of computer penetration rate utilizing their first lifespan distribution and historical sales data. A bounding analysis on the future carrying capacity was simulated using the three parameter logistic curve. The observed obsolete generation quantities from the extrapolated penetration rates are then used to model the disposal phase. The results of the bounding analysis indicate that in the year 2020, around 41-152 million units of computers will become obsolete. The obsolete computer generation quantities are then used to estimate the End-of-Life outflows by utilizing a time-series multiple lifespan model. Even a conservative estimate of the future recycling capacity of PCs will reach upwards of 30 million units during 2025. Apparently, more than 150 million units could be potentially recycled in the upper bound case. However, considering significant future investment in the e-waste recycling sector from all stakeholders in India, we propose a logistic growth in the recycling rate and estimate the requirement of recycling capacity between 60 and 400 million units for the lower and upper bound case during 2025. Finally, we compare the future obsolete PC generation amount of the US and India. Copyright © 2010 Elsevier Ltd. All rights reserved.
Progress on the Development of Future Airport Surface Wireless Communications Network
NASA Technical Reports Server (NTRS)
Kerczewski, Robert J.; Budinger, James M.; Brooks, David E.; Franklin, Morgan; DeHart, Steve; Dimond, Robert P.; Borden, Michael
2009-01-01
Continuing advances in airport surface management and improvements in airport surface safety are required to enable future growth in air traffic throughout the airspace, as airport arrival and departure delays create a major system bottleneck. These airport management and safety advances will be built upon improved communications, navigation, surveillance, and weather sensing, creating an information environment supporting system automation. The efficient movement of the digital data generated from these systems requires an underlying communications network infrastructure to connect data sources with the intended users with the required quality of service. Current airport surface communications consists primarily of buried copper or fiber cable. Safety related communications with mobile airport surface assets occurs over 25 kHz VHF voice and data channels. The available VHF spectrum, already congested in many areas, will be insufficient to support future data traffic requirements. Therefore, a broadband wireless airport surface communications network is considered a requirement for the future airport component of the air transportation system. Progress has been made on defining the technology and frequency spectrum for the airport surface wireless communications network. The development of a test and demonstration facility and the definition of required testing and standards development are now underway. This paper will review the progress and planned future work.
Renewable Electricity Futures Study Executive Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mai, Trieu; Sandor, Debra; Wiser, Ryan
2012-12-01
The Renewable Electricity Futures Study (RE Futures) provides an analysis of the grid integration opportunities, challenges, and implications of high levels of renewable electricity generation for the U.S. electric system. The study is not a market or policy assessment. Rather, RE Futures examines renewable energy resources and many technical issues related to the operability of the U.S. electricity grid, and provides initial answers to important questions about the integration of high penetrations of renewable electricity technologies from a national perspective. RE Futures results indicate that a future U.S. electricity system that is largely powered by renewable sources is possible andmore » that further work is warranted to investigate this clean generation pathway.« less
CW-pumped telecom band polarization entangled photon pair generation in a Sagnac interferometer.
Li, Yan; Zhou, Zhi-Yuan; Ding, Dong-Sheng; Shi, Bao-Sen
2015-11-02
Polarization entangled photon pair source is widely used in many quantum information processing applications such as teleportation, quantum communications, quantum computation and high precision quantum metrology. We report on the generation of a continuous-wave pumped 1550 nm polarization entangled photon pair source at telecom wavelength using a type-II periodically poled KTiOPO(4) (PPKTP) crystal in a Sagnac interferometer. Hong-Ou-Mandel (HOM) interference measurement yields signal and idler photon bandwidth of 2.4 nm. High quality of entanglement is verified by various kinds of measurements, for example two-photon interference fringes, Bell inequality and quantum states tomography. The source can be tuned over a broad range against temperature or pump power without loss of visibilities. This source will be used in our future experiments such as generation of orbital angular momentum entangled source at telecom wavelength for quantum frequency up-conversion, entanglement based quantum key distributions and many other quantum optics experiments at telecom wavelengths.
Use of Open Architecture Middleware for Autonomous Platforms
NASA Astrophysics Data System (ADS)
Naranjo, Hector; Diez, Sergio; Ferrero, Francisco
2011-08-01
Network Enabled Capabilities (NEC) is the vision for next-generation systems in the defence domain formulated by governments, the European Defence Agency (EDA) and the North Atlantic Treaty Organization (NATO). It involves the federation of military information systems, rather than just a simple interconnection, to provide each user with the "right information, right place, right time - and not too much". It defines openness, standardization and flexibility principles in military systems, likewise applicable in the civilian space applications.This paper provides the conclusions drawn from "Architecture for Embarked Middleware" (EMWARE) study, funded by the European Defence Agency (EDA).The aim of the EMWARE project was to provide the information and understanding to facilitate the adoption of informed decisions regarding the specification and implementation of Open Architecture Middleware in future distributed systems, linking it with the NEC goal.EMWARE project included the definition of four business cases, each devoted to a different field of application (Unmanned Aerial Vehicles, Helicopters, Unmanned Ground Vehicles and the Satellite Ground Segment).
Macías Saint-Gerons, Diego; de la Fuente Honrubia, César; de Andrés Trelles, Fernando; Catalá-López, Ferrán Catalá-López
2016-12-01
The arrival of new drug into the market requires many years of previous research along with the need of continuous evaluation throughout the lifetime of the drug. This warrants pharmacoepidemiological research which may be defined as the study of the use and the effects of drugs in large populations. Nowadays this type of research seems more feasible thanks to the massive expansion of the information sources and data (e.g: clinical patient registries, electronic medical records). However there is a risk of information overload, fragmented evidence and given the enthusiasm aroused by the "Big Data", it must be emphasized that its nature is mainly observational, and therefore subject to bias and confusion. The application of epidemiological methods in this scenario seems essential for any analysis. In short, the management and use of these data sources to generate useful information expansion is the next challenge for the application of research methods in modern pharmacoepidemiology.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-31
... COMMODITY FUTURES TRADING COMMISSION Agency Information Collection Activities: Notice of Intent to Renew Collection, Futures Volume, Open Interest, Price, Deliveries and Exchange of Futures for Physicals AGENCY: Commodity Futures Trading Commission. ACTION: Notice. SUMMARY: The Commodity Futures Trading...
Perspectives on the future of the electric utility industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonn, B.; Schaffhauser, A.
1994-04-01
This report offers perspectives on the future of the electric utility industry. These perspectives will be used in further research to assess the prospects for Integrated Resource Planning (IRP). The perspectives are developed first by examining economic, political and regulatory, societal, technological, and environmental trends that are (1) national and global in scope and (2) directly related to the electric utility industry. Major national and global trends include increasing global economic competition, increasing political and ethnic strife, rapidly changing technologies, and increasing worldwide concern about the environment. Major trends in the utility industry include increasing competition in generation; changing patternsmore » of electricity demand; increasing use of information technology to control power systems; and increasing implementation of environmental controls. Ways in which the national and global trends may directly affect the utility industry are also explored. The trends are used to construct three global and national scenarios- ``business as usual,`` ``technotopia future,`` and ``fortress state`` -and three electric utility scenarios- ``frozen in headlights,`` ``megaelectric,`` and ``discomania.`` The scenarios are designed to be thought provoking descriptions of potential futures, not predictions of the future, although three key variables are identified that will have significant impacts on which future evolves-global climate change, utility technologies, and competition. While emphasis needs to be placed on understanding the electric utility scenarios, the interactions between the two sets of scenarios is also of interest.« less
O'Grady, Laura A; Witteman, Holly; Wathen, C Nadine
2008-01-01
Background First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. Results In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. Conclusion An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided. PMID:19087353
O'Grady, Laura A; Witteman, Holly; Wathen, C Nadine
2008-12-16
First generation Internet technologies such as mailing lists or newsgroups afforded unprecedented levels of information exchange within a variety of interest groups, including those who seek health information. With emergence of the World Wide Web many communication applications were ported to web browsers. One of the driving factors in this phenomenon has been the exchange of experiential or anecdotal knowledge that patients share online, and there is emerging evidence that participation in these forums may be having an impact on people's health decision making. Theoretical frameworks supporting this form of information seeking and learning have yet to be proposed. In this article, we propose an adaptation of Kolb's experiential learning theory to begin to formulate an experiential health information processing model that may contribute to our understanding of online health information seeking behaviour in this context. An experiential health information processing model is proposed that can be used as a research framework. Future research directions include investigating the utility of this model in the online health information seeking context, studying the impact of collaborating in these online environments on patient decision making and on health outcomes are provided.
Funk, Eric; Riddell, Jeff; Ankel, Felix; Cabrera, Daniel
2018-06-12
Health professions educators face multiple challenges, among them the need to adapt educational methods to new technologies. In the last decades multiple new digital platforms have appeared in the learning arena, including massive open online courses and social media-based education. The major critique of these novel methods is the lack of the ability to ascertain the origin, validity, and accountability of the knowledge that is created, shared, and acquired. Recently, a novel technology based on secured data storage and transmission, called blockchain, has emerged as a way to generate networks where validity, trust, and accountability can be created. Conceptually blockchain is an open, public, distributed, and secure digital registry where information transactions are secured and have a clear origin, explicit pathways, and concrete value. Health professions education based on the blockchain will potentially allow improved tracking of content and the individuals who create it, quantify educational impact on multiple generations of learners, and build a relative value of educational interventions. Furthermore, institutions adopting blockchain technology would be able to provide certification and credentialing of healthcare professionals with no intermediaries. There is potential for blockchain to significantly change the future of health professions education and radically transform how patients, professionals, educators, and learners interact around safe, valid, and accountable information.
Mission-Oriented Sensor Arrays and UAVs - a Case Study on Environmental Monitoring
NASA Astrophysics Data System (ADS)
Figueira, N. M.; Freire, I. L.; Trindade, O.; Simões, E.
2015-08-01
This paper presents a new concept of UAV mission design in geomatics, applied to the generation of thematic maps for a multitude of civilian and military applications. We discuss the architecture of Mission-Oriented Sensors Arrays (MOSA), proposed in Figueira et Al. (2013), aimed at splitting and decoupling the mission-oriented part of the system (non safety-critical hardware and software) from the aircraft control systems (safety-critical). As a case study, we present an environmental monitoring application for the automatic generation of thematic maps to track gunshot activity in conservation areas. The MOSA modeled for this application integrates information from a thermal camera and an on-the-ground microphone array. The use of microphone arrays technology is of particular interest in this paper. These arrays allow estimation of the direction-of-arrival (DOA) of the incoming sound waves. Information about events of interest is obtained by the fusion of the data provided by the microphone array, captured by the UAV, fused with information from the termal image processing. Preliminary results show the feasibility of the on-the-ground sound processing array and the simulation of the main processing module, to be embedded into an UAV in a future work. The main contributions of this paper are the proposed MOSA system, including concepts, models and architecture.
Bridge over troubled waters: A Synthesis Session to connect ...
Lack of access to relevant scientific data has limited decision makers from incorporating scientific information into their management and policy schemes. Yet, there is increasing interest among decision makers and scientists to integrate coastal and marine science into the policy and management process. Strategies designed to build communication between decision makers and scientists can be an effective means to disseminate and/or generate policy relevant scientific information. Here researchers develop, test, and present a workshop model designed to bridge the gap between coastal and marine decision makers and scientists. Researchers identify successful components of such a workshop as well as areas for improvement and recommendations to design and conduct similar workshops in the future. This novel workshop format can be used in other fora to effectively connect decision makers and scientists, and to initiate an iterative process to generate and transfer policy relevant scientific information into evidence-based decisions, an important element in protecting coastal and marine resources. In this paper we develop and present a model for increasing collaboration between scientists and decision makers to promote evidence based decisions. Successes and areas for improvement in the tested model are discussed. This novel workshop model is intended to build and sustain connections, with the ultimate goal of creating better policy and management practices. In a recent
Information literacy: using LISTEN project strategies to equip nurses worldwide.
Patterson, Ramona; Carter-Templeton, Heather; Russell, Cynthia
2009-01-01
The 21st century presents a major challenge in the form of information overload. In a profession where new knowledge is ever expanding, nurse educators must equip nurses to find the information they need to provide safe evidence-based care. Information literacy and information technology competencies have become a priority in nursing education, but inconsistencies in definitions, frameworks, content, and design, combined with ill-equipped faculty have hindered the development of a transferable model geared toward improving nurses' information literacy. Challenges are compounded for nurses in developing nations, where access to information and training for information literacy are both problematic. This paper describes experiences from the LISTEN project, during the 1st year of a 3-year funded Nurse Education Practice and Retention grant. Designed to improve information literacy competencies of student and workforce nurses, using individualized learning via interactive web-based modules, LISTEN provides on its' website a Did You Know video dramatizing the importance of information literacy to nurses, and offers resources for information literacy, information technology, and evidence-based nursing practice. Preliminary findings from beta testing reveal the module content is realistic, complete, and logical. The website and video have generated worldwide interest. Future possibilities include nationwide implementation and adaptation for the international arena.
Predictive Technologies: Can Smart Tools Augment the Brain's Predictive Abilities?
Pezzulo, Giovanni; D'Ausilio, Alessandro; Gaggioli, Andrea
2016-01-01
The ability of “looking into the future”—namely, the capacity of anticipating future states of the environment or of the body—represents a fundamental function of human (and animal) brains. A goalkeeper who tries to guess the ball's direction; a chess player who attempts to anticipate the opponent's next move; or a man-in-love who tries to calculate what are the chances of her saying yes—in all these cases, people are simulating possible future states of the world, in order to maximize the success of their decisions or actions. Research in neuroscience is showing that our ability to predict the behavior of physical or social phenomena is largely dependent on the brain's ability to integrate current and past information to generate (probabilistic) simulations of the future. But could predictive processing be augmented using advanced technologies? In this contribution, we discuss how computational technologies may be used to support, facilitate or enhance the prediction of future events, by considering exemplificative scenarios across different domains, from simpler sensorimotor decisions to more complex cognitive tasks. We also examine the key scientific and technical challenges that must be faced to turn this vision into reality. PMID:27199648
DMT-TAFM: a data mining tool for technical analysis of futures market
NASA Astrophysics Data System (ADS)
Stepanov, Vladimir; Sathaye, Archana
2002-03-01
Technical analysis of financial markets describes many patterns of market behavior. For practical use, all these descriptions need to be adjusted for each particular trading session. In this paper, we develop a data mining tool for technical analysis of the futures markets (DMT-TAFM), which dynamically generates rules based on the notion of the price pattern similarity. The tool consists of three main components. The first component provides visualization of data series on a chart with different ranges, scales, and chart sizes and types. The second component constructs pattern descriptions using sets of polynomials. The third component specifies the training set for mining, defines the similarity notion, and searches for a set of similar patterns. DMT-TAFM is useful to prepare the data, and then reveal and systemize statistical information about similar patterns found in any type of historical price series. We performed experiments with our tool on three decades of trading data fro hundred types of futures. Our results for this data set shows that, we can prove or disprove many well-known patterns based on real data, as well as reveal new ones, and use the set of relatively consistent patterns found during data mining for developing better futures trading strategies.
Patient Health Record Systems Scope and Functionalities: Literature Review and Future Directions
2017-01-01
Background A new generation of user-centric information systems is emerging in health care as patient health record (PHR) systems. These systems create a platform supporting the new vision of health services that empowers patients and enables patient-provider communication, with the goal of improving health outcomes and reducing costs. This evolution has generated new sets of data and capabilities, providing opportunities and challenges at the user, system, and industry levels. Objective The objective of our study was to assess PHR data types and functionalities through a review of the literature to inform the health care informatics community, and to provide recommendations for PHR design, research, and practice. Methods We conducted a review of the literature to assess PHR data types and functionalities. We searched PubMed, Embase, and MEDLINE databases from 1966 to 2015 for studies of PHRs, resulting in 1822 articles, from which we selected a total of 106 articles for a detailed review of PHR data content. Results We present several key findings related to the scope and functionalities in PHR systems. We also present a functional taxonomy and chronological analysis of PHR data types and functionalities, to improve understanding and provide insights for future directions. Functional taxonomy analysis of the extracted data revealed the presence of new PHR data sources such as tracking devices and data types such as time-series data. Chronological data analysis showed an evolution of PHR system functionalities over time, from simple data access to data modification and, more recently, automated assessment, prediction, and recommendation. Conclusions Efforts are needed to improve (1) PHR data quality through patient-centered user interface design and standardized patient-generated data guidelines, (2) data integrity through consolidation of various types and sources, (3) PHR functionality through application of new data analytics methods, and (4) metrics to evaluate clinical outcomes associated with automated PHR system use, and costs associated with PHR data storage and analytics. PMID:29141839
Jeunehomme, Olivier; D'Argembeau, Arnaud
2016-01-01
Recent research suggests that episodic future thoughts can be formed through the same dual mechanisms, direct and generative, as autobiographical memories. However, the prevalence and determinants of the direct production of future event representations remain unclear. Here, we addressed this issue by collecting self-reports of production modes, response times (RTs), and verbal protocols for the production past and future events in the word cueing paradigm. Across three experiments, we found that both past and future events were frequently reported to come directly to mind in response to the cue, and RTs confirmed that events were produced faster for direct than for generative responses. When looking at the determinants of direct responses, we found that most past and future events that were directly produced had already been thought of on a previous occasion, and the frequency of previous thoughts predicted the occurrence of direct access. The direct production of autobiographical thoughts was also more frequent for past and future events that were judged important and emotionally intense. Collectively, these findings provide novel evidence that the direct production of episodic future thoughts is frequent in the word cueing paradigm and often involves the activation of personally significant "memories of the future."
de Cates, Angharad N.; Broome, Matthew R.
2016-01-01
Over 800,000 people die by suicide each year globally, with non-fatal self-harm 20 times more common. With each episode of self-harm, the risks of future self-harm and suicide increase, as well as personal and healthcare costs. Therefore, early delineation of those at high risk of future self-harm is important. Historically, research has focused on clinical and demographic factors, but risk assessments based on these have low sensitivity to predict repetition. Various neurocognitive factors have been associated with self-harming behavior, but it is less certain if we can use these factors clinically (i) as risk markers to predict future self-harm and (ii) to become therapeutic targets for interventions. Recent systematic reviews and meta-analyses of behavioral tasks and fMRI studies point to an emerging hypothesis for neurocognition in self-harm: an underactive pre-frontal cortex is unable to respond appropriately to non-emotional stimuli, or inhibit a hyperactive emotionally-/threat-driven limbic system. However, there is almost no imaging data examining repetition of self-harm. Extrapolating from the non-repetition data, there may be several potential neurocognitive targets for interventions to prevent repeat self-harm: cognitive training; pharmacological regimes to promote non-emotional neurocognition; or other techniques, such as repetitive transcranial magnetic stimulation. Hence, there is an urgent need for imaging studies examining repetition and to test specific hypotheses. Until we investigate the functional neurocognitive basis underlying repetition of self-harm in a systematic manner using second-generational imaging techniques, we will be unable to inform third-generational imaging and potential future clinical applications. PMID:26858659
Changing Course: navigating the future of the Lower Mississippi River
NASA Astrophysics Data System (ADS)
Cochran, S.
2016-02-01
Changing Course is a design competition to reimagine a more sustainable Lower Mississippi River Delta, bringing teams together from around the world to create innovative visions for one of America's greatest natural resources. Building off of Louisiana's Coastal Master Plan, and answering a key question from that plan, three winning teams (Baird & Associates, Moffatt & Nichol and Studio Misi-Ziibi) have generated designs for how the Mississippi River's water and sediment can be used to maximize rebuilding of delta wetlands while also continuing to meet the needs of navigation, flood protection, and coastal industries and communities. While each of the winning teams offered a different vision, all three identified the same key requirements as critical to sustaining the Mississippi River Delta today and into the future: Reconnecting the Mississippi River to its wetlands to help restore southeast Louisiana's first line of defense against powerful storms and rising sea levels. Planning for a more sustainable delta, including a gradual shift in population to create more protected and resilient communities. Protecting and maximizing the region's port and maritime activities, including a deeper more sustainable navigation channel upriver from Southwest Pass. Increasing economic opportunities in a future smaller delta through expanding shipping capacity, coastal restoration infrastructure, outdoor recreation and tourism and commercial fishing. This session will give a high level overview of the design competition process, results and common themes, similarities and differences in their designs, and how the ideas generated will inform coastal stakeholders and official government processes.
IMAGES: An IMage Archive Generated for Exoplanet Surveys
NASA Astrophysics Data System (ADS)
Tanner, A.
2010-10-01
In the past few years, there have been a menagerie of high contrast imaging surveys which have resulted in the detection of the first brown dwarfs orbiting main sequence stars and the first directly imaged exo-planetary systems. While these discoveries are scientifically rewarding, they are rare and the majority of the images collected during these surveys show single target stars. In addition, while papers will report the number of companion non-detections down to a sensitivity limit at a specific distance from the star, the corresponding images are rarely made available to the public. To date, such data exists for over a thousand stars. Thus, we are creating IMAGES, the IMage Archive Generated for Exoplanet Searches, as a repository for high contrast images gathered from published direct imaging sub-stellar and exoplanet companion surveys. This database will serve many purposes such as 1) facilitating common proper motion confirmation for candidate companions, 2) reducing the number of redundant observations of non-detection fields, 3) providing multiplicity precursor information to better select targets for future exoplanet missions, 4) providing stringent limits on the companion fraction of stars for a wide range of age, spectral type and star formation environment, and 5) provide multi-epoch images of stars with known companions for orbital monitoring. This database will be open to the public and will be searchable and sortable and will be extremely useful for future direct imaging programs such as GPI and SPHERE as well as future planet search programs such as JWST and SIM.
NASA Astrophysics Data System (ADS)
Rickels, W.; Visbeck, M.; Kronfeld-Goharani, U.; Neumann, B.; Schmidt, J.; van Doorn, E.; Matz-Lück, N.; Ott, K.; Quaas, M.
2013-12-01
The ocean regulates the global climate, provides humans with natural resources such as food, materials, important substances, and energy, and is essential for international trade and recreational and cultural activities. Together with human development and economic growth, free access to, and availability of, ocean resources and services have exerted strong pressure on marine systems, ranging from overfishing, increasing resource extraction, and alteration of coastal zones to various types of thoughtless pollution. International cooperation and effective governance are required to protect the marine environment and promote the sustainable use of marine resources in such a way that due account can be taken of the environmental values of current generations and the needs of future generations. For this purpose, developing and agreeing on to devote one of the Sustainable Development Goal (SDG) specifically to the Ocean and Coasts could prove to be an essential element. The new SDGs will build upon the Millennium Development Goals (MDGs) and replace them by 2015. Ensuring environmental sustainability in a general sense is one of the eight MDGs, but the ocean is not explicitly addressed. Furthermore, the creation of a comprehensive underlying set of ocean sustainability targets and effective indicators would help in assessing the current status of marine systems, diagnosing ongoing trends, and providing information for inclusive, forward-looking, and sustainable ocean governance. To achieve this, we propose to establish a global Future Ocean Spatial Planning (FOSP) process.
The impact of H2S emissions on future geothermal power generation - The Geysers region, California
NASA Technical Reports Server (NTRS)
Leibowitz, L. P.
1977-01-01
The future potential for geothermal power generation in the Geysers region of California is as much as 10 times the current 502 MW(e) capacity. However, environmental factors such as H2S emissions and institutional considerations may play the primary role in determining the rate and ultimate level of development. In this paper a scenario of future geothermal generation capacity and H2S emissions in the Geysers region is presented. Problem areas associated with H2S emissions, H2S abatement processes, plant operations, and government agency resources are described. The impact of H2S emissions on future development and the views of effected organizations are discussed. Potential actions needed to remove these constraints are summarized.
Palombo, D J; Keane, M M; Verfaellie, M
2016-08-01
The capacity to envision the future plays an important role in many aspects of cognition, including our ability to make optimal, adaptive choices. Past work has shown that the medial temporal lobe (MTL) is necessary for decisions that draw on episodic future thinking. By contrast, little is known about the role of the MTL in decisions that draw on semantic future thinking. Accordingly, the present study investigated whether the MTL contributes to one form of decision making, namely intertemporal choice, when such decisions depend on semantic consideration of the future. In an intertemporal choice task, participants must select either a smaller amount of money that is available in the present or a larger amount of money that would be available at a future date. Amnesic individuals with MTL damage and healthy control participants performed such a task in which, prior to making a choice, they engaged in a semantic generation exercise, wherein they generated items that they would purchase with the future reward. In experiment 1, we found that, relative to a baseline condition involving standard intertemporal choice, healthy individuals were more inclined to select a larger, later reward over a smaller, present reward after engaging in semantic future thinking. By contrast, amnesic participants were paradoxically less inclined to wait for a future reward following semantic future thinking. This finding suggests that amnesics may have had difficulty "tagging" the generated item(s) as belonging to the future. Critically, experiment 2 showed that when the generated items were presented alongside the intertemporal choices, both controls and amnesic participants shifted to more patient choices. These findings suggest that the MTL is not needed for making optimal decisions that draw on semantic future thinking as long as scaffolding is provided to support accurate time tagging. Together, these findings stand to better clarify the role of the MTL in decision making. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weinberg, A.M.
Today, for the first time, scientific concerns are seriously being addressed that span future times--hundreds, even thousands, or more years in the future. One is witnessing what the author calls scientific millenarianism. Are such concerns for the distant future exercises in futility, or are they real issues that, to the everlasting gratitude of future generations, this generation has identified, warned about and even suggested how to cope with in the distant future? Can the four potential catastrophes--bolide impact, CO{sub 2} warming, radioactive wastes and thermonuclear war--be avoided by technical fixes, institutional responses, religion, or by doing nothing? These are themore » questions addressed in this paper.« less
Sensitivity of Regional Hydropower Generation to the Projected Changes in Future Watershed Hydrology
NASA Astrophysics Data System (ADS)
Kao, S. C.; Naz, B. S.; Gangrade, S.
2015-12-01
Hydropower is a key contributor to the renewable energy portfolio due to its established development history and the diverse benefits it provides to the electric power systems. With the projected change in the future watershed hydrology, including shift of snowmelt timing, increasing occurrence of extreme precipitation, and change in drought frequencies, there is a need to investigate how the regional hydropower generation may change correspondingly. To evaluate the sensitivity of watershed storage and hydropower generation to future climate change, a lumped Watershed Runoff-Energy Storage (WRES) model is developed to simulate the annual and seasonal hydropower generation at various hydropower areas in the United States. For each hydropower study area, the WRES model use the monthly precipitation and naturalized (unregulated) runoff as inputs to perform a runoff mass balance calculation for the total monthly runoff storage in all reservoirs and retention facilities in the watershed, and simulate the monthly regulated runoff release and hydropower generation through the system. The WRES model is developed and calibrated using the historic (1980-2009) monthly precipitation, runoff, and generation data, and then driven by a large set of dynamically- and statistically-downscaled Coupled Model Intercomparison Project Phase 5 climate projections to simulate the change of watershed storage and hydropower generation under different future climate scenarios. The results among different hydropower regions, storage capacities, emission scenarios, and timescales are compared and discussed in this study.
Preserving Our Legacy for Future Generations of Educators
ERIC Educational Resources Information Center
Hearn, Colleen Porter; Crabtree, Kacy E.
2008-01-01
Preserving dance history for future generations includes documenting and maintaining the life and work of dance pioneers who today's dance educators can learn from and imitate. This article offers basic guidelines for conducting interviews; preserving valuable documentation, including photographs and recordings; and unearthing forgotten stories…
Gates, Bob; Statham, Mark
2013-10-01
In England, the numbers of learning disability nurses are declining; a need for urgent attention to workforce planning issues has been advocated. This paper considers views of lecturers, students and potential students as legitimate stakeholders for future education commissioning for this field of nursing. This project aimed to undertake a strategic review of learning disability nursing educational commissioning, to provide an 'evidence based' evaluation to inform future strategic commissioning of learning disability nursing for one Health Authority, UK. The project adopted a structured multiple methods approach to generate evidence from a number of data sources, this paper reports on the findings from one method [focus groups] used for two groups of stakeholders. Informants comprised 10 learning disability nursing students studying at a Higher Education Institution, 25 health and social care students studying at a Further Education College, and 6 academic staff from 5 universities; all informants were from the south of England. The method reported on in this paper is focus group methodology. Once completed, transcripts made were read in full, and subjected to content analysis. The process of content analysis led to the development of 11 theoretical categories that describe the multiplicity of views of informants, as to issues of importance for this element of the health workforce. The paper concludes by identifying key messages from these informants. It is suggested that both method and findings have national and international resonance, as stakeholder engagement is a universal issue in health care education commissioning. Copyright © 2013 Elsevier Ltd. All rights reserved.
A Survey on Security and Privacy in Emerging Sensor Networks: From Viewpoint of Close-Loop
Zhang, Lifu; Zhang, Heng
2016-01-01
Nowadays, as the next generation sensor networks, Cyber-Physical Systems (CPSs) refer to the complex networked systems that have both physical subsystems and cyber components, and the information flow between different subsystems and components is across a communication network, which forms a closed-loop. New generation sensor networks are found in a growing number of applications and have received increasing attention from many inter-disciplines. Opportunities and challenges in the design, analysis, verification and validation of sensor networks co-exists, among which security and privacy are two important ingredients. This paper presents a survey on some recent results in the security and privacy aspects of emerging sensor networks from the viewpoint of the closed-loop. This paper also discusses several future research directions under these two umbrellas. PMID:27023559
Scoppettone, G.G.; Johnson, D.M.; Hereford, M.E.; Rissler, Peter; Fabes, Mark; Salgado, Antonio; Shea, Sean
2012-01-01
Habitat restoration that favors native species can help control non-native species (McShane and others, 2004; Scoppettone and others, 2005; Kennedy and others, 2006). Restoration of Carson Slough and its tributaries present an opportunity to promote habitat types that favor native species over non-natives. Historically, the majority of Ash Meadows spring systems were tributaries to Carson Slough. In 2007 and 2008, a survey of Ash Meadows spring systems was conducted to generate baseline information on the distribution of fishes throughout AMNWR (Scoppettone and others, 2011b). In this study, we conducted a follow-up survey with emphasis on upper Carson Slough. This permitted us to gauge the early effects of spring system restoration on fish populations and to generate further baseline data relevant to future restoration efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ming, Yang; Wu, Zi-jian; Xu, Fei, E-mail: feixu@nju.edu.cn
The nonmaximally entangled state is a special kind of entangled state, which has important applications in quantum information processing. It has been generated in quantum circuits based on bulk optical elements. However, corresponding schemes in integrated quantum circuits have been rarely considered. In this Letter, we propose an effective solution for this problem. An electro-optically tunable nonmaximally mode-entangled photon state is generated in an on-chip domain-engineered lithium niobate (LN) waveguide. Spontaneous parametric down-conversion and electro-optic interaction are effectively combined through suitable domain design to transform the entangled state into our desired formation. Moreover, this is a flexible approach to entanglementmore » architectures. Other kinds of reconfigurable entanglements are also achievable through this method. LN provides a very promising platform for future quantum circuit integration.« less
NASA Astrophysics Data System (ADS)
Rodriguez, Sarah L.; Lehman, Kathleen
2017-10-01
This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.