Current and Emerging Technology Approaches in Genomics
Conley, Yvette P.; Biesecker, Leslie G.; Gonsalves, Stephen; Merkle, Carrie J.; Kirk, Maggie; Aouizerat, Bradley E.
2013-01-01
Purpose To introduce current and emerging approaches that are being utilized in the field of genomics so the reader can conceptually evaluate the literature and appreciate how these approaches are advancing our understanding of health-related issues. Organizing Construct Each approach is described and includes information related to how it is advancing research, its potential clinical utility, exemplars of current uses, challenges related to technologies used for these approaches, and when appropriate information related to understanding the evidence base for clinical utilization of each approach is provided. Web-based resources are included for the reader who would like more in-depth information and to provide opportunity to stay up to date with these approaches and their utility. Conclusions The chosen approaches– genome sequencing, genome-wide association studies, epigenomics, and gene expression– are extremely valuable approaches for collecting research data to help us better understand the pathophysiology of a variety of health-related conditions, but they are also gaining in utility for clinical assessment and testing purposes. Clinical Relevance Our increased understanding of the molecular underpinnings of disease will assist with better development of screening tests, diagnostic tests, tests that allow us to prognosticate, tests that allow for individualized treatments, and tests to facilitate post-treatment surveillance. PMID:23294727
Expected Utility Based Decision Making under Z-Information and Its Application.
Aliev, Rashad R; Mraiziq, Derar Atallah Talal; Huseynov, Oleg H
2015-01-01
Real-world decision relevant information is often partially reliable. The reasons are partial reliability of the source of information, misperceptions, psychological biases, incompetence, and so forth. Z-numbers based formalization of information (Z-information) represents a natural language (NL) based value of a variable of interest in line with the related NL based reliability. What is important is that Z-information not only is the most general representation of real-world imperfect information but also has the highest descriptive power from human perception point of view as compared to fuzzy number. In this study, we present an approach to decision making under Z-information based on direct computation over Z-numbers. This approach utilizes expected utility paradigm and is applied to a benchmark decision problem in the field of economics.
The novel use of climate information in water utility planning
NASA Astrophysics Data System (ADS)
Yates, D. N.
2016-12-01
Municipal water utilities have a long history of planning and yet their traditional use of climate information has been rather static in nature, using approaches such as 'safe-yield' to design their water infrastructure. New planning paradigms, such as triple-bottom-line approaches that integerate environemntal, social, and financial aspects of the water enterprise have led water utilies to use climate information in a much more rich and informative way. This presentation will describe examples of how climate climate information, hydrologic modeling, and water systems decision support tools are uniquely bleneded to help water utilties make informed decisions.
Analytics that Inform the University: Using Data You Already Have
ERIC Educational Resources Information Center
Dziuban, Charles; Moskal, Patsy; Cavanagh, Thomas; Watts, Andre
2012-01-01
The authors describe the University of Central Florida's top-down/bottom-up action analytics approach to using data to inform decision-making at the University of Central Florida. The top-down approach utilizes information about programs, modalities, and college implementation of Web initiatives. The bottom-up approach continuously monitors…
Petrou, Stavros; Kwon, Joseph; Madan, Jason
2018-05-10
Economic analysts are increasingly likely to rely on systematic reviews and meta-analyses of health state utility values to inform the parameter inputs of decision-analytic modelling-based economic evaluations. Beyond the context of economic evaluation, evidence from systematic reviews and meta-analyses of health state utility values can be used to inform broader health policy decisions. This paper provides practical guidance on how to conduct a systematic review and meta-analysis of health state utility values. The paper outlines a number of stages in conducting a systematic review, including identifying the appropriate evidence, study selection, data extraction and presentation, and quality and relevance assessment. The paper outlines three broad approaches that can be used to synthesise multiple estimates of health utilities for a given health state or condition, namely fixed-effect meta-analysis, random-effects meta-analysis and mixed-effects meta-regression. Each approach is illustrated by a synthesis of utility values for a hypothetical decision problem, and software code is provided. The paper highlights a number of methodological issues pertinent to the conduct of meta-analysis or meta-regression. These include the importance of limiting synthesis to 'comparable' utility estimates, for example those derived using common utility measurement approaches and sources of valuation; the effects of reliance on limited or poorly reported published data from primary utility assessment studies; the use of aggregate outcomes within analyses; approaches to generating measures of uncertainty; handling of median utility values; challenges surrounding the disentanglement of utility estimates collected serially within the context of prospective observational studies or prospective randomised trials; challenges surrounding the disentanglement of intervention effects; and approaches to measuring model validity. Areas of methodological debate and avenues for future research are highlighted.
Image Information Mining Utilizing Hierarchical Segmentation
NASA Technical Reports Server (NTRS)
Tilton, James C.; Marchisio, Giovanni; Koperski, Krzysztof; Datcu, Mihai
2002-01-01
The Hierarchical Segmentation (HSEG) algorithm is an approach for producing high quality, hierarchically related image segmentations. The VisiMine image information mining system utilizes clustering and segmentation algorithms for reducing visual information in multispectral images to a manageable size. The project discussed herein seeks to enhance the VisiMine system through incorporating hierarchical segmentations from HSEG into the VisiMine system.
Utility-preserving transaction data anonymization with low information loss.
Loukides, Grigorios; Gkoulalas-Divanis, Aris
2012-08-01
Transaction data record various information about individuals, including their purchases and diagnoses, and are increasingly published to support large-scale and low-cost studies in domains such as marketing and medicine. However, the dissemination of transaction data may lead to privacy breaches, as it allows an attacker to link an individual's record to their identity. Approaches that anonymize data by eliminating certain values in an individual's record or by replacing them with more general values have been proposed recently, but they often produce data of limited usefulness. This is because these approaches adopt value transformation strategies that do not guarantee data utility in intended applications and objective measures that may lead to excessive data distortion. In this paper, we propose a novel approach for anonymizing data in a way that satisfies data publishers' utility requirements and incurs low information loss. To achieve this, we introduce an accurate information loss measure and an effective anonymization algorithm that explores a large part of the problem space. An extensive experimental study, using click-stream and medical data, demonstrates that our approach permits many times more accurate query answering than the state-of-the-art methods, while it is comparable to them in terms of efficiency.
Utility-preserving transaction data anonymization with low information loss
Loukides, Grigorios; Gkoulalas-Divanis, Aris
2012-01-01
Transaction data record various information about individuals, including their purchases and diagnoses, and are increasingly published to support large-scale and low-cost studies in domains such as marketing and medicine. However, the dissemination of transaction data may lead to privacy breaches, as it allows an attacker to link an individual’s record to their identity. Approaches that anonymize data by eliminating certain values in an individual’s record or by replacing them with more general values have been proposed recently, but they often produce data of limited usefulness. This is because these approaches adopt value transformation strategies that do not guarantee data utility in intended applications and objective measures that may lead to excessive data distortion. In this paper, we propose a novel approach for anonymizing data in a way that satisfies data publishers’ utility requirements and incurs low information loss. To achieve this, we introduce an accurate information loss measure and an effective anonymization algorithm that explores a large part of the problem space. An extensive experimental study, using click-stream and medical data, demonstrates that our approach permits many times more accurate query answering than the state-of-the-art methods, while it is comparable to them in terms of efficiency. PMID:22563145
Information Literacy in Oman's Higher Education: A Descriptive-Inferential Approach
ERIC Educational Resources Information Center
Al-Aufi, Ali; Al-Azri, Hamed
2013-01-01
This study aims to identify the current status of information literacy among the students at Sultan Qaboos University in their final year through using the Big6 model for solving information problems. The study utilizes self-assessment survey approach, with the questionnaire as a tool for data collection. It surveyed undergraduate students of…
Evaluating the Value of Information in the Presence of High Uncertainty
2013-06-01
in this hierarchy is subsumed in the Knowledge and Information layers. If information with high expected value is identified, it then passes up...be, the higher is its value. Based on the idea of expected utility of asking a question [36], Nelson [31] discusses different approaches for...18] formalizes the expected value of a sample of information using the concept of pre-posterior analysis as the expected increase in utility by
Utilizing the Structure and Content Information for XML Document Clustering
NASA Astrophysics Data System (ADS)
Tran, Tien; Kutty, Sangeetha; Nayak, Richi
This paper reports on the experiments and results of a clustering approach used in the INEX 2008 document mining challenge. The clustering approach utilizes both the structure and content information of the Wikipedia XML document collection. A latent semantic kernel (LSK) is used to measure the semantic similarity between XML documents based on their content features. The construction of a latent semantic kernel involves the computing of singular vector decomposition (SVD). On a large feature space matrix, the computation of SVD is very expensive in terms of time and memory requirements. Thus in this clustering approach, the dimension of the document space of a term-document matrix is reduced before performing SVD. The document space reduction is based on the common structural information of the Wikipedia XML document collection. The proposed clustering approach has shown to be effective on the Wikipedia collection in the INEX 2008 document mining challenge.
Technical Standards for Command and Control Information Systems (CCISs) and Information Technology
1994-02-01
formatting, transmitting, receiving, and processing imagery and imagery-related information. The N1TFS is in essence the suite of individual standards...also known as Limited Operational Capability-Europe) and the German Joint Analysis System Military Intelligence ( JASMIN ). Among the approaches being... essence , the other systems utilize a one-level address space where addressing consists of identifying the fire support unit. However, AFATDS utilizes a two
Recruitment of College-Bound Youth through use of the ACT Assessment File
1985-07-01
approach coupled with a personalized approach to information dissemination. _--+Two exploratory studies were conducted to examine the utility of telephone...programs at the DLI. L. It was reasoned that such a personalized approach in providing information might generate more intere3t than an impersonal mass... personalized form of contact), 2) an information sheet describing the DLI and educational assistance for veterans, and 3) a business reply card for
Gray, D T; Weinstein, M C
1998-01-01
Decision and cost-utility analyses considered the tradeoffs of treating patent ductus arteriosus (PDA) using conventional surgery versus transcatheter implantation of the Rashkind occluder. Physicians and informed lay parents assigned utility scores to procedure success/complications combinations seen in prognostically similar pediatric patients with isolated PDA treated from 1982 to 1987. Utility scores multiplied by outcome frequencies from a comparative study generated expected utility values for the two approaches. Cost-utility analyses combined these results with simulated provider cost estimates from 1989. On a 0-100 scale (worst to best observed outcome), the median expected utility for surgery was 99.96, versus 98.88 for the occluder. Results of most sensitivity analyses also slightly favored surgery. Expected utility differences based on 1987 data were minimal. With a mean overall simulated cost of $8,838 vs $12,466 for the occluder, surgery was favored in most cost-utility analyses. Use of the inherently less invasive but less successful, more risky, and more costly occluder approach conferred no apparent net advantage in this study. Analyses of comparable current data would be informative.
Tracking Actual Usage: The Attention Metadata Approach
ERIC Educational Resources Information Center
Wolpers, Martin; Najjar, Jehad; Verbert, Katrien; Duval, Erik
2007-01-01
The information overload in learning and teaching scenarios is a main hindering factor for efficient and effective learning. New methods are needed to help teachers and students in dealing with the vast amount of available information and learning material. Our approach aims to utilize contextualized attention metadata to capture behavioural…
The Effects of Variability and Risk in Selection Utility Analysis: An Empirical Comparison.
ERIC Educational Resources Information Center
Rich, Joseph R.; Boudreau, John W.
1987-01-01
Investigated utility estimate variability for the selection utility of using the Programmer Aptitude Test to select computer programmers. Comparison of Monte Carlo results to other risk assessment approaches (sensitivity analysis, break-even analysis, algebraic derivation of the distribtion) suggests that distribution information provided by Monte…
Stemflow estimation in a redwood forest using model-based stratified random sampling
Jack Lewis
2003-01-01
Model-based stratified sampling is illustrated by a case study of stemflow volume in a redwood forest. The approach is actually a model-assisted sampling design in which auxiliary information (tree diameter) is utilized in the design of stratum boundaries to optimize the efficiency of a regression or ratio estimator. The auxiliary information is utilized in both the...
Efficient hiding of confidential high-utility itemsets with minimal side effects
NASA Astrophysics Data System (ADS)
Lin, Jerry Chun-Wei; Hong, Tzung-Pei; Fournier-Viger, Philippe; Liu, Qiankun; Wong, Jia-Wei; Zhan, Justin
2017-11-01
Privacy preserving data mining (PPDM) is an emerging research problem that has become critical in the last decades. PPDM consists of hiding sensitive information to ensure that it cannot be discovered by data mining algorithms. Several PPDM algorithms have been developed. Most of them are designed for hiding sensitive frequent itemsets or association rules. Hiding sensitive information in a database can have several side effects such as hiding other non-sensitive information and introducing redundant information. Finding the set of itemsets or transactions to be sanitised that minimises side effects is an NP-hard problem. In this paper, a genetic algorithm (GA) using transaction deletion is designed to hide sensitive high-utility itemsets for PPUM. A flexible fitness function with three adjustable weights is used to evaluate the goodness of each chromosome for hiding sensitive high-utility itemsets. To speed up the evolution process, the pre-large concept is adopted in the designed algorithm. It reduces the number of database scans required for verifying the goodness of an evaluated chromosome. Substantial experiments are conducted to compare the performance of the designed GA approach (with/without the pre-large concept), with a GA-based approach relying on transaction insertion and a non-evolutionary algorithm, in terms of execution time, side effects, database integrity and utility integrity. Results demonstrate that the proposed algorithm hides sensitive high-utility itemsets with fewer side effects than previous studies, while preserving high database and utility integrity.
Cluster-based query expansion using external collections in medical information retrieval.
Oh, Heung-Seon; Jung, Yuchul
2015-12-01
Utilizing external collections to improve retrieval performance is challenging research because various test collections are created for different purposes. Improving medical information retrieval has also gained much attention as various types of medical documents have become available to researchers ever since they started storing them in machine processable formats. In this paper, we propose an effective method of utilizing external collections based on the pseudo relevance feedback approach. Our method incorporates the structure of external collections in estimating individual components in the final feedback model. Extensive experiments on three medical collections (TREC CDS, CLEF eHealth, and OHSUMED) were performed, and the results were compared with a representative expansion approach utilizing the external collections to show the superiority of our method. Copyright © 2015 Elsevier Inc. All rights reserved.
Managing Space Situational Awareness Using the Space Surveillance Network
2013-11-14
This report examines the use of utility metrics from two forms of expected information gain for each object‐sensor pair as well as the...examines the use of utility metrics from two forms of expected information gain for each object-sensor pair as well as the approximated stability of the...estimation errors in order to work towards a tasking strategy. The information theoretic approaches use the calculation of Fisher information gain
A Dangerous Idea? Freedom, Children and the Capability Approach to Education
ERIC Educational Resources Information Center
Bessant, Judith
2014-01-01
This article begins by observing how education is currently appreciated primarily for its utility value, a view informed by utilitarianism and neoclassical economic theory. A critique of that framing is offered and an alternative way of valuing education informed by a Capabilities Approach is presented. In doing so, I also observe that while key…
The derivation of scenic utility functions and surfaces and their role in landscape management
John W. Hamilton; Gregory J. Buhyoff; J. Douglas Wellman
1979-01-01
This paper outlines a methodological approach for determining relevant physical landscape features which people use in formulating judgments about scenic utility. This information, coupled with either empirically derived or rationally stipulated regression techniques, may be used to produce scenic utility functions and surfaces. These functions can provide a means for...
Real-time information management environment (RIME)
NASA Astrophysics Data System (ADS)
DeCleene, Brian T.; Griffin, Sean; Matchett, Garry; Niejadlik, Richard
2000-08-01
Whereas data mining and exploitation improve the quality and quantity of information available to the user, there remains a mission requirement to assist the end-user in managing the access to this information and ensuring that the appropriate information is delivered to the right user in time to make decisions and take action. This paper discusses TASC's federated architecture to next- generation information management, contrasts the approach against emerging technologies, and quantifies the performance gains. This architecture and implementation, known as Real-time Information Management Environment (RIME), is based on two key concepts: information utility and content-based channelization. The introduction of utility allows users to express the importance and delivery requirements of their information needs in the context of their mission. Rather than competing for resources on a first-come/first-served basis, the infrastructure employs these utility functions to dynamically react to unanticipated loading by optimizing the delivered information utility. Furthermore, commander's resource policies shape these functions to ensure that resources are allocated according to military doctrine. Using information about the desired content, channelization identifies opportunities to aggregate users onto shared channels reducing redundant transmissions. Hence, channelization increases the information throughput of the system and balances sender/receiver processing load.
Moving toward integrated value-based planning: The issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamberlin, J.H.; Braithwait, C.L.
1988-07-01
Integrated Value-Based Planning (IVP) is a promising new planning approach that uses value, as well as cost, as the common denominator for evaluating supply and demand resource options. Planning based on value yields an ''apples to apples'' comparison of utility and customer options. The IVP approach can form the cornerstone of a successful market-driven utility planning strategy. This conference will raise questions, discuss issues, and further the exchange of information regarding the tools, concepts, and techniques needed to put IVP into the utility planner's toolbox. This proceedings is more than a compendium of papers. It is designed to let bothmore » participants and non-participants exchange information. To this end, listings and cross-listings of papers, speakers and participant interest areas, along with the ever-invaluable phone number have been included.« less
Ultrasonic inspection of carbon fiber reinforced plastic by means of sample-recognition methods
NASA Technical Reports Server (NTRS)
Bilgram, R.
1985-01-01
In the case of carbon fiber reinforced plastic (CFRP), it has not yet been possible to detect nonlocal defects and material degradation related to aging with the aid of nondestructive inspection method. An approach for overcoming difficulties regarding such an inspection involves an extension of the ultrasonic inspection procedure on the basis of a use of signal processing and sample recognition methods. The basic concept involved in this approach is related to the realization that the ultrasonic signal contains information regarding the medium which is not utilized in conventional ultrasonic inspection. However, the analytical study of the phyiscal processes involved is very complex. For this reason, an empirical approach is employed to make use of the information which has not been utilized before. This approach uses reference signals which can be obtained with material specimens of different quality. The implementation of these concepts for the supersonic inspection of CFRP laminates is discussed.
ERIC Educational Resources Information Center
Royer, David James
2017-01-01
To best support all students' academic, behavioral, and social needs, an integrated systems approach is necessary. In such systems, all faculty and staff ideally recognize student success is a shared responsibility and collaborate in a data-informed process to define common student behavioral expectations to facilitate success academically,…
Feature-based fusion of medical imaging data.
Calhoun, Vince D; Adali, Tülay
2009-09-01
The acquisition of multiple brain imaging types for a given study is a very common practice. There have been a number of approaches proposed for combining or fusing multitask or multimodal information. These can be roughly divided into those that attempt to study convergence of multimodal imaging, for example, how function and structure are related in the same region of the brain, and those that attempt to study the complementary nature of modalities, for example, utilizing temporal EEG information and spatial functional magnetic resonance imaging information. Within each of these categories, one can attempt data integration (the use of one imaging modality to improve the results of another) or true data fusion (in which multiple modalities are utilized to inform one another). We review both approaches and present a recent computational approach that first preprocesses the data to compute features of interest. The features are then analyzed in a multivariate manner using independent component analysis. We describe the approach in detail and provide examples of how it has been used for different fusion tasks. We also propose a method for selecting which combination of modalities provides the greatest value in discriminating groups. Finally, we summarize and describe future research topics.
NASA Astrophysics Data System (ADS)
Kolodny, Michael A.
2017-05-01
Today's battlefield space is extremely complex, dealing with an enemy that is neither well-defined nor well-understood. Adversaries are comprised of widely-distributed, loosely-networked groups engaging in nefarious activities. Situational understanding is needed by decision makers; understanding of adversarial capabilities and intent is essential. Information needed at any time is dependent on the mission/task at hand. Information sources potentially providing mission-relevant information are disparate and numerous; they include sensors, social networks, fusion engines, internet, etc. Management of these multi-dimensional informational sources is critical. This paper will present a new approach being undertaken to answer the challenge of enhancing battlefield understanding by optimizing the utilization of available informational sources (means) to required missions/tasks as well as determining the "goodness'" of the information acquired in meeting the capabilities needed. Requirements are usually expressed in terms of a presumed technology solution (e.g., imagery). A metaphor of the "magic rabbits" was conceived to remove presumed technology solutions from requirements by claiming the "required" technology is obsolete. Instead, intelligent "magic rabbits" are used to provide needed information. The question then becomes: "WHAT INFORMATION DO YOU NEED THE RABBITS TO PROVIDE YOU?" This paper will describe a new approach called Mission-Informed Needed Information - Discoverable, Available Sensing Sources (MINI-DASS) that designs a process that builds information acquisition missions and determines what the "magic rabbits" need to provide in a manner that is machine understandable. Also described is the Missions and Means Framework (MMF) model used, the process flow utilized, the approach to developing an ontology of information source means and the approach for determining the value of the information acquired.
Self and External Monitoring of Reading Comprehension
ERIC Educational Resources Information Center
Shiu, Ling-po; Chen, Qishan
2013-01-01
The present study compared the effectiveness of 2 approaches to remedy the inaccuracy of self-monitoring of reading comprehension. The first approach attempts to enhance self-monitoring by strengthening the cues utilized in monitoring. The second approach replaces self-monitoring with external regulation based on objective evaluative information.…
NASA Astrophysics Data System (ADS)
Li, Y. H.; Shinohara, T.; Satoh, T.; Tachibana, K.
2016-06-01
High-definition and highly accurate road maps are necessary for the realization of automated driving, and road signs are among the most important element in the road map. Therefore, a technique is necessary which can acquire information about all kinds of road signs automatically and efficiently. Due to the continuous technical advancement of Mobile Mapping System (MMS), it has become possible to acquire large number of images and 3d point cloud efficiently with highly precise position information. In this paper, we present an automatic road sign detection and recognition approach utilizing both images and 3D point cloud acquired by MMS. The proposed approach consists of three stages: 1) detection of road signs from images based on their color and shape features using object based image analysis method, 2) filtering out of over detected candidates utilizing size and position information estimated from 3D point cloud, region of candidates and camera information, and 3) road sign recognition using template matching method after shape normalization. The effectiveness of proposed approach was evaluated by testing dataset, acquired from more than 180 km of different types of roads in Japan. The results show a very high success in detection and recognition of road signs, even under the challenging conditions such as discoloration, deformation and in spite of partial occlusions.
Asymmetric information and macroeconomic dynamics
NASA Astrophysics Data System (ADS)
Hawkins, Raymond J.; Aoki, Masanao; Roy Frieden, B.
2010-09-01
We show how macroeconomic dynamics can be derived from asymmetric information. As an illustration of the utility of this approach we derive the equilibrium density, non-equilibrium densities and the equation of motion for the response to a demand shock for productivity in a simple economy. Novel consequences of this approach include a natural incorporation of time dependence into macroeconomics and a common information-theoretic basis for economics and other fields seeking to link micro-dynamics and macro-observables.
Initiative for safe driving and enhanced utilization of crash data
NASA Astrophysics Data System (ADS)
Wagner, John F.
1994-03-01
This initiative addresses the utilization of current technology to increase the efficiency of police officers to complete required Driving Under the Influence (DUI) forms and to enhance their ability to acquire and record crash and accident information. The project is a cooperative program among the New Mexico Alliance for Transportation Research (ATR), Science Applications International Corporation (SAIC), Los Alamos National Laboratory, and the New Mexico State Highway and Transportation Department. The approach utilizes an in-car computer and associated sensors for information acquisition and recording. Los Alamos artificial intelligence technology is leveraged to ensure ease of data entry and use.
Leveraging model-informed approaches for drug discovery and development in the cardiovascular space.
Dockendorf, Marissa F; Vargo, Ryan C; Gheyas, Ferdous; Chain, Anne S Y; Chatterjee, Manash S; Wenning, Larissa A
2018-06-01
Cardiovascular disease remains a significant global health burden, and development of cardiovascular drugs in the current regulatory environment often demands large and expensive cardiovascular outcome trials. Thus, the use of quantitative pharmacometric approaches which can help enable early Go/No Go decision making, ensure appropriate dose selection, and increase the likelihood of successful clinical trials, have become increasingly important to help reduce the risk of failed cardiovascular outcomes studies. In addition, cardiovascular safety is an important consideration for many drug development programs, whether or not the drug is designed to treat cardiovascular disease; modeling and simulation approaches also have utility in assessing risk in this area. Herein, examples of modeling and simulation applied at various stages of drug development, spanning from the discovery stage through late-stage clinical development, for cardiovascular programs are presented. Examples of how modeling approaches have been utilized in early development programs across various therapeutic areas to help inform strategies to mitigate the risk of cardiovascular-related adverse events, such as QTc prolongation and changes in blood pressure, are also presented. These examples demonstrate how more informed drug development decisions can be enabled by modeling and simulation approaches in the cardiovascular area.
NASA Astrophysics Data System (ADS)
Baum, R.; Characklis, G. W.
2016-12-01
Financial hedging solutions have been examined as tools for effectively mitigating water scarcity related financial risks for water utilities, and have become more prevalent as conservation (resulting in reduced revenues) and water transfers (resulting in increased costs) play larger roles in drought management. Individualized financial contracts (i.e. designed for a single utility) provide evidence of the potential benefits of financial hedging. However, individualized contracts require substantial time and information to develop, limiting their widespread implementation. More generalized contracts have also shown promise, and would allow the benefits of risk pooling to be more effectively realized, resulting in less expensive contracts. Risk pooling reduces the probability of an insurer making payouts that deviate significantly from the mean, but given that the financial risks of drought are spatially correlated amongst utilities, these more extreme "fat tail" risks remain. Any group offering these hedging contracts, whether a third-party insurer or a "mutual" comprised of many utilities, will need to balance the costs (i.e. additional risk) and benefits (i.e. returns) of alternative approaches to managing the extreme risks (e.g. through insurance layers). The balance of these different approaches will vary depending on the risk pool being considered, including the number, size and exposure of the participating utilities. This work first establishes a baseline of the tradeoffs between risk and expected return in insuring against the financial risks of water scarcity without alternative hedging approaches for water utilities across all climate divisions of the United States. Then various scenarios are analyzed to provide insight into how to maximize returns for risk pooling portfolios at various risk levels through balancing different insurance layers and hedging approaches. This analysis will provide valuable information for designing optimal financial risk management strategies for water utilities across the United States.
Movement Regulation of Handsprings on Vault
ERIC Educational Resources Information Center
Heinen, Thomas; Vinken, Pia M.; Jeraj, Damian; Velentzas, Konstantinos
2013-01-01
Purpose: Visual information is utilized in gymnastics vaulting. The question remains as to which informational sources are used to regulate handspring performance. The purpose of this study was to examine springboard and vaulting table position as informational sources in gymnastics vaulting. The hypothesis tested was that the approach-run and…
Medical equipment industry in India: Production, procurement and utilization.
Chakravarthi, Indira
2013-01-01
This article presents information on the medical equipment industry in India-on production, procurement and utilization related activities of key players in the sector, in light of the current policies of liberalization and growth of a "health-care industry" in India. Policy approaches to medical equipment have been discussed elsewhere.
Link Performance Analysis and monitoring - A unified approach to divergent requirements
NASA Astrophysics Data System (ADS)
Thom, G. A.
Link Performance Analysis and real-time monitoring are generally covered by a wide range of equipment. Bit Error Rate testers provide digital link performance measurements but are not useful during real-time data flows. Real-time performance monitors utilize the fixed overhead content but vary widely from format to format. Link quality information is also present from signal reconstruction equipment in the form of receiver AGC, bit synchronizer AGC, and bit synchronizer soft decision level outputs, but no general approach to utilizing this information exists. This paper presents an approach to link tests, real-time data quality monitoring, and results presentation that utilizes a set of general purpose modules in a flexible architectural environment. The system operates over a wide range of bit rates (up to 150 Mbs) and employs several measurement techniques, including P/N code errors or fixed PCM format errors, derived real-time BER from frame sync errors, and Data Quality Analysis derived by counting significant sync status changes. The architecture performs with a minimum of elements in place to permit a phased update of the user's unit in accordance with his needs.
ERIC Educational Resources Information Center
Madkour, M. A. K.
1980-01-01
Discusses underlying assumptions and prerequisites for information development in Arab countries. Administrative and environmental impediments which hinder the optimum utilization of available resources and suggestions for improvements are outlined. A brief bibliography is provided. (Author/RAA)
Approaches to Learning in a Second Year Chemical Engineering Course.
ERIC Educational Resources Information Center
Case, Jennifer M.; Gunstone, Richard F.
2003-01-01
Investigates student approaches to learning in a second year chemical engineering course by means of a qualitative research project which utilized interview and journal data from a group of 11 students. Identifies three approaches to learning: (1) conceptual; (2) algorithmic; and (3) information-based. Presents student responses to a series of…
Development of a Portfolio Management Approach with Case Study of the NASA Airspace Systems Program
NASA Technical Reports Server (NTRS)
Neitzke, Kurt W.; Hartman, Christopher L.
2012-01-01
A portfolio management approach was developed for the National Aeronautics and Space Administration s (NASA s) Airspace Systems Program (ASP). The purpose was to help inform ASP leadership regarding future investment decisions related to its existing portfolio of advanced technology concepts and capabilities (C/Cs) currently under development and to potentially identify new opportunities. The portfolio management approach is general in form and is extensible to other advanced technology development programs. It focuses on individual C/Cs and consists of three parts: 1) concept of operations (con-ops) development, 2) safety impact assessment, and 3) benefit-cost-risk (B-C-R) assessment. The first two parts are recommendations to ASP leaders and will be discussed only briefly, while the B-C-R part relates to the development of an assessment capability and will be discussed in greater detail. The B-C-R assessment capability enables estimation of the relative value of each C/C as compared with all other C/Cs in the ASP portfolio. Value is expressed in terms of a composite weighted utility function (WUF) rating, based on estimated benefits, costs, and risks. Benefit utility is estimated relative to achieving key NAS performance objectives, which are outlined in the ASP Strategic Plan.1 Risk utility focuses on C/C development and implementation risk, while cost utility focuses on the development and implementation portions of overall C/C life-cycle costs. Initial composite ratings of the ASP C/Cs were successfully generated; however, the limited availability of B-C-R information, which is used as inputs to the WUF model, reduced the meaningfulness of these initial investment ratings. Development of this approach, however, defined specific information-generation requirements for ASP C/C developers that will increase the meaningfulness of future B-C-R ratings.
Cieslewicz, Artur; Dutkiewicz, Jakub; Jedrzejek, Czeslaw
2018-01-01
Abstract Information retrieval from biomedical repositories has become a challenging task because of their increasing size and complexity. To facilitate the research aimed at improving the search for relevant documents, various information retrieval challenges have been launched. In this article, we present the improved medical information retrieval systems designed by Poznan University of Technology and Poznan University of Medical Sciences as a contribution to the bioCADDIE 2016 challenge—a task focusing on information retrieval from a collection of 794 992 datasets generated from 20 biomedical repositories. The system developed by our team utilizes the Terrier 4.2 search platform enhanced by a query expansion method using word embeddings. This approach, after post-challenge modifications and improvements (with particular regard to assigning proper weights for original and expanded terms), allowed us achieving the second best infNDCG measure (0.4539) compared with the challenge results and infAP 0.3978. This demonstrates that proper utilization of word embeddings can be a valuable addition to the information retrieval process. Some analysis is provided on related work involving other bioCADDIE contributions. We discuss the possibility of improving our results by using better word embedding schemes to find candidates for query expansion. Database URL: https://biocaddie.org/benchmark-data PMID:29688372
The causal structure of utility conditionals.
Bonnefon, Jean-François; Sloman, Steven A
2013-01-01
The psychology of reasoning is increasingly considering agents' values and preferences, achieving greater integration with judgment and decision making, social cognition, and moral reasoning. Some of this research investigates utility conditionals, ''if p then q'' statements where the realization of p or q or both is valued by some agents. Various approaches to utility conditionals share the assumption that reasoners make inferences from utility conditionals based on the comparison between the utility of p and the expected utility of q. This article introduces a new parameter in this analysis, the underlying causal structure of the conditional. Four experiments showed that causal structure moderated utility-informed conditional reasoning. These inferences were strongly invited when the underlying structure of the conditional was causal, and significantly less so when the underlying structure of the conditional was diagnostic. This asymmetry was only observed for conditionals in which the utility of q was clear, and disappeared when the utility of q was unclear. Thus, an adequate account of utility-informed inferences conditional reasoning requires three components: utility, probability, and causal structure. Copyright © 2012 Cognitive Science Society, Inc.
ERIC Educational Resources Information Center
Sseguya, Haroon; Mazur, Robert; Abbott, Eric; Matsiko, Frank
2012-01-01
Purpose: To examine the status and priorities for agricultural information generation, dissemination and utilization in the context of agricultural innovation systems in southeast Uganda. Design/Methodology/Approach: Group discussions were conducted with six communities in Kamuli district, southeast Uganda. The focus was on information sources and…
Lessons Learned from the Deployment of a Hydrologic Science Observations Data Model
NASA Astrophysics Data System (ADS)
Beran, B.; Valentine, D.; Zaslavsky, I.; van Ingen, C.
2007-12-01
The CUAHSI Hydrologic Information System project is developing information technology infrastructure to support hydrologic science. The CUAHSI Observations Data Model (ODM) is a data model to store hydrologic observations data in a system designed to optimize data retrieval for integrated analysis of information collected by multiple investigators. The ODM v1, provides a distinct view into what information the community has determined is important to store, and what data views the community. As we began to work with ODM v1, we discovered the problem with the approach of tightly linking the community views of data to the database model. Design decisions for ODM v1 hindered the ability to utilize the datamodel as an aggregated information catalog need for the cyberinfrastructure. Different development groups had different approaches to populating the datamodel, and handling the complexity. The approaches varied from populating the ODM with a bare minimum of constraints to creating a fully constrained datamodel. This made the integration of different tools, difficult. In the end, we decided to utilize the fully populate model which ensure maximum compatibility with the data sources. Groups also discovered that while the data model central concept was optimized for data retrieval of individual observation. In practice, the concept of data series is better to manage data, yet there is no link between data series and data value in ODM v1. We are beginning to develop ODM v2 as a series of profiles. By utilizing profiles, we intend to make the core information model smaller, more manageable, and simpler to understand and populate. We intend to keep the community semantics, improve the linkages between data series and data values, and enhance data discovery for the CUAHSI cyberinfrastructure.
Interactive Visualization of Computational Fluid Dynamics using Mosaic
NASA Technical Reports Server (NTRS)
Clucas, Jean; Watson, Velvin; Chancellor, Marisa K. (Technical Monitor)
1994-01-01
The Web provides new Methods for accessing Information world-wide, but the current text-and-pictures approach neither utilizes all the Web's possibilities not provides for its limitations. While the inclusion of pictures and animations in a paper communicates more effectively than text alone, It Is essentially an extension of the concept of "publication." Also, as use of the Web increases putting images and animations online will quickly load even the "Information Superhighway." We need to find forms of communication that take advantage of the special nature of the Web. This paper presents one approach: the use of the Internet and the Mosaic interface for data sharing and collaborative analysis. We will describe (and In the presentation, demonstrate) our approach: using FAST (Flow Analysis Software Toolkit), a scientific visualization package, as a data viewer and interactive tool called from MOSAIC. Our intent is to stimulate the development of other tools that utilize the unique nature of electronic communication.
Goddard, Katrina A.B.; Knaus, William A.; Whitlock, Evelyn; Lyman, Gary H.; Feigelson, Heather Spencer; Schully, Sheri D.; Ramsey, Scott; Tunis, Sean; Freedman, Andrew N.; Khoury, Muin J.; Veenstra, David L.
2013-01-01
Background The clinical utility is uncertain for many cancer genomic applications. Comparative effectiveness research (CER) can provide evidence to clarify this uncertainty. Objectives To identify approaches to help stakeholders make evidence-based decisions, and to describe potential challenges and opportunities using CER to produce evidence-based guidance. Methods We identified general CER approaches for genomic applications through literature review, the authors’ experiences, and lessons learned from a recent, seven-site CER initiative in cancer genomic medicine. Case studies illustrate the use of CER approaches. Results Evidence generation and synthesis approaches include comparative observational and randomized trials, patient reported outcomes, decision modeling, and economic analysis. We identified significant challenges to conducting CER in cancer genomics: the rapid pace of innovation, the lack of regulation, the limited evidence for clinical utility, and the beliefs that genomic tests could have personal utility without having clinical utility. Opportunities to capitalize on CER methods in cancer genomics include improvements in the conduct of evidence synthesis, stakeholder engagement, increasing the number of comparative studies, and developing approaches to inform clinical guidelines and research prioritization. Conclusions CER offers a variety of methodological approaches to address stakeholders’ needs. Innovative approaches are needed to ensure an effective translation of genomic discoveries. PMID:22516979
Using Global Climate Data to Inform Long-Term Water Planning Decisions
NASA Astrophysics Data System (ADS)
Groves, D. G.; Lempert, R.
2008-12-01
Water managers throughout the world are working to consider climate change in their long-term water planning processes. The best available information regarding plausible future hydrologic conditions are largely derived from global circulation models and from paleoclimate data. To date there lacks a single approach for (1) utilizing these data in water management planning tools for analysis and (2) evaluating the myriad of possible adaptation options. This talk will describe several approaches being used at RAND to incorporate global projections of climate change into local, regional, and state-wide long-term water planning. It will draw on current work with the California Department of Water Resources and other local Western water agencies, and a recently completed project with the Inland Empire Utilities Agency. Work to date suggests that climate information can be assimilated into local water planning tools to help identify robust climate adaptation water management strategies.
Application of Project Portfolio Management
NASA Astrophysics Data System (ADS)
Pankowska, Malgorzata
The main goal of the chapter is the presentation of the application project portfolio management approach to support development of e-Municipality and public administration information systems. The models of how people publish and utilize information on the web have been transformed continually. Instead of simply viewing on static web pages, users publish their own content through blogs and photo- and video-sharing slides. Analysed in this chapter, ICT (Information Communication Technology) projects for municipalities cover the mixture of the static web pages, e-Government information systems, and Wikis. So, for the management of the ICT projects' mixtures the portfolio project management approach is proposed.
Managing for resilience: an information theory-based ...
Ecosystems are complex and multivariate; hence, methods to assess the dynamics of ecosystems should have the capacity to evaluate multiple indicators simultaneously. Most research on identifying leading indicators of regime shifts has focused on univariate methods and simple models which have limited utility when evaluating real ecosystems, particularly because drivers are often unknown. We discuss some common univariate and multivariate approaches for detecting critical transitions in ecosystems and demonstrate their capabilities via case studies. Synthesis and applications. We illustrate the utility of an information theory-based index for assessing ecosystem dynamics. Trends in this index also provide a sentinel of both abrupt and gradual transitions in ecosystems. In response to the need to identify leading indicators of regime shifts in ecosystems, our research compares traditional indicators and Fisher information, an information theory based method, by examining four case study systems. Results demonstrate the utility of methods and offers great promise for quantifying and managing for resilience.
Integrated Artificial Intelligence Approaches for Disease Diagnostics.
Vashistha, Rajat; Chhabra, Deepak; Shukla, Pratyoosh
2018-06-01
Mechanocomputational techniques in conjunction with artificial intelligence (AI) are revolutionizing the interpretations of the crucial information from the medical data and converting it into optimized and organized information for diagnostics. It is possible due to valuable perfection in artificial intelligence, computer aided diagnostics, virtual assistant, robotic surgery, augmented reality and genome editing (based on AI) technologies. Such techniques are serving as the products for diagnosing emerging microbial or non microbial diseases. This article represents a combinatory approach of using such approaches and providing therapeutic solutions towards utilizing these techniques in disease diagnostics.
Health Information Retrieval Tool (HIRT)
Nyun, Mra Thinzar; Ogunyemi, Omolola; Zeng, Qing
2002-01-01
The World Wide Web (WWW) is a powerful way to deliver on-line health information, but one major problem limits its value to consumers: content is highly distributed, while relevant and high quality information is often difficult to find. To address this issue, we experimented with an approach that utilizes three-dimensional anatomic models in conjunction with free-text search.
Determinants of Computer Utilization by Extension Personnel: A Structural Equations Approach
ERIC Educational Resources Information Center
Sivakumar, Paramasivan Sethuraman; Parasar, Bibudha; Das, Raghu Nath; Anantharaman, Mathevanpillai
2014-01-01
Purpose: Information technology (IT) has tremendous potential for fostering grassroots development and the Indian government has created various capital-intensive computer networks to promote agricultural development. However, research studies have shown that information technology investments are not always translated into productivity gains due…
Data-driven approach for assessing utility of medical tests using electronic medical records.
Skrøvseth, Stein Olav; Augestad, Knut Magne; Ebadollahi, Shahram
2015-02-01
To precisely define the utility of tests in a clinical pathway through data-driven analysis of the electronic medical record (EMR). The information content was defined in terms of the entropy of the expected value of the test related to a given outcome. A kernel density classifier was used to estimate the necessary distributions. To validate the method, we used data from the EMR of the gastrointestinal department at a university hospital. Blood tests from patients undergoing surgery for gastrointestinal surgery were analyzed with respect to second surgery within 30 days of the index surgery. The information content is clearly reflected in the patient pathway for certain combinations of tests and outcomes. C-reactive protein tests coupled to anastomosis leakage, a severe complication show a clear pattern of information gain through the patient trajectory, where the greatest gain from the test is 3-4 days post index surgery. We have defined the information content in a data-driven and information theoretic way such that the utility of a test can be precisely defined. The results reflect clinical knowledge. In the case we used the tests carry little negative impact. The general approach can be expanded to cases that carry a substantial negative impact, such as in certain radiological techniques. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Utility-preserving anonymization for health data publishing.
Lee, Hyukki; Kim, Soohyung; Kim, Jong Wook; Chung, Yon Dohn
2017-07-11
Publishing raw electronic health records (EHRs) may be considered as a breach of the privacy of individuals because they usually contain sensitive information. A common practice for the privacy-preserving data publishing is to anonymize the data before publishing, and thus satisfy privacy models such as k-anonymity. Among various anonymization techniques, generalization is the most commonly used in medical/health data processing. Generalization inevitably causes information loss, and thus, various methods have been proposed to reduce information loss. However, existing generalization-based data anonymization methods cannot avoid excessive information loss and preserve data utility. We propose a utility-preserving anonymization for privacy preserving data publishing (PPDP). To preserve data utility, the proposed method comprises three parts: (1) utility-preserving model, (2) counterfeit record insertion, (3) catalog of the counterfeit records. We also propose an anonymization algorithm using the proposed method. Our anonymization algorithm applies full-domain generalization algorithm. We evaluate our method in comparison with existence method on two aspects, information loss measured through various quality metrics and error rate of analysis result. With all different types of quality metrics, our proposed method show the lower information loss than the existing method. In the real-world EHRs analysis, analysis results show small portion of error between the anonymized data through the proposed method and original data. We propose a new utility-preserving anonymization method and an anonymization algorithm using the proposed method. Through experiments on various datasets, we show that the utility of EHRs anonymized by the proposed method is significantly better than those anonymized by previous approaches.
Multimethod-Multisource Approach for Assessing High-Technology Training Systems.
ERIC Educational Resources Information Center
Shlechter, Theodore M.; And Others
This investigation examined the value of using a multimethod-multisource approach to assess high-technology training systems. The research strategy was utilized to provide empirical information on the instructional effectiveness of the Reserve Component Virtual Training Program (RCVTP), which was developed to improve the training of Army National…
Mental energy: Assessing the motivation dimension.
Barbuto, John E
2006-07-01
Content-based theories of motivation may best uti lize the meta-theory of work motivation. Process-based theories may benefit most from adopting Locke and Latham's goal-setting approaches and measures. Decision-making theories should utilize the measurement approach operationalized by Ilgen et al. Sustained effort theories should utilize similar approaches to those used in numerous studies of intrinsic motivation, but the measurement of which is typically observational or attitudinal. This paper explored the implications of the four approaches to studying motivation on the newly estab ished model of mental energy. The approach taken for examining motivation informs the measurement of mental energy. Specific recommendations for each approach were developed and provided. As a result of these efforts, it will now be possible to diagnose, measure, and experimentally test for changes in human motivation, which is one of the three major components of mental energy.
Evaluating Data Clustering Approach for Life-Cycle Facility Control
2013-04-01
produce 90% matching accuracy with noise/variations up to 55%. KEYWORDS: Building Information Modelling ( BIM ), machine learning, pattern detection...reconciled to building information model elements and ultimately to an expected resource utilization schedule. The motivation for this integration is to...by interoperable data sources and building information models . Building performance modelling and simulation efforts such as those by Maile et al
Yoo, Illhoi; Hu, Xiaohua; Song, Il-Yeol
2007-11-27
A huge amount of biomedical textual information has been produced and collected in MEDLINE for decades. In order to easily utilize biomedical information in the free text, document clustering and text summarization together are used as a solution for text information overload problem. In this paper, we introduce a coherent graph-based semantic clustering and summarization approach for biomedical literature. Our extensive experimental results show the approach shows 45% cluster quality improvement and 72% clustering reliability improvement, in terms of misclassification index, over Bisecting K-means as a leading document clustering approach. In addition, our approach provides concise but rich text summary in key concepts and sentences. Our coherent biomedical literature clustering and summarization approach that takes advantage of ontology-enriched graphical representations significantly improves the quality of document clusters and understandability of documents through summaries.
Yoo, Illhoi; Hu, Xiaohua; Song, Il-Yeol
2007-01-01
Background A huge amount of biomedical textual information has been produced and collected in MEDLINE for decades. In order to easily utilize biomedical information in the free text, document clustering and text summarization together are used as a solution for text information overload problem. In this paper, we introduce a coherent graph-based semantic clustering and summarization approach for biomedical literature. Results Our extensive experimental results show the approach shows 45% cluster quality improvement and 72% clustering reliability improvement, in terms of misclassification index, over Bisecting K-means as a leading document clustering approach. In addition, our approach provides concise but rich text summary in key concepts and sentences. Conclusion Our coherent biomedical literature clustering and summarization approach that takes advantage of ontology-enriched graphical representations significantly improves the quality of document clusters and understandability of documents through summaries. PMID:18047705
A reduced order, test verified component mode synthesis approach for system modeling applications
NASA Astrophysics Data System (ADS)
Butland, Adam; Avitabile, Peter
2010-05-01
Component mode synthesis (CMS) is a very common approach used for the generation of large system models. In general, these modeling techniques can be separated into two categories: those utilizing a combination of constraint modes and fixed interface normal modes and those based on a combination of free interface normal modes and residual flexibility terms. The major limitation of the methods utilizing constraint modes and fixed interface normal modes is the inability to easily obtain the required information from testing; the result of this limitation is that constraint mode-based techniques are primarily used with numerical models. An alternate approach is proposed which utilizes frequency and shape information acquired from modal testing to update reduced order finite element models using exact analytical model improvement techniques. The connection degrees of freedom are then rigidly constrained in the test verified, reduced order model to provide the boundary conditions necessary for constraint modes and fixed interface normal modes. The CMS approach is then used with this test verified, reduced order model to generate the system model for further analysis. A laboratory structure is used to show the application of the technique with both numerical and simulated experimental components to describe the system and validate the proposed approach. Actual test data is then used in the approach proposed. Due to typical measurement data contaminants that are always included in any test, the measured data is further processed to remove contaminants and is then used in the proposed approach. The final case using improved data with the reduced order, test verified components is shown to produce very acceptable results from the Craig-Bampton component mode synthesis approach. Use of the technique with its strengths and weaknesses are discussed.
Controlled trials to improve antibiotic utilization: a systematic review of experience, 1984-2004.
Parrino, Thomas A
2005-02-01
To review the effectiveness of interventions designed to improve antibiotic prescribing patterns in clinical practice and to draw inferences about the most practical methods for optimizing antibiotic utilization in hospital and ambulatory settings. A literature search using online databases for the years 1975-2004 identified controlled trials of strategies for improving antibiotic utilization. Due to variation in study settings and design, quantitative meta-analysis was not feasible. Therefore, a qualitative literature review was conducted. Forty-one controlled trials met the search criteria. Interventions consisted of education, peer review and feedback, physician participation, rewards and penalties, administrative methods, and combined approaches. Social marketing directed at patients and prescribers was effective in varying contexts, as was implementation of practice guidelines. Authorization systems with structured order entry, formulary restriction, and mandatory consultation were also effective. Peer review and feedback were more effective when combined with dissemination of relevant information or social marketing than when used alone. Several practices were effective in improving antibiotic utilization: social marketing, practice guidelines, authorization systems, and peer review and feedback. Online systems providing clinical information, structured order entry, and decision support may be the most promising approach. Further studies, including economic analyses, are needed to confirm or refute this hypothesis.
Quality of Information Approach to Improving Source Selection in Tactical Networks
2017-02-01
consider the performance of this process based on metrics relating to quality of information: accuracy, timeliness, completeness and reliability. These...that are indicators of that the network is meeting these quality requirements. We study effective data rate, social distance, link integrity and the...utility of information as metrics within a multi-genre network to determine the quality of information of its available sources. This paper proposes a
Targeted prodrugs in oral drug delivery: the modern molecular biopharmaceutical approach.
Dahan, Arik; Khamis, Mustafa; Agbaria, Riad; Karaman, Rafik
2012-08-01
The molecular revolution greatly impacted the field of drug design and delivery in general, and the utilization of the prodrug approach in particular. The increasing understanding of membrane transporters has promoted a novel 'targeted-prodrug' approach utilizing carrier-mediated transport to increase intestinal permeability, as well as specific enzymes to promote activation to the parent drug. This article provides the reader with a concise overview of this modern approach to prodrug design. Targeting the oligopeptide transporter PEPT1 for absorption and the serine hydrolase valacyclovirase for activation will be presented as examples for the successful utilization of this approach. Additionally, the use of computational approaches, such as DFT and ab initio molecular orbital methods, in modern prodrugs design will be discussed. Overall, in the coming years, more and more information will undoubtedly become available regarding intestinal transporters and potential enzymes that may be exploited for the targeted modern prodrug approach. Hence, the concept of prodrug design can no longer be viewed as merely a chemical modification to solve problems associated with parent compounds. Rather, it opens promising opportunities for precise and efficient drug delivery, as well as enhancement of treatment options and therapeutic efficacy.
Students' Bibliographic Research: Competition Enhances Results.
ERIC Educational Resources Information Center
Engel, Nora; And Others
1996-01-01
Describes an approach to a course about the basics of genetics and molecular biology that utilizes a contest that involves students in accessing information about a topic such as sex determination, neural development, and gene therapy. The general objective of the approach is to inspire students to become familiar with the scientific literature.…
ERIC Educational Resources Information Center
Zachariades, Iacovos; Roberts, Sherron Killingsworth
1995-01-01
Describes an innovative and collaborative approach to helping teacher educators better prepare preservice teachers to utilize technology for effective instruction. A mentoring program that paired graduate students in instructional technology with interested faculty members is discussed, and attitudes of the mentors and the faculty members are…
Task Analysis: A Systematic Approach to Designing New Careers Programs.
ERIC Educational Resources Information Center
Jackson, Vivian C.
This guide presents the primary approaches, tools, and techniques utilized by the New Careers Training Laboratory (NCTL) staff to provide skills in training and to conduct agency task analyses. Much of the technical information has been taken from an earlier NCTL publication by Tita Beal, "A New Careers Guide for Career Development…
A Kindergarten Curriculum Guide for Indian Children: A Bilingual-Bicultural Approach.
ERIC Educational Resources Information Center
National Association for the Education of Young Children, Washington, DC.
A bilingual and bicultural approach is presented for teaching Navajo Indian students by enhancing and utilizing the familiar while broadening and enriching the students' experiences related to the larger American culture. Information is given on the significance of early learning, physical and mental aspects of the five year old, articulation of…
A Transparent Translation from Legacy System Model into Common Information Model: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Simpson, Jeffrey; Zhang, Yingchen
Advance in smart grid is forcing utilities towards better monitoring, control and analysis of distribution systems, and requires extensive cyber-based intelligent systems and applications to realize various functionalities. The ability of systems, or components within systems, to interact and exchange services or information with each other is the key to the success of smart grid technologies, and it requires efficient information exchanging and data sharing infrastructure. The Common Information Model (CIM) is a standard that allows different applications to exchange information about an electrical system, and it has become a widely accepted solution for information exchange among different platforms andmore » applications. However, most existing legacy systems are not developed using CIM, but using their own languages. Integrating such legacy systems is a challenge for utilities, and the appropriate utilization of the integrated legacy systems is even more intricate. Thus, this paper has developed an approach and open-source tool in order to translate legacy system models into CIM format. The developed tool is tested for a commercial distribution management system and simulation results have proved its effectiveness.« less
Culturally Tailored Depression/Suicide Prevention in Latino Youth: Community Perspectives.
Ford-Paz, Rebecca E; Reinhard, Christine; Kuebbeler, Andrea; Contreras, Richard; Sánchez, Bernadette
2015-10-01
Latino adolescents are at elevated risk for depression and suicide compared to other ethnic groups. Project goals were to gain insight from community leaders about depression risk factors particular to Latino adolescents and generate innovative suggestions to improve cultural relevance of prevention interventions. This project utilized a CBPR approach to enhance cultural relevance, acceptability, and utility of the findings and subsequent program development. Two focus groups of youth and youth-involved Latino community leaders (n = 18) yielded three overarching themes crucial to a culturally tailored depression prevention intervention: (1) utilize a multipronged and sustainable intervention approach, (2) raise awareness about depression in culturally meaningful ways, and (3) promote Latino youth's social connection and cultural enrichment activities. Findings suggest that both adaptation of existing prevention programs and development of hybrid approaches may be necessary to reduce depression/suicide disparities for Latino youth. One such hybrid program informed by community stakeholders is described.
Irizarry, Kristopher J L; Bryant, Doug; Kalish, Jordan; Eng, Curtis; Schmidt, Peggy L; Barrett, Gini; Barr, Margaret C
2016-01-01
Many endangered captive populations exhibit reduced genetic diversity resulting in health issues that impact reproductive fitness and quality of life. Numerous cost effective genomic sequencing and genotyping technologies provide unparalleled opportunity for incorporating genomics knowledge in management of endangered species. Genomic data, such as sequence data, transcriptome data, and genotyping data, provide critical information about a captive population that, when leveraged correctly, can be utilized to maximize population genetic variation while simultaneously reducing unintended introduction or propagation of undesirable phenotypes. Current approaches aimed at managing endangered captive populations utilize species survival plans (SSPs) that rely upon mean kinship estimates to maximize genetic diversity while simultaneously avoiding artificial selection in the breeding program. However, as genomic resources increase for each endangered species, the potential knowledge available for management also increases. Unlike model organisms in which considerable scientific resources are used to experimentally validate genotype-phenotype relationships, endangered species typically lack the necessary sample sizes and economic resources required for such studies. Even so, in the absence of experimentally verified genetic discoveries, genomics data still provides value. In fact, bioinformatics and comparative genomics approaches offer mechanisms for translating these raw genomics data sets into integrated knowledge that enable an informed approach to endangered species management.
Irizarry, Kristopher J. L.; Bryant, Doug; Kalish, Jordan; Eng, Curtis; Schmidt, Peggy L.; Barrett, Gini; Barr, Margaret C.
2016-01-01
Many endangered captive populations exhibit reduced genetic diversity resulting in health issues that impact reproductive fitness and quality of life. Numerous cost effective genomic sequencing and genotyping technologies provide unparalleled opportunity for incorporating genomics knowledge in management of endangered species. Genomic data, such as sequence data, transcriptome data, and genotyping data, provide critical information about a captive population that, when leveraged correctly, can be utilized to maximize population genetic variation while simultaneously reducing unintended introduction or propagation of undesirable phenotypes. Current approaches aimed at managing endangered captive populations utilize species survival plans (SSPs) that rely upon mean kinship estimates to maximize genetic diversity while simultaneously avoiding artificial selection in the breeding program. However, as genomic resources increase for each endangered species, the potential knowledge available for management also increases. Unlike model organisms in which considerable scientific resources are used to experimentally validate genotype-phenotype relationships, endangered species typically lack the necessary sample sizes and economic resources required for such studies. Even so, in the absence of experimentally verified genetic discoveries, genomics data still provides value. In fact, bioinformatics and comparative genomics approaches offer mechanisms for translating these raw genomics data sets into integrated knowledge that enable an informed approach to endangered species management. PMID:27376076
In the Face of Cybersecurity: How the Common Information Model Can Be Used
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skare, Paul; Falk, Herbert; Rice, Mark
2016-01-01
Efforts are underway to combine smart grid information, devices, networking, and emergency response information to create messages that are not dependent on specific standards development organizations (SDOs). This supports a future-proof approach of allowing changes in the canonical data models (CDMs) going forward without having to perform forklift replacements of solutions that use the messages. This also allows end users (electric utilities) to upgrade individual components of a larger system while keeping the message payload definitions intact. The goal is to enable public and private information sharing securely in a standards-based approach that can be integrated into existing operations. Wemore » provide an example architecture that could benefit from this multi-SDO, secure message approach. This article also describes how to improve message security« less
Overcoming Terminology Barrier Using Web Resources for Cross-Language Medical Information Retrieval
Lu, Wen-Hsiang; Lin, Ray Shih-Jui; Chan, Yi-Che; Chen, Kuan-Hsi
2006-01-01
A number of authoritative medical websites, such as PubMed and MedlinePlus, provide consumers with the most up-to-date health information. However, non-English speakers often encounter not only language barriers (from other languages to English) but also terminology barriers (from laypersons’ terms to professional medical terms) when retrieving information from these websites. Our previous work addresses language barriers by developing a multilingual medical thesaurus, Chinese-English MeSH, while this study presents an approach to overcome terminology barriers based on Web resources. Two techniques were utilized in our approach: monolingual concept mapping using approximate string matching and crosslingual concept mapping using Web resources. The evaluation shows that our approach can significantly improve the performance on MeSH concept mapping and cross-language medical information retrieval. PMID:17238395
Bioinspired Resource Management for Multiple-Sensor Target Tracking Systems
2011-06-20
Section 2, we also present the Renyi o-entropy and a-divergence [13] that are extensively utilized in our information-theoretic approach (cf. [9] and...gain in information. The Renyi a-entropy provides a general scalar measure of uncertainty [10]: Ua (Slrft) = YZT^ 1(>g / ^ (XA’ I Zl:*^ (/XA:- (7...it follows that as a approaches unity, the Renyi a-entropy (7) reduces to the Shannon entropy: TMzi*) = Urni/Ha(zi;fc) = - / p(xk\\zhk)\\ogp{xk\\zi:k
An integrated approach to system design, reliability, and diagnosis
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Iverson, David L.
1990-01-01
The requirement for ultradependability of computer systems in future avionics and space applications necessitates a top-down, integrated systems engineering approach for design, implementation, testing, and operation. The functional analyses of hardware and software systems must be combined by models that are flexible enough to represent their interactions and behavior. The information contained in these models must be accessible throughout all phases of the system life cycle in order to maintain consistency and accuracy in design and operational decisions. One approach being taken by researchers at Ames Research Center is the creation of an object-oriented environment that integrates information about system components required in the reliability evaluation with behavioral information useful for diagnostic algorithms. Procedures have been developed at Ames that perform reliability evaluations during design and failure diagnoses during system operation. These procedures utilize information from a central source, structured as object-oriented fault trees. Fault trees were selected because they are a flexible model widely used in aerospace applications and because they give a concise, structured representation of system behavior. The utility of this integrated environment for aerospace applications in light of our experiences during its development and use is described. The techniques for reliability evaluation and failure diagnosis are discussed, and current extensions of the environment and areas requiring further development are summarized.
An integrated approach to system design, reliability, and diagnosis
NASA Astrophysics Data System (ADS)
Patterson-Hine, F. A.; Iverson, David L.
1990-12-01
The requirement for ultradependability of computer systems in future avionics and space applications necessitates a top-down, integrated systems engineering approach for design, implementation, testing, and operation. The functional analyses of hardware and software systems must be combined by models that are flexible enough to represent their interactions and behavior. The information contained in these models must be accessible throughout all phases of the system life cycle in order to maintain consistency and accuracy in design and operational decisions. One approach being taken by researchers at Ames Research Center is the creation of an object-oriented environment that integrates information about system components required in the reliability evaluation with behavioral information useful for diagnostic algorithms. Procedures have been developed at Ames that perform reliability evaluations during design and failure diagnoses during system operation. These procedures utilize information from a central source, structured as object-oriented fault trees. Fault trees were selected because they are a flexible model widely used in aerospace applications and because they give a concise, structured representation of system behavior. The utility of this integrated environment for aerospace applications in light of our experiences during its development and use is described. The techniques for reliability evaluation and failure diagnosis are discussed, and current extensions of the environment and areas requiring further development are summarized.
Placebo non-response measure in sequential parallel comparison design studies.
Rybin, Denis; Doros, Gheorghe; Pencina, Michael J; Fava, Maurizio
2015-07-10
The Sequential Parallel Comparison Design (SPCD) is one of the novel approaches addressing placebo response. The analysis of SPCD data typically classifies subjects as 'placebo responders' or 'placebo non-responders'. Most current methods employed for analysis of SPCD data utilize only a part of the data collected during the trial. A repeated measures model was proposed for analysis of continuous outcomes that permitted the inclusion of information from all subjects into the treatment effect estimation. We describe here a new approach using a weighted repeated measures model that further improves the utilization of data collected during the trial, allowing the incorporation of information that is relevant to the placebo response, and dealing with the problem of possible misclassification of subjects. Our simulations show that when compared to the unweighted repeated measures model method, our approach performs as well or, under certain conditions, better, in preserving the type I error, achieving adequate power and minimizing the mean squared error. Copyright © 2015 John Wiley & Sons, Ltd.
Multimedia Approach and Its Effect in Teaching Mathematics for the Prospective Teachers
ERIC Educational Resources Information Center
Joan, D. R. Robert; Denisia, S. P.
2012-01-01
Multimedia improves the effectiveness of teaching learning process of multimedia in formal or informal setting and utilizing scientific principle. It allows us to sort out the information to analyse and make meaning for conceptualization and applications which is suitable for individual learners. The objectives of the study was to measure the…
Selecting essential information for biosurveillance--a multi-criteria decision analysis.
Generous, Nicholas; Margevicius, Kristen J; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillance defines biosurveillance as "the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels." However, the strategy does not specify how "essential information" is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being "essential". The question of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of "essential information" for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system.
Model prototype utilization in the analysis of fault tolerant control and data processing systems
NASA Astrophysics Data System (ADS)
Kovalev, I. V.; Tsarev, R. Yu; Gruzenkin, D. V.; Prokopenko, A. V.; Knyazkov, A. N.; Laptenok, V. D.
2016-04-01
The procedure assessing the profit of control and data processing system implementation is presented in the paper. The reasonability of model prototype creation and analysis results from the implementing of the approach of fault tolerance provision through the inclusion of structural and software assessment redundancy. The developed procedure allows finding the best ratio between the development cost and the analysis of model prototype and earnings from the results of this utilization and information produced. The suggested approach has been illustrated by the model example of profit assessment and analysis of control and data processing system.
Soft-information flipping approach in multi-head multi-track BPMR systems
NASA Astrophysics Data System (ADS)
Warisarn, C.; Busyatras, W.; Myint, L. M. M.
2018-05-01
Inter-track interference is one of the most severe impairments in bit-patterned media recording system. This impairment can be effectively handled by a modulation code and a multi-head array jointly processing multiple tracks; however, such a modulation constraint has never been utilized to improve the soft-information. Therefore, this paper proposes the utilization of modulation codes with an encoded constraint defined by the criteria for soft-information flipping during a three-track data detection process. Moreover, we also investigate the optimal offset position of readheads to provide the most improvement in system performance. The simulation results indicate that the proposed systems with and without position jitter are significantly superior to uncoded systems.
Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John
2018-03-07
DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.
Georgino, Madeline M; Kress, Terri; Alexander, Sheila; Beach, Michael
2015-01-01
The purpose of this project was to measure trauma nurse improvement in familiarity with emergency preparedness and disaster response core competencies as originally defined by the Emergency Preparedness Information Questionnaire after a focused educational program. An adapted version of the Emergency Preparedness Information Questionnaire was utilized to measure familiarity of nurses with core competencies pertinent to first responder capabilities. This project utilized a pre- and postsurvey descriptive design and integrated education sessions into the preexisting, mandatory "Trauma Nurse Course" at large, level I trauma center. A total of 63 nurses completed the intervention during May and September 2014 sessions. Overall, all 8 competencies demonstrated significant (P < .001; 98% confidence interval) improvements in familiarity. In conclusion, this pilot quality improvement project demonstrated a unique approach to educating nurses to be more ready and comfortable when treating victims of a disaster.
1983-06-06
solubility Hgb F quantitation, alkali denaturation Clot retraction Unstable Ggb studies Cryofibrinogen Methemoglobin Parasitology Blood , Occult and Gross...Performance Standards ........ .................... ... 24 4 Types of Information to be Recorded Under Course of Treatment . 25 5 Factors Contributing to...examine external environmental factors implicating the need for utilization review of ancillary services within an Army Medical Center. (Hereinafter the
McCullagh, Laura; Schmitz, Susanne; Barry, Michael; Walsh, Cathal
2017-11-01
In Ireland, all new drugs for which reimbursement by the healthcare payer is sought undergo a health technology assessment by the National Centre for Pharmacoeconomics. The National Centre for Pharmacoeconomics estimate expected value of perfect information but not partial expected value of perfect information (owing to computational expense associated with typical methodologies). The objective of this study was to examine the feasibility and utility of estimating partial expected value of perfect information via a computationally efficient, non-parametric regression approach. This was a retrospective analysis of evaluations on drugs for cancer that had been submitted to the National Centre for Pharmacoeconomics (January 2010 to December 2014 inclusive). Drugs were excluded if cost effective at the submitted price. Drugs were excluded if concerns existed regarding the validity of the applicants' submission or if cost-effectiveness model functionality did not allow required modifications to be made. For each included drug (n = 14), value of information was estimated at the final reimbursement price, at a threshold equivalent to the incremental cost-effectiveness ratio at that price. The expected value of perfect information was estimated from probabilistic analysis. Partial expected value of perfect information was estimated via a non-parametric approach. Input parameters with a population value at least €1 million were identified as potential targets for research. All partial estimates were determined within minutes. Thirty parameters (across nine models) each had a value of at least €1 million. These were categorised. Collectively, survival analysis parameters were valued at €19.32 million, health state utility parameters at €15.81 million and parameters associated with the cost of treating adverse effects at €6.64 million. Those associated with drug acquisition costs and with the cost of care were valued at €6.51 million and €5.71 million, respectively. This research demonstrates that the estimation of partial expected value of perfect information via this computationally inexpensive approach could be considered feasible as part of the health technology assessment process for reimbursement purposes within the Irish healthcare system. It might be a useful tool in prioritising future research to decrease decision uncertainty.
Expected Power-Utility Maximization Under Incomplete Information and with Cox-Process Observations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujimoto, Kazufumi, E-mail: m_fuji@kvj.biglobe.ne.jp; Nagai, Hideo, E-mail: nagai@sigmath.es.osaka-u.ac.jp; Runggaldier, Wolfgang J., E-mail: runggal@math.unipd.it
2013-02-15
We consider the problem of maximization of expected terminal power utility (risk sensitive criterion). The underlying market model is a regime-switching diffusion model where the regime is determined by an unobservable factor process forming a finite state Markov process. The main novelty is due to the fact that prices are observed and the portfolio is rebalanced only at random times corresponding to a Cox process where the intensity is driven by the unobserved Markovian factor process as well. This leads to a more realistic modeling for many practical situations, like in markets with liquidity restrictions; on the other hand itmore » considerably complicates the problem to the point that traditional methodologies cannot be directly applied. The approach presented here is specific to the power-utility. For log-utilities a different approach is presented in Fujimoto et al. (Preprint, 2012).« less
Optimization Based Data Mining Approah for Forecasting Real-Time Energy Demand
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A; Li, Xueping; Zhou, Shengchao
The worldwide concern over environmental degradation, increasing pressure on electric utility companies to meet peak energy demand, and the requirement to avoid purchasing power from the real-time energy market are motivating the utility companies to explore new approaches for forecasting energy demand. Until now, most approaches for forecasting energy demand rely on monthly electrical consumption data. The emergence of smart meters data is changing the data space for electric utility companies, and creating opportunities for utility companies to collect and analyze energy consumption data at a much finer temporal resolution of at least 15-minutes interval. While the data granularity providedmore » by smart meters is important, there are still other challenges in forecasting energy demand; these challenges include lack of information about appliances usage and occupants behavior. Consequently, in this paper, we develop an optimization based data mining approach for forecasting real-time energy demand using smart meters data. The objective of our approach is to develop a robust estimation of energy demand without access to these other building and behavior data. Specifically, the forecasting problem is formulated as a quadratic programming problem and solved using the so-called support vector machine (SVM) technique in an online setting. The parameters of the SVM technique are optimized using simulated annealing approach. The proposed approach is applied to hourly smart meters data for several residential customers over several days.« less
MIDAS: Mining differentially activated subpaths of KEGG pathways from multi-class RNA-seq data.
Lee, Sangseon; Park, Youngjune; Kim, Sun
2017-07-15
Pathway based analysis of high throughput transcriptome data is a widely used approach to investigate biological mechanisms. Since a pathway consists of multiple functions, the recent approach is to determine condition specific sub-pathways or subpaths. However, there are several challenges. First, few existing methods utilize explicit gene expression information from RNA-seq. More importantly, subpath activity is usually an average of statistical scores, e.g., correlations, of edges in a candidate subpath, which fails to reflect gene expression quantity information. In addition, none of existing methods can handle multiple phenotypes. To address these technical problems, we designed and implemented an algorithm, MIDAS, that determines condition specific subpaths, each of which has different activities across multiple phenotypes. MIDAS utilizes gene expression quantity information fully and the network centrality information to determine condition specific subpaths. To test performance of our tool, we used TCGA breast cancer RNA-seq gene expression profiles with five molecular subtypes. 36 differentially activate subpaths were determined. The utility of our method, MIDAS, was demonstrated in four ways. All 36 subpaths are well supported by the literature information. Subsequently, we showed that these subpaths had a good discriminant power for five cancer subtype classification and also had a prognostic power in terms of survival analysis. Finally, in a performance comparison of MIDAS to a recent subpath prediction method, PATHOME, our method identified more subpaths and much more genes that are well supported by the literature information. http://biohealth.snu.ac.kr/software/MIDAS/. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method
NASA Astrophysics Data System (ADS)
Zhang, Xiangnan
2018-03-01
A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.
Zander, Thorsten O; Kothe, Christian
2011-04-01
Cognitive monitoring is an approach utilizing realtime brain signal decoding (RBSD) for gaining information on the ongoing cognitive user state. In recent decades this approach has brought valuable insight into the cognition of an interacting human. Automated RBSD can be used to set up a brain-computer interface (BCI) providing a novel input modality for technical systems solely based on brain activity. In BCIs the user usually sends voluntary and directed commands to control the connected computer system or to communicate through it. In this paper we propose an extension of this approach by fusing BCI technology with cognitive monitoring, providing valuable information about the users' intentions, situational interpretations and emotional states to the technical system. We call this approach passive BCI. In the following we give an overview of studies which utilize passive BCI, as well as other novel types of applications resulting from BCI technology. We especially focus on applications for healthy users, and the specific requirements and demands of this user group. Since the presented approach of combining cognitive monitoring with BCI technology is very similar to the concept of BCIs itself we propose a unifying categorization of BCI-based applications, including the novel approach of passive BCI.
Vest, Joshua R; Abramson, Erika
2015-01-01
Health information exchange (HIE) systems facilitate access to patient information for a variety of health care organizations, end users, and clinical and organizational goals. While a complex intervention, organizations' usage of HIE is often conceptualized and measured narrowly. We sought to provide greater specificity to the concept of HIE as an intervention by formulating a typology of organizational HIE usage. We interviewed representatives of a regional health information organization and health care organizations actively using HIE information to change patient utilization and costs. The resultant typology includes three dimensions: user role, usage initiation, and patient set. This approach to categorizing how health care organizations are actually applying HIE information to clinical and business tasks provides greater clarity about HIE as an intervention and helps elucidate the conceptual linkage between HIE an organizational and patient outcomes.
Efficient Discovery of De-identification Policies Through a Risk-Utility Frontier
Xia, Weiyi; Heatherly, Raymond; Ding, Xiaofeng; Li, Jiuyong; Malin, Bradley
2014-01-01
Modern information technologies enable organizations to capture large quantities of person-specific data while providing routine services. Many organizations hope, or are legally required, to share such data for secondary purposes (e.g., validation of research findings) in a de-identified manner. In previous work, it was shown de-identification policy alternatives could be modeled on a lattice, which could be searched for policies that met a prespecified risk threshold (e.g., likelihood of re-identification). However, the search was limited in several ways. First, its definition of utility was syntactic - based on the level of the lattice - and not semantic - based on the actual changes induced in the resulting data. Second, the threshold may not be known in advance. The goal of this work is to build the optimal set of policies that trade-off between privacy risk (R) and utility (U), which we refer to as a R-U frontier. To model this problem, we introduce a semantic definition of utility, based on information theory, that is compatible with the lattice representation of policies. To solve the problem, we initially build a set of policies that define a frontier. We then use a probability-guided heuristic to search the lattice for policies likely to update the frontier. To demonstrate the effectiveness of our approach, we perform an empirical analysis with the Adult dataset of the UCI Machine Learning Repository. We show that our approach can construct a frontier closer to optimal than competitive approaches by searching a smaller number of policies. In addition, we show that a frequently followed de-identification policy (i.e., the Safe Harbor standard of the HIPAA Privacy Rule) is suboptimal in comparison to the frontier discovered by our approach. PMID:25520961
Efficient Discovery of De-identification Policies Through a Risk-Utility Frontier.
Xia, Weiyi; Heatherly, Raymond; Ding, Xiaofeng; Li, Jiuyong; Malin, Bradley
2013-01-01
Modern information technologies enable organizations to capture large quantities of person-specific data while providing routine services. Many organizations hope, or are legally required, to share such data for secondary purposes (e.g., validation of research findings) in a de-identified manner. In previous work, it was shown de-identification policy alternatives could be modeled on a lattice, which could be searched for policies that met a prespecified risk threshold (e.g., likelihood of re-identification). However, the search was limited in several ways. First, its definition of utility was syntactic - based on the level of the lattice - and not semantic - based on the actual changes induced in the resulting data. Second, the threshold may not be known in advance. The goal of this work is to build the optimal set of policies that trade-off between privacy risk (R) and utility (U), which we refer to as a R-U frontier. To model this problem, we introduce a semantic definition of utility, based on information theory, that is compatible with the lattice representation of policies. To solve the problem, we initially build a set of policies that define a frontier. We then use a probability-guided heuristic to search the lattice for policies likely to update the frontier. To demonstrate the effectiveness of our approach, we perform an empirical analysis with the Adult dataset of the UCI Machine Learning Repository. We show that our approach can construct a frontier closer to optimal than competitive approaches by searching a smaller number of policies. In addition, we show that a frequently followed de-identification policy (i.e., the Safe Harbor standard of the HIPAA Privacy Rule) is suboptimal in comparison to the frontier discovered by our approach.
Implications of sampling design and sample size for national carbon accounting systems
Michael Köhl; Andrew Lister; Charles T. Scott; Thomas Baldauf; Daniel Plugge
2011-01-01
Countries willing to adopt a REDD regime need to establish a national Measurement, Reporting and Verification (MRV) system that provides information on forest carbon stocks and carbon stock changes. Due to the extensive areas covered by forests the information is generally obtained by sample based surveys. Most operational sampling approaches utilize a combination of...
The Information Needs of Virtual Users: A Study of Second Life Libraries
ERIC Educational Resources Information Center
Chow, Anthony S.; Baity, C. Chase; Zamarripa, Marilyn; Chappell, Pam; Rachlin, David; Vinson, Curtis
2012-01-01
As virtual worlds continue to proliferate globally, libraries are faced with the question of whether to provide information services to virtual patrons. This study, utilizing a mixed-method approach of interviews, focus groups, and surveys, represents one of the largest studies of virtual libraries attempted to date. Taking a holistic perspective,…
ERIC Educational Resources Information Center
Santa, Annesa Flentje; Cochran, Bryan N.
2008-01-01
The purpose of this study was to inform future Public Service Announcement (PSA) development by examining the potential effectiveness of different types of anti-driving under the influence (DUI) PSAs for persons with different characteristics. PSAs utilizing empathy, fear, and informational approaches were shown to persons recruited from…
An Information Retrieval Approach for Robust Prediction of Road Surface States.
Park, Jae-Hyung; Kim, Kwanho
2017-01-28
Recently, due to the increasing importance of reducing severe vehicle accidents on roads (especially on highways), the automatic identification of road surface conditions, and the provisioning of such information to drivers in advance, have recently been gaining significant momentum as a proactive solution to decrease the number of vehicle accidents. In this paper, we firstly propose an information retrieval approach that aims to identify road surface states by combining conventional machine-learning techniques and moving average methods. Specifically, when signal information is received from a radar system, our approach attempts to estimate the current state of the road surface based on the similar instances observed previously based on utilizing a given similarity function. Next, the estimated state is then calibrated by using the recently estimated states to yield both effective and robust prediction results. To validate the performances of the proposed approach, we established a real-world experimental setting on a section of actual highway in South Korea and conducted a comparison with the conventional approaches in terms of accuracy. The experimental results show that the proposed approach successfully outperforms the previously developed methods.
An Information Retrieval Approach for Robust Prediction of Road Surface States
Park, Jae-Hyung; Kim, Kwanho
2017-01-01
Recently, due to the increasing importance of reducing severe vehicle accidents on roads (especially on highways), the automatic identification of road surface conditions, and the provisioning of such information to drivers in advance, have recently been gaining significant momentum as a proactive solution to decrease the number of vehicle accidents. In this paper, we firstly propose an information retrieval approach that aims to identify road surface states by combining conventional machine-learning techniques and moving average methods. Specifically, when signal information is received from a radar system, our approach attempts to estimate the current state of the road surface based on the similar instances observed previously based on utilizing a given similarity function. Next, the estimated state is then calibrated by using the recently estimated states to yield both effective and robust prediction results. To validate the performances of the proposed approach, we established a real-world experimental setting on a section of actual highway in South Korea and conducted a comparison with the conventional approaches in terms of accuracy. The experimental results show that the proposed approach successfully outperforms the previously developed methods. PMID:28134859
Nguyen, Bang H; Vo, Phuong H; Doan, Hiep T; McPhee, Stephen J
2006-01-01
Colorectal cancer is the third most common cancer in Vietnamese Americans. Their colorectal screening rates are lower than the rates of whites. Four focus groups were conducted to identify Vietnamese American sources and credibility of health information, media utilization, and intervention approaches. Vietnamese Americans trusted doctors and patient testimonials and had access to, and received most of their health information from, Vietnamese- language print and electronic media. Recommended intervention approaches include promoting doctors' recommendation of screening and using Vietnamese-language mass media, print materials, and oral presentations. Focus groups are useful in determining communication channels and intervention approaches.
NASA Astrophysics Data System (ADS)
Yates, D. N.; Kaatz, L.
2016-12-01
Over the past decade, water utility managers across Colorado have joined together to advance their understanding of the role of climate information in their planning process. In an unprecedented step, managers from 5 different organizations and agencies pooled their resources and worked collaboratively to better understand the ever evolving role of science in helping understand risks, uncertainties, and opportunities that climate uncertainty and change might bring to this semi-arid region. The group developed an ongoing educational process to better understand climate projections (Scale); cohesively communicate with customers and the media (Communication); provided institutional coverage to an often contentious topic (Safety); and helped coordinate with other investigations and participants to facilitate education and training (Collaboration); and pooled finances, staff, and expert resources (Resources). We will share this experience and give examples of concrete outcomes.
New Methodology for Estimating Fuel Economy by Vehicle Class
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling
2011-01-01
Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumptionmore » rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.« less
ERIC Educational Resources Information Center
Garcia-Post, Aine
2016-01-01
This study aimed to add to the information known about the leaders charged with the task of successfully and sustainably turning around a school. Specifically, this research examined the extent to which leaders engaged in turnaround school reform experienced mindfulness. Utilizing a mixed-methods approach, state assigned school grades of schools…
ERIC Educational Resources Information Center
Perera, Indika
2010-01-01
ICT (information and communication technologies) add enormous approaches to utilize computing into users' daily lives. Every aspect of social needs has been touched by ICT, including learning. VL (virtual learning), with the life span of slightly above a decade, still looks for possible approaches to enhance its functions with significant pressure…
ERIC Educational Resources Information Center
Tsui, David; van der Kooy, Derek
2008-01-01
We utilized olfactory-mediated chemotaxis in "Caenorhabditis elegans" to examine the effect of aging on information processing and animal behavior. Wild-type (N2) young adults (day 4) initially approach and eventually avoid a point source of benzaldehyde. Aged adult animals (day 7) showed a stronger initial approach and a delayed avoidance to…
ERIC Educational Resources Information Center
Rasmussen, Ashley B.
2017-01-01
This study utilized a semi-structured interview approach to identify how college methods professors in Nebraska are engaging pre-service K-12 teachers with the Next Generation Science Standards and to determine if this information is being carried over to Nebraska K-12 classrooms. The study attempted to address these items by answering the…
ERIC Educational Resources Information Center
Barney, David C.; Pleban, Francis T.
2018-01-01
Objectives: To provide further information regarding physical education (PE) teachers' perceptions of incorporating music in PE lessons and to evaluate the influence of music on the classroom environment using a qualitative approach. Method: Electronic survey interviews were conducted with 26 veteran PE instructors (10 male, 16 female), from 7…
Costing the Morbidity and Mortality Consequences of Zoonoses Using Health-Adjusted Life Years.
Jordan, H; Dunt, D; Hollingsworth, B; Firestone, S M; Burgman, M
2016-10-01
Governments are routinely involved in the biosecurity of agricultural and food imports and exports. This involves controlling the complex ongoing threat of the broad range of zoonoses: endemic, exotic and newly emerging. Policy-related decision-making in these areas requires accurate information and predictions concerning the effects and potential impacts of zoonotic diseases. The aim of this article was to provide information concerning the development and use of utility-based tools, specifically disability-adjusted life years (DALYs), for measuring the burden on human disease (morbidity and mortality) as a consequence of zoonotic infections. Issues and challenges to their use are also considered. Non-monetary utility approaches that are reviewed in this paper form one of a number of tools that can be used to estimate the monetary and non-monetary 'cost' of morbidity- and mortality-related consequences. Other tools derive from cost-of-illness, willingness-to-pay and multicriteria approaches. Utility-based approaches are specifically designed to capture the pain, suffering and loss of functioning associated with diseases, zoonotic and otherwise. These effects are typically complicated to define, measure and subsequently 'cost'. Utility-based measures will not be able to capture all of the effects, especially those that extend beyond the health sector. These will more normally be captured in financial terms. Along with other uncommon diseases, the quality of the relevant epidemiological data may not be adequate to support the estimation of losses in utility as a result of zoonoses. Other issues in their use have been identified. New empirical studies have shown some success in addressing these issues. Other issues await further study. It is concluded that, bearing in mind all caveats, utility-based methods are important tools in assessing the magnitude of the impacts of zoonoses in human disease. They make an important contribution to decision-making and priority setting across all sectors. In doing so, they highlight the relative importance of the burden of zoonotic disease globally. © 2014 Blackwell Verlag GmbH.
Asymmetric information and economics
NASA Astrophysics Data System (ADS)
Frieden, B. Roy; Hawkins, Raymond J.
2010-01-01
We present an expression of the economic concept of asymmetric information with which it is possible to derive the dynamical laws of an economy. To illustrate the utility of this approach we show how the assumption of optimal information flow leads to a general class of investment strategies including the well-known Q theory of Tobin. Novel consequences of this formalism include a natural definition of market efficiency and an uncertainty principle relating capital stock and investment flow.
ERIC Educational Resources Information Center
Grant, Darren
2007-01-01
We determine how much observed student performance in microeconomics principles can be attributed, inferentially, to three kinds of student academic "productivity," the instructor, demographics, and unmeasurables. The empirical approach utilizes an ordered probit model that relates student performance in micro to grades in prior…
NASA Astrophysics Data System (ADS)
Behr, Joshua G.; Diaz, Rafael
Non-urgent Emergency Department utilization has been attributed with increasing congestion in the flow and treatment of patients and, by extension, conditions the quality of care and profitability of the Emergency Department. Interventions designed to divert populations to more appropriate care may be cautiously received by operations managers due to uncertainty about the impact an adopted intervention may have on the two values of congestion and profitability. System Dynamics (SD) modeling and simulation may be used to measure the sensitivity of these two, often-competing, values of congestion and profitability and, thus, provide an additional layer of information designed to inform strategic decision making.
Simmons, David
2011-01-01
This article explores the utility of ethnography in accounting for healers’ understandings of HIV/AIDS—and more generally sexually transmitted infections—and the planning of HIV/AIDS education interventions targeting healers in urban Zimbabwe. I argue that much of the information utilized for planning and implementing such programs is actually based on rapid research procedures (usually single-method survey-based approaches) that do not fully capture healers’ explanatory frameworks. This incomplete information then becomes authoritative knowledge about local ‘traditions' and forms the basis for the design and implementation of training programs. Such decontextualization may, in turn, affect program effectiveness. PMID:21343161
2011-05-01
iTunes illustrate the difference between the centralized approach of digital library systems and the distributed approach of container file formats...metadata in a container file format. Apple’s iTunes uses a centralized metadata approach and allows users to maintain song metadata in a single...one iTunes library to another the metadata must be copied separately or reentered in the new library. This demonstrates the utility of storing metadata
A problem-oriented approach to journal selection for hospital libraries.
Delman, B S
1982-01-01
This paper describes a problem-oriented approach to journal selection (PAJS), including general methodology, theoretical terms, and a brief description of results when the system was applied in three different hospitals. The PAJS system relates the objective information which the MEDLARS data base offers about the universe of biomedical literature to objective, problem-oriented information supplied by the hospital's medical records. The results were manipulated quantitatively to determine (1) the relevance of various journals to each of the hospital's defined significant information problems and (2) the overall utility of each journal to the institution as a whole. The utility information was plotted on a graph to identify the collection of journal titles which would be most useful to the given hospital. Attempts made to verify certain aspects of the whole process are also described. The results suggest that the methodology is generally able to provide an effective library response. The system optimizes resources vis-a-vis information and can be used for both budget allocation and justification. It offers an algorithm to which operations researchers can apply any one of a variety of mathematical programming methods. Although originally intended for librarians in the community hospital environment, the PAJS system is generalizable and has application potential in a variety of special library settings. PMID:6758893
Research utilization in the building industry: decision model and preliminary assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, R.L.; Johnson, D.R.; Smith, S.A.
1985-10-01
The Research Utilization Program was conceived as a far-reaching means for managing the interactions of the private sector and the federal research sector as they deal with energy conservation in buildings. The program emphasizes a private-public partnership in planning a research agenda and in applying the results of ongoing and completed research. The results of this task support the hypothesis that the transfer of R and D results to the buildings industry can be accomplished more efficiently and quickly by a systematic approach to technology transfer. This systematic approach involves targeting decision makers, assessing research and information needs, properly formatingmore » information, and then transmitting the information through trusted channels. The purpose of this report is to introduce elements of a market-oriented knowledge base, which would be useful to the Building Systems Division, the Office of Buildings and Community Systems and their associated laboratories in managing a private-public research partnership on a rational systematic basis. This report presents conceptual models and data bases that can be used in formulating a technology transfer strategy and in planning technology transfer programs.« less
Building a knowledge translation platform in Malawi to support evidence-informed health policy.
Berman, Joshua; Mitambo, Collins; Matanje-Mwagomba, Beatrice; Khan, Shiraz; Kachimanga, Chiyembekezo; Wroe, Emily; Mwape, Lonia; van Oosterhout, Joep J; Chindebvu, Getrude; van Schoor, Vanessa; Puchalski Ritchie, Lisa M; Panisset, Ulysses; Kathyola, Damson
2015-12-08
With the support of the World Health Organization's Evidence-Informed Policy Network, knowledge translation platforms have been developed throughout Africa, the Americas, Eastern Europe, and Asia to further evidence-informed national health policy. In this commentary, we discuss the approaches, activities and early lessons learned from the development of a Knowledge Translation Platform in Malawi (KTPMalawi). Through ongoing leadership, as well as financial and administrative support, the Malawi Ministry of Health has strongly signalled its intention to utilize a knowledge translation platform methodology to support evidence-informed national health policy. A unique partnership between Dignitas International, a medical and research non-governmental organization, and the Malawi Ministry of Health, has established KTPMalawi to engage national-level policymakers, researchers and implementers in a coordinated approach to the generation and utilization of health-sector research. Utilizing a methodology developed and tested by knowledge translation platforms across Africa, a stakeholder mapping exercise and initial capacity building workshops were undertaken and a multidisciplinary Steering Committee was formed. This Steering Committee prioritized the development of two initial Communities of Practice to (1) improve data utilization in the pharmaceutical supply chain and (2) improve the screening and treatment of hypertension within HIV-infected populations. Each Community of Practice's mandate is to gather and synthesize the best available global and local evidence and produce evidence briefs for policy that have been used as the primary input into structured deliberative dialogues. While a lack of sustained initial funding slowed its early development, KTPMalawi has greatly benefited from extensive technical support and mentorship by an existing network of global knowledge translation platforms. With the continued support of the Malawi Ministry of Health and the Evidence-Informed Policy Network, KTPMalawi can continue to build on its role in facilitating the use of evidence in the development and refinement of health policy in Malawi.
Simulation of the hyperspectral data from multispectral data using Python programming language
NASA Astrophysics Data System (ADS)
Tiwari, Varun; Kumar, Vinay; Pandey, Kamal; Ranade, Rigved; Agarwal, Shefali
2016-04-01
Multispectral remote sensing (MRS) sensors have proved their potential in acquiring and retrieving information of Land Use Land (LULC) Cover features in the past few decades. These MRS sensor generally acquire data within limited broad spectral bands i.e. ranging from 3 to 10 number of bands. The limited number of bands and broad spectral bandwidth in MRS sensors becomes a limitation in detailed LULC studies as it is not capable of distinguishing spectrally similar LULC features. On the counterpart, fascinating detailed information available in hyperspectral (HRS) data is spectrally over determined and able to distinguish spectrally similar material of the earth surface. But presently the availability of HRS sensors is limited. This is because of the requirement of sensitive detectors and large storage capability, which makes the acquisition and processing cumbersome and exorbitant. So, there arises a need to utilize the available MRS data for detailed LULC studies. Spectral reconstruction approach is one of the technique used for simulating hyperspectral data from available multispectral data. In the present study, spectral reconstruction approach is utilized for the simulation of hyperspectral data using EO-1 ALI multispectral data. The technique is implemented using python programming language which is open source in nature and possess support for advanced imaging processing libraries and utilities. Over all 70 bands have been simulated and validated using visual interpretation, statistical and classification approach.
Expected Utility Distributions for Flexible, Contingent Execution
NASA Technical Reports Server (NTRS)
Bresina, John L.; Washington, Richard
2000-01-01
This paper presents a method for using expected utility distributions in the execution of flexible, contingent plans. A utility distribution maps the possible start times of an action to the expected utility of the plan suffix starting with that action. The contingent plan encodes a tree of possible courses of action and includes flexible temporal constraints and resource constraints. When execution reaches a branch point, the eligible option with the highest expected utility at that point in time is selected. The utility distributions make this selection sensitive to the runtime context, yet still efficient. Our approach uses predictions of action duration uncertainty as well as expectations of resource usage and availability to determine when an action can execute and with what probability. Execution windows and probabilities inevitably change as execution proceeds, but such changes do not invalidate the cached utility distributions, thus, dynamic updating of utility information is minimized.
Integrating structure-based and ligand-based approaches for computational drug design.
Wilson, Gregory L; Lill, Markus A
2011-04-01
Methods utilized in computer-aided drug design can be classified into two major categories: structure based and ligand based, using information on the structure of the protein or on the biological and physicochemical properties of bound ligands, respectively. In recent years there has been a trend towards integrating these two methods in order to enhance the reliability and efficiency of computer-aided drug-design approaches by combining information from both the ligand and the protein. This trend resulted in a variety of methods that include: pseudoreceptor methods, pharmacophore methods, fingerprint methods and approaches integrating docking with similarity-based methods. In this article, we will describe the concepts behind each method and selected applications.
Proceedings: 1990 EPRI gas turbine procurement seminar
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, B.L.; Miller, M.N.
1991-06-01
This seminar presents information that enables utilities to implement more cost-effective procurements for gas turbine and combined-cycle power generation equipment. A systematic approach to specification and permitting procedures can lower unit life-cycle cost. APPROACH. Thirty-two staff members from 25 utilities met in Danvers, Massachusetts, October 9--11, 1990. Speakers representing utilities, vendors, and EPRI contractors presented material on recent procurement and startup experiences, permitting considerations, specification strategy, bid evaluation techniques, and a vendor's perspective of utility procurements. KEY POINTS. The seminar focused on specification features, procurement procedures, and bid evaluation techniques designed to implement life-cycle cost-effective procurement consistent with the plantmore » mission. Speakers highlighted the following issues: Experiential case histories of recent procurements and startups, emphasizing how to design procurement procedures that improve plant operating economics; Current trends in permitting for NO{sub x} compliance and recent permitting experience; Quantifiable evaluations of vendors' bids for RAM-related characteristics; The means to obtain specifically desired but nonstandard equipment features.« less
Novel approaches for road congestion mitigation.
DOT National Transportation Integrated Search
2012-07-02
Transportation planning is usually aiming to solve two problems: the traffic assignment and the toll pricing problems. The latter one utilizes information from the first one, in order to find the optimal set of tolls that is the set of tolls that lea...
Novel approaches for road congestion minimization.
DOT National Transportation Integrated Search
2012-07-01
Transportation planning is usually aiming to solve two problems: the traffic assignment and the toll pricing problems. The latter one utilizes information from the first one, in order to find the optimal set of tolls that is the set of tolls that lea...
Maxwell's demon in biochemical signal transduction with feedback loop
Ito, Sosuke; Sagawa, Takahiro
2015-01-01
Signal transduction in living cells is vital to maintain life itself, where information transfer in noisy environment plays a significant role. In a rather different context, the recent intensive research on ‘Maxwell's demon'—a feedback controller that utilizes information of individual molecules—have led to a unified theory of information and thermodynamics. Here we combine these two streams of research, and show that the second law of thermodynamics with information reveals the fundamental limit of the robustness of signal transduction against environmental fluctuations. Especially, we find that the degree of robustness is quantitatively characterized by an informational quantity called transfer entropy. Our information-thermodynamic approach is applicable to biological communication inside cells, in which there is no explicit channel coding in contrast to artificial communication. Our result could open up a novel biophysical approach to understand information processing in living systems on the basis of the fundamental information–thermodynamics link. PMID:26099556
2013-07-30
more about STEM. From museums, to gardens, to planetariums and more, Places to Go mobilizes people to explore the STEM resources offered by their...Works website was developed utilizing a phased approach. This approach allowed for informed, periodic updates to the structure, design, and backend ...our web development team, throughout this phase. A significant amount of backend development work on the website, as well as design work was completed
LLCySA: Making Sense of Cyberspace
2014-01-01
data center. His other activities include the development of immersive 3D environments leveraging video- game technology to provide a multiplayer ...exploring data-driven approaches to network protection. Imagine a cyber analyst navigating a three-dimen- sional (3D) game , walking down virtual office...because of information overload. One approach to this challenge leverages technol- ogy utilized in the 3D gaming industry. The video- game medium
An Initial Strategy for Commercial Industry Awareness of the International Space Station
NASA Technical Reports Server (NTRS)
Jorgensen, Catherine A.
1999-01-01
While plans are being developed to utilize the ISS for scientific research, and human and microgravity experiments, it is time to consider the future of the ISS as a world-wide commercial marketplace developed from a government owned, operated and controlled facility. Commercial industry will be able to seize this opportunity to utilize the ISS as a unique manufacturing platform and engineering testbed for advanced technology. NASA has begun the strategic planning of the evolution and commercialization of the ISS. The Pre-Planned Program Improvement (P3I) Working Group at NASA is assessing the future ISS needs and technology plans to enhance ISS performance. Some of these enhancements will allow the accommodation of commercial applications and the Human Exploration and Development of Space mission support. As this information develops, it is essential to disseminate this information to commercial industry, targeting not only the private and public space sector but also the non-aerospace commercial industries. An approach is presented for early distribution of this information via the ISS Evolution Data book that includes ISS baseline system information, baseline utilization and operations plans, advanced technologies, future utilization opportunities, ISS evolution and Design Reference Missions (DRM). This information source and tool can be used as catalyst in the commercial world for the generation of ideas and options to enhance the current capabilities of the ISS.
A country for old men? Long-term home care utilization in Europe.
Balia, Silvia; Brau, Rinaldo
2014-10-01
This paper investigates long-term home care utilization in Europe. Data from the first wave of the Survey on Health, Ageing and Retirement (SHARE) on formal (nursing care and paid domestic help) and informal care (support provided by relatives) are used to study the probability and the quantity of both types of care. The overall process is framed in a fully simultaneous equation system that takes the form of a bivariate two-part model where the reciprocal interaction between formal and informal care is estimated. Endogeneity and unobservable heterogeneity are addressed using a common latent factor approach. The analysis of the relative impact of age and disability on home care utilization is enriched by the use of a proximity to death (PtD) indicator built using the second wave of SHARE. All these indicators are important predictors of home care utilization. In particular, a strong significant effect of PtD is found in the paid domestic help and informal care models. The relationship between formal and informal care moves from substitutability to complementarity depending on the type of care considered, and the estimated effects are small in absolute size. This might call for a reconsideration of the effectiveness of incentives for informal care as instruments to reduce public expenditure for home care services. Copyright © 2013 John Wiley & Sons, Ltd.
Adding localization information in a fingerprint binary feature vector representation
NASA Astrophysics Data System (ADS)
Bringer, Julien; Despiegel, Vincent; Favre, Mélanie
2011-06-01
At BTAS'10, a new framework to transform a fingerprint minutiae template into a binary feature vector of fixed length is described. A fingerprint is characterized by its similarity with a fixed number set of representative local minutiae vicinities. This approach by representative leads to a fixed length binary representation, and, as the approach is local, it enables to deal with local distortions that may occur between two acquisitions. We extend this construction to incorporate additional information in the binary vector, in particular on localization of the vicinities. We explore the use of position and orientation information. The performance improvement is promising for utilization into fast identification algorithms or into privacy protection algorithms.
2006-09-01
expected advancements in information technology and library science offer the best hope of resolving the above concerns. vi • An EWA will be...information technology and library science must be utilized to accomplish this. Some DOD research investment may be required to resolve DOD specific...distributed assessment process that exploits the documentation of all of the CEST issues, advances in information technology and library science , and the
Sharma, Vivekanand; Law, Wayne; Balick, Michael J; Sarkar, Indra Neil
2017-01-01
The growing amount of data describing historical medicinal uses of plants from digitization efforts provides the opportunity to develop systematic approaches for identifying potential plant-based therapies. However, the task of cataloguing plant use information from natural language text is a challenging task for ethnobotanists. To date, there have been only limited adoption of informatics approaches used for supporting the identification of ethnobotanical information associated with medicinal uses. This study explored the feasibility of using biomedical terminologies and natural language processing approaches for extracting relevant plant-associated therapeutic use information from historical biodiversity literature collection available from the Biodiversity Heritage Library. The results from this preliminary study suggest that there is potential utility of informatics methods to identify medicinal plant knowledge from digitized resources as well as highlight opportunities for improvement.
Sharma, Vivekanand; Law, Wayne; Balick, Michael J.; Sarkar, Indra Neil
2017-01-01
The growing amount of data describing historical medicinal uses of plants from digitization efforts provides the opportunity to develop systematic approaches for identifying potential plant-based therapies. However, the task of cataloguing plant use information from natural language text is a challenging task for ethnobotanists. To date, there have been only limited adoption of informatics approaches used for supporting the identification of ethnobotanical information associated with medicinal uses. This study explored the feasibility of using biomedical terminologies and natural language processing approaches for extracting relevant plant-associated therapeutic use information from historical biodiversity literature collection available from the Biodiversity Heritage Library. The results from this preliminary study suggest that there is potential utility of informatics methods to identify medicinal plant knowledge from digitized resources as well as highlight opportunities for improvement. PMID:29854223
Simultaneous Visualization of Different Utility Networks for Disaster Management
NASA Astrophysics Data System (ADS)
Semm, S.; Becker, T.; Kolbe, T. H.
2012-07-01
Cartographic visualizations of crises are used to create a Common Operational Picture (COP) and enforce Situational Awareness by presenting and representing relevant information. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific decision-making throughout the crises. Since, Operator's attention span and their working memory are limiting factors for the process of getting and interpreting information; the cartographic presentation has to support individuals in coordinating their activities and with handling highly dynamic situations. The Situational Awareness of operators in conjunction with a COP are key aspects of the decision making process and essential for coming to appropriate decisions. Utility networks are one of the most complex and most needed systems within a city. The visualization of utility infrastructure in crisis situations is addressed in this paper. The paper will provide a conceptual approach on how to simplify, aggregate, and visualize multiple utility networks and their components to meet the requirements of the decision-making process and to support Situational Awareness.
Span graphics display utilities handbook, first edition
NASA Technical Reports Server (NTRS)
Gallagher, D. L.; Green, J. L.; Newman, R.
1985-01-01
The Space Physics Analysis Network (SPAN) is a computer network connecting scientific institutions throughout the United States. This network provides an avenue for timely, correlative research between investigators, in a multidisciplinary approach to space physics studies. An objective in the development of SPAN is to make available direct and simplified procedures that scientists can use, without specialized training, to exchange information over the network. Information exchanges include raw and processes data, analysis programs, correspondence, documents, and graphite images. This handbook details procedures that can be used to exchange graphic images over SPAN. The intent is to periodically update this handbook to reflect the constantly changing facilities available on SPAN. The utilities described within reflect an earnest attempt to provide useful descriptions of working utilities that can be used to transfer graphic images across the network. Whether graphic images are representative of satellite servations or theoretical modeling and whether graphics images are of device dependent or independent type, the SPAN graphics display utilities handbook will be the users guide to graphic image exchange.
[Rational drug use: an economic approach to decision making].
Mota, Daniel Marques; da Silva, Marcelo Gurgel Carlos; Sudo, Elisa Cazue; Ortún, Vicente
2008-04-01
The present article approaches rational drug use (RDU) from the economical point of view. The implementation of RDU implies in costs and involves acquisition of knowledge and behavioral changes of several agents. The difficulties in implementing RDU may be due to shortage problems, information asymmetry, lack of information, uncertain clinical decisions, externalities, time-price, incentives for drug prescribers and dispensers, drug prescriber preferences and marginal utility. Health authorities, among other agencies, must therefore regularize, rationalize and control drug use to minimize inefficiency in pharmaceutical care and to prevent exposing the population to unnecessary health risks.
An approach for integrating toxicogenomic data in risk assessment: The dibutyl phthalate case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Euling, Susan Y., E-mail: euling.susan@epa.gov; Thompson, Chad M.; Chiu, Weihsueh A.
An approach for evaluating and integrating genomic data in chemical risk assessment was developed based on the lessons learned from performing a case study for the chemical dibutyl phthalate. A case study prototype approach was first developed in accordance with EPA guidance and recommendations of the scientific community. Dibutyl phthalate (DBP) was selected for the case study exercise. The scoping phase of the dibutyl phthalate case study was conducted by considering the available DBP genomic data, taken together with the entire data set, for whether they could inform various risk assessment aspects, such as toxicodynamics, toxicokinetics, and dose–response. A descriptionmore » of weighing the available dibutyl phthalate data set for utility in risk assessment provides an example for considering genomic data for future chemical assessments. As a result of conducting the scoping process, two questions—Do the DBP toxicogenomic data inform 1) the mechanisms or modes of action?, and 2) the interspecies differences in toxicodynamics?—were selected to focus the case study exercise. Principles of the general approach include considering the genomics data in conjunction with all other data to determine their ability to inform the various qualitative and/or quantitative aspects of risk assessment, and evaluating the relationship between the available genomic and toxicity outcome data with respect to study comparability and phenotypic anchoring. Based on experience from the DBP case study, recommendations and a general approach for integrating genomic data in chemical assessment were developed to advance the broader effort to utilize 21st century data in risk assessment. - Highlights: • Performed DBP case study for integrating genomic data in risk assessment • Present approach for considering genomic data in chemical risk assessment • Present recommendations for use of genomic data in chemical risk assessment.« less
2012-05-07
AFRL-RV-PS- AFRL-RV-PS- TP-2012-0017 TP-2012-0017 MULTIPLE-ARRAY DETECTION, ASSOCIATION AND LOCATION OF INFRASOUND AND SEISMO-ACOUSTIC...ASSOCIATION AND LOCATION OF 5a. CONTRACT NUMBER FA8718-08-C-0008 INFRASOUND AND SEISMO-ACOUSTIC EVENT – UTILIZATION OF GROUND-TRUTH... infrasound signals from both correlated and uncorrelated noise. Approaches to this problem are implementation of the F-detector, which employs the F
Statistically Based Approach to Broadband Liner Design and Assessment
NASA Technical Reports Server (NTRS)
Jones, Michael G. (Inventor); Nark, Douglas M. (Inventor)
2016-01-01
A broadband liner design optimization includes utilizing in-duct attenuation predictions with a statistical fan source model to obtain optimum impedance spectra over a number of flow conditions for one or more liner locations in a bypass duct. The predicted optimum impedance information is then used with acoustic liner modeling tools to design liners having impedance spectra that most closely match the predicted optimum values. Design selection is based on an acceptance criterion that provides the ability to apply increasing weighting to specific frequencies and/or operating conditions. One or more broadband design approaches are utilized to produce a broadband liner that targets a full range of frequencies and operating conditions.
1980-12-01
experimental approach . Brit. J. Educ. Psychol. 37, 1967, 81-98 -I EYSENCK, H.J.: Personality and attainment: an application of psychological principles to...and away from written testing. Accordingly, a large portion of US Army testing employs the hands-on approach to testing. 3The Hands-On Test (HOT) is...increase the amount of job- related information utilized by ratets in appraisals. All three approaches are aimed at increasing accuracy by reducing
NASA Technical Reports Server (NTRS)
Gardner, D. G.; Tejwani, G. D.; Bircher, F. E.; Loboda, J. A.; Van Dyke, D. B.; Chenevert, D. J.
1991-01-01
Details are presented of the approach used in a comprehensive program to utilize exhaust plume diagnostics for rocket engine health-and-condition monitoring and assessing SSME component wear and degradation. This approach incorporates both spectral and video monitoring of the exhaust plume. Video monitoring provides qualitative data for certain types of component wear while spectral monitoring allows both quantitative and qualitative information. Consideration is given to spectral identification of SSME materials and baseline plume emissions.
Central Limit Theorem: New SOCR Applet and Demonstration Activity
ERIC Educational Resources Information Center
Dinov, Ivo D.; Christou, Nicholas; Sanchez, Juana
2008-01-01
Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multi-faceted learning environments, which may facilitate student comprehension and information…
Ecology, Elementary Teaching Guide.
ERIC Educational Resources Information Center
Gross, Iva Helen
In an effort to provide background information and encourage incorporation of ecological understandings into the curriculum, this teacher's guide has been devised for fourth and fifth grade teachers. It utilizes an activity-oriented approach to discovery and inquiry, outlining behavioral objectives, learning activities, teaching suggestions, and…
Biswas, Rakesh; Maniam, Jayanthy; Lee, Edwin Wen Huo; Gopal, Premalatha; Umakanth, Shashikiran; Dahiya, Sumit; Ahmed, Sayeed
2008-10-01
The hypothesis in the conceptual model was that a user-driven innovation in presently available information and communication technology infrastructure would be able to meet patient and health professional users information needs and help them attain better health outcomes. An operational model was created to plan a trial on a sample diabetic population utilizing a randomized control trial design, assigning one randomly selected group of diabetics to receive electronic information intervention and analyse if it would improve their health outcomes in comparison with a matched diabetic population who would only receive regular medical intervention. Diabetes was chosen for this particular trial, as it is a major chronic illness in Malaysia as elsewhere in the world. It is in essence a position paper for how the study concept should be organized to stimulate wider discussion prior to beginning the study.
Hawkins, H; Langer, J; Padua, E; Reaves, J
2001-06-01
Activity-based costing (ABC) is a process that enables the estimation of the cost of producing a product or service. More accurate than traditional charge-based approaches, it emphasizes analysis of processes, and more specific identification of both direct and indirect costs. This accuracy is essential in today's healthcare environment, in which managed care organizations necessitate responsible and accountable costing. However, to be successfully utilized, it requires time, effort, expertise, and support. Data collection can be tedious and expensive. By integrating ABC with information management (IM) and systems (IS), organizations can take advantage of the process orientation of both, extend and improve ABC, and decrease resource utilization for ABC projects. In our case study, we have examined the process of a multidisciplinary breast center. We have mapped the constituent activities and established cost drivers. This information has been structured and included in our information system database for subsequent analysis.
Delinquent-Victim Youth-Adapting a Trauma-Informed Approach for the Juvenile Justice System.
Rapp, Lisa
2016-01-01
The connection between victimization and later delinquency is well established and most youth involved with the juvenile justice system have at least one if not multiple victimizations in their history. Poly-victimized youth or those presenting with complex trauma require specialized assessment and services to prevent deleterious emotional, physical, and social life consequences. Empirical studies have provided information which can guide practitioners work with these youth and families, yet many of the policies and practices of the juvenile justice system are counter to this model. Many youth-serving organizations are beginning to review their operations to better match a trauma-informed approach and in this article the author will highlight how a trauma-informed care model could be utilized to adapt the juvenile justice system.
Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.
2009-06-01
This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based onmore » this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.« less
NASA Technical Reports Server (NTRS)
Bradley, D. B.; Cain, J. B., III; Williard, M. W.
1978-01-01
The task was to evaluate the ability of a set of timing/synchronization subsystem features to provide a set of desirable characteristics for the evolving Defense Communications System digital communications network. The set of features related to the approaches by which timing/synchronization information could be disseminated throughout the network and the manner in which this information could be utilized to provide a synchronized network. These features, which could be utilized in a large number of different combinations, included mutual control, directed control, double ended reference links, independence of clock error measurement and correction, phase reference combining, and self organizing.
A Framework of Multi Objectives Negotiation for Dynamic Supply Chain Model
NASA Astrophysics Data System (ADS)
Chai, Jia Yee; Sakaguchi, Tatsuhiko; Shirase, Keiichi
Trends of globalization and advances in Information Technology (IT) have created opportunity in collaborative manufacturing across national borders. A dynamic supply chain utilizes these advances to enable more flexibility in business cooperation. This research proposes a concurrent decision making framework for a three echelons dynamic supply chain model. The dynamic supply chain is formed by autonomous negotiation among agents based on multi agents approach. Instead of generating negotiation aspects (such as amount, price and due date) arbitrary, this framework proposes to utilize the information available at operational level of an organization in order to generate realistic negotiation aspect. The effectiveness of the proposed model is demonstrated by various case studies.
Precision diagnostics: moving towards protein biomarker signatures of clinical utility in cancer.
Borrebaeck, Carl A K
2017-03-01
Interest in precision diagnostics has been fuelled by the concept that early detection of cancer would benefit patients; that is, if detected early, more tumours should be resectable and treatment more efficacious. Serum contains massive amounts of potentially diagnostic information, and affinity proteomics has risen as an accurate approach to decipher this, to generate actionable information that should result in more precise and evidence-based options to manage cancer. To achieve this, we need to move from single to multiplex biomarkers, a so-called signature, that can provide significantly increased diagnostic accuracy. This Opinion article focuses on the progress being made in identifying protein biomarker signatures of clinical utility, using blood-based proteomics.
Computational medicinal chemistry in fragment-based drug discovery: what, how and when.
Rabal, Obdulia; Urbano-Cuadrado, Manuel; Oyarzabal, Julen
2011-01-01
The use of fragment-based drug discovery (FBDD) has increased in the last decade due to the encouraging results obtained to date. In this scenario, computational approaches, together with experimental information, play an important role to guide and speed up the process. By default, FBDD is generally considered as a constructive approach. However, such additive behavior is not always present, therefore, simple fragment maturation will not always deliver the expected results. In this review, computational approaches utilized in FBDD are reported together with real case studies, where applicability domains are exemplified, in order to analyze them, and then, maximize their performance and reliability. Thus, a proper use of these computational tools can minimize misleading conclusions, keeping the credit on FBDD strategy, as well as achieve higher impact in the drug-discovery process. FBDD goes one step beyond a simple constructive approach. A broad set of computational tools: docking, R group quantitative structure-activity relationship, fragmentation tools, fragments management tools, patents analysis and fragment-hopping, for example, can be utilized in FBDD, providing a clear positive impact if they are utilized in the proper scenario - what, how and when. An initial assessment of additive/non-additive behavior is a critical point to define the most convenient approach for fragments elaboration.
2005-06-01
cognitive task analysis , organizational information dissemination and interaction, systems engineering, collaboration and communications processes, decision-making processes, and data collection and organization. By blending these diverse disciplines command centers can be designed to support decision-making, cognitive analysis, information technology, and the human factors engineering aspects of Command and Control (C2). This model can then be used as a baseline when dealing with work in areas of business processes, workflow engineering, information management,
1994-04-01
engineering and con- struction management services for both military and civil works programs. In FY93, the cost of those programs exceeded $10 billion and...A related issue was to explore the USACE costs , benefits, and barriers to implementing a single Class VI system software package for both the military...provide information in useful ways, track utilization information, I A Class HI system is defined in AR 25-3. It is a system whose total program costs are
Using Derivative Contracts to Mitigate Water Utility Financial Risks
NASA Astrophysics Data System (ADS)
Characklis, G. W.; Zeff, H.
2012-12-01
As developing new supply capacity has become increasingly expensive and difficult to permit, utilities have become more reliant on temporary demand management programs, such as outdoor water use restrictions, for ensuring reliability during drought. However, a significant fraction of water utility income is often derived from the volumetric sale of water, and such restrictions can lead to substantial revenue losses. Given that many utilities set prices at levels commensurate with recovering costs, these revenue losses can leave them financially vulnerable to budgetary shortfalls during drought. This work explores approaches for mitigating drought-related revenue losses through the use of third-party financial insurance contracts based on weather derivatives. Two different types of contracts are developed, and their efficacy is compared against two more traditional forms of financial hedging used by water utilities: drought surcharges and contingency funds (i.e. self insurance). Strategies involving each of these approaches, as well as their use in combination, are applied under conditions facing the water utility serving Durham, North Carolina. A multi-reservoir model provides information on the scale and timing of droughts, with the financial effects of these events simulated using detailed data derived from utility billing records. Results suggest that third-party derivative contracts, either independently or in combination with more traditional hedging tools (i.e. surcharges, contingency funds), can provide an effective means of reducing a utility's financial vulnerability to drought.
Managing water utility financial risks through third-party index insurance contracts
NASA Astrophysics Data System (ADS)
Zeff, Harrison B.; Characklis, Gregory W.
2013-08-01
As developing new supply capacity has become increasingly expensive and difficult to permit (i.e., regulatory approval), utilities have become more reliant on temporary demand management programs, such as outdoor water use restrictions, for ensuring reliability during drought. However, a significant fraction of water utility income is often derived from the volumetric sale of water, and such restrictions can lead to substantial revenue losses. Given that many utilities set prices at levels commensurate with recovering costs, these revenue losses can leave them financially vulnerable to budgetary shortfalls. This work explores approaches for mitigating drought-related revenue losses through the use of third-party financial insurance contracts based on streamflow indices. Two different types of contracts are developed, and their efficacy is compared against two more traditional forms of financial hedging used by water utilities: Drought surcharges and contingency funds (i.e., self-insurance). Strategies involving each of these approaches, as well as their use in combination, are applied under conditions facing the water utility serving Durham, North Carolina. A multireservoir model provides information on the scale and timing of droughts, and the financial effects of these events are simulated using detailed data derived from utility billing records. Results suggest that third-party index insurance contracts, either independently or in combination with more traditional hedging tools, can provide an effective means of reducing a utility's financial vulnerability to drought.
Teaching Genocide through GIS: A Transformative Approach
ERIC Educational Resources Information Center
Fitchett, Paul G.; Good, Amy J.
2012-01-01
The utilization of Geographical Information Systems (GIS) and geobrowsers (Google Earth) have become increasingly prevalent in the study of genocide. These applications offer teachers and students the opportunity to analyze historical and contemporary genocidal acts from a critical geographic perspective in which the confluence of historical…
NASA Astrophysics Data System (ADS)
Li, Haifeng; Zhu, Qing; Yang, Xiaoxia; Xu, Linrong
2012-10-01
Typical characteristics of remote sensing applications are concurrent tasks, such as those found in disaster rapid response. The existing composition approach to geographical information processing service chain, searches for an optimisation solution and is what can be deemed a "selfish" way. This way leads to problems of conflict amongst concurrent tasks and decreases the performance of all service chains. In this study, a non-cooperative game-based mathematical model to analyse the competitive relationships between tasks, is proposed. A best response function is used, to assure each task maintains utility optimisation by considering composition strategies of other tasks and quantifying conflicts between tasks. Based on this, an iterative algorithm that converges to Nash equilibrium is presented, the aim being to provide good convergence and maximise the utilisation of all tasks under concurrent task conditions. Theoretical analyses and experiments showed that the newly proposed method, when compared to existing service composition methods, has better practical utility in all tasks.
A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data
NASA Astrophysics Data System (ADS)
Kohl, B. C.; Given, J.
2017-12-01
The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.
NASA Astrophysics Data System (ADS)
Westermayer, C.; Schirrer, A.; Hemedi, M.; Kozek, M.
2013-12-01
An ℋ∞ full information feedforward design approach for longitudinal motion prefilter design of a large flexible blended wing body (BWB) aircraft is presented. An existing onset is extended such that specifications concerning command tracking, limited control energy, and manoeuvre load reduction can be addressed simultaneously. Therefore, the utilized design architecture is provided and manual tuning aspects are considered. In order to increase controller tuning efficiency, an automated tuning process based on several optimization criteria is proposed. Moreover, two design methodologies for the parameter-varying design case are investigated. The obtained controller is validated on a high-order nonlinear model, indicating the high potential of the presented approach for flexible aircraft control.
Fuzzy Naive Bayesian model for medical diagnostic decision support.
Wagholikar, Kavishwar B; Vijayraghavan, Sundararajan; Deshpande, Ashok W
2009-01-01
This work relates to the development of computational algorithms to provide decision support to physicians. The authors propose a Fuzzy Naive Bayesian (FNB) model for medical diagnosis, which extends the Fuzzy Bayesian approach proposed by Okuda. A physician's interview based method is described to define a orthogonal fuzzy symptom information system, required to apply the model. For the purpose of elaboration and elicitation of characteristics, the algorithm is applied to a simple simulated dataset, and compared with conventional Naive Bayes (NB) approach. As a preliminary evaluation of FNB in real world scenario, the comparison is repeated on a real fuzzy dataset of 81 patients diagnosed with infectious diseases. The case study on simulated dataset elucidates that FNB can be optimal over NB for diagnosing patients with imprecise-fuzzy information, on account of the following characteristics - 1) it can model the information that, values of some attributes are semantically closer than values of other attributes, and 2) it offers a mechanism to temper exaggerations in patient information. Although the algorithm requires precise training data, its utility for fuzzy training data is argued for. This is supported by the case study on infectious disease dataset, which indicates optimality of FNB over NB for the infectious disease domain. Further case studies on large datasets are required to establish utility of FNB.
Pediatric acute gastroenteritis: understanding caregivers' experiences and information needs.
Albrecht, Lauren; Hartling, Lisa; Scott, Shannon D
2017-05-01
Pediatric acute gastroenteritis (AGE) is a common condition with high health care utilization, persistent practice variation, and substantial family burden. An initial approach to resolve these issues is to understand the patient/caregiver experience of this illness. The objective of this study was to describe caregivers' experiences of pediatric AGE and identify their information needs, preferences, and priorities. A qualitative, descriptive study was conducted. Caregivers of a child with AGE were recruited for this study in the pediatric emergency department (ED) at a tertiary hospital in a major urban centre. Individual interviews were conducted (n=15), and a thematic analysis of interview transcripts was completed using a hybrid inductive/deductive approach. Five major themes were identified and described: 1) caregiver management strategies; 2) reasons for going to the ED; 3) treatment and management of AGE in the ED; 4) caregivers' information needs; and 5) additional factors influencing caregivers' experiences and decision-making. A number of subthemes within each major theme were identified and described. This qualitative descriptive study has identified caregiver information needs, preferences, and priorities regarding pediatric AGE. This study also identified inconsistencies in the treatment and management of pediatric AGE at home and in the ED that influence health care utilization and patient outcomes related to pediatric AGE.
Multisource geological data mining and its utilization of uranium resources exploration
NASA Astrophysics Data System (ADS)
Zhang, Jie-lin
2009-10-01
Nuclear energy as one of clear energy sources takes important role in economic development in CHINA, and according to the national long term development strategy, many more nuclear powers will be built in next few years, so it is a great challenge for uranium resources exploration. Research and practice on mineral exploration demonstrates that utilizing the modern Earth Observe System (EOS) technology and developing new multi-source geological data mining methods are effective approaches to uranium deposits prospecting. Based on data mining and knowledge discovery technology, this paper uses multi-source geological data to character electromagnetic spectral, geophysical and spatial information of uranium mineralization factors, and provides the technical support for uranium prospecting integrating with field remote sensing geological survey. Multi-source geological data used in this paper include satellite hyperspectral image (Hyperion), high spatial resolution remote sensing data, uranium geological information, airborne radiometric data, aeromagnetic and gravity data, and related data mining methods have been developed, such as data fusion of optical data and Radarsat image, information integration of remote sensing and geophysical data, and so on. Based on above approaches, the multi-geoscience information of uranium mineralization factors including complex polystage rock mass, mineralization controlling faults and hydrothermal alterations have been identified, the metallogenic potential of uranium has been evaluated, and some predicting areas have been located.
Estimating Wood Volume for Pinus Brutia Trees in Forest Stands from QUICKBIRD-2 Imagery
NASA Astrophysics Data System (ADS)
Patias, Petros; Stournara, Panagiota
2016-06-01
Knowledge of forest parameters, such as wood volume, is required for a sustainable forest management. Collecting such information in the field is laborious and even not feasible in inaccessible areas. In this study, tree wood volume is estimated utilizing remote sensing techniques, which can facilitate the extraction of relevant information. The study area is the University Forest of Taxiarchis, which is located in central Chalkidiki, Northern Greece and covers an area of 58km2. The tree species under study is the conifer evergreen species P. brutia (Calabrian pine). Three plot surfaces of 10m radius were used. VHR Quickbird-2 images are used in combination with an allometric relationship connecting the Tree Crown with the Diameter at breast height (Dbh), and a volume table developed for Greece. The overall methodology is based on individual tree crown delineation, based on (a) the marker-controlled watershed segmentation approach and (b) the GEographic Object-Based Image Analysis approach. The aim of the first approach is to extract separate segments each of them including a single tree and eventual lower vegetation, shadows, etc. The aim of the second approach is to detect and remove the "noisy" background. In the application of the first approach, the Blue, Green, Red, Infrared and PCA-1 bands are tested separately. In the application of the second approach, NDVI and image brightness thresholds are utilized. The achieved results are evaluated against field plot data. Their observed difference are between -5% to +10%.
Computational modeling of RNA 3D structures, with the aid of experimental restraints
Magnus, Marcin; Matelska, Dorota; Łach, Grzegorz; Chojnowski, Grzegorz; Boniecki, Michal J; Purta, Elzbieta; Dawson, Wayne; Dunin-Horkawicz, Stanislaw; Bujnicki, Janusz M
2014-01-01
In addition to mRNAs whose primary function is transmission of genetic information from DNA to proteins, numerous other classes of RNA molecules exist, which are involved in a variety of functions, such as catalyzing biochemical reactions or performing regulatory roles. In analogy to proteins, the function of RNAs depends on their structure and dynamics, which are largely determined by the ribonucleotide sequence. Experimental determination of high-resolution RNA structures is both laborious and difficult, and therefore, the majority of known RNAs remain structurally uncharacterized. To address this problem, computational structure prediction methods were developed that simulate either the physical process of RNA structure formation (“Greek science” approach) or utilize information derived from known structures of other RNA molecules (“Babylonian science” approach). All computational methods suffer from various limitations that make them generally unreliable for structure prediction of long RNA sequences. However, in many cases, the limitations of computational and experimental methods can be overcome by combining these two complementary approaches with each other. In this work, we review computational approaches for RNA structure prediction, with emphasis on implementations (particular programs) that can utilize restraints derived from experimental analyses. We also list experimental approaches, whose results can be relatively easily used by computational methods. Finally, we describe case studies where computational and experimental analyses were successfully combined to determine RNA structures that would remain out of reach for each of these approaches applied separately. PMID:24785264
Evaluating the Potential of Commercial GIS for Accelerator Configuration Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.L. Larrieu; Y.R. Roblin; K. White
2005-10-10
The Geographic Information System (GIS) is a tool used by industries needing to track information about spatially distributed assets. A water utility, for example, must know not only the precise location of each pipe and pump, but also the respective pressure rating and flow rate of each. In many ways, an accelerator such as CEBAF (Continuous Electron Beam Accelerator Facility) can be viewed as an ''electron utility''. Whereas the water utility uses pipes and pumps, the ''electron utility'' uses magnets and RF cavities. At Jefferson lab we are exploring the possibility of implementing ESRI's ArcGIS as the framework for buildingmore » an all-encompassing accelerator configuration database that integrates location, configuration, maintenance, and connectivity details of all hardware and software. The possibilities of doing so are intriguing. From the GIS, software such as the model server could always extract the most-up-to-date layout information maintained by the Survey & Alignment for lattice modeling. The Mechanical Engineering department could use ArcGIS tools to generate CAD drawings of machine segments from the same database. Ultimately, the greatest benefit of the GIS implementation could be to liberate operators and engineers from the limitations of the current system-by-system view of machine configuration and allow a more integrated regional approach. The commercial GIS package provides a rich set of tools for database-connectivity, versioning, distributed editing, importing and exporting, and graphical analysis and querying, and therefore obviates the need for much custom development. However, formidable challenges to implementation exist and these challenges are not only technical and manpower issues, but also organizational ones. The GIS approach would crosscut organizational boundaries and require departments, which heretofore have had free reign to manage their own data, to cede some control and agree to a centralized framework.« less
MAGDM linear-programming models with distinct uncertain preference structures.
Xu, Zeshui S; Chen, Jian
2008-10-01
Group decision making with preference information on alternatives is an interesting and important research topic which has been receiving more and more attention in recent years. The purpose of this paper is to investigate multiple-attribute group decision-making (MAGDM) problems with distinct uncertain preference structures. We develop some linear-programming models for dealing with the MAGDM problems, where the information about attribute weights is incomplete, and the decision makers have their preferences on alternatives. The provided preference information can be represented in the following three distinct uncertain preference structures: 1) interval utility values; 2) interval fuzzy preference relations; and 3) interval multiplicative preference relations. We first establish some linear-programming models based on decision matrix and each of the distinct uncertain preference structures and, then, develop some linear-programming models to integrate all three structures of subjective uncertain preference information provided by the decision makers and the objective information depicted in the decision matrix. Furthermore, we propose a simple and straightforward approach in ranking and selecting the given alternatives. It is worth pointing out that the developed models can also be used to deal with the situations where the three distinct uncertain preference structures are reduced to the traditional ones, i.e., utility values, fuzzy preference relations, and multiplicative preference relations. Finally, we use a practical example to illustrate in detail the calculation process of the developed approach.
Social media as an information source for rapid flood inundation mapping
NASA Astrophysics Data System (ADS)
Fohringer, J.; Dransch, D.; Kreibich, H.; Schröter, K.
2015-12-01
During and shortly after a disaster, data about the hazard and its consequences are scarce and not readily available. Information provided by eyewitnesses via social media is a valuable information source, which should be explored in a~more effective way. This research proposes a methodology that leverages social media content to support rapid inundation mapping, including inundation extent and water depth in the case of floods. The novelty of this approach is the utilization of quantitative data that are derived from photos from eyewitnesses extracted from social media posts and their integration with established data. Due to the rapid availability of these posts compared to traditional data sources such as remote sensing data, areas affected by a flood, for example, can be determined quickly. The challenge is to filter the large number of posts to a manageable amount of potentially useful inundation-related information, as well as to interpret and integrate the posts into mapping procedures in a timely manner. To support rapid inundation mapping we propose a methodology and develop "PostDistiller", a tool to filter geolocated posts from social media services which include links to photos. This spatial distributed contextualized in situ information is further explored manually. In an application case study during the June 2013 flood in central Europe we evaluate the utilization of this approach to infer spatial flood patterns and inundation depths in the city of Dresden.
Social media as an information source for rapid flood inundation mapping
NASA Astrophysics Data System (ADS)
Fohringer, J.; Dransch, D.; Kreibich, H.; Schröter, K.
2015-07-01
During and shortly after a disaster data about the hazard and its consequences are scarce and not readily available. Information provided by eye-witnesses via social media are a valuable information source, which should be explored in a more effective way. This research proposes a methodology that leverages social media content to support rapid inundation mapping, including inundation extent and water depth in case of floods. The novelty of this approach is the utilization of quantitative data that are derived from photos from eye-witnesses extracted from social media posts and its integration with established data. Due to the rapid availability of these posts compared to traditional data sources such as remote sensing data, for example areas affected by a flood can be determined quickly. The challenge is to filter the large number of posts to a manageable amount of potentially useful inundation-related information as well as their timely interpretation and integration in mapping procedures. To support rapid inundation mapping we propose a methodology and develop a tool to filter geo-located posts from social media services which include links to photos. This spatial distributed contextualized in-situ information is further explored manually. In an application case study during the June 2013 flood in central Europe we evaluate the utilization of this approach to infer spatial flood patterns and inundation depths in the city of Dresden.
Beyer, Hans-Georg
2014-01-01
The convergence behaviors of so-called natural evolution strategies (NES) and of the information-geometric optimization (IGO) approach are considered. After a review of the NES/IGO ideas, which are based on information geometry, the implications of this philosophy w.r.t. optimization dynamics are investigated considering the optimization performance on the class of positive quadratic objective functions (the ellipsoid model). Exact differential equations describing the approach to the optimizer are derived and solved. It is rigorously shown that the original NES philosophy optimizing the expected value of the objective functions leads to very slow (i.e., sublinear) convergence toward the optimizer. This is the real reason why state of the art implementations of IGO algorithms optimize the expected value of transformed objective functions, for example, by utility functions based on ranking. It is shown that these utility functions are localized fitness functions that change during the IGO flow. The governing differential equations describing this flow are derived. In the case of convergence, the solutions to these equations exhibit an exponentially fast approach to the optimizer (i.e., linear convergence order). Furthermore, it is proven that the IGO philosophy leads to an adaptation of the covariance matrix that equals in the asymptotic limit-up to a scalar factor-the inverse of the Hessian of the objective function considered.
Density estimation in tiger populations: combining information for strong inference
Gopalaswamy, Arjun M.; Royle, J. Andrew; Delampady, Mohan; Nichols, James D.; Karanth, K. Ullas; Macdonald, David W.
2012-01-01
A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture–recapture data. The model, which combined information, provided the most precise estimate of density (8.5 ± 1.95 tigers/100 km2 [posterior mean ± SD]) relative to a model that utilized only one data source (photographic, 12.02 ± 3.02 tigers/100 km2 and fecal DNA, 6.65 ± 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.
Density estimation in tiger populations: combining information for strong inference.
Gopalaswamy, Arjun M; Royle, J Andrew; Delampady, Mohan; Nichols, James D; Karanth, K Ullas; Macdonald, David W
2012-07-01
A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture-recapture data. The model, which combined information, provided the most precise estimate of density (8.5 +/- 1.95 tigers/100 km2 [posterior mean +/- SD]) relative to a model that utilized only one data source (photographic, 12.02 +/- 3.02 tigers/100 km2 and fecal DNA, 6.65 +/- 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.
Bayesian decoding using unsorted spikes in the rat hippocampus
Layton, Stuart P.; Chen, Zhe; Wilson, Matthew A.
2013-01-01
A fundamental task in neuroscience is to understand how neural ensembles represent information. Population decoding is a useful tool to extract information from neuronal populations based on the ensemble spiking activity. We propose a novel Bayesian decoding paradigm to decode unsorted spikes in the rat hippocampus. Our approach uses a direct mapping between spike waveform features and covariates of interest and avoids accumulation of spike sorting errors. Our decoding paradigm is nonparametric, encoding model-free for representing stimuli, and extracts information from all available spikes and their waveform features. We apply the proposed Bayesian decoding algorithm to a position reconstruction task for freely behaving rats based on tetrode recordings of rat hippocampal neuronal activity. Our detailed decoding analyses demonstrate that our approach is efficient and better utilizes the available information in the nonsortable hash than the standard sorting-based decoding algorithm. Our approach can be adapted to an online encoding/decoding framework for applications that require real-time decoding, such as brain-machine interfaces. PMID:24089403
Grafting: A Technique to Modify Ion Accumulation in Horticultural Crops
Nawaz, Muhammad A.; Imtiaz, Muhammad; Kong, Qiusheng; Cheng, Fei; Ahmed, Waqar; Huang, Yuan; Bie, Zhilong
2016-01-01
Grafting is a centuries-old technique used in plants to obtain economic benefits. Grafting increases nutrient uptake and utilization efficiency in a number of plant species, including fruits, vegetables, and ornamentals. Selected rootstocks of the same species or close relatives are utilized in grafting. Rootstocks absorb more water and ions than self-rooted plants and transport these water and ions to the aboveground scion. Ion uptake is regulated by a complex communication mechanism between the scion and rootstock. Sugars, hormones, and miRNAs function as long-distance signaling molecules and regulate ion uptake and ion homeostasis by affecting the activity of ion transporters. This review summarizes available information on the effect of rootstock on nutrient uptake and utilization and the mechanisms involved. Information on specific nutrient-efficient rootstocks for different crops of commercial importance is also provided. Several other important approaches, such as interstocking (during double grafting), inarching, use of plant-growth-promoting rhizobacteria, use of arbuscular mycorrhizal fungi, use of plant growth substances (e.g., auxin and melatonin), and use of genetically engineered rootstocks and scions (transgrafting), are highlighted; these approaches can be combined with grafting to enhance nutrient uptake and utilization in commercially important plant species. Whether the rootstock and scion affect each other's soil microbiota and their effect on the nutrient absorption of rootstocks remain largely unknown. Similarly, the physiological and molecular bases of grafting, crease formation, and incompatibility are not fully identified and require investigation. Grafting in horticultural crops can help reveal the basic biology of grafting, the reasons for incompatibility, sensing, and signaling of nutrients, ion uptake and transport, and the mechanism of heavy metal accumulation and restriction in rootstocks. Ion transporter and miRNA-regulated nutrient studies have focused on model and non-grafted plants, and information on grafted plants is limited. Such information will improve the development of nutrient-efficient rootstocks. PMID:27818663
Psychodrama: A Creative Approach for Addressing Parallel Process in Group Supervision
ERIC Educational Resources Information Center
Hinkle, Michelle Gimenez
2008-01-01
This article provides a model for using psychodrama to address issues of parallel process during group supervision. Information on how to utilize the specific concepts and techniques of psychodrama in relation to group supervision is discussed. A case vignette of the model is provided.
A Bayesian Approach to Interactive Retrieval
ERIC Educational Resources Information Center
Tague, Jean M.
1973-01-01
A probabilistic model for interactive retrieval is presented. Bayesian statistical decision theory principles are applied: use of prior and sample information about the relationship of document descriptions to query relevance; maximization of expected value of a utility function, to the problem of optimally restructuring search strategies in an…
Challenge: Teacher's Utilization Manual.
ERIC Educational Resources Information Center
Lincoln Public Schools, NE.
Published as a guide to educational television viewing for the gifted, the stated objective is to extend the learning environment, validate and individualize learning, provide resources, and use a nonverbal approach. For each area discussed the text provides information on the target audience, the need and purpose, methods of achieving the…
Exploiting Untapped Information Resources in Earth Science
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Fox, P. A.; Kempler, S.; Maskey, M.
2015-12-01
One of the continuing challenges in any Earth science investigation is the amount of time and effort required for data preparation before analysis can begin. Current Earth science data and information systems have their own shortcomings. For example, the current data search systems are designed with the assumption that researchers find data primarily by metadata searches on instrument or geophysical keywords, assuming that users have sufficient knowledge of the domain vocabulary to be able to effectively utilize the search catalogs. These systems lack support for new or interdisciplinary researchers who may be unfamiliar with the domain vocabulary or the breadth of relevant data available. There is clearly a need to innovate and evolve current data and information systems in order to improve data discovery and exploration capabilities to substantially reduce the data preparation time and effort. We assert that Earth science metadata assets are dark resources, information resources that organizations collect, process, and store for regular business or operational activities but fail to utilize for other purposes. The challenge for any organization is to recognize, identify and effectively utilize the dark data stores in their institutional repositories to better serve their stakeholders. NASA Earth science metadata catalogs contain dark resources consisting of structured information, free form descriptions of data and pre-generated images. With the addition of emerging semantic technologies, such catalogs can be fully utilized beyond their original design intent of supporting current search functionality. In this presentation, we will describe our approach of exploiting these information resources to provide novel data discovery and exploration pathways to science and education communities
NASA Astrophysics Data System (ADS)
Corbett, Jacqueline Marie
Enabled by advanced communication and information technologies, the smart grid represents a major transformation for the electricity sector. Vast quantities of data and two-way communications abilities create the potential for a flexible, data-driven, multi-directional supply and consumption network well equipped to meet the challenges of the next century. For electricity service providers ("utilities"), the smart grid provides opportunities for improved business practices and new business models; however, a transformation of such magnitude is not without risks. Three related studies are conducted to explore the implications of the smart grid on utilities' demand-side activities. An initial conceptual framework, based on organizational information processing theory, suggests that utilities' performance depends on the fit between the information processing requirements and capacities associated with a given demand-side activity. Using secondary data and multiple regression analyses, the first study finds, consistent with OIPT, a positive relationship between utilities' advanced meter deployments and demand-side management performance. However, it also finds that meters with only data collection capacities are associated with lower performance, suggesting the presence of information waste causing operational inefficiencies. In the second study, interviews with industry participants provide partial support for the initial conceptual model, new insights are gained with respect to information processing fit and information waste, and "big data" is identified as a central theme of the smart grid. To derive richer theoretical insights, the third study employs a grounded theory approach examining the experience of one successful utility in detail. Based on interviews and documentary data, the paradox of dynamic stability emerges as an essential enabler of utilities' performance in the smart grid environment. Within this context, the frames of opportunity, control, and data limitation interact to support dynamic stability and contribute to innovation within tradition. The main contributions of this thesis include theoretical extensions to OIPT and the development of an emergent model of dynamic stability in relation to big data. The thesis also adds to the green IS literature and identifies important practical implications for utilities as they endeavour to bring the smart grid to reality.
Robust Inference of Cell-to-Cell Expression Variations from Single- and K-Cell Profiling
Narayanan, Manikandan; Martins, Andrew J.; Tsang, John S.
2016-01-01
Quantifying heterogeneity in gene expression among single cells can reveal information inaccessible to cell-population averaged measurements. However, the expression level of many genes in single cells fall below the detection limit of even the most sensitive technologies currently available. One proposed approach to overcome this challenge is to measure random pools of k cells (e.g., 10) to increase sensitivity, followed by computational “deconvolution” of cellular heterogeneity parameters (CHPs), such as the biological variance of single-cell expression levels. Existing approaches infer CHPs using either single-cell or k-cell data alone, and typically within a single population of cells. However, integrating both single- and k-cell data may reap additional benefits, and quantifying differences in CHPs across cell populations or conditions could reveal novel biological information. Here we present a Bayesian approach that can utilize single-cell, k-cell, or both simultaneously to infer CHPs within a single condition or their differences across two conditions. Using simulated as well as experimentally generated single- and k-cell data, we found situations where each data type would offer advantages, but using both together can improve precision and better reconcile CHP information contained in single- and k-cell data. We illustrate the utility of our approach by applying it to jointly generated single- and k-cell data to reveal CHP differences in several key inflammatory genes between resting and inflammatory cytokine-activated human macrophages, delineating differences in the distribution of ‘ON’ versus ‘OFF’ cells and in continuous variation of expression level among cells. Our approach thus offers a practical and robust framework to assess and compare cellular heterogeneity within and across biological conditions using modern multiplexed technologies. PMID:27438699
ERIC Educational Resources Information Center
ALTMANN, BERTHOLD; BROWN, WILLIAM G.
THE FIRST-GENERATION APPROACH BY CONCEPT (ABC) STORAGE AND RETRIEVAL METHOD, A METHOD WHICH UTILIZES AS A SUBJECT APPROACH APPROPRIATE STANDARDIZED ENGLISH-LANGUAGE STATEMENTS PROCESSED AND PRINTED IN A PERMUTED INDEX FORMAT, UNDERWENT A PERFORMANCE TEST, THE PRIMARY OBJECTIVE OF WHICH WAS TO SPOT DEFICIENCIES AND TO DEVELOP A SECOND-GENERATION…
A Lean Approach to Improving SE Visibility in Large Operational Systems Evolution
2013-06-01
large health care system of systems. To enhance both visibility and flow, the approach utilizes visualization techniques, pull-scheduling processes...development processes. This paper describes an example implementation of the concept in a large health care system of systems. To enhance both visibility...and then provides the results to the requestor as soon as available. Hospital System Information Support Development The health care SoS is a set
Query by Browsing: An Alternative Hypertext Information Retrieval Method
Frisse, Mark E.; Cousins, Steve B.
1989-01-01
In this paper we discuss our efforts to develop programs which enhance the ability to navigate through large medical hypertext systems. Our approach organizes hypertext index terms into a belief network and uses reader feedback to update the degree of belief in the index terms' utility to a query. We begin by describing various possible configurations for indexes to hypertext. We then describe how belief network calculations can be applied to these indexes. After a brief discussion of early results using manuscripts from a medical handbook, we close with an analysis of our approach's applicability to a wider range of hypertext information retrieval problems.
Martin, J B; Wilkins, A S; Stawski, S K
1998-08-01
The evolving health care environment demands that health care organizations fully utilize information technologies (ITs). The effective deployment of IT requires the development and implementation of a comprehensive IT strategic plan. A number of approaches to health care IT strategic planning exist, but they are outdated or incomplete. The component alignment model (CAM) introduced here recognizes the complexity of today's health care environment, emphasizing continuous assessment and realignment of seven basic components: external environment, emerging ITs, organizational infrastructure, mission, IT infrastructure, business strategy, and IT strategy. The article provides a framework by which health care organizations can develop an effective IT strategic planning process.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey H.
2006-01-01
The need for sufficient quantities of oxygen, water, and fuel resources to support a crew on the surface of Mars presents a critical logistical issue of whether to transport such resources from Earth or manufacture them on Mars. An approach based on the classical Wildcat Drilling Problem of Bayesian decision theory was applied to the problem of finding water in order to compute the expected value of precursor mission sample information. An implicit (required) probability of finding water on Mars was derived from the value of sample information using the expected mass savings of alternative precursor missions.
A tactual display aid for primary flight training
NASA Technical Reports Server (NTRS)
Gilson, R. D.
1979-01-01
A means of flight instruction is discussed. In addition to verbal assistance, control feedback was continously presented via a nonvisual means utilizing touch. A kinesthetic-tactile (KT) display was used as a readout and tracking device for a computer generated signal of desired angle of attack during the approach and landing. Airspeed and glide path information was presented via KT or visual heads up display techniques. Performance with the heads up display of pitch information was shown to be significantly better than performance with the KT pitch display. Testing without the displays showed that novice pilots who had received tactile pitch error information performed both pitch and throttle control tasks significantly better than those who had received the same information from the visual heads up display of pitch during the test series of approaches to landing.
NASA Technical Reports Server (NTRS)
Voorhees, J. W.; Bucher, N. M.
1983-01-01
The cockpit has been one of the most rapidly changing areas of new aircraft design over the past thirty years. In connection with these developments, a pilot can now be considered a decision maker/system manager as well as a vehicle controller. There is, however, a trend towards an information overload in the cockpit, and information processing problems begin to occur for the rotorcraft pilot. One approach to overcome the arising difficulties is based on the utilization of voice technology to improve the information transfer rate in the cockpit with respect to both input and output. Attention is given to the background of speech technology, the application of speech technology within the cockpit, voice interactive electronic warning system (VIEWS) simulation, and methodology. Information subsystems are considered along with a dynamic simulation study, and data collection.
Band Excitation Kelvin probe force microscopy utilizing photothermal excitation
Collins, Liam; Jesse, Stephen; Balke, Nina; ...
2015-03-13
A multifrequency open loop Kelvin probe force microscopy (KPFM) approach utilizing photothermal as opposed to electrical excitation is developed. Photothermal band excitation (PthBE)-KPFM is implemented here in a grid mode on a model test sample comprising a metal-insulator junction with local charge-patterned regions. Unlike the previously described open loop BE-KPFM, which relies on capacitive actuation of the cantilever, photothermal actuation is shown to be highly sensitive to the electrostatic force gradient even at biases close to the contact potential difference (CPD). PthBE-KPFM is further shown to provide a more localized measurement of true CPD in comparison to the gold standardmore » ambient KPFM approach, amplitude modulated KPFM. In conclusion, PthBE-KPFM data contain information relating to local dielectric properties and electronic dissipation between tip and sample unattainable using conventional single frequency KPFM approaches.« less
Duggleby, Wendy; Williams, Allison
2016-01-01
The purpose of this article is to discuss methodological and epistemological considerations involved in using qualitative inquiry to develop interventions. These considerations included (a) using diverse methodological approaches and (b) epistemological considerations such as generalization, de-contextualization, and subjective reality. Diverse methodological approaches have the potential to inform different stages of intervention development. Using the development of a psychosocial hope intervention for advanced cancer patients as an example, the authors utilized a thematic study to assess current theories/frameworks and interventions. However, to understand the processes that the intervention needed to target to affect change, grounded theory was used. Epistemological considerations provided a framework to understand and, further, critique the intervention. Using diverse qualitative methodological approaches and examining epistemological considerations were useful in developing an intervention that appears to foster hope in patients with advanced cancer. © The Author(s) 2015.
Segmentation of oil spills in SAR images by using discriminant cuts
NASA Astrophysics Data System (ADS)
Ding, Xianwen; Zou, Xiaolin
2018-02-01
The discriminant cut is used to segment the oil spills in synthetic aperture radar (SAR) images. The proposed approach is a region-based one, which is able to capture and utilize spatial information in SAR images. The real SAR images, i.e. ALOS-1 PALSAR and Sentinel-1 SAR images were collected and used to validate the accuracy of the proposed approach for oil spill segmentation in SAR images. The accuracy of the proposed approach is higher than that of the fuzzy C-means classification method.
Goldfarb, S
1999-03-01
Whether one seeks to reduce inappropriate utilization of resources, improve diagnostic accuracy, increase utilization of effective therapies, or reduce the incidence of complications, the key to change is physician involvement in change. Unfortunately, a simple approach to the problem of inducing change in physician behavior is not available. There is a generally accepted view that expert, best-practice guidelines will improve clinical performance. However, there may be a bias to report positive results and a lack of careful analysis of guideline usage in routine practice in a "postmarketing" study akin to that seen in the pharmaceutical industry. Systems that allow the reliable assessment of quality of outcomes, efficiency of resource utilization, and accurate assessment of the risks associated with the care of given patient populations must be widely available before deciding whether an incentive-based system for providing the full range of medical care is feasible. Decision support focuses on providing information, ideally at the "point of service" and in the context of a particular clinical situation. Rules are self-imposed by physicians and are therefore much more likely to be adopted. As health care becomes corporatized, with increasing numbers of physicians employed by large organizations with the capacity to provide detailed information on the nature and quality of clinical care, it is possible that properly constructed guidelines, appropriate financial incentives, and robust forms of decision support will lead to a physician-led, process improvement approach to more rational and affordable health care.
Utilization of ICT by Moral Education Teachers
ERIC Educational Resources Information Center
Narinasamy, Ilhavenil a/p; Mamat, Wan Hasmah Wan
2013-01-01
Studies show that information and communications technology (ICT) integration in many classrooms today enhances students' learning and skills acquisition. Thus, it is necessary for teachers to integrate ICT in their classrooms. This paper discusses the need to incorporate ICT in Moral Education. This study adopts the qualitative approach design…
A CROSS-SPECIES MODE OF ACTION INFORMATION ASSESSMENT: A CASE STUDY OF BISPHENOL A
A case study assessing the utility of this approach was performed for bisphenol A (BPA). BPA, a component of polycarbonate plastics, epoxy resins, and polyester resins, was selected because it is a high production volume chemical; data have been identified for both vertebrate an...
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Understanding the Stereotype as a Complex Communication Tool: Touchstone Award
ERIC Educational Resources Information Center
Kurylo, Anastacia
2004-01-01
A cognitive approach that views stereotypes as mental tools that function in information processing has dominated recent work on stereotyping. But regardless of their cognitive utility, an association between stereotypes and social injustice (i.e., prejudice, discrimination) has earned stereotypes the label "bad" and consequently something to be…
The large number of diverse chemicals in production or in the environment has motivated medium to high throughput in vitro or small animal approaches to efficiently profile chemical-biological interactions and to utilize this information to assess risks of chemical exposures on h...
Why Audit Communication in Organizations?
ERIC Educational Resources Information Center
White, Noel D.; Greenbaum, Howard H.
The purpose of this paper is to present a common sense proposal, as opposed to a documented proposal, arguing for the adoption of a periodic communication audit procedure in organizations. The paper presents an approach and information the communication consultant can utilize in addressing management practitioners on the topic: "Why Audit…
A new approach to preserve privacy data mining based on fuzzy theory in numerical database
NASA Astrophysics Data System (ADS)
Cui, Run; Kim, Hyoung Joong
2014-01-01
With the rapid development of information techniques, data mining approaches have become one of the most important tools to discover the in-deep associations of tuples in large-scale database. Hence how to protect the private information is quite a huge challenge, especially during the data mining procedure. In this paper, a new method is proposed for privacy protection which is based on fuzzy theory. The traditional fuzzy approach in this area will apply fuzzification to the data without considering its readability. A new style of obscured data expression is introduced to provide more details of the subsets without reducing the readability. Also we adopt a balance approach between the privacy level and utility when to achieve the suitable subgroups. An experiment is provided to show that this approach is suitable for the classification without a lower accuracy. In the future, this approach can be adapted to the data stream as the low computation complexity of the fuzzy function with a suitable modification.
Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines
Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu
2016-01-01
In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved. PMID:27136561
Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines.
Liu, Liansheng; Liu, Datong; Zhang, Yujie; Peng, Yu
2016-04-29
In a complex system, condition monitoring (CM) can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor) to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR). The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA) Ames Research Center and have been used as Prognostics and Health Management (PHM) challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved.
Hierarchical Representation Learning for Kinship Verification.
Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul
2017-01-01
Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.
Informatics applied to cytology
Hornish, Maryanne; Goulart, Robert A.
2008-01-01
Automation and emerging information technologies are being adopted by cytology laboratories to augment Pap test screening and improve diagnostic accuracy. As a result, informatics, the application of computers and information systems to information management, has become essential for the successful operation of the cytopathology laboratory. This review describes how laboratory information management systems can be used to achieve an automated and seamless workflow process. The utilization of software, electronic databases and spreadsheets to perform necessary quality control measures are discussed, as well as a Lean production system and Six Sigma approach, to reduce errors in the cytopathology laboratory. PMID:19495402
Classification of cognitive systems dedicated to data sharing
NASA Astrophysics Data System (ADS)
Ogiela, Lidia; Ogiela, Marek R.
2017-08-01
In this paper will be presented classification of new cognitive information systems dedicated to cryptographic data splitting and sharing processes. Cognitive processes of semantic data analysis and interpretation, will be used to describe new classes of intelligent information and vision systems. In addition, cryptographic data splitting algorithms and cryptographic threshold schemes will be used to improve processes of secure and efficient information management with application of such cognitive systems. The utility of the proposed cognitive sharing procedures and distributed data sharing algorithms will be also presented. A few possible application of cognitive approaches for visual information management and encryption will be also described.
Shape design sensitivity analysis using domain information
NASA Technical Reports Server (NTRS)
Seong, Hwal-Gyeong; Choi, Kyung K.
1985-01-01
A numerical method for obtaining accurate shape design sensitivity information for built-up structures is developed and demonstrated through analysis of examples. The basic character of the finite element method, which gives more accurate domain information than boundary information, is utilized for shape design sensitivity improvement. A domain approach for shape design sensitivity analysis of built-up structures is derived using the material derivative idea of structural mechanics and the adjoint variable method of design sensitivity analysis. Velocity elements and B-spline curves are introduced to alleviate difficulties in generating domain velocity fields. The regularity requirements of the design velocity field are studied.
Simulating ensembles of source water quality using a K-nearest neighbor resampling approach.
Towler, Erin; Rajagopalan, Balaji; Seidel, Chad; Summers, R Scott
2009-03-01
Climatological, geological, and water management factors can cause significant variability in surface water quality. As drinking water quality standards become more stringent, the ability to quantify the variability of source water quality becomes more important for decision-making and planning in water treatment for regulatory compliance. However, paucity of long-term water quality data makes it challenging to apply traditional simulation techniques. To overcome this limitation, we have developed and applied a robust nonparametric K-nearest neighbor (K-nn) bootstrap approach utilizing the United States Environmental Protection Agency's Information Collection Rule (ICR) data. In this technique, first an appropriate "feature vector" is formed from the best available explanatory variables. The nearest neighbors to the feature vector are identified from the ICR data and are resampled using a weight function. Repetition of this results in water quality ensembles, and consequently the distribution and the quantification of the variability. The main strengths of the approach are its flexibility, simplicity, and the ability to use a large amount of spatial data with limited temporal extent to provide water quality ensembles for any given location. We demonstrate this approach by applying it to simulate monthly ensembles of total organic carbon for two utilities in the U.S. with very different watersheds and to alkalinity and bromide at two other U.S. utilities.
Rejection of an innovation: health information management training materials in east Africa.
Gladwin, J; Dixon, R A; Wilson, T D
2002-12-01
A shift towards decentralization in many low-income countries has meant more skills are demanded of primary health care managers, including data and information handling at all levels of the health care system. Ministries of Health are changing their central reporting health information systems to health management information systems with emphasis on managers utilizing information at the point of collection. This paper reports on a research study to investigate the introduction of new information management strategies intended to promote an informational approach to management at the operational health service level in low-income countries. It aims to understand the process taking place when externally developed training materials (PHC MAP), which are intended to strengthen health management information systems, are introduced to potential users in an east African country. A case study has been undertaken and this research has demonstrated that the dynamic equilibrium approach to organizational change is applicable to the introduction of new information management strategies and management approaches in low-income countries. Although PHC MAP developers envisaged a technical innovation needing implementation, potential users saw the situation as one of organizational change. Contributions to theory have been made and many implications for introducing new information systems or the informational approach to management are identified. This theoretical framework could also facilitate the introduction of future information management innovations and would allow practitioners to perceive the introduction of information management innovations as one of organizational change that needs to be managed. Consequently, issues that may facilitate or inhibit adoption could be identified in advance.
Beck, Peter; Truskaller, Thomas; Rakovac, Ivo; Bruner, Fritz; Zanettin, Dominik; Pieber, Thomas R
2009-01-01
5.9% of the Austrian population is affected by diabetes mellitus. Disease Management is a structured treatment approach that is suitable for application to the diabetes mellitus area and often is supported by information technology. This article describes the information systems developed and implemented in the Austrian disease management programme for type 2 diabetes. Several workflows for administration as well as for clinical documentation have been implemented utilizing the Austrian e-Health infrastructure. De-identified clinical data is available for creating feedback reports for providers and programme evaluation.
Distributed photovoltaic systems: Utility interface issues and their present status
NASA Technical Reports Server (NTRS)
Hassan, M.; Klein, J.
1981-01-01
Major technical issues involving the integration of distributed photovoltaics (PV) into electric utility systems are defined and their impacts are described quantitatively. An extensive literature search, interviews, and analysis yielded information about the work in progress and highlighted problem areas in which additional work and research are needed. The findings from the literature search were used to determine whether satisfactory solutions to the problems exist or whether satisfactory approaches to a solution are underway. It was discovered that very few standards, specifications, or guidelines currently exist that will aid industry in integrating PV into the utility system. Specific areas of concern identified are: (1) protection, (2) stability, (3) system unbalance, (4) voltage regulation and reactive power requirements, (5) harmonics, (6) utility operations, (7) safety, (8) metering, and (9) distribution system planning and design.
Boettcher, P J; Tixier-Boichard, M; Toro, M A; Simianer, H; Eding, H; Gandini, G; Joost, S; Garcia, D; Colli, L; Ajmone-Marsan, P
2010-05-01
The genetic diversity of the world's livestock populations is decreasing, both within and across breeds. A wide variety of factors has contributed to the loss, replacement or genetic dilution of many local breeds. Genetic variability within the more common commercial breeds has been greatly decreased by selectively intense breeding programmes. Conservation of livestock genetic variability is thus important, especially when considering possible future changes in production environments. The world has more than 7500 livestock breeds and conservation of all of them is not feasible. Therefore, prioritization is needed. The objective of this article is to review the state of the art in approaches for prioritization of breeds for conservation, particularly those approaches that consider molecular genetic information, and to identify any shortcomings that may restrict their application. The Weitzman method was among the first and most well-known approaches for utilization of molecular genetic information in conservation prioritization. This approach balances diversity and extinction probability to yield an objective measure of conservation potential. However, this approach was designed for decision making across species and measures diversity as distinctiveness. For livestock, prioritization will most commonly be performed among breeds within species, so alternatives that measure diversity as co-ancestry (i.e. also within-breed variability) have been proposed. Although these methods are technically sound, their application has generally been limited to research studies; most existing conservation programmes have effectively primarily based decisions on extinction risk. The development of user-friendly software incorporating these approaches may increase their rate of utilization.
NASA Astrophysics Data System (ADS)
Jia, Heping; Jin, Wende; Ding, Yi; Song, Yonghua; Yu, Dezhao
2017-01-01
With the expanding proportion of renewable energy generation and development of smart grid technologies, flexible demand resources (FDRs) have been utilized as an approach to accommodating renewable energies. However, multiple uncertainties of FDRs may influence reliable and secure operation of smart grid. Multi-state reliability models for a single FDR and aggregating FDRs have been proposed in this paper with regard to responsive abilities for FDRs and random failures for both FDR devices and information system. The proposed reliability evaluation technique is based on Lz transform method which can formulate time-varying reliability indices. A modified IEEE-RTS has been utilized as an illustration of the proposed technique.
Frías-López, Cristina; Sánchez-Herrero, José F; Guirao-Rico, Sara; Mora, Elisa; Arnedo, Miquel A; Sánchez-Gracia, Alejandro; Rozas, Julio
2016-12-15
The development of molecular markers is one of the most important challenges in phylogenetic and genome wide population genetics studies, especially in studies with non-model organisms. A highly promising approach for obtaining suitable markers is the utilization of genomic partitioning strategies for the simultaneous discovery and genotyping of a large number of markers. Unfortunately, not all markers obtained from these strategies provide enough information for solving multiple evolutionary questions at a reasonable taxonomic resolution. We have developed Development Of Molecular markers In Non-model Organisms (DOMINO), a bioinformatics tool for informative marker development from both next generation sequencing (NGS) data and pre-computed sequence alignments. The application implements popular NGS tools with new utilities in a highly versatile pipeline specifically designed to discover or select personalized markers at different levels of taxonomic resolution. These markers can be directly used to study the taxa surveyed for their design, utilized for further downstream PCR amplification in a broader set taxonomic scope, or exploited as suitable templates to bait design for target DNA enrichment techniques. We conducted an exhaustive evaluation of the performance of DOMINO via computer simulations and illustrate its utility to find informative markers in an empirical dataset. DOMINO is freely available from www.ub.edu/softevol/domino CONTACT: elsanchez@ub.edu or jrozas@ub.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Does topological information matter for power grid vulnerability?
Ouyang, Min; Yang, Kun
2014-12-01
Power grids, which are playing an important role in supporting the economy of a region as well as the life of its citizens, could be attacked by terrorists or enemies to damage the region. Depending on different levels of power grid information collected by the terrorists, their attack strategies might be different. This paper groups power grid information into four levels: no information, purely topological information (PTI), topological information with generator and load nodes (GLNI), and full information (including component physical properties and flow parameters information), and then identifies possible attack strategies for each information level. Analyzing and comparing power grid vulnerability under these attack strategies from both terrorists' and utility companies' point of view give rise to an approach to quantify the relative values of these three types of information, including PTI, GLNI, and component parameter information (CPI). This approach can provide information regarding the extent to which topological information matters for power system vulnerability decisions. Taking several test systems as examples, results show that for small attacks with p ≤ 0.1, CPI matters the most; when taking attack cost into consideration and assuming that the terrorists take the optimum cost-efficient attack intensity, then CPI has the largest cost-based information value.
Does topological information matter for power grid vulnerability?
NASA Astrophysics Data System (ADS)
Ouyang, Min; Yang, Kun
2014-12-01
Power grids, which are playing an important role in supporting the economy of a region as well as the life of its citizens, could be attacked by terrorists or enemies to damage the region. Depending on different levels of power grid information collected by the terrorists, their attack strategies might be different. This paper groups power grid information into four levels: no information, purely topological information (PTI), topological information with generator and load nodes (GLNI), and full information (including component physical properties and flow parameters information), and then identifies possible attack strategies for each information level. Analyzing and comparing power grid vulnerability under these attack strategies from both terrorists' and utility companies' point of view give rise to an approach to quantify the relative values of these three types of information, including PTI, GLNI, and component parameter information (CPI). This approach can provide information regarding the extent to which topological information matters for power system vulnerability decisions. Taking several test systems as examples, results show that for small attacks with p ≤ 0.1, CPI matters the most; when taking attack cost into consideration and assuming that the terrorists take the optimum cost-efficient attack intensity, then CPI has the largest cost-based information value.
Yamada, Takashi; Tanaka, Yushiro; Hasegawa, Ryuichi; Sakuratani, Yuki; Yamazoe, Yasushi; Ono, Atsushi; Hirose, Akihiko; Hayashi, Makoto
2014-12-01
We propose a category approach to assessing the testicular toxicity of chemicals with a similar structure to ethylene glycol methyl ether (EGME). Based on toxicity information for EGME and related chemicals and accompanied by adverse outcome pathway information on the testicular toxicity of EGME, this category was defined as chemicals that are metabolized to methoxy- or ethoxyacetic acid, a substance responsible for testicular toxicity. A Japanese chemical inventory was screened using the Hazard Evaluation Support System, which we have developed to support a category approach for predicting the repeated-dose toxicity of chemical substances. Quantitative metabolic information on the related chemicals was then considered, and seventeen chemicals were finally obtained from the inventory as a shortlist for the category. Available data in the literature shows that chemicals for which information is available on the metabolic formation of EGME, ethylene glycol ethyl ether, methoxy- or ethoxyacetic acid do in fact possess testicular toxicity, suggesting that testicular toxicity is a concern, due to metabolic activation, for the remaining chemicals. Our results clearly demonstrate practical utility of AOP-based category approach for predicting repeated-dose toxicity of chemicals. Copyright © 2014 Elsevier Inc. All rights reserved.
Value of information analysis in healthcare: a review of principles and applications.
Tuffaha, Haitham W; Gordon, Louisa G; Scuffham, Paul A
2014-06-01
Economic evaluations are increasingly utilized to inform decisions in healthcare; however, decisions remain uncertain when they are not based on adequate evidence. Value of information (VOI) analysis has been proposed as a systematic approach to measure decision uncertainty and assess whether there is sufficient evidence to support new technologies. The objective of this paper is to review the principles and applications of VOI analysis in healthcare. Relevant databases were systematically searched to identify VOI articles. The findings from the selected articles were summarized and narratively presented. Various VOI methods have been developed and applied to inform decision-making, optimally designing research studies and setting research priorities. However, the application of this approach in healthcare remains limited due to technical and policy challenges. There is a need to create more awareness about VOI analysis, simplify its current methods, and align them with the needs of decision-making organizations.
2014-09-01
Analysis Simulation for Advanced Tracking (TASAT) satellite modeling tool [8,9]. The method uses the bi-reflectance distribution functions ( BRDF ...directional Reflectance Model Validation and Utilization, Air Force Avionics Laboratory Technical Report, AFAL-TR-73-303, October 1973. [10] Hall, D...failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE SEP 2014 2. REPORT
A Qualitative Study of Underutilization of the AIDS Drug Assistance Program
Olson, Kristin M.; Godwin, Noah C.; Wilkins, Sara Anne; Mugavero, Michael J.; Moneyham, Linda D.; Slater, Larry Z.; Raper, James L.
2014-01-01
In our previous work, we demonstrated underutilization of the AIDS Drug Assistance Program (ADAP) at an HIV clinic in Alabama. In order to understand barriers and facilitators to utilization of ADAP, we conducted focus groups of ADAP enrollees. Focus groups were stratified by sex, race, and historical medication possession ratio as a measure of program utilization. We grouped factors according to the social-ecological model. We found that multiple levels of influence, including patient and clinic-related factors, influenced utilization of antiretroviral medications. Patients introduced issues that illustrated high-priority needs for ADAP policy and implementation, suggesting that in order to improve ADAP utilization, the following issues must be addressed: patient transportation, ADAP medication refill schedules and procedures, mailing of medications, and the ADAP recertification process. These findings can inform a strategy of approaches to improve ADAP utilization, which may have widespread implications for ADAP programs across the United States. PMID:24503498
A strategic approach for Water Safety Plans implementation in Portugal.
Vieira, Jose M P
2011-03-01
Effective risk assessment and risk management approaches in public drinking water systems can benefit from a systematic process for hazards identification and effective management control based on the Water Safety Plan (WSP) concept. Good results from WSP development and implementation in a small number of Portuguese water utilities have shown that a more ambitious nationwide strategic approach to disseminate this methodology is needed. However, the establishment of strategic frameworks for systematic and organic scaling-up of WSP implementation at a national level requires major constraints to be overcome: lack of legislation and policies and the need for appropriate monitoring tools. This study presents a framework to inform future policy making by understanding the key constraints and needs related to institutional, organizational and research issues for WSP development and implementation in Portugal. This methodological contribution for WSP implementation can be replicated at a global scale. National health authorities and the Regulator may promote changes in legislation and policies. Independent global monitoring and benchmarking are adequate tools for measuring the progress over time and for comparing the performance of water utilities. Water utilities self-assessment must include performance improvement, operational monitoring and verification. Research and education and resources dissemination ensure knowledge acquisition and transfer.
A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.
Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon
2018-02-28
Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.
A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility
2018-01-01
Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29495599
Examining Teacher Experiences: A Qualitative Study on Inclusion in the Elementary Classroom
ERIC Educational Resources Information Center
Sinclair, Jennifer L.
2017-01-01
This qualitative study utilized a semi-structured interview approach to better understand the experiences of general education teachers (n = 8) with the inclusion of special education students in the general education classroom. By gaining information about the experiences that general education teachers have with supports and services for, as…
An Adaptive Approach for Implementing e-Government in I. R. Iran
ERIC Educational Resources Information Center
Sharifi, Hossein; Zarei, Behrouz
2004-01-01
Acknowledging the necessity of utilizing the new electronics, information, and communication technologies, the movement toward implementation of e-government in Iran has recently received the attention of the authorities and policy makers. The premise of the work is set around the fact that the e-enabled government is a momentous opportunity for…
Exploring the Utility of Sequential Analysis in Studying Informal Formative Assessment Practices
ERIC Educational Resources Information Center
Furtak, Erin Marie; Ruiz-Primo, Maria Araceli; Bakeman, Roger
2017-01-01
Formative assessment is a classroom practice that has received much attention in recent years for its established potential at increasing student learning. A frequent analytic approach for determining the quality of formative assessment practices is to develop a coding scheme and determine frequencies with which the codes are observed; however,…
Next Steps for "Big Data" in Education: Utilizing Data-Intensive Research
ERIC Educational Resources Information Center
Dede, Chris
2016-01-01
Data-informed instructional methods offer tremendous promise for increasing the effectiveness of teaching, learning, and schooling. Yet-to-be-developed data science approaches have the potential to dramatically advance instruction for every student and to enhance learning for people of all ages. Next steps that emerged from a recent National…
ERIC Educational Resources Information Center
Squires, Gregory D.; And Others
1979-01-01
Redlining of many urban communities and discrimination against the poor and minorities are common in the insurance industry, and these practices contribute to the deterioration of those communities. The utilization of a structural/disinvestment approach by social scientists should provide additional information about the uneven development of…
LEAD at Lunch: Inquiry, Learning, and Action
ERIC Educational Resources Information Center
Roberts, Cynthia
2012-01-01
This account of practice discusses the author's experience in facilitating a small group of managers in health care over lunchtime utilizing an action learning approach. This was part of a larger leadership development initiative which took place in the organization and the intention was to create a more intimate, informal and safe setting whereby…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... intramural and extramural research efforts that address the combined health effects of multiple environmental... construed as a funding opportunity or grant program. Input from all interested parties is welcome including... program priorities and recommends funding levels to assure maximum utilization of available resources in...
Mobile Phone Applications in Academic Library Services: A Students' Feedback Survey
ERIC Educational Resources Information Center
Karim, Nor Shahriza Abdul; Darus, Siti Hawa; Hussin, Ramlah
2006-01-01
Purpose: This study seeks to explore the utilization of mobile phone services in the educational environment, explore the nature of mobile phone use among university students, and investigate the perception of university students on mobile phone uses in library and information services. Design/methodology/approach: The study used a review of…
Jeremy R. deWaard; Andrew Mitchell; Melody A. Keena; David Gopurenko; Laura M. Boykin; Karen F. Armstrong; Michael G. Pogue; Joao Lima; Robin Floyd; Robert H. Hanner; Leland M. Humble
2010-01-01
This study demonstrates the efficacy of DNA barcodes for diagnosing species of Lymantria and reinforces the view that the approach is an under-utilized resource with substantial potential for biosecurity and surveillance. Biomonitoring agencies currently employing the NB restriction digest system would gather more information by transitioning to the...
A Principled Approach to Utilizing Digital Games in the Language Learning Classroom
ERIC Educational Resources Information Center
Baierschmidt, Jared
2013-01-01
Empirical research into the use of digital games for educational purposes has shown promising results such as increased learner motivation, improved learner retention of information, and increased learner interest in subject matter. Furthermore, in the field of language learning, digital games have been used successfully in a variety of ways such…
Finding a common path: predicting gene function using inferred evolutionary trees.
Reynolds, Kimberly A
2014-07-14
Reporting in Cell, Li and colleagues (2014) describe an innovative method to functionally classify genes using evolutionary information. This approach demonstrates broad utility for eukaryotic gene annotation and suggests an intriguing new decomposition of pathways and complexes into evolutionarily conserved modules. Copyright © 2014 Elsevier Inc. All rights reserved.
Single-Case Research Design: An Alternative Strategy for Evidence-Based Practice
ERIC Educational Resources Information Center
Stapleton, Drue; Hawkins, Andrew
2015-01-01
Objective: The trend of utilizing evidence-based practice (EBP) in athletic training is now requiring clinicians, researchers, educators, and students to be equipped to both engage in and make judgments about research evidence. Single-case design (SCD) research may provide an alternative approach to develop such skills and inform clinical and…
Career Education: Collaboration with the Private Sector. Information Series No. 246.
ERIC Educational Resources Information Center
Bhaerman, Robert D.
This paper reviews three aspects of career education-private sector collaboration: (1) the general and specific approaches that have been utilized during the past 10 years by the career education movement and the private sector in developing career education collaboration in the private sector; (2) the major results of these activities, focusing…
Background: Although engineered nanomaterials (ENM) are currently regulated either in the context of a new chemical, or as a new use of an existing chemical, hazard assessment is still to a large extent reliant on information from historical toxicity studies of the parent compoun...
ERIC Educational Resources Information Center
Holland, Terrill R.
1979-01-01
Clinicians with highly deviant recommendation rates also exhibited extreme differences in their use of the normal and sociopathic categories. It was concluded that biases in the definition and assignment of labels might be reduced by means of a combined clinical-statistical approach to gathering and utilizing diagnostic information. (Author)
ERIC Educational Resources Information Center
Williams, Frederick D.
Presented are the results of a study to determine the perceived needs of environmental control education programs as seen by students, instructors, deans or program directors, and field-related employers in the field of water pollution control. Data were collected utilizing three approaches: survey instruments, information from Water Quality…
ERIC Educational Resources Information Center
Bi, Youyi
2017-01-01
Human-centered design requires thorough understanding of people (e.g. customers, designers, engineers) in order to better satisfy the needs and expectations of all stakeholders in the design process. Designers are able to create better products by incorporating customers' subjective evaluations on products. Engineers can also build better tools…
A data-driven approach for quality assessment of radiologic interpretations.
Hsu, William; Han, Simon X; Arnold, Corey W; Bui, Alex At; Enzmann, Dieter R
2016-04-01
Given the increasing emphasis on delivering high-quality, cost-efficient healthcare, improved methodologies are needed to measure the accuracy and utility of ordered diagnostic examinations in achieving the appropriate diagnosis. Here, we present a data-driven approach for performing automated quality assessment of radiologic interpretations using other clinical information (e.g., pathology) as a reference standard for individual radiologists, subspecialty sections, imaging modalities, and entire departments. Downstream diagnostic conclusions from the electronic medical record are utilized as "truth" to which upstream diagnoses generated by radiology are compared. The described system automatically extracts and compares patient medical data to characterize concordance between clinical sources. Initial results are presented in the context of breast imaging, matching 18 101 radiologic interpretations with 301 pathology diagnoses and achieving a precision and recall of 84% and 92%, respectively. The presented data-driven method highlights the challenges of integrating multiple data sources and the application of information extraction tools to facilitate healthcare quality improvement. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Student approaches for learning in medicine: what does it tell us about the informal curriculum?
Zhang, Jianzhen; Peterson, Raymond F; Ozolins, Ieva Z
2011-10-21
It has long been acknowledged that medical students frequently focus their learning on that which will enable them to pass examinations, and that they use a range of study approaches and resources in preparing for their examinations. A recent qualitative study identified that in addition to the formal curriculum, students are using a range of resources and study strategies which could be attributed to the informal curriculum. What is not clearly established is the extent to which these informal learning resources and strategies are utilized by medical students. The aim of this study was to establish the extent to which students in a graduate-entry medical program use various learning approaches to assist their learning and preparation for examinations, apart from those resources offered as part of the formal curriculum. A validated survey instrument was administered to 522 medical students. Factor analysis and internal consistence, descriptive analysis and comparisons with demographic variables were completed. The factor analysis identified eight scales with acceptable levels of internal consistency with an alpha coefficient between 0.72 and 0.96. Nearly 80% of the students reported that they were overwhelmed by the amount of work that was perceived necessary to complete the formal curriculum, with 74.3% believing that the informal learning approaches helped them pass the examinations. 61.3% believed that they prepared them to be good doctors. A variety of informal learning activities utilized by students included using past student notes (85.8%) and PBL tutor guides (62.7%), and being part of self-organised study groups (62.6%), and peer-led tutorials (60.2%). Almost all students accessed the formal school resources for at least 10% of their study time. Students in the first year of the program were more likely to rely on the formal curriculum resources compared to those of Year 2 (p = 0.008). Curriculum planners should examine the level of use of informal learning activities in their schools, and investigate whether this is to enhance student progress, a result of perceived weakness in the delivery and effectiveness of formal resources, or to overcome anxiety about the volume of work expected by medical programs.
Towards a Viscous Wall Model for Immersed Boundary Methods
NASA Technical Reports Server (NTRS)
Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.
2016-01-01
Immersed boundary methods are frequently employed for simulating flows at low Reynolds numbers or for applications where viscous boundary layer effects can be neglected. The primary shortcoming of Cartesian mesh immersed boundary methods is the inability of efficiently resolving thin turbulent boundary layers in high-Reynolds number flow application. The inefficiency of resolving the thin boundary is associated with the use of constant aspect ratio Cartesian grid cells. Conventional CFD approaches can efficiently resolve the large wall normal gradients by utilizing large aspect ratio cells near the wall. This paper presents different approaches for immersed boundary methods to account for the viscous boundary layer interaction with the flow-field away from the walls. Different wall modeling approaches proposed in previous research studies are addressed and compared to a new integral boundary layer based approach. In contrast to common wall-modeling approaches that usually only utilize local flow information, the integral boundary layer based approach keeps the streamwise history of the boundary layer. This allows the method to remain effective at much larger y+ values than local wall modeling approaches. After a theoretical discussion of the different approaches, the method is applied to increasingly more challenging flow fields including fully attached, separated, and shock-induced separated (laminar and turbulent) flows.
Fottler, Myron D; Dickson, Duncan; Ford, Robert C; Bradley, Kenneth; Johnson, Lee
2006-02-01
The measurement of patient satisfaction is crucial to enhancing customer service and competitive advantage in the health-care industry. While there are numerous approaches to such measurement, this paper provides a case study which compares and contrasts patient and staff perceptions of customer service using both survey and focus group data. Results indicate that there is a high degree of correlation between staff and patient perceptions of customer service based on both survey and focus group data. However, the staff and patient subgroups also provided complementary information regarding patient perceptions of their service experience. Staff members tended to have more negative perceptions of service attributes than did the patients themselves. The focus group results provide complementary information to survey results in terms of greater detail and more managerially relevant information. While these results are derived from a pilot study, they suggest that diversification of data sources beyond patient surveys may enhance the utility of customer service information. If further research can affirm these findings, they create exciting possibilities for gathering valid, reliable and cost-effective customer service information.
Selecting Essential Information for Biosurveillance—A Multi-Criteria Decision Analysis
Generous, Nicholas; Margevicius, Kristen J.; Taylor-McCabe, Kirsten J.; Brown, Mac; Daniel, W. Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina
2014-01-01
The National Strategy for Biosurveillancedefines biosurveillance as “the process of gathering, integrating, interpreting, and communicating essential information related to all-hazards threats or disease activity affecting human, animal, or plant health to achieve early detection and warning, contribute to overall situational awareness of the health aspects of an incident, and to enable better decision-making at all levels.” However, the strategy does not specify how “essential information” is to be identified and integrated into the current biosurveillance enterprise, or what the metrics qualify information as being “essential”. Thequestion of data stream identification and selection requires a structured methodology that can systematically evaluate the tradeoffs between the many criteria that need to be taken in account. Multi-Attribute Utility Theory, a type of multi-criteria decision analysis, can provide a well-defined, structured approach that can offer solutions to this problem. While the use of Multi-Attribute Utility Theoryas a practical method to apply formal scientific decision theoretical approaches to complex, multi-criteria problems has been demonstrated in a variety of fields, this method has never been applied to decision support in biosurveillance.We have developed a formalized decision support analytic framework that can facilitate identification of “essential information” for use in biosurveillance systems or processes and we offer this framework to the global BSV community as a tool for optimizing the BSV enterprise. To demonstrate utility, we applied the framework to the problem of evaluating data streams for use in an integrated global infectious disease surveillance system. PMID:24489748
Helping Water Utilities Grapple with Climate Change
NASA Astrophysics Data System (ADS)
Yates, D.; Gracely, B.; Miller, K.
2008-12-01
The Water Research Foundation (WRF), serving the drinking water industry and the National Center for Atmospheric Research (NCAR) are collaborating on an effort to develop and implement locally-relevant, structured processes to help water utilities consider the impacts and adaptation options that climate variability and change might have on their water systems. Adopting a case-study approach, the structured process include 1) a problem definition phase, focused on identifying goals, information needs, utility vulnerabilities and possible adaptation options in the face of climate and hydrologic uncertainty; 2) developing and/or modifying system-specific Integrated Water Resource Management (IWRM) models and conducting sensitivity analysis to identify critical variables; 3) developing probabilistic climate change scenarios focused on exploring uncertainties identified as important in the sensitivity analysis in step 2; and 4) implementing the structured process and examining approaches decision making under uncertainty. Collaborators include seven drinking water utilities and two state agencies: 1) The Inland Empire Utility Agency, CA; 2) The El Dorado Irrigation District, Placerville CA; 2) Portland Water Bureau, Portland OR; 3) Colorado Springs Utilities, Colo Spgs, CO; 4) Cincinnati Water, Cincinnati, OH; 5) Massachusetts Water Resources Authority (MWRA), Boston, MA; 6) Durham Water, Durham, NC; and 7) Palm Beach County Water (PBCW), Palm Beach, FL. The California Department of Water Resources and the Colorado Water Conservation Board were the state agencies that we have collaborated with.
Chen, Bei-Bei; Gong, Hui-Li; Li, Xiao-Juan; Lei, Kun-Chao; Duan, Guang-Yao; Xie, Jin-Rong
2014-04-01
Long-term over-exploitation of underground resources, and static and dynamic load increase year by year influence the occurrence and development of regional land subsidence to a certain extent. Choosing 29 scenes Envisat ASAR images covering plain area of Beijing, China, the present paper used the multi-temporal InSAR method incorporating both persistent scatterer and small baseline approaches, and obtained monitoring information of regional land subsidence. Under different situation of space development and utilization, the authors chose five typical settlement areas; With classified information of land-use, multi-spectral remote sensing image, and geological data, and adopting GIS spatial analysis methods, the authors analyzed the time series evolution characteristics of uneven settlement. The comprehensive analysis results suggests that the complex situations of space development and utilization affect the trend of uneven settlement; the easier the situation of space development and utilization, the smaller the settlement gradient, and the less the uneven settlement trend.
[Use of an index of social welfare for health planning at a municipal level].
Ochoa-Díaz López, H; Sánchez-Pérez, H J; Martínez-Guzmán, L A
1996-01-01
This paper analyzes the relationship between a living standards index for small areas based on census data and information on morbidity and health care utilization. The information was gathered through a health interview survey of a random sample of 1 238 households from rural areas of Tlaxcala, Mexico. The population from localities with lower living standards showed significantly higher prevalences of morbidity and worse self-reported health status measures, as compared to localities with higher living standards. On the contrary, higher living standards were related with a greater utilization of health services. The approach proved to be useful in discriminating localities and areas of high and low prevalence of morbidity and utilization of health care services, which in turn could be used to identify those areas where needs are greatest. The implications of the results for health planning and resource allocation (based on population health needs and underlying social conditions) at the local level are discussed.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1995-06-01
The basic relationships between stress and strain under cyclic conditions of loading are not at present well understood. It would seem that information of this type is vital for a fundamental approach to understand the fatigue behavior of dynamically loaded structures. In this paper, experimental and computational methods are utilized to study the fatigue behavior of a thin aluminum cantilever plate subjected to dynamic loading. The studies are performed by combining optomechanical and finite element methods. The cantilever plate is loaded periodically by excitation set at a fixed amplitude and at a specific resonance frequency of the plate. By continuously applying this type of loading and using holographic interferometry, the behavior of the plate during a specific period of time is investigated. Quantitative information is obtained from laser vibrometry data which are utilized by a finite element program to calculate strains and stresses assuming a homogeneous and isotropic material and constant strain elements. It is shown that the use of experimental and computational hybrid methodologies allows identification of different zones of the plate that are fatigue critical. This optomechanical approach proves to be a viable tool for understanding of fatigue behavior of mechanical components and for performing optimization of structures subjected to fatigue conditions.
Siminoff, L A; Sandberg, D E
2015-05-01
Specific complaints and grievances from adult patients with disorders of sex development (DSD), and their advocates center around the lack of information or misinformation they were given about their condition and feeling stigmatized and shamed by the secrecy surrounding their condition and its management. Many also attribute poor sexual function to damaging genital surgery and/or repeated, insensitive genital examinations. These reports suggest the need to reconsider the decision-making process for the treatment of children born with DSD. This paper proposes that shared decision making, an important concept in adult health care, be operationalized for the major decisions commonly encountered in DSD care and facilitated through the utilization of decision aids and support tools. This approach may help patients and their families make informed decisions that are better aligned with their personal values and goals. It may also lead to greater confidence in decision making with greater satisfaction and less regret. A brief review of the past and current approach to DSD decision making is provided, along with a review of shared decision making and decision aids and support tools. A case study explores the need and potential utility of this suggested new approach. © Georg Thieme Verlag KG Stuttgart · New York.
Elzoghby, Mostafa; Li, Fu; Arafa, Ibrahim. I.; Arif, Usman
2017-01-01
Information fusion from multiple sensors ensures the accuracy and robustness of a navigation system, especially in the absence of global positioning system (GPS) data which gets degraded in many cases. A way to deal with multi-mode estimation for a small fixed wing unmanned aerial vehicle (UAV) localization framework is proposed, which depends on utilizing a Luenberger observer-based linear matrix inequality (LMI) approach. The proposed estimation technique relies on the interaction between multiple measurement modes and a continuous observer. The state estimation is performed in a switching environment between multiple active sensors to exploit the available information as much as possible, especially in GPS-denied environments. Luenberger observer-based projection is implemented as a continuous observer to optimize the estimation performance. The observer gain might be chosen by solving a Lyapunov equation by means of a LMI algorithm. Convergence is achieved by utilizing the linear matrix inequality (LMI), based on Lyapunov stability which keeps the dynamic estimation error bounded by selecting the observer gain matrix (L). Simulation results are presented for a small UAV fixed wing localization problem. The results obtained using the proposed approach are compared with a single mode Extended Kalman Filter (EKF). Simulation results are presented to demonstrate the viability of the proposed strategy. PMID:28420214
Anonymization of Longitudinal Electronic Medical Records
Tamersoy, Acar; Loukides, Grigorios; Nergiz, Mehmet Ercan; Saygin, Yucel; Malin, Bradley
2013-01-01
Electronic medical record (EMR) systems have enabled healthcare providers to collect detailed patient information from the primary care domain. At the same time, longitudinal data from EMRs are increasingly combined with biorepositories to generate personalized clinical decision support protocols. Emerging policies encourage investigators to disseminate such data in a deidentified form for reuse and collaboration, but organizations are hesitant to do so because they fear such actions will jeopardize patient privacy. In particular, there are concerns that residual demographic and clinical features could be exploited for reidentification purposes. Various approaches have been developed to anonymize clinical data, but they neglect temporal information and are, thus, insufficient for emerging biomedical research paradigms. This paper proposes a novel approach to share patient-specific longitudinal data that offers robust privacy guarantees, while preserving data utility for many biomedical investigations. Our approach aggregates temporal and diagnostic information using heuristics inspired from sequence alignment and clustering methods. We demonstrate that the proposed approach can generate anonymized data that permit effective biomedical analysis using several patient cohorts derived from the EMR system of the Vanderbilt University Medical Center. PMID:22287248
Associative Memory In A Phase Conjugate Resonator Cavity Utilizing A Hologram
NASA Astrophysics Data System (ADS)
Owechko, Y.; Marom, E.; Soffer, B. H.; Dunning, G.
1987-01-01
The principle of information retrieval by association has been suggested as a basis for parallel computing and as the process by which human memory functions.1 Various associative processors have been proposed that use electronic or optical means. Optical schemes,2-7 in particular, those based on holographic principles,3,6,7 are well suited to associative processing because of their high parallelism and information throughput. Previous workers8 demonstrated that holographically stored images can be recalled by using relatively complicated reference images but did not utilize nonlinear feedback to reduce the large cross talk that results when multiple objects are stored and a partial or distorted input is used for retrieval. These earlier approaches were limited in their ability to reconstruct the output object faithfully from a partial input.
Utilising social media contents for flood inundation mapping
NASA Astrophysics Data System (ADS)
Schröter, Kai; Dransch, Doris; Fohringer, Joachim; Kreibich, Heidi
2016-04-01
Data about the hazard and its consequences are scarce and not readily available during and shortly after a disaster. An information source which should be explored in a more efficient way is eyewitness accounts via social media. This research presents a methodology that leverages social media content to support rapid inundation mapping, including inundation extent and water depth in the case of floods. It uses quantitative data that are estimated from photos extracted from social media posts and their integration with established data. Due to the rapid availability of these posts compared to traditional data sources such as remote sensing data, areas affected by a flood, for example, can be determined quickly. Key challenges are to filter the large number of posts to a manageable amount of potentially useful inundation-related information, and to interpret and integrate the posts into mapping procedures in a timely manner. We present a methodology and a tool ("PostDistiller") to filter geo-located posts from social media services which include links to photos and to further explore this spatial distributed contextualized in situ information for inundation mapping. The June 2013 flood in Dresden is used as an application case study in which we evaluate the utilization of this approach and compare the resulting spatial flood patterns and inundation depths to 'traditional' data sources and mapping approaches like water level observations and remote sensing flood masks. The outcomes of the application case are encouraging. Strengths of the proposed procedure are that information for the estimation of inundation depth is rapidly available, particularly in urban areas where it is of high interest and of great value because alternative information sources like remote sensing data analysis do not perform very well. The uncertainty of derived inundation depth data and the uncontrollable availability of the information sources are major threats to the utility of the approach.
Visualizing biological reaction intermediates with DNA curtains
NASA Astrophysics Data System (ADS)
Zhao, Yiling; Jiang, Yanzhou; Qi, Zhi
2017-04-01
Single-molecule approaches have tremendous potential analyzing dynamic biological reaction with heterogeneity that cannot be effectively accessed via traditional ensemble-level biochemical approaches. The approach of deoxyribonucleic acid (DNA) curtains developed by Dr Eric Greene and his research team at Columbia University is a high-throughput single-molecule technique that utilizes fluorescent imaging to visualize protein-DNA interactions directly and allows the acquisition of statistically relevant information from hundreds or even thousands of individual reactions. This review aims to summarize the past, present, and future of DNA curtains, with an emphasis on its applications to solve important biological questions.
Roadmap on quantum optical systems
NASA Astrophysics Data System (ADS)
Dumke, Rainer; Lu, Zehuang; Close, John; Robins, Nick; Weis, Antoine; Mukherjee, Manas; Birkl, Gerhard; Hufnagel, Christoph; Amico, Luigi; Boshier, Malcolm G.; Dieckmann, Kai; Li, Wenhui; Killian, Thomas C.
2016-09-01
This roadmap bundles fast developing topics in experimental optical quantum sciences, addressing current challenges as well as potential advances in future research. We have focused on three main areas: quantum assisted high precision measurements, quantum information/simulation, and quantum gases. Quantum assisted high precision measurements are discussed in the first three sections, which review optical clocks, atom interferometry, and optical magnetometry. These fields are already successfully utilized in various applied areas. We will discuss approaches to extend this impact even further. In the quantum information/simulation section, we start with the traditionally successful employed systems based on neutral atoms and ions. In addition the marvelous demonstrations of systems suitable for quantum information is not progressing, unsolved challenges remain and will be discussed. We will also review, as an alternative approach, the utilization of hybrid quantum systems based on superconducting quantum devices and ultracold atoms. Novel developments in atomtronics promise unique access in exploring solid-state systems with ultracold gases and are investigated in depth. The sections discussing the continuously fast-developing quantum gases include a review on dipolar heteronuclear diatomic gases, Rydberg gases, and ultracold plasma. Overall, we have accomplished a roadmap of selected areas undergoing rapid progress in quantum optics, highlighting current advances and future challenges. These exciting developments and vast advances will shape the field of quantum optics in the future.
Laukkanen, Sanna; Kangas, Annika; Kangas, Jyrki
2002-02-01
Voting theory has a lot in common with utility theory, and especially with group decision-making. An expected-utility-maximising strategy exists in voting situations, as well as in decision-making situations. Therefore, it is natural to utilise the achievements of voting theory also in group decision-making. Most voting systems are based on a single criterion or holistic preference information on decision alternatives. However, a voting scheme called multicriteria approval is specially developed for decision-making situations with multiple criteria. This study considers the voting theory from the group decision support point of view and compares it with some other methods applied to similar purposes in natural resource management. A case study is presented, where the approval voting approach is introduced to natural resources planning and tested in a forestry group decision-making process. Applying multicriteria approval method was found to be a potential approach for handling some challenges typical for forestry group decision support. These challenges include (i) utilising ordinal information in the evaluation of decision alternatives, (ii) being readily understandable for and treating equally all the stakeholders in possession of different levels of knowledge on the subject considered, (iii) fast and cheap acquisition of preference information from several stakeholders, and (iv) dealing with multiple criteria.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adam, J. C.; Stephens, J. C.; Chung, Serena
As managers of agricultural and natural resources are confronted with uncertainties in global change impacts, the complexities associated with the interconnected cycling of nitrogen, carbon, and water present daunting management challenges. Existing models provide detailed information on specific sub-systems (land, air, water, economics, etc). An increasing awareness of the unintended consequences of management decisions resulting from interconnectedness of these sub-systems, however, necessitates coupled regional earth system models (EaSMs). Decision makers’ needs and priorities can be integrated into the model design and development processes to enhance decision-making relevance and "usability" of EaSMs. BioEarth is a current research initiative with a focusmore » on the U.S. Pacific Northwest region that explores the coupling of multiple stand-alone EaSMs to generate usable information for resource decision-making. Direct engagement between model developers and non-academic stakeholders involved in resource and environmental management decisions throughout the model development process is a critical component of this effort. BioEarth utilizes a "bottom-up" approach, upscaling a catchment-scale model to basin and regional scales, as opposed to the "top-down" approach of downscaling global models utilized by most other EaSM efforts. This paper describes the BioEarth initiative and highlights opportunities and challenges associated with coupling multiple stand-alone models to generate usable information for agricultural and natural resource decision-making.« less
NASA Astrophysics Data System (ADS)
Lim, Meng-Hui; Teoh, Andrew Beng Jin
2011-12-01
Biometric discretization derives a binary string for each user based on an ordered set of biometric features. This representative string ought to be discriminative, informative, and privacy protective when it is employed as a cryptographic key in various security applications upon error correction. However, it is commonly believed that satisfying the first and the second criteria simultaneously is not feasible, and a tradeoff between them is always definite. In this article, we propose an effective fixed bit allocation-based discretization approach which involves discriminative feature extraction, discriminative feature selection, unsupervised quantization (quantization that does not utilize class information), and linearly separable subcode (LSSC)-based encoding to fulfill all the ideal properties of a binary representation extracted for cryptographic applications. In addition, we examine a number of discriminative feature-selection measures for discretization and identify the proper way of setting an important feature-selection parameter. Encouraging experimental results vindicate the feasibility of our approach.
An approach for utilizing clinical statements in HL7 RIM to evaluate eligibility criteria.
Bache, Richard; Daniel, Christel; James, Julie; Hussain, Sajjad; McGilchrist, Mark; Delaney, Brendan; Taweel, Adel
2014-01-01
The HL7 RIM (Reference Information Model) is a commonly used standard for the exchange of clinical data and can be employed for integrating the patient care and clinical research domains. Yet it is not sufficiently well specified to ensure a canonical representation of structured clinical data when used for the automated evaluation of eligibility criteria from a clinical trial protocol. We present an approach to further constrain the RIM to create a common information model to hold clinical data. In order to demonstrate our approach, we identified 132 distinct data elements from 10 rich clinical trails. We then defined a taxonomy to (i) identify the types of data elements that would need to be stored and (ii) define the types of predicate that would be used to evaluate them. This informed the definition of a pattern used to represent the data, which was shown to be sufficient for storing and evaluating the clinical statements required by the trials.
The NASA Program Management Tool: A New Vision in Business Intelligence
NASA Technical Reports Server (NTRS)
Maluf, David A.; Swanson, Keith; Putz, Peter; Bell, David G.; Gawdiak, Yuri
2006-01-01
This paper describes a novel approach to business intelligence and program management for large technology enterprises like the U.S. National Aeronautics and Space Administration (NASA). Two key distinctions of the approach are that 1) standard business documents are the user interface, and 2) a "schema-less" XML database enables flexible integration of technology information for use by both humans and machines in a highly dynamic environment. The implementation utilizes patent-pending NASA software called the NASA Program Management Tool (PMT) and its underlying "schema-less" XML database called Netmark. Initial benefits of PMT include elimination of discrepancies between business documents that use the same information and "paperwork reduction" for program and project management in the form of reducing the effort required to understand standard reporting requirements and to comply with those reporting requirements. We project that the underlying approach to business intelligence will enable significant benefits in the timeliness, integrity and depth of business information available to decision makers on all organizational levels.
Individual determinants of research utilization by nurses: a systematic review update.
Squires, Janet E; Estabrooks, Carole A; Gustavsson, Petter; Wallin, Lars
2011-01-05
Interventions that have a better than random chance of increasing nurses' use of research are important to the delivery of quality patient care. However, few reports exist of successful research utilization in nursing interventions. Systematic identification and evaluation of individual characteristics associated with and predicting research utilization may inform the development of research utilization interventions. To update the evidence published in a previous systematic review on individual characteristics influencing research utilization by nurses. As part of a larger systematic review on research utilization instruments, 12 online bibliographic databases were searched. Hand searching of specialized journals and an ancestry search was also conducted. Randomized controlled trials, clinical trials, and observational study designs examining the association between individual characteristics and nurses' use of research were eligible for inclusion. Studies were limited to those published in the English, Danish, Swedish, and Norwegian languages. A vote counting approach to data synthesis was taken. A total of 42,770 titles were identified, of which 501 were retrieved. Of these 501 articles, 45 satisfied our inclusion criteria. Articles assessed research utilization in general (n = 39) or kinds of research utilization (n = 6) using self-report survey measures. Individual nurse characteristics were classified according to six categories: beliefs and attitudes, involvement in research activities, information seeking, education, professional characteristics, and socio-demographic/socio-economic characteristics. A seventh category, critical thinking, emerged in studies examining kinds of research utilization. Positive relationships, at statistically significant levels, for general research utilization were found in four categories: beliefs and attitudes, information seeking, education, and professional characteristics. The only characteristic assessed in a sufficient number of studies and with consistent findings for the kinds of research utilization was attitude towards research; this characteristic had a positive association with instrumental and overall research utilization. This review reinforced conclusions in the previous review with respect to positive relationships between general research utilization and: beliefs and attitudes, and current role. Furthermore, attending conferences/in-services, having a graduate degree in nursing, working in a specialty area, and job satisfaction were also identified as individual characteristics important to research utilization. While these findings hold promise as potential targets of future research utilization interventions, there were methodological problems inherent in many of the studies that necessitate their findings be replicated in further research using more robust study designs and multivariate assessment methods.
Individual determinants of research utilization by nurses: a systematic review update
2011-01-01
Background Interventions that have a better than random chance of increasing nurses' use of research are important to the delivery of quality patient care. However, few reports exist of successful research utilization in nursing interventions. Systematic identification and evaluation of individual characteristics associated with and predicting research utilization may inform the development of research utilization interventions. Objective To update the evidence published in a previous systematic review on individual characteristics influencing research utilization by nurses. Methods As part of a larger systematic review on research utilization instruments, 12 online bibliographic databases were searched. Hand searching of specialized journals and an ancestry search was also conducted. Randomized controlled trials, clinical trials, and observational study designs examining the association between individual characteristics and nurses' use of research were eligible for inclusion. Studies were limited to those published in the English, Danish, Swedish, and Norwegian languages. A vote counting approach to data synthesis was taken. Results A total of 42,770 titles were identified, of which 501 were retrieved. Of these 501 articles, 45 satisfied our inclusion criteria. Articles assessed research utilization in general (n = 39) or kinds of research utilization (n = 6) using self-report survey measures. Individual nurse characteristics were classified according to six categories: beliefs and attitudes, involvement in research activities, information seeking, education, professional characteristics, and socio-demographic/socio-economic characteristics. A seventh category, critical thinking, emerged in studies examining kinds of research utilization. Positive relationships, at statistically significant levels, for general research utilization were found in four categories: beliefs and attitudes, information seeking, education, and professional characteristics. The only characteristic assessed in a sufficient number of studies and with consistent findings for the kinds of research utilization was attitude towards research; this characteristic had a positive association with instrumental and overall research utilization. Conclusions This review reinforced conclusions in the previous review with respect to positive relationships between general research utilization and: beliefs and attitudes, and current role. Furthermore, attending conferences/in-services, having a graduate degree in nursing, working in a specialty area, and job satisfaction were also identified as individual characteristics important to research utilization. While these findings hold promise as potential targets of future research utilization interventions, there were methodological problems inherent in many of the studies that necessitate their findings be replicated in further research using more robust study designs and multivariate assessment methods. PMID:21208425
An Approach for harmonizing European Water Portals
NASA Astrophysics Data System (ADS)
Pesquer, Lluís; Stasch, Christoph; Masó, Joan; Jirka, Simon; Domingo, Xavier; Guitart, Francesc; Turner, Thomas; Hinderk Jürrens, Eike
2017-04-01
A number of European funded research projects is developing novel solutions for water monitoring, modeling and management. To generate innovations in the water sector, third parties from industry and the public sector need to take up the solutions and bring them into the market. A variety of portals exists to support this move into the market. Examples on the European level are the EIP Water Online Marketplace(1), the WaterInnEU Marketplace(2), the WISE RTD Water knowledge portal(3), the WIDEST- ICT for Water Observatory(4) or the SWITCH-ON Virtual Product Market and Virtual Water-Science Laboratory(5). Further innovation portals and initiatives exist on the national or regional level, for example, the Denmark knows water platform6 or the Dutch water alliance(7). However, the different portals often cover the same projects, the same products and the same services. Since they are technically separated and have their own data models and databases, people need to duplicate information and maintain it at several endpoints. This requires additional efforts and hinders the interoperable exchange between these portals and tools using the underlying data. In this work, we provide an overview on the existing portals and present an approach for harmonizing and integrating common information that is provided across different portals. The approach aims to integrate the common in formation in a common database utilizing existing vocabularies, where possible. An Application Programming Interface allows access the information in a machine-readable way and utilizing the information in other applications beyond description and discovery purposes. (1) http://www.eip-water.eu/my-market-place (2) https://marketplace.waterinneu.org (3) http://www.wise-rtd.info/ (4) http://iwo.widest.eu (5) http://www.switch-on-vwsl.eu/ (6) http://www.rethinkwater.dk/ (7) http://wateralliance.nl/
Learning to classify wakes from local sensory information
NASA Astrophysics Data System (ADS)
Alsalman, Mohamad; Colvert, Brendan; Kanso, Eva; Kanso Team
2017-11-01
Aquatic organisms exhibit remarkable abilities to sense local flow signals contained in their fluid environment and to surmise the origins of these flows. For example, fish can discern the information contained in various flow structures and utilize this information for obstacle avoidance and prey tracking. Flow structures created by flapping and swimming bodies are well characterized in the fluid dynamics literature; however, such characterization relies on classical methods that use an external observer to reconstruct global flow fields. The reconstructed flows, or wakes, are then classified according to the unsteady vortex patterns. Here, we propose a new approach for wake identification: we classify the wakes resulting from a flapping airfoil by applying machine learning algorithms to local flow information. In particular, we simulate the wakes of an oscillating airfoil in an incoming flow, extract the downstream vorticity information, and train a classifier to learn the different flow structures and classify new ones. This data-driven approach provides a promising framework for underwater navigation and detection in application to autonomous bio-inspired vehicles.
Kharroubi, Samer A
2017-10-06
Valuations of health state descriptors such as EQ-5D or SF6D have been conducted in different countries. There is a scope to make use of the results in one country as informative priors to help with the analysis of a study in another, for this to enable better estimation to be obtained in the new country than analyzing its data separately. Data from 2 EQ-5D valuation studies were analyzed using the time trade-off technique, where values for 42 health states were devised from representative samples of the UK and US populations. A Bayesian non-parametric approach has been applied to predict the health utilities of the US population, where the UK results were used as informative priors in the model to improve their estimation. The findings showed that employing additional information from the UK data helped in the production of US utility estimates much more precisely than would have been possible using the US study data alone. It is very plausible that this method would serve useful in countries where the conduction of large evaluation studies is not very feasible.
Telehealth and eHealth in nurse practitioner training: current perspectives
Rutledge, Carolyn M; Kott, Karen; Schweickert, Patty A; Poston, Rebecca; Fowler, Christianne; Haney, Tina S
2017-01-01
Telehealth is becoming a vital process for providing access to cost-effective quality care to patients at a distance. As such, it is important for nurse practitioners, often the primary providers for rural and disadvantaged populations, to develop the knowledge, skills, and attitudes needed to utilize telehealth technologies in practice. In reviewing the literature, very little information was found on programs that addressed nurse practitioner training in telehealth. This article provides an overview of both the topics and the techniques that have been utilized for training nurse practitioners and nurse practitioner students in the delivery of care utilizing telehealth. Specifically, this article focuses on topics including 1) defining telehealth, 2) telehealth etiquette, 3) interprofessional collaboration, 4) regulations, 5) reimbursement, 6) security/Health Insurance Portability and Accountability Act (HIPAA), 7) ethical practice in telehealth, and 8) satisfaction of patients and providers. A multimodal approach based on a review of the literature is presented for providing the training: 1) didactics, 2) simulations including standardized patient encounters, 3) practice immersions, and 4) telehealth projects. Studies found that training using the multimodal approach allowed the students to develop comfort, knowledge, and skills needed to embrace the utilization of telehealth in health care. PMID:28721113
NASA Astrophysics Data System (ADS)
Warchoł, Piotr
2018-06-01
The public transportation system of Cuernavaca, Mexico, exhibits random matrix theory statistics. In particular, the fluctuation of times between the arrival of buses on a given bus stop, follows the Wigner surmise for the Gaussian unitary ensemble. To model this, we propose an agent-based approach in which each bus driver tries to optimize his arrival time to the next stop with respect to an estimated arrival time of his predecessor. We choose a particular form of the associated utility function and recover the appropriate distribution in numerical experiments for a certain value of the only parameter of the model. We then investigate whether this value of the parameter is otherwise distinguished within an information theoretic approach and give numerical evidence that indeed it is associated with a minimum of averaged pairwise mutual information.
Integrating predictive information into an agro-economic model to guide agricultural planning
NASA Astrophysics Data System (ADS)
Block, Paul; Zhang, Ying; You, Liangzhi
2017-04-01
Seasonal climate forecasts can inform long-range planning, including water resources utilization and allocation, however quantifying the value of this information on the economy is often challenging. For rain-fed farmers, skillful season-ahead predictions may lead to superior planning, as compared to business as usual strategies, resulting in additional benefits or reduced losses. In this study, regional-level probabilistic precipitation forecasts of the major rainy season in Ethiopia are fed into an agro-economic model, adapted from the International Food Policy Research Institute, to evaluate economic outcomes (GDP, poverty rates, etc.) as compared with a no-forecast approach. Based on forecasted conditions, farmers can select various actions: adjusting crop area and crop type, purchasing drought resistant seed, or applying additional fertilizer. Preliminary results favor the forecast-based approach, particularly through crop area reallocation.
Particle tracking and extended object imaging by interferometric super resolution microscopy
NASA Astrophysics Data System (ADS)
Gdor, Itay; Yoo, Seunghwan; Wang, Xiaolei; Daddysman, Matthew; Wilton, Rosemarie; Ferrier, Nicola; Hereld, Mark; Cossairt, Oliver (Ollie); Katsaggelos, Aggelos; Scherer, Norbert F.
2018-02-01
An interferometric fluorescent microscope and a novel theoretic image reconstruction approach were developed and used to obtain super-resolution images of live biological samples and to enable dynamic real time tracking. The tracking utilizes the information stored in the interference pattern of both the illuminating incoherent light and the emitted light. By periodically shifting the interferometer phase and a phase retrieval algorithm we obtain information that allow localization with sub-2 nm axial resolution at 5 Hz.
Acting to gain information: Real-time reasoning meets real-time perception
NASA Technical Reports Server (NTRS)
Rosenschein, Stan
1994-01-01
Recent advances in intelligent reactive systems suggest new approaches to the problem of deriving task-relevant information from perceptual systems in real time. The author will describe work in progress aimed at coupling intelligent control mechanisms to real-time perception systems, with special emphasis on frame rate visual measurement systems. A model for integrated reasoning and perception will be discussed, and recent progress in applying these ideas to problems of sensor utilization for efficient recognition and tracking will be described.
Comparative Analysis of Begonia Plastid Genomes and Their Utility for Species-Level Phylogenetics
Harrison, Nicola; Harrison, Richard J.
2016-01-01
Recent, rapid radiations make species-level phylogenetics difficult to resolve. We used a multiplexed, high-throughput sequencing approach to identify informative genomic regions to resolve phylogenetic relationships at low taxonomic levels in Begonia from a survey of sixteen species. A long-range PCR method was used to generate draft plastid genomes to provide a strong phylogenetic backbone, identify fast evolving regions and provide informative molecular markers for species-level phylogenetic studies in Begonia. PMID:27058864
An information based approach to improving overhead imagery collection
NASA Astrophysics Data System (ADS)
Sourwine, Matthew J.; Hintz, Kenneth J.
2011-06-01
Recent growth in commercial imaging satellite development has resulted in a complex and diverse set of systems. To simplify this environment for both customer and vendor, an information based sensor management model was built to integrate tasking and scheduling systems. By establishing a relationship between image quality and information, tasking by NIIRS can be utilized to measure the customer's required information content. Focused on a reduction in uncertainty about a target of interest, the sensor manager finds the best sensors to complete the task given the active suite of imaging sensors' functions. This is done through determination of which satellite will meet customer information and timeliness requirements with low likelihood of interference at the highest rate of return.
Bitzer, Sonja; Albertini, Nicola; Lock, Eric; Ribaux, Olivier; Delémont, Olivier
2015-12-01
In an attempt to grasp the effectiveness of forensic science in the criminal justice process, a number of studies introduced some form of performance indicator. However, most of these indicators suffer from different weaknesses, from the definition of forensic science itself to problems of reliability and validity. We suggest the introduction of the concept of utility of the clue as an internal evaluation indicator of forensic science in the investigation. Utility of the clue is defined as added value of information, gained by the use of traces. This concept could be used to assess the contribution of the trace in the context of the case. By extension, a second application of this concept is suggested. By formalising and considering, a priori, the perceived utility of using traces, we introduce the notion of expected utility that could be used as decision factor when choosing which traces to use, once they have been collected at the crime scene or from an object in the laboratory. In a case-based approach, utility can be assessed in the light of the available information to evaluate the investigative contribution of forensic science. In the decision-making process, the projection or estimation of the utility of the clue is proposed to be a factor to take into account when triaging the set of traces. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kamaruddin, Saadi Bin Ahmad; Marponga Tolos, Siti; Hee, Pah Chin; Ghani, Nor Azura Md; Ramli, Norazan Mohamed; Nasir, Noorhamizah Binti Mohamed; Ksm Kader, Babul Salam Bin; Saiful Huq, Mohammad
2017-03-01
Neural framework has for quite a while been known for its ability to handle a complex nonlinear system without a logical model and can learn refined nonlinear associations gives. Theoretically, the most surely understood computation to set up the framework is the backpropagation (BP) count which relies on upon the minimization of the mean square error (MSE). However, this algorithm is not totally efficient in the presence of outliers which usually exist in dynamic data. This paper exhibits the modelling of quadriceps muscle model by utilizing counterfeit smart procedures named consolidated backpropagation neural network nonlinear autoregressive (BPNN-NAR) and backpropagation neural network nonlinear autoregressive moving average (BPNN-NARMA) models in view of utilitarian electrical incitement (FES). We adapted particle swarm optimization (PSO) approach to enhance the performance of backpropagation algorithm. In this research, a progression of tests utilizing FES was led. The information that is gotten is utilized to build up the quadriceps muscle model. 934 preparing information, 200 testing and 200 approval information set are utilized as a part of the improvement of muscle model. It was found that both BPNN-NAR and BPNN-NARMA performed well in modelling this type of data. As a conclusion, the neural network time series models performed reasonably efficient for non-linear modelling such as active properties of the quadriceps muscle with one input, namely output namely muscle force.
Near-Earth object hazardous impact: A Multi-Criteria Decision Making approach.
Sánchez-Lozano, J M; Fernández-Martínez, M
2016-11-16
The impact of a near-Earth object (NEO) may release large amounts of energy and cause serious damage. Several NEO hazard studies conducted over the past few years provide forecasts, impact probabilities and assessment ratings, such as the Torino and Palermo scales. These high-risk NEO assessments involve several criteria, including impact energy, mass, and absolute magnitude. The main objective of this paper is to provide the first Multi-Criteria Decision Making (MCDM) approach to classify hazardous NEOs. Our approach applies a combination of two methods from a widely utilized decision making theory. Specifically, the Analytic Hierarchy Process (AHP) methodology is employed to determine the criteria weights, which influence the decision making, and the Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) is used to obtain a ranking of alternatives (potentially hazardous NEOs). In addition, NEO datasets provided by the NASA Near-Earth Object Program are utilized. This approach allows the classification of NEOs by descending order of their TOPSIS ratio, a single quantity that contains all of the relevant information for each object.
Dynamic systems and inferential information processing in human communication.
Grammer, Karl; Fink, Bernhard; Renninger, LeeAnn
2002-12-01
Research in human communication on an ethological basis is almost obsolete. The reasons for this are manifold and lie partially in methodological problems connected to the observation and description of behavior, as well as the nature of human behavior itself. In this chapter, we present a new, non-intrusive, technical approach to the analysis of human non-verbal behavior, which could help to solve the problem of categorization that plagues the traditional approaches. We utilize evolutionary theory to propose a new theory-driven methodological approach to the 'multi-unit multi-channel modulation' problem of human nonverbal communication. Within this concept, communication is seen as context-dependent (the meaning of a signal is adapted to the situation), as a multi-channel and a multi-unit process (a string of many events interrelated in 'communicative' space and time), and as related to the function it serves. Such an approach can be utilized to successfully bridge the gap between evolutionary psychological research, which focuses on social cognition adaptations, and human ethology, which describes every day behavior in an objective, systematic way.
Augmenting matrix factorization technique with the combination of tags and genres
NASA Astrophysics Data System (ADS)
Ma, Tinghuai; Suo, Xiafei; Zhou, Jinjuan; Tang, Meili; Guan, Donghai; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah
2016-11-01
Recommender systems play an important role in our daily life and are becoming popular tools for users to find what they are really interested in. Matrix factorization methods, which are popular recommendation methods, have gained high attention these years. With the rapid growth of the Internet, lots of information has been created, like social network information, tags and so on. Along with these, a few matrix factorization approaches have been proposed which incorporate the personalized information of users or items. However, except for ratings, most of the matrix factorization models have utilized only one kind of information to understand users' interests. Considering the sparsity of information, in this paper, we try to investigate the combination of different information, like tags and genres, to reveal users' interests accurately. With regard to the generalization of genres, a constraint is added when genres are utilized to find users' similar ;soulmates;. In addition, item regularizer is also considered based on latent semantic indexing (LSI) method with the item tags. Our experiments are conducted on two real datasets: Movielens dataset and Douban dataset. The experimental results demonstrate that the combination of tags and genres is really helpful to reveal users' interests.
Active Detection for Exposing Intelligent Attacks in Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weerakkody, Sean; Ozel, Omur; Griffioen, Paul
In this paper, we consider approaches for detecting integrity attacks carried out by intelligent and resourceful adversaries in control systems. Passive detection techniques are often incorporated to identify malicious behavior. Here, the defender utilizes finely-tuned algorithms to process information and make a binary decision, whether the system is healthy or under attack. We demonstrate that passive detection can be ineffective against adversaries with model knowledge and access to a set of input/output channels. We then propose active detection as a tool to detect attacks. In active detection, the defender leverages degrees of freedom he has in the system to detectmore » the adversary. Specifically, the defender will introduce a physical secret kept hidden from the adversary, which can be utilized to authenticate the dynamics. In this regard, we carefully review two approaches for active detection: physical watermarking at the control input, and a moving target approach for generating system dynamics. We examine practical considerations for implementing these technologies and discuss future research directions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.
2013-07-15
Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operatemore » at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.« less
Arlikatti, Sudha; Peacock, Walter Gillis; Prater, Carla S; Grover, Himanshu; Sekar, Arul S Gnana
2010-07-01
This paper offers a potential measurement solution for assessing disaster impacts and subsequent recovery at the household level by using a modified domestic assets index (MDAI) approach. Assessment of the utility of the domestic assets index first proposed by Bates, Killian and Peacock (1984) has been confined to earthquake areas in the Americas and southern Europe. This paper modifies and extends the approach to the Indian sub-continent and to coastal surge hazards utilizing data collected from 1,000 households impacted by the Indian Ocean tsunami (2004) in the Nagapattinam district of south-eastern India. The analyses suggest that the MDAI scale is a reliable and valid measure of household living conditions and is useful in assessing disaster impacts and tracking recovery efforts over time. It can facilitate longitudinal studies, encourage cross-cultural, cross-national comparisons of disaster impacts and inform national and international donors of the itemized monetary losses from disasters at the household level.
Toward a Responsibility-Catering Prioritarian Ethical Theory of Risk.
Wikman-Svahn, Per; Lindblom, Lars
2018-03-05
Standard tools used in societal risk management such as probabilistic risk analysis or cost-benefit analysis typically define risks in terms of only probabilities and consequences and assume a utilitarian approach to ethics that aims to maximize expected utility. The philosopher Carl F. Cranor has argued against this view by devising a list of plausible aspects of the acceptability of risks that points towards a non-consequentialist ethical theory of societal risk management. This paper revisits Cranor's list to argue that the alternative ethical theory responsibility-catering prioritarianism can accommodate the aspects identified by Cranor and that the elements in the list can be used to inform the details of how to view risks within this theory. An approach towards operationalizing the theory is proposed based on a prioritarian social welfare function that operates on responsibility-adjusted utilities. A responsibility-catering prioritarian ethical approach towards managing risks is a promising alternative to standard tools such as cost-benefit analysis.
NASA Astrophysics Data System (ADS)
Pua, Rizza; Park, Miran; Wi, Sunhee; Cho, Seungryong
2016-12-01
We propose a hybrid metal artifact reduction (MAR) approach for computed tomography (CT) that is computationally more efficient than a fully iterative reconstruction method, but at the same time achieves superior image quality to the interpolation-based in-painting techniques. Our proposed MAR method, an image-based artifact subtraction approach, utilizes an intermediate prior image reconstructed via PDART to recover the background information underlying the high density objects. For comparison, prior images generated by total-variation minimization (TVM) algorithm, as a realization of fully iterative approach, were also utilized as intermediate images. From the simulation and real experimental results, it has been shown that PDART drastically accelerates the reconstruction to an acceptable quality of prior images. Incorporating PDART-reconstructed prior images in the proposed MAR scheme achieved higher quality images than those by a conventional in-painting method. Furthermore, the results were comparable to the fully iterative MAR that uses high-quality TVM prior images.
A methodology for comprehensive strategic planning and program prioritization
NASA Astrophysics Data System (ADS)
Raczynski, Christopher Michael
2008-10-01
This process developed in this work, Strategy Optimization for the Allocation of Resources (SOAR), is a strategic planning methodology based off Integrated Product and Process Development and systems engineering techniques. Utilizing a top down approach, the process starts with the creation of the organization vision and its measures of effectiveness. These measures are prioritized based on their application to external world scenarios which will frame the future. The programs which will be used to accomplish this vision are identified by decomposing the problem. Information is gathered on the programs as to the application, cost, schedule, risk, and other pertinent information. The relationships between the levels of the hierarchy are mapped utilizing subject matter experts. These connections are then utilized to determine the overall benefit of the programs to the vision of the organization. Through a Multi-Objective Genetic Algorithm a tradespace of potential program portfolios can be created amongst which the decision maker can allocate resources. The information and portfolios are presented to the decision maker through the use of a Decision Support System which collects and visualizes all the data in a single location. This methodology was tested utilizing a science and technology planning exercise conducted by the United States Navy. A thorough decomposition was defined and technology programs identified which had the potential to provide benefit to the vision. The prioritization of the top level capabilities was performed through the use of a rank ordering scheme and a previous naval application was used to demonstrate a cumulative voting scheme. Voting was performed utilizing the Nominal Group Technique to capture the relationships between the levels of the hierarchy. Interrelationships between the technologies were identified and a MOGA was utilized to optimize portfolios with respect to these constraints and information was placed in a DSS. This formulation allowed the decision makers to assess which portfolio could provide the greatest benefit to the Navy while still fitting within the funding profile.
A Synthesis of Qualitative Studies of Writing Center Tutoring, 1983-2006
ERIC Educational Resources Information Center
Babcock, Rebecca Day; Manning, Kellye; Rogers, Travis; Goff, Courtney; McCain, Amanda
2012-01-01
This book grew out of the desire and necessity to understand just what went on in writing center tutoring sessions. Utilizing previous research--mostly dissertations that have not been widely read--the authors analyze the available data using a grounded theory approach. With information from over 50 sources, the resulting text is not only a…
ERIC Educational Resources Information Center
Ousley, Chris
2010-01-01
This study sought to provide empirical evidence regarding the use of spatial analysis in enrollment management to predict persistence and graduation. The research utilized data from the 2000 U.S. Census and applicant records from The University of Arizona to study the spatial distributions of enrollments. Based on the initial results, stepwise…
The Use of Art in the Medical Decision-Making Process of Oncology Patients
ERIC Educational Resources Information Center
Czamanski-Cohen, Johanna
2012-01-01
The introduction of written informed consent in the 1970s created expectations of shared decision making between doctors and patients that has led to decisional conflict for some patients. This study utilized a collaborative, intrinsic case study approach to the decision-making process of oncology patients who participated in an open art therapy…
2015-07-31
and make the expected decision outcomes. The scenario is based around a scripted storyboard where an organized crime network is operating in a city to...interdicted by law enforcement to disrupt the network. The scenario storyboard was used to develop a probabilistic vehicle traffic model in order to
ERIC Educational Resources Information Center
Yoshimura, Christina Granato; Campbell, Kimberly Brown
2016-01-01
A university in the United States Mountain West utilized grant resources to track counseling services for students who were currently experiencing or who had historically experienced relationship violence, sexual assault and/or stalking. This report reflects on the first 2 years of this program, including an overview of prevalence and reporting…
ERIC Educational Resources Information Center
Sampson, James P., Jr.; Peterson, Gary W.; Reardon, Robert C.; Lenz, Janet G.
2000-01-01
Responds to Jepsen's (this issue [2000]) commentary on Sampson et al.'s theory-based approach to using readiness assessment to improve career services. Three topics are included: the reliability and utility of using readiness assessment measures; verbal ability and the use of cognitive information-processing theory in practice; and the potential…
ERIC Educational Resources Information Center
Moeletsi, M. E.; Mellaart, E. A. R.; Mpandeli, N. S.; Hamandawana, H.
2013-01-01
Purpose: New innovative ways of communicating agrometeorological information are needed to help farmers, especially subsistence/small-scale farmers, to cope with the high climate variability experienced in most parts of southern Africa. Design/methodology/approach: The article introduces an early warning system for farmers. It utilizes short…
Establishing a Community of Inquiry through Hybrid Courses in Clinical Social Work Education
ERIC Educational Resources Information Center
Ferrera, Maria; Ostrander, Noam; Crabtree-Nelson, Sonya
2013-01-01
Utilizing the conceptual framework of Garrison, Anderson, and Archer for critical inquiry, this paper outlines the importance of the community of inquiry (COI) model and how it may inform online social work education. Integrating the COI model, we discuss how online learning in the classroom with a hybrid approach has been used to facilitate…
Stephanie A. Snyder; Jay H. Whitmore; Ingrid E. Schneider; Dennis R. Becker
2008-01-01
This paper presents a geographic information system (GIS)-based method for recreational trail location for all-terrain vehicles (ATVs) which considers environmental factors, as well as rider preferences for trail attributes. The method utilizes the Least-Cost Path algorithm within a GIS framework to optimize trail location. The trail location algorithm considered trail...
A Communication Model for Teaching a Course in Mass Media and Society.
ERIC Educational Resources Information Center
Crumley, Wilma; Stricklin, Michael
Many professors of mass media and society courses have relied on a teaching model implying that students are sponges soaking up information. A more appropriate model invites concern with an active audience, transaction, the interpersonal mass media mix, a general systems approach, and process and change--in other words, utilization of current and…
ERIC Educational Resources Information Center
Mothibi, Gloria
2015-01-01
E-learning is substantially becoming a popular effective learning approach within greater academic settings due to high use of web systems in learning. E-learning involves utilization of information and communication technology (ICT) to improve and help teaching and learning. The aim of this study was to estimate the relationship between…
ERIC Educational Resources Information Center
Yue, Anthony R.
2011-01-01
Reflecting on the personal experience of teaching human resource management in the Canadian Arctic, the author explores the utility of an existentialist approach to pedagogy. The author outlines select aspects of existentialism that are pertinent to the teaching and discusses the implications of using reflexive existential thought as guidance in a…
A School-Based Evaluation Model for Accelerating the Education of Students At-Risk.
ERIC Educational Resources Information Center
Fetterman, David M.; Haertel, Edward H.
This paper presents ideas for the development and utilization of a comprehensive evaluation plan for an accelerated school. It contains information about the purposes of a comprehensive evaluation, the evaluation design, and the kinds of data that might be gathered and used. The first section, "An Approach to Evaluation: Multiple Purposes and…
Yeung, Kai; Basu, Anirban; Hansen, Ryan N; Watkins, John B; Sullivan, Sean D
2017-02-01
Value-based benefit design has been suggested as an effective approach to managing the high cost of pharmaceuticals in health insurance markets. Premera Blue Cross, a large regional health plan, implemented a value-based formulary (VBF) for pharmaceuticals in 2010 that explicitly used cost-effectiveness analysis (CEA) to inform medication copayments. The objective of the study was to determine the impact of the VBF. Interrupted time series of employer-sponsored plans from 2006 to 2013. Intervention group: 5235 beneficiaries exposed to the VBF. 11,171 beneficiaries in plans without any changes in pharmacy benefits. The VBF-assigned medications with lower value (estimated by CEA) to higher copayment tiers and assigned medications with higher value to lower copayment tiers. Primary outcome was medication expenditures from member, health plan, and member plus health plan perspectives. Secondary outcomes were medication utilization, emergency department visits, hospitalizations, office visits, and nonmedication expenditures. In the intervention group after VBF implementation, member medication expenditures increased by $2 per member per month (PMPM) [95% confidence interval (CI), $1-$3] or 9%, whereas health plan medication expenditures decreased by $10 PMPM (CI, $18-$2) or 16%, resulting in a net decrease of $8 PMPM (CI, $15-$2) or 10%, which translates to a net savings of $1.1 million. Utilization of medications moved into lower copayment tiers increased by 1.95 days' supply (CI, 1.29-2.62) or 17%. Total medication utilization, health services utilization, and nonmedication expenditures did not change. Cost-sharing informed by CEA reduced overall medication expenditures without negatively impacting medication utilization, health services utilization, or nonmedication expenditures.
An Introspective Critique of Past, Present, and Future USGS Decision Support
NASA Astrophysics Data System (ADS)
Neff, B. P.; Pavlick, M.
2017-12-01
In response to increasing scrutiny of publicly funded science, the Water Mission Area of USGS is shifting its approach for informing decisions that affect the country. Historically, USGS has focused on providing sound science on cutting edge, societally relevant issues with the expectation that decision makers will take action on this information. In practice, scientists often do not understand or focus on the needs of decision makers and decision makers often cannot or do not utilize information produced by scientists. The Water Mission Area of USGS has recognized that it can better serve the taxpayer by delivering information more relevant to decision making in a form more conducive to its use. To this end, the Water Mission Area of USGS is seeking greater integration with the decision making process to better inform what information it produces. In addition, recognizing that the transfer of scientific knowledge to decision making is fundamentally a social process, USGS is embracing the use of social science to better inform how it delivers scientific information and facilitates its use. This study utilizes qualitative methods to document the evolution of decision support at USGS and provide a rationale for a shift in direction. Challenges to implementation are identified and collaborative opportunities to improve decision making are discussed.
Malhis, Nawar; Butterfield, Yaron S N; Ester, Martin; Jones, Steven J M
2009-01-01
A plethora of alignment tools have been created that are designed to best fit different types of alignment conditions. While some of these are made for aligning Illumina Sequence Analyzer reads, none of these are fully utilizing its probability (prb) output. In this article, we will introduce a new alignment approach (Slider) that reduces the alignment problem space by utilizing each read base's probabilities given in the prb files. Compared with other aligners, Slider has higher alignment accuracy and efficiency. In addition, given that Slider matches bases with probabilities other than the most probable, it significantly reduces the percentage of base mismatches. The result is that its SNP predictions are more accurate than other SNP prediction approaches used today that start from the most probable sequence, including those using base quality.
Industrial applications study. Volume V. Bibliography of relevant literature. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Harry L.; Hamel, Bernard B.; Karamchetty, Som
1976-12-01
This five-volume report represents an initial Phase O evaluation of waste heat recovery and utilization potential in the manufacturing portion of the industrial sector. The scope of this initial phase was limited to the two-digit SIC level and addressed the feasibility of obtaining in-depth energy information in the industrial sector. Within this phase, a successful methodology and approaches for data gathering and assessment are established. Using these approaches, energy use and waste heat profiles were developed at the 2-digit level; with this data, waste heat utilization technologies were evaluated. The first section of the bibliography lists extensive citations for allmore » industries. The next section is composed of an extensive literature search with abstracts for industrial energy conservation. EPA publications on specific industries and general references conclude the publication. (MCW)« less
Near-Field Spectroscopy with Nanoparticles Deposited by AFM
NASA Technical Reports Server (NTRS)
Anderson, Mark S.
2008-01-01
An alternative approach to apertureless near-field optical spectroscopy involving an atomic-force microscope (AFM) entails less complexity of equipment than does a prior approach. The alternative approach has been demonstrated to be applicable to apertureless near-field optical spectroscopy of the type using an AFM and surface enhanced Raman scattering (SERS), and is expected to be equally applicable in cases in which infrared or fluorescence spectroscopy is used. Apertureless near-field optical spectroscopy is a means of performing spatially resolved analyses of chemical compositions of surface regions of nanostructured materials. In apertureless near-field spectroscopy, it is common practice to utilize nanostructured probe tips or nanoparticles (usually of gold) having shapes and dimensions chosen to exploit plasmon resonances so as to increase spectroscopic-signal strengths. To implement the particular prior approach to which the present approach is an alternative, it is necessary to integrate a Raman spectrometer with an AFM and to utilize a special SERS-active probe tip. The resulting instrumentation system is complex, and the tasks of designing and constructing the system and using the system to acquire spectro-chemical information from nanometer-scale regions on a surface are correspondingly demanding.
Multimodality approach to classifying hand utilization for the clinical breast examination.
Laufer, Shlomi; Cohen, Elaine R; Maag, Anne-Lise D; Kwan, Calvin; Vanveen, Barry; Pugh, Carla M
2014-01-01
The clinical breast examination (CBE) is performed to detect breast pathology. However, little is known regarding clinical technique and how it relates to diagnostic accuracy. We sought to quantify breast examination search patterns and hand utilization with a new data collection and analysis system. Participants performed the CBE while the sensor mapping and video camera system collected performance data. From this data, algorithms were developed that measured the number of hands used during the exam and active examination time. This system is a feasible and reliable method to collect new information on CBE techniques.
Research evidence utilization in policy development by child welfare administrators.
Jack, Susan; Dobbins, Maureen; Tonmyr, Lil; Dudding, Peter; Brooks, Sandy; Kennedy, Betty
2010-01-01
An exploratory qualitative study was conducted to explore how child welfare administrators use research evidence in decision-making. Content analysis revealed that a cultural shift toward evidence-based practice (EBP) is occurring in Canadian child welfare organizations and multiple types of evidence inform policy decisions. Barriers to using evidence include individual, organizational, and environmental factors. Facilitating factors include the development of internal champions and organizational cultures that value EBP. Integrating research into practice and policy decisions requires a multifaceted approach of creating organizational cultures that support research utilization and supporting senior bureaucrats to use research evidence in policy development.
Towards Principles-Based Approaches to Governance of Health-related Research using Personal Data
Laurie, Graeme; Sethi, Nayha
2013-01-01
Technological advances in the quality, availability and linkage potential of health data for research make the need to develop robust and effective information governance mechanisms more pressing than ever before; they also lead us to question the utility of governance devices used hitherto such as consent and anonymisation. This article assesses and advocates a principles-based approach, contrasting this with traditional rule-based approaches, and proposes a model of principled proportionate governance. It is suggested that the approach not only serves as the basis for good governance in contemporary data linkage but also that it provides a platform to assess legal reforms such as the draft Data Protection Regulation. PMID:24416087
Towards Principles-Based Approaches to Governance of Health-related Research using Personal Data.
Laurie, Graeme; Sethi, Nayha
2013-01-01
Technological advances in the quality, availability and linkage potential of health data for research make the need to develop robust and effective information governance mechanisms more pressing than ever before; they also lead us to question the utility of governance devices used hitherto such as consent and anonymisation. This article assesses and advocates a principles-based approach, contrasting this with traditional rule-based approaches, and proposes a model of principled proportionate governance . It is suggested that the approach not only serves as the basis for good governance in contemporary data linkage but also that it provides a platform to assess legal reforms such as the draft Data Protection Regulation.
A Multi-faceted Approach to Promote Comprehension of Online Health Information Among Older Adults.
Chin, Jessie; Moeller, Darcie D; Johnson, Jessica; Duwe, Elise A G; Graumlich, James F; Murray, Michael D; Morrow, Daniel G
2017-03-10
Older adults' self-care often depends on understanding and utilizing health information. Inadequate health literacy among older adults poses a barrier to self-care because it hampers comprehension of this information, particularly when the information is not well-designed. Our goal was to improve comprehension of online health information among older adults with hypertension who varied in health literacy abilities. We identified passages about hypertension self-care from credible websites (typical passages). We used a multi-faceted approach to redesign these passages, revising their content, language, organization and format (revised passages). Older participants read both versions of the passages at their own pace. After each passage, they summarized the passage and then answered questions about the passage. Participants better remembered the revised than the typical passages, summarizing the passages more accurately and uptaking information more efficiently (less reading time needed per unit of information remembered). The benefits for reading efficiency were greater for older adults with more health knowledge, suggesting knowledge facilitated comprehension of information in the revised passages. A systematic, multi-faceted approach to designing health documents can promote online learning among older adults with diverse health literacy abilities. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Wilkinson, Joyce E; Rycroft-Malone, Jo; Davies, Huw T O; McCormack, Brendan
2012-12-01
A group of researchers and practitioners interested in advancing knowledge utilization met as a colloquium in Belfast (KU 11) and used a "world café" approach to exploit the social capital and shared understanding built up over previous events to consider the research and practice agenda. We considered three key areas of relevance to knowledge use: (1) understanding the nature of research use, influence and impact; (2) blended and collaborative approaches to knowledge production and use; and (3) supporting sustainability and spread of evidence-informed innovations. The approach enabled the development of artifacts that reflected the three areas and these were analyzed using a creative hermeneutic approach. The themes that emerged and which are outlined in this commentary are not mutually exclusive. There was much overlap in the discussions and therefore of the themes, reflecting the complex nature of knowledge translation work. The agenda that has emerged from KU 11 also reflects the participatory and creative approach in which the meeting was structured and focused, and therefore emphasizes the processual, relational and contingent nature of some of the challenges we face. The past 20 years has seen an explosion in activity around understanding KU, and we have learned much about the difficulties. Whilst the agenda for the next decade may be becoming clearer, colloquia such as KU 11, using creative and engaging approaches, have a key role to play in dissecting, articulating and sharing that agenda. In this way, we also build an ever-expanding international community that is dedicated to working towards increasing the chances of success for better patient care. © 2012 Sigma Theta Tau International.
Description of a user-oriented geographic information system - The resource analysis program
NASA Technical Reports Server (NTRS)
Tilmann, S. E.; Mokma, D. L.
1980-01-01
This paper describes the Resource Analysis Program, an applied geographic information system. Several applications are presented which utilized soil, and other natural resource data, to develop integrated maps and data analyses. These applications demonstrate the methods of analysis and the philosophy of approach used in the mapping system. The applications are evaluated in reference to four major needs of a functional mapping system: data capture, data libraries, data analysis, and mapping and data display. These four criteria are then used to describe an effort to develop the next generation of applied mapping systems. This approach uses inexpensive microcomputers for field applications and should prove to be a viable entry point for users heretofore unable or unwilling to venture into applied computer mapping.
Armstrong, Joanne; Toscano, Michele; Kotchko, Nancy; Friedman, Sue; Schwartz, Marc D; Virgo, Katherine S; Lynch, Kristian; Andrews, James E; Aguado Loi, Claudia X; Bauer, Joseph E; Casares, Carolina; Teten, Rachel Threet; Kondoff, Matthew R; Molina, Ashley D; Abdollahian, Mehrnaz; Brand, Lana; Walker, Gregory S; Sutphen, Rebecca
2015-02-01
Research to date regarding identification and management of hereditary breast and ovarian cancer syndrome (HBOC) in the U.S. has been confined primarily to academic center-based studies with limited patient engagement. To begin to understand and address the current gaps and disparities in delivery of services for the appropriate identification and optimal risk management of individuals with HBOC, we designed and have initiated the American BRCA Outcomes and Utilization of Testing (ABOUT) Study. ABOUT relies on a collaborative patient advocacy, academic and industry partnership to recruit and engage U.S. individuals who are at increased risk for HBOC and investigate their experiences, decisions and outcomes. It utilizes an extensive research infrastructure, including an interactive web-based data system and electronic interfaces for secure online participation and automated data exchange. We describe the novel recruitment approach that was designed for collaboration with a national commercial health plan partner to identify all individuals for whom a healthcare provider orders a BRCA test and mail to each individual an invitation to participate and study packet. The study packet contains detailed information about the study, a baseline questionnaire and informed consent for participation in the study, for release of relevant medical and health plan records and for ongoing research engagement. This approach employs patient-reported, laboratory-reported and health plan-reported outcomes and facilitates longitudinal engagement. We believe that the type of innovative methodology and collaborative framework we have developed for ABOUT is an ideal foundation for a patient-powered research network. This approach can make substantial contributions to identifying current and best practices in HBOC, leading to improved strategies for clinical care and optimal health outcomes among individuals with high inherited risk for cancer.
Berlin, Leonard
2014-03-01
Concerns about the possibility of developing cancer due to diagnostic imaging examinations utilizing ionizing radiation exposure are increasing. Research studies of survivors of atomic bomb explosions, nuclear reactor accidents, and other unanticipated exposures to similar radiation have led to varying conclusions regarding the stochastic effects of radiation exposure. That high doses of ionizing radiation cause cancer in humans is generally accepted, but the question of whether diagnostic levels of radiation cause cancer continues to be hotly debated. It cannot be denied that overexposure to ionizing radiation beyond a certain threshold, which has not been exactly determined, does generate cancer. This causes a dilemma: what should patients be informed about the possibility that a CT or similar examination might cause cancer later in life? At present, there is no consensus in the radiology community as to whether informed consent must be obtained from a patient before the patient undergoes a CT or similar examination. The author analyzes whether there is a legal duty mandating radiologists to obtain such informed consent but also, irrespective of the law, whether there an ethical duty that compels radiologists to inform patients of potential adverse effects of ionizing radiation. Over the past decade, there has been a noticeable shift from a benevolent, paternalistic approach to medical care to an autonomy-based, shared-decision-making approach, whereby patient and physician work as partners in determining what is medically best for the patient. Radiologists should discuss the benefits and hazards of imaging with their patients. Copyright © 2014. Published by Elsevier Inc.
Nikfarjam, Azadeh; Sarker, Abeed; O'Connor, Karen; Ginn, Rachel; Gonzalez, Graciela
2015-05-01
Social media is becoming increasingly popular as a platform for sharing personal health-related information. This information can be utilized for public health monitoring tasks, particularly for pharmacovigilance, via the use of natural language processing (NLP) techniques. However, the language in social media is highly informal, and user-expressed medical concepts are often nontechnical, descriptive, and challenging to extract. There has been limited progress in addressing these challenges, and thus far, advanced machine learning-based NLP techniques have been underutilized. Our objective is to design a machine learning-based approach to extract mentions of adverse drug reactions (ADRs) from highly informal text in social media. We introduce ADRMine, a machine learning-based concept extraction system that uses conditional random fields (CRFs). ADRMine utilizes a variety of features, including a novel feature for modeling words' semantic similarities. The similarities are modeled by clustering words based on unsupervised, pretrained word representation vectors (embeddings) generated from unlabeled user posts in social media using a deep learning technique. ADRMine outperforms several strong baseline systems in the ADR extraction task by achieving an F-measure of 0.82. Feature analysis demonstrates that the proposed word cluster features significantly improve extraction performance. It is possible to extract complex medical concepts, with relatively high performance, from informal, user-generated content. Our approach is particularly scalable, suitable for social media mining, as it relies on large volumes of unlabeled data, thus diminishing the need for large, annotated training data sets. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Optimal marker-assisted selection to increase the effective size of small populations.
Wang, J
2001-02-01
An approach to the optimal utilization of marker and pedigree information in minimizing the rates of inbreeding and genetic drift at the average locus of the genome (not just the marked loci) in a small diploid population is proposed, and its efficiency is investigated by stochastic simulations. The approach is based on estimating the expected pedigree of each chromosome by using marker and individual pedigree information and minimizing the average coancestry of selected chromosomes by quadratic integer programming. It is shown that the approach is much more effective and much less computer demanding in implementation than previous ones. For pigs with 10 offspring per mother genotyped for two markers (each with four alleles at equal initial frequency) per chromosome of 100 cM, the approach can increase the average effective size for the whole genome by approximately 40 and 55% if mating ratios (the number of females mated with a male) are 3 and 12, respectively, compared with the corresponding values obtained by optimizing between-family selection using pedigree information only. The efficiency of the marker-assisted selection method increases with increasing amount of marker information (number of markers per chromosome, heterozygosity per marker) and family size, but decreases with increasing genome size. For less prolific species, the approach is still effective if the mating ratio is large so that a high marker-assisted selection pressure on the rarer sex can be maintained.
Automatic glaucoma diagnosis through medical imaging informatics.
Liu, Jiang; Zhang, Zhuo; Wong, Damon Wing Kee; Xu, Yanwu; Yin, Fengshou; Cheng, Jun; Tan, Ngan Meng; Kwoh, Chee Keong; Xu, Dong; Tham, Yih Chung; Aung, Tin; Wong, Tien Yin
2013-01-01
Computer-aided diagnosis for screening utilizes computer-based analytical methodologies to process patient information. Glaucoma is the leading irreversible cause of blindness. Due to the lack of an effective and standard screening practice, more than 50% of the cases are undiagnosed, which prevents the early treatment of the disease. To design an automatic glaucoma diagnosis architecture automatic glaucoma diagnosis through medical imaging informatics (AGLAIA-MII) that combines patient personal data, medical retinal fundus image, and patient's genome information for screening. 2258 cases from a population study were used to evaluate the screening software. These cases were attributed with patient personal data, retinal images and quality controlled genome data. Utilizing the multiple kernel learning-based classifier, AGLAIA-MII, combined patient personal data, major image features, and important genome single nucleotide polymorphism (SNP) features. Receiver operating characteristic curves were plotted to compare AGLAIA-MII's performance with classifiers using patient personal data, images, and genome SNP separately. AGLAIA-MII was able to achieve an area under curve value of 0.866, better than 0.551, 0.722 and 0.810 by the individual personal data, image and genome information components, respectively. AGLAIA-MII also demonstrated a substantial improvement over the current glaucoma screening approach based on intraocular pressure. AGLAIA-MII demonstrates for the first time the capability of integrating patients' personal data, medical retinal image and genome information for automatic glaucoma diagnosis and screening in a large dataset from a population study. It paves the way for a holistic approach for automatic objective glaucoma diagnosis and screening.
Thompson, Ceri R; McKee, Martin
2011-02-01
To explore differences in national approaches to hospital capital planning and financing in three European countries and to understand the roles and positions of the actors involved. Case studies of major new hospital developments were undertaken in each of the study countries (France, Sweden and England), based on a review of documents related to each development and the national framework within which they took place, as well as interviews with key informants. The principal-agent model was used, focusing on identification of differing utilities and information asymmetries. There are substantial differences between countries, for example in relation to the role of the hospital in its own redevelopment, the organisational distance between actors, the institutional level at which decision rights for major investments are exercised, and how principals control the agents. These differences have implications for the processes involved and the nature of economic and health care problems that can arise. There is evidence of, and opportunity for economic problems in all systems but these seems to be greater in France and England where the hospital leads the process, where there is limited involvement by the regional bodies, and informational differences appear greater. We conclude that hospital planning processes should be informed by an explicit understanding of the powerful groups involved and their divergent preferences and utilities. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
McCarty, L.S.; Landrum, P.F.; Luoma, S.N.; Meador, J.P.; Merten, A.A.; Shephard, B.K.; van Wezelzz, A.P.
2011-01-01
The tissue residue dose concept has been used, although in a limited manner, in environmental toxicology for more than 100 y. This review outlines the history of this approach and the technical background for organic chemicals and metals. Although the toxicity of both can be explained in tissue residue terms, the relationship between external exposure concentration, body and/or tissues dose surrogates, and the effective internal dose at the sites of toxic action tends to be more complex for metals. Various issues and current limitations related to research and regulatory applications are also examined. It is clear that the tissue residue approach (TRA) should be an integral component in future efforts to enhance the generation, understanding, and utility of toxicity testing data, both in the laboratory and in the field. To accomplish these goals, several key areas need to be addressed: 1) development of a risk-based interpretive framework linking toxicology and ecology at multiple levels of biological organization and incorporating organism-based dose metrics; 2) a broadly applicable, generally accepted classification scheme for modes/mechanisms of toxic action with explicit consideration of residue information to improve both single chemical and mixture toxicity data interpretation and regulatory risk assessment; 3) toxicity testing protocols updated to ensure collection of adequate residue information, along with toxicokinetics and toxicodynamics information, based on explicitly defined toxicological models accompanied by toxicological model validation; 4) continued development of residueeffect databases is needed ensure their ongoing utility; and 5) regulatory guidance incorporating residue-based testing and interpretation approaches, essential in various jurisdictions. ??:2010 SETAC.
An Enhanced Adaptive Management Approach for Remediation of Legacy Mercury in the South River
Foran, Christy M.; Baker, Kelsie M.; Grosso, Nancy R.; Linkov, Igor
2015-01-01
Uncertainties about future conditions and the effects of chosen actions, as well as increasing resource scarcity, have been driving forces in the utilization of adaptive management strategies. However, many applications of adaptive management have been criticized for a number of shortcomings, including a limited ability to learn from actions and a lack of consideration of stakeholder objectives. To address these criticisms, we supplement existing adaptive management approaches with a decision-analytical approach that first informs the initial selection of management alternatives and then allows for periodic re-evaluation or phased implementation of management alternatives based on monitoring information and incorporation of stakeholder values. We describe the application of this enhanced adaptive management (EAM) framework to compare remedial alternatives for mercury in the South River, based on an understanding of the loading and behavior of mercury in the South River near Waynesboro, VA. The outcomes show that the ranking of remedial alternatives is influenced by uncertainty in the mercury loading model, by the relative importance placed on different criteria, and by cost estimates. The process itself demonstrates that a decision model can link project performance criteria, decision-maker preferences, environmental models, and short- and long-term monitoring information with management choices to help shape a remediation approach that provides useful information for adaptive, incremental implementation. PMID:25665032
Least squares restoration of multichannel images
NASA Technical Reports Server (NTRS)
Galatsanos, Nikolas P.; Katsaggelos, Aggelos K.; Chin, Roland T.; Hillery, Allen D.
1991-01-01
Multichannel restoration using both within- and between-channel deterministic information is considered. A multichannel image is a set of image planes that exhibit cross-plane similarity. Existing optimal restoration filters for single-plane images yield suboptimal results when applied to multichannel images, since between-channel information is not utilized. Multichannel least squares restoration filters are developed using the set theoretic and the constrained optimization approaches. A geometric interpretation of the estimates of both filters is given. Color images (three-channel imagery with red, green, and blue components) are considered. Constraints that capture the within- and between-channel properties of color images are developed. Issues associated with the computation of the two estimates are addressed. A spatially adaptive, multichannel least squares filter that utilizes local within- and between-channel image properties is proposed. Experiments using color images are described.
Access and Use: Improving Digital Multimedia Consumer Health Information.
Thomas, Alex
2016-01-01
This project enabled novel organisational insight into the comparative utility of a portfolio of consumer health information content, by measuring patterns of attrition (abandonment) in content use. The project used as a case study the event activity log of a fully automated digital information kiosk, located in a community health facility. Direct measurements of the duration of content use were derived from the user interface activity recorded in the kiosk log, thus avoiding issues in using other approaches to collecting this type of data, such as sampling and observer bias. The distribution patterns of 1,383 durations of observed abandonments of use for twenty-eight discrete modules of health information content were visualised using Kaplan-Meir survival plots. Clear patterns of abandonment of content use were exhibited. The method of analysis is cost-effective, scalable and provides deep insight into the utility of health promotion content. The impact on the content producers, platform operators and service users is to improve organisational learning and thus increase the confidence in stakeholders that the service is continuously delivering high quality health and wellbeing benefits.
Storage and utilization of HLA genomic data--new approaches to HLA typing.
Helmberg, W
2000-01-01
Currently available DNA-based HLA typing assays can provide detailed information about sequence motifs of a tested sample. It is still a common practice, however, for information acquired by high-resolution sequence specific oligonucleotide probe (SSOP) typing or sequence specific priming (SSP) to be presented in a low-resolution serological format. Unfortunately, this representation can lead to significant loss of useful data in many cases. An alternative to assigning allele equivalents to suchDNA typing results is simply to store the observed typing pattern and utilize the information with the help of Virtual DNA Analysis (VDA). Interpretation of the stored typing patterns can then be updated based on newly defined alleles, assuming the sequence motifs detected by the typing reagents are known. Rather than updating reagent specificities in individual laboratories, such updates should be performed in a central, publicly available sequence database. By referring to this database, HLA genomic data can then be stored and transferred between laboratories without loss of information. The 13th International Histocompatibility Workshop offers an ideal opportunity to begin building this common database for the entire human MHC.
Melancon, M.J.; Bengston, David A.; Henshel, Diane S.
1996-01-01
As in mammals and fish, birds respond to many environmental contaminants with induction of hepatic cytochromes P450. In order to monitor cytchromes P450 in specific avian species, for assessing the status of that species or the habitat it utilizes, it is necessary to have background information on the appropriate assay conditions and the responsiveness of cytochrome P450 induction in that species. Assay of four monooxygenases which give resorufin as product using a fluorescence microwell plate scanner has proven to be an effective approach. Information is provided on the incubation conditions and baseline activity for twenty avian species at ages ranging from pipping embryo to adult. Induction responsiveness is presented for sixteen of them. This information can serve as a guide for those who wish to utilize cytochrome P450 as a biomarker for contaminant exposure and effect to aid in selection of appropriate species, age, and monooxygenase assay(s).
Information dissemination and use: critical components in occupational safety and health.
Schulte, P A; Okun, A; Stephenson, C M; Colligan, M; Ahlers, H; Gjessing, C; Loos, G; Niemeier, R W; Sweeney, M H
2003-11-01
Information dissemination is a mandated, but understudied, requirement of occupational and environmental health laws and voluntary initiatives. Research is needed on the factors that enhance and limit the development, transfer, and use of occupational safety and health information (OSH). Contemporary changes in the workforce, workplaces, and the nature of work will require new emphasis on the dissemination of information to foster prevention. Legislative and regulatory requirements and voluntary initiatives for dissemination of OSH information were identified and assessed. Literature on information dissemination was reviewed to identify important issues and useful approaches. More than 20 sections of laws and regulations were identified that mandated dissemination of occupational and environmental safety and health information. A four-stage approach for tracking dissemination and considering the flow of information was delineated. Special areas of dissemination were identified: the information needs of the changing workforce, new and young workers; small businesses; and workers with difficulty in understanding or reading English. We offer a framework for dissemination of OSH information and underscore the need to focus on the extent to which decision-makers and others receive and use such information. More solid data are also needed on current investments in disseminating, diffusing and applying OSH information and on the utility of that information. Am. J. Ind. Med. 44:515-531, 2003. Published 2003 Wiley-Liss, Inc.
Future mobile access for open-data platforms and the BBC-DaaS system
NASA Astrophysics Data System (ADS)
Edlich, Stefan; Singh, Sonam; Pfennigstorf, Ingo
2013-03-01
In this paper, we develop an open data platform on multimedia devices to act as marketplace of data for information seekers and data providers. We explore the important aspects of Data-as-a-Service (DaaS) service in the cloud with a mobile access point. The basis of the DaaS service is to act as a marketplace for information, utilizing new technologies and recent new scalable polyglot architectures based on NoSql databases. Whereas Open-Data platforms are beginning to be widely accepted, its mobile use is not. We compare similar products, their approach and a possible mobile usage. We discuss several approaches to address the mobile access as a native app, html5 and a mobile first approach together with the several frontend presentation techniques. Big data visualization itself is in the early days and we explore some possibilities to get big data / open data accessed by mobile users.
Windows 7 Antiforensics: A Review and a Novel Approach.
Eterovic-Soric, Brett; Choo, Kim-Kwang Raymond; Mubarak, Sameera; Ashman, Helen
2017-07-01
In this paper, we review literature on antiforensics published between 2010 and 2016 and reveal the surprising lack of up-to-date research on this topic. This research aims to contribute to this knowledge gap by investigating different antiforensic techniques for devices running Windows 7, one of the most popular operating systems. An approach which allows for removal or obfuscation of most forensic evidence is then presented. Using the Trojan software DarkComet RAT as a case study, we demonstrate the utility of our approach and that a Trojan Horse infection may be a legitimate possibility, even if there is no evidence of an infection on a seized computer's hard drive. Up-to-date information regarding how forensic artifacts can be compromised will allow relevant stakeholders to make informed decisions when deciding the outcome of legal cases involving digital evidence. © 2017 American Academy of Forensic Sciences.
Developing the skills required for evidence-based practice.
French, B
1998-01-01
The current health care environment requires practitioners with the skills to find and apply the best currently available evidence for effective health care, to contribute to the development of evidence-based practice protocols, and to evaluate the impact of utilizing validated research findings in practice. Current approaches to teaching research are based mainly on gaining skills by participation in the research process. Emphasis on the requirement for rigour in the process of creating new knowledge is assumed to lead to skill in the process of using research information created by others. This article reflects upon the requirements for evidence-based practice, and the degree to which current approaches to teaching research prepare practitioners who are able to find, evaluate and best use currently available research information. The potential for using the principles of systematic review as a teaching and learning strategy for research is explored, and some of the possible strengths and weakness of this approach are highlighted.
Bahramian, Hoda; Mohebbi, Simin Z; Khami, Mohammad Reza; Quinonez, Rocio Beatriz
2018-05-10
Pregnant women are vulnerable to a wide range of oral health conditions that could be harmful to their own health and future child. Despite the usefulness of regular dental service utilization in prevention and early detection of oral diseases, it is notably low among pregnant women. In this qualitative study, we aimed to explore barriers and facilitators influencing pregnant women's dental service utilization. Using a triangulation approach, we included pregnant women (n = 22) from two public health centers, midwives (n = 8) and dentists (n = 12) from 12 other public centers in Tehran (Iran). Data was gathered through face-to-face semi-structured interviewing and focus group discussion methods. The analysis of qualitative data was performed using conventional content analysis with MAXQDA10 software. Reported barriers of dental service utilization among pregnant women were categorized under emerging themes: Lack of knowledge and misbelief, cost of dental care, physiological changes, fear and other psychological conditions, time constraint, dentists' unwillingness to accept pregnant women treatment, cultural taboos and lack of interprofessional collaboration. Solutions proposed by dentists, midwives and pregnant women to improve dental care utilization during pregnancy were categorized under three themes: Provision of knowledge, financial support and establishing supportive policies. Understanding perceived barriers of dental service utilization during pregnancy can serve as baseline information for planning and formulating appropriate oral health education, financial support, and legislations tailored for lower income pregnant women, midwives and dentists in countries with developing oral health care system.
Demiris, G; Thompson, H
2011-01-01
As health care systems face limited resources and workforce shortages to address the complex needs of older adult populations, innovative approaches utilizing information technology can support aging. Smart Home and Ambient Assisted Living (SHAAL) systems utilize advanced and ubiquitous technologies including sensors and other devices that are integrated in the residential infrastructure or wearable, to capture data describing activities of daily living and health related events. This paper highlights how data from SHAAL systems can lead to information and knowledge that ultimately improves clinical outcomes and quality of life for older adults as well as quality of health care services. We conducted a review of personal health record applications specifically for older adults and approaches to using information to improve elder care. We present a framework that showcases how data captured from SHAAL systems can be processed to provide meaningful information that becomes part of a personal health record. Synthesis and visualization of information resulting from SHAAL systems can lead to knowledge and support education, delivery of tailored interventions and if needed, transitions in care. Such actions can involve multiple stakeholders as part of shared decision making. SHAAL systems have the potential to support aging and improve quality of life and decision making for older adults and their families. The framework presented in this paper demonstrates how emphasis needs to be placed into extracting meaningful information from new innovative systems that will support decision making. The challenge for informatics designers and researchers is to facilitate an evolution of SHAAL systems expanding beyond demonstration projects to actual interventions that will improve health care for older adults.
Activity-based exploitation of Full Motion Video (FMV)
NASA Astrophysics Data System (ADS)
Kant, Shashi
2012-06-01
Video has been a game-changer in how US forces are able to find, track and defeat its adversaries. With millions of minutes of video being generated from an increasing number of sensor platforms, the DOD has stated that the rapid increase in video is overwhelming their analysts. The manpower required to view and garner useable information from the flood of video is unaffordable, especially in light of current fiscal restraints. "Search" within full-motion video has traditionally relied on human tagging of content, and video metadata, to provision filtering and locate segments of interest, in the context of analyst query. Our approach utilizes a novel machine-vision based approach to index FMV, using object recognition & tracking, events and activities detection. This approach enables FMV exploitation in real-time, as well as a forensic look-back within archives. This approach can help get the most information out of video sensor collection, help focus the attention of overburdened analysts form connections in activity over time and conserve national fiscal resources in exploiting FMV.
Jiang, Zhehan; Skorupski, William
2017-12-12
In many behavioral research areas, multivariate generalizability theory (mG theory) has been typically used to investigate the reliability of certain multidimensional assessments. However, traditional mG-theory estimation-namely, using frequentist approaches-has limits, leading researchers to fail to take full advantage of the information that mG theory can offer regarding the reliability of measurements. Alternatively, Bayesian methods provide more information than frequentist approaches can offer. This article presents instructional guidelines on how to implement mG-theory analyses in a Bayesian framework; in particular, BUGS code is presented to fit commonly seen designs from mG theory, including single-facet designs, two-facet crossed designs, and two-facet nested designs. In addition to concrete examples that are closely related to the selected designs and the corresponding BUGS code, a simulated dataset is provided to demonstrate the utility and advantages of the Bayesian approach. This article is intended to serve as a tutorial reference for applied researchers and methodologists conducting mG-theory studies.
Multi-Modal Active Perception for Autonomously Selecting Landing Sites on Icy Moons
NASA Technical Reports Server (NTRS)
Arora, A.; Furlong, P. M.; Wong, U.; Fong, T.; Sukkarieh, S.
2017-01-01
Selecting suitable landing sites is fundamental to achieving many mission objectives in planetary robotic lander missions. However, due to sensing limitations, landing sites which are both safe and scientifically valuable often cannot be determined reliably from orbit, particularly, in icy moon missions where orbital sensing data is noisy and incomplete. This paper presents an active perception approach to Entry Descent and Landing (EDL) which enables the lander to autonomously plan informative descent trajectories, acquire high quality sensing data during descent and exploit this additional information to select higher utility landing sites. Our approach consists of two components: probabilistic modeling of landing site features and approximate trajectory planning using a sampling based planner. The proposed framework allows the lander to plan long horizons paths and remain robust to noisy data. Results in simulated environments show large performance improvements over alternative approaches and show promise that our approach has strong potential to improve science return of not only icy moon missions but EDL systems in general.
Kaitaniemi, Pekka
2008-04-09
Allometric equations are widely used in many branches of biological science. The potential information content of the normalization constant b in allometric equations of the form Y = bX(a) has, however, remained largely neglected. To demonstrate the potential for utilizing this information, I generated a large number of artificial datasets that resembled those that are frequently encountered in biological studies, i.e., relatively small samples including measurement error or uncontrolled variation. The value of X was allowed to vary randomly within the limits describing different data ranges, and a was set to a fixed theoretical value. The constant b was set to a range of values describing the effect of a continuous environmental variable. In addition, a normally distributed random error was added to the values of both X and Y. Two different approaches were then used to model the data. The traditional approach estimated both a and b using a regression model, whereas an alternative approach set the exponent a at its theoretical value and only estimated the value of b. Both approaches produced virtually the same model fit with less than 0.3% difference in the coefficient of determination. Only the alternative approach was able to precisely reproduce the effect of the environmental variable, which was largely lost among noise variation when using the traditional approach. The results show how the value of b can be used as a source of valuable biological information if an appropriate regression model is selected.
The New Parent Checklist: A Tool to Promote Parental Reflection.
Keys, Elizabeth M; McNeil, Deborah A; Wallace, Donna A; Bostick, Jason; Churchill, A Jocelyn; Dodd, Maureen M
To design and establish content and face validity of an evidence-informed tool that promotes parental self-reflection during the transition to parenthood. The New Parent Checklist was developed using a three-phase sequential approach: Phase 1 a scoping review and expert consultation to develop and refine a prototype tool; Phase 2 content analysis of parent focus groups; and Phase 3 assessment of utility in a cross-sectional sample of parents completing the New Parent Checklist and a questionnaire. The initial version of the checklist was considered by experts to contain key information. Focus group participants found it useful, appropriate, and nonjudgmental, and offered suggestions to enhance readability, utility, as well as face and content validity. In the cross-sectional survey, 83% of the participants rated the New Parent Checklist as "helpful" or "very helpful" and 90% found the New Parent Checklist "very easy" to use. Open-ended survey responses included predominantly positive feedback. Notable differences existed for some items based on respondents' first language, age, and sex. Results and feedback from all three phases informed the current version, available for download online. The New Parent Checklist is a comprehensive evidence-informed self-reflective tool with promising content and face validity. Depending on parental characteristics and infant age, certain items of the New Parent Checklist have particular utility but may also require further adaptation and testing. Local resources for information and/or support are included in the tool and could be easily adapted by other regions to incorporate their own local resources.
Health-related media use among youth audiences in Senegal
Glik, Deborah; Massey, Philip; Gipson, Jessica; Dieng, Thierno; Rideau, Alexandre; Prelip, Michael
2016-01-01
Lower- and middle-income countries (LMICs) are experiencing rapid changes in access to and use of new internet and digital media technologies. The purpose of this study was to better understand how younger audiences are navigating traditional and newer forms of media technologies, with particular emphasis on the skills and competencies needed to obtain, evaluate and apply health-related information, also defined as health and media literacy. Sixteen focus group discussions were conducted throughout Senegal in September 2012 with youth aged 15–25. Using an iterative coding process based on grounded theory, four themes emerged related to media use for health information among Senegalese youth. They include the following: (i) media utilization; (ii) barriers and conflicts regarding media utilization; (iii) uses and gratifications and (iv) health and media literacy. Findings suggest that Senegalese youth use a heterogeneous mix of media platforms (i.e. television, radio, internet) and utilization often occurs with family members or friends. Additionally, the need for entertainment, information and connectedness inform media use, mostly concerning sexual and reproductive health information. Importantly, tensions arise as youth balance innovative and interactive technologies with traditional and conservative values, particularly concerning ethical and privacy concerns. Findings support the use of multipronged intervention approaches that leverage both new media, as well as traditional media strategies, and that also address lack of health and media literacy in this population. Implementing health-related interventions across multiple media platforms provides an opportunity to create an integrated, as opposed to a disparate, user experience. PMID:25113152
Wong, Paul W C; Chan, Wincy S C; Beh, Philip S L; Yau, Fiona W S; Yip, Paul S F; Hawton, Keith
2010-01-01
Ethical issues have been raised about using the psychological autopsy approach in the study of suicide. The impact on informants of control cases who participated in case-control psychological autopsy studies has not been investigated. (1) To investigate whether informants of suicide cases recruited by two approaches (coroners' court and public mortuaries) respond differently to the initial contact by the research team. (2) To explore the reactions, reasons for participation, and comments of both the informants of suicide and control cases to psychological autopsy interviews. (3) To investigate the impact of the interviews on informants of suicide cases about a month after the interviews. A self-report questionnaire was used for the informants of both suicide and control cases. Telephone follow-up interviews were conducted with the informants of suicide cases. The majority of the informants of suicide cases, regardless of the initial route of contact, as well as the control cases were positive about being approached to take part in the study. A minority of informants of suicide and control cases found the experience of talking about their family member to be more upsetting than expected. The telephone follow-up interviews showed that none of the informants of suicide cases reported being distressed by the psychological autopsy interviews. The acceptance rate for our original psychological autopsy study was modest. The findings of this study are useful for future participants and researchers in measuring the potential benefits and risks of participating in similar sensitive research. Psychological autopsy interviews may be utilized as an active engagement approach to reach out to the people bereaved by suicide, especially in places where the postvention work is underdeveloped.
Thresholds for conservation and management: structured decision making as a conceptual framework
Nichols, James D.; Eaton, Mitchell J.; Martin, Julien; Edited by Guntenspergen, Glenn R.
2014-01-01
changes in system dynamics. They are frequently incorporated into ecological models used to project system responses to management actions. Utility thresholds are components of management objectives and are values of state or performance variables at which small changes yield substantial changes in the value of the management outcome. Decision thresholds are values of system state variables at which small changes prompt changes in management actions in order to reach specified management objectives. Decision thresholds are derived from the other components of the decision process.We advocate a structured decision making (SDM) approach within which the following components are identified: objectives (possibly including utility thresholds), potential actions, models (possibly including ecological thresholds), monitoring program, and a solution algorithm (which produces decision thresholds). Adaptive resource management (ARM) is described as a special case of SDM developed for recurrent decision problems that are characterized by uncertainty. We believe that SDM, in general, and ARM, in particular, provide good approaches to conservation and management. Use of SDM and ARM also clarifies the distinct roles of ecological thresholds, utility thresholds, and decision thresholds in informed decision processes.
Early Amyloidogenic Oligomerization Studied through Fluorescence Lifetime Correlation Spectroscopy
Paredes, Jose M.; Casares, Salvador; Ruedas-Rama, Maria J.; Fernandez, Elena; Castello, Fabio; Varela, Lorena; Orte, Angel
2012-01-01
Amyloidogenic protein aggregation is a persistent biomedical problem. Despite active research in disease-related aggregation, the need for multidisciplinary approaches to the problem is evident. Recent advances in single-molecule fluorescence spectroscopy are valuable for examining heterogenic biomolecular systems. In this work, we have explored the initial stages of amyloidogenic aggregation by employing fluorescence lifetime correlation spectroscopy (FLCS), an advanced modification of conventional fluorescence correlation spectroscopy (FCS) that utilizes time-resolved information. FLCS provides size distributions and kinetics for the oligomer growth of the SH3 domain of α-spectrin, whose N47A mutant forms amyloid fibrils at pH 3.2 and 37 °C in the presence of salt. The combination of FCS with additional fluorescence lifetime information provides an exciting approach to focus on the initial aggregation stages, allowing a better understanding of the fibrillization process, by providing multidimensional information, valuable in combination with other conventional methodologies. PMID:22949804
Active link selection for efficient semi-supervised community detection
NASA Astrophysics Data System (ADS)
Yang, Liang; Jin, Di; Wang, Xiao; Cao, Xiaochun
2015-03-01
Several semi-supervised community detection algorithms have been proposed recently to improve the performance of traditional topology-based methods. However, most of them focus on how to integrate supervised information with topology information; few of them pay attention to which information is critical for performance improvement. This leads to large amounts of demand for supervised information, which is expensive or difficult to obtain in most fields. For this problem we propose an active link selection framework, that is we actively select the most uncertain and informative links for human labeling for the efficient utilization of the supervised information. We also disconnect the most likely inter-community edges to further improve the efficiency. Our main idea is that, by connecting uncertain nodes to their community hubs and disconnecting the inter-community edges, one can sharpen the block structure of adjacency matrix more efficiently than randomly labeling links as the existing methods did. Experiments on both synthetic and real networks demonstrate that our new approach significantly outperforms the existing methods in terms of the efficiency of using supervised information. It needs ~13% of the supervised information to achieve a performance similar to that of the original semi-supervised approaches.
Active link selection for efficient semi-supervised community detection
Yang, Liang; Jin, Di; Wang, Xiao; Cao, Xiaochun
2015-01-01
Several semi-supervised community detection algorithms have been proposed recently to improve the performance of traditional topology-based methods. However, most of them focus on how to integrate supervised information with topology information; few of them pay attention to which information is critical for performance improvement. This leads to large amounts of demand for supervised information, which is expensive or difficult to obtain in most fields. For this problem we propose an active link selection framework, that is we actively select the most uncertain and informative links for human labeling for the efficient utilization of the supervised information. We also disconnect the most likely inter-community edges to further improve the efficiency. Our main idea is that, by connecting uncertain nodes to their community hubs and disconnecting the inter-community edges, one can sharpen the block structure of adjacency matrix more efficiently than randomly labeling links as the existing methods did. Experiments on both synthetic and real networks demonstrate that our new approach significantly outperforms the existing methods in terms of the efficiency of using supervised information. It needs ~13% of the supervised information to achieve a performance similar to that of the original semi-supervised approaches. PMID:25761385
NASA Astrophysics Data System (ADS)
Garrote, Luis; Sordo, Alvaro; Iglesias, Ana
2016-04-01
Information is valuable when it improves decision-making (e.g., actions can be adjusted to better suit the situation at hand) and enables the mitigation of damage. However, quantifying the value of information is often difficult. Here we explore a general approach to understand the economic value of drought information for water managers framing our approach in the precautionary principle that reminds us that uncertainty is not a reason to postpone or avoid action. We explore how decision making can disregard uncertain effects, taking a short-term approach and focusing instead on the certain costs and benefits of taking action. Two main questions arise: How do we know that advanced drought information is actually helping decisions?; and What is the value of information in the decision process? The approach is applied to several regulated water resources systems in Spain. It first views drought information as a factor in the decision process which can be used by water managers to reduce uncertainty. Second, the value of drought information is the expected gain in a decision outcome (utility) from using additional information. Finally, the gains of improved information are compared with the information collection costs. Here we estimate the value by taking into account the accuracy of the drought information, the subjective probabilities about the value, analyzed as Bayesian probabilities, and the ability or skill of the stakeholders to apply the drought information to modify their actions. Since information may be considered a public good (non-rivalry and non-excludability), it may justify public policy in the provision of information, considering social costs and benefits. The application of the framework to the Spanish case studies shows that information benefits exceeds to costs when drought frequency is 20-40% above normal values; below these values uncertainty in the decisions dominate the results; above these values, the management decisions are limited even with perfect information.
Automating security monitoring and analysis for Space Station Freedom's electric power system
NASA Technical Reports Server (NTRS)
Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han
1990-01-01
Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A new approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.
Automating security monitoring and analysis for Space Station Freedom's electric power system
NASA Technical Reports Server (NTRS)
Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han
1990-01-01
Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A novel approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E.
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information. PMID:25954452
A Random Walk Approach to Query Informative Constraints for Clustering.
Abin, Ahmad Ali
2017-08-09
This paper presents a random walk approach to the problem of querying informative constraints for clustering. The proposed method is based on the properties of the commute time, that is the expected time taken for a random walk to travel between two nodes and return, on the adjacency graph of data. Commute time has the nice property of that, the more short paths connect two given nodes in a graph, the more similar those nodes are. Since computing the commute time takes the Laplacian eigenspectrum into account, we use this property in a recursive fashion to query informative constraints for clustering. At each recursion, the proposed method constructs the adjacency graph of data and utilizes the spectral properties of the commute time matrix to bipartition the adjacency graph. Thereafter, the proposed method benefits from the commute times distance on graph to query informative constraints between partitions. This process iterates for each partition until the stop condition becomes true. Experiments on real-world data show the efficiency of the proposed method for constraints selection.
Hosseini, Masoud; Ahmadi, Maryam; Dixon, Brian E
2014-01-01
Clinical decision support (CDS) systems can support vaccine forecasting and immunization reminders; however, immunization decision-making requires data from fragmented, independent systems. Interoperability and accurate data exchange between immunization information systems (IIS) is an essential factor to utilize Immunization CDS systems. Service oriented architecture (SOA) and Health Level 7 (HL7) are dominant standards for web-based exchange of clinical information. We implemented a system based on SOA and HL7 v3 to support immunization CDS in Iran. We evaluated system performance by exchanging 1500 immunization records for roughly 400 infants between two IISs. System turnaround time is less than a minute for synchronous operation calls and the retrieved immunization history of infants were always identical in different systems. CDS generated reports were accordant to immunization guidelines and the calculations for next visit times were accurate. Interoperability is rare or nonexistent between IIS. Since inter-state data exchange is rare in United States, this approach could be a good prototype to achieve interoperability of immunization information.
MPEG-7 based video annotation and browsing
NASA Astrophysics Data System (ADS)
Hoeynck, Michael; Auweiler, Thorsten; Wellhausen, Jens
2003-11-01
The huge amount of multimedia data produced worldwide requires annotation in order to enable universal content access and to provide content-based search-and-retrieval functionalities. Since manual video annotation can be time consuming, automatic annotation systems are required. We review recent approaches to content-based indexing and annotation of videos for different kind of sports and describe our approach to automatic annotation of equestrian sports videos. We especially concentrate on MPEG-7 based feature extraction and content description, where we apply different visual descriptors for cut detection. Further, we extract the temporal positions of single obstacles on the course by analyzing MPEG-7 edge information. Having determined single shot positions as well as the visual highlights, the information is jointly stored with meta-textual information in an MPEG-7 description scheme. Based on this information, we generate content summaries which can be utilized in a user-interface in order to provide content-based access to the video stream, but further for media browsing on a streaming server.
3D shape recovery from image focus using Gabor features
NASA Astrophysics Data System (ADS)
Mahmood, Fahad; Mahmood, Jawad; Zeb, Ayesha; Iqbal, Javaid
2018-04-01
Recovering an accurate and precise depth map from a set of acquired 2-D image dataset of the target object each having different focus information is an ultimate goal of 3-D shape recovery. Focus measure algorithm plays an important role in this architecture as it converts the corresponding color value information into focus information which will be then utilized for recovering depth map. This article introduces Gabor features as focus measure approach for recovering depth map from a set of 2-D images. Frequency and orientation representation of Gabor filter features is similar to human visual system and normally applied for texture representation. Due to its little computational complexity, sharp focus measure curve, robust to random noise sources and accuracy, it is considered as superior alternative to most of recently proposed 3-D shape recovery approaches. This algorithm is deeply investigated on real image sequences and synthetic image dataset. The efficiency of the proposed scheme is also compared with the state of art 3-D shape recovery approaches. Finally, by means of two global statistical measures, root mean square error and correlation, we claim that this approach, in spite of simplicity, generates accurate results.
Proctor, Robert W; Chen, Jing
2015-08-01
The overarching goal is to convey the concept of science of security and the contributions that a scientifically based, human factors approach can make to this interdisciplinary field. Rather than a piecemeal approach to solving cybersecurity problems as they arise, the U.S. government is mounting a systematic effort to develop an approach grounded in science. Because humans play a central role in security measures, research on security-related decisions and actions grounded in principles of human information-processing and decision-making is crucial to this interdisciplinary effort. We describe the science of security and the role that human factors can play in it, and use two examples of research in cybersecurity--detection of phishing attacks and selection of mobile applications--to illustrate the contribution of a scientific, human factors approach. In these research areas, we show that systematic information-processing analyses of the decisions that users make and the actions they take provide a basis for integrating the human component of security science. Human factors specialists should utilize their foundation in the science of applied information processing and decision making to contribute to the science of cybersecurity. © 2015, Human Factors and Ergonomics Society.
Facilitating factors and barriers to malaria research utilization for policy development in Malawi.
Mwendera, Chikondi A; de Jager, Christiaan; Longwe, Herbert; Phiri, Kamija; Hongoro, Charles; Mutero, Clifford M
2016-10-19
Research on various determinants of health is key in providing evidence for policy development, thereby leading to successful interventions. Utilization of research is an intricate process requiring an understanding of contextual factors. The study was conducted to assess enhancing factors and barriers of research utilization for malaria policy development in Malawi. Qualitative research approach was used through in-depth interviews with 39 key informants that included malaria researchers, policy makers, programme managers, and key stakeholders. Purposive sampling and snowballing techniques were used in identifying key informants. Interview transcripts were entered in QSR Nvivo 11 software for coding and analysis. Respondents identified global efforts as key in advancing knowledge translation, while local political will has been conducive for research utilization. Other factors were availability of research, availability of diverse local researchers and stakeholders supporting knowledge translation. While barriers included: lack of platforms for researcher-public engagement, politics, researchers' lack of communication skills, lack of research collaborations, funder driven research, unknown World Health Organization policy position, and the lack of a malaria research repository. Overall, the study identified facilitating factors to malaria research utilization for policy development in Malawi. These factors need to be systematically coordinated to address the identified barriers and improve on malaria research utilization in policy development. Malaria research can be key in the implementation of evidence-based interventions to reduce the malaria burden and assist in the paradigm shift from malaria control to elimination in Malawi.
ERIC Educational Resources Information Center
Ennis, Robin Parks
2016-01-01
Students with emotional and behavioral disorders (EBD) often struggle to be effective writers. Self-regulated strategy development (SRSD) is one approach to writing instruction that has demonstrated success for students with EBD. However, there is little research exploring its utility to teach writing to students with EBD in social studies. The…
USDA-ARS?s Scientific Manuscript database
The Texas Childhood Obesity Research Demonstration project (TX CORD) uses a systems-oriented approach to address obesity that includes individual and family interventions, community-level action, as well as environmental and policy initiatives. Given that randomization is seldom possible in communit...
ERIC Educational Resources Information Center
White, Robert; Taylor, Shirley
2002-01-01
The British model of nurses as finders, appraisers, and users of research in practice is unattainable, given the technical complexity of research and the skills and time required. Clinical governance mechanisms and accountability demands further undermine the approach. An alternative is development of nursing research specialists and…
ERIC Educational Resources Information Center
Štefaniková, Sona; Prokop, Pavol
2015-01-01
The popularity of science education is decreasing in certain parts of the world and negative attitudes toward science are common in learners from various cultures. Learners' interest in science and the effectiveness of their memory can be enhanced by utilizing modern concepts of an evolutionary-based approach in psychology. Survival-relevant…
"Tell Me a Story": The Use of Narrative as a Learning Tool for Natural Selection
ERIC Educational Resources Information Center
Prins, Renate; Avraamidou, Lucy; Goedhart, Martin
2017-01-01
Grounded within literature pointing to the value of narrative in communicating scientific information, the purpose of this study was to examine the use of stories as a tool for teaching about natural selection in the context of school science. The study utilizes a mixed method, case study approach which focuses on the design, implementation, and…
A Tools-Based Approach to Teaching Data Mining Methods
ERIC Educational Resources Information Center
Jafar, Musa J.
2010-01-01
Data mining is an emerging field of study in Information Systems programs. Although the course content has been streamlined, the underlying technology is still in a state of flux. The purpose of this paper is to describe how we utilized Microsoft Excel's data mining add-ins as a front-end to Microsoft's Cloud Computing and SQL Server 2008 Business…
Eini C. Lowell; Dennis R. Becker; Robert Rummer; Debra Larson; Linda Wadleigh
2008-01-01
This research provides an important step in the conceptualization and development of an integrated wildfire fuels reduction system from silvicultural prescription, through stem selection, harvesting, in-woods processing, transport, and market selection. Decisions made at each functional step are informed by knowledge about subsequent functions. Data on the resource...
Eini C. Lowell; Dennis R. Becker; Robert Rummer; Debra Larson; Linda Wadleigh
2008-01-01
This research provides an important step in the conceptualization and development of an integrated wildfire fuels reduction system from silvicultural prescription, through stem selection, harvesting, in-woods processing, transport, and market selection. Decisions made at each functional step are informed by knowledge about subsequent functions. Data on the resource...
A workflow learning model to improve geovisual analytics utility
Roth, Robert E; MacEachren, Alan M; McCabe, Craig A
2011-01-01
Introduction This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. Objectives The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. Methodology The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. Results/Conclusions In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009. PMID:21983545
A workflow learning model to improve geovisual analytics utility.
Roth, Robert E; Maceachren, Alan M; McCabe, Craig A
2009-01-01
INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.
Collective coupling in hybrid superconducting circuits
NASA Astrophysics Data System (ADS)
Saito, Shiro
Hybrid quantum systems utilizing superconducting circuits have attracted significant recent attention, not only for quantum information processing tasks but also as a way to explore fundamentally new physics regimes. In this talk, I will discuss two superconducting circuit based hybrid quantum system approaches. The first is a superconducting flux qubit - electron spin ensemble hybrid system in which quantum information manipulated in the flux qubit can be transferred to, stored in and retrieved from the ensemble. Although the coherence time of the ensemble is short, about 20 ns, this is a significant first step to utilize the spin ensemble as quantum memory for superconducting flux qubits. The second approach is a superconducting resonator - flux qubit ensemble hybrid system in which we fabricated a superconducting LC resonator coupled to a large ensemble of flux qubits. Here we observed a dispersive frequency shift of approximately 250 MHz in the resonators transmission spectrum. This indicates thousands of flux qubits are coupling to the resonator collectively. Although we need to improve our qubits inhomogeneity, our system has many potential uses including the creation of new quantum metamaterials, novel applications in quantum metrology and so on. This work was partially supported by JSPS KAKENHI Grant Number 25220601.
Nishio, Shin-Ya; Usami, Shin-Ichi
2017-03-01
Recent advances in next-generation sequencing (NGS) have given rise to new challenges due to the difficulties in variant pathogenicity interpretation and large dataset management, including many kinds of public population databases as well as public or commercial disease-specific databases. Here, we report a new database development tool, named the "Clinical NGS Database," for improving clinical NGS workflow through the unified management of variant information and clinical information. This database software offers a two-feature approach to variant pathogenicity classification. The first of these approaches is a phenotype similarity-based approach. This database allows the easy comparison of the detailed phenotype of each patient with the average phenotype of the same gene mutation at the variant or gene level. It is also possible to browse patients with the same gene mutation quickly. The other approach is a statistical approach to variant pathogenicity classification based on the use of the odds ratio for comparisons between the case and the control for each inheritance mode (families with apparently autosomal dominant inheritance vs. control, and families with apparently autosomal recessive inheritance vs. control). A number of case studies are also presented to illustrate the utility of this database. © 2016 The Authors. **Human Mutation published by Wiley Periodicals, Inc.
A Novel Group Decision-Making Method Based on Sensor Data and Fuzzy Information.
Bai, Yu-Ting; Zhang, Bai-Hai; Wang, Xiao-Yi; Jin, Xue-Bo; Xu, Ji-Ping; Su, Ting-Li; Wang, Zhao-Yang
2016-10-28
Algal bloom is a typical phenomenon of the eutrophication of rivers and lakes and makes the water dirty and smelly. It is a serious threat to water security and public health. Most scholars studying solutions for this pollution have studied the principles of remediation approaches, but few have studied the decision-making and selection of the approaches. Existing research uses simplex decision-making information which is highly subjective and uses little of the data from water quality sensors. To utilize these data and solve the rational decision-making problem, a novel group decision-making method is proposed using the sensor data with fuzzy evaluation information. Firstly, the optimal similarity aggregation model of group opinions is built based on the modified similarity measurement of Vague values. Secondly, the approaches' ability to improve the water quality indexes is expressed using Vague evaluation methods. Thirdly, the water quality sensor data are analyzed to match the features of the alternative approaches with grey relational degrees. This allows the best remediation approach to be selected to meet the current water status. Finally, the selection model is applied to the remediation of algal bloom in lakes. The results show this method's rationality and feasibility when using different data from different sources.
Proulx, Michael J.; Gwinnutt, James; Dell’Erba, Sara; Levy-Tzedek, Shelly; de Sousa, Alexandra A.; Brown, David J.
2015-01-01
Vision is the dominant sense for perception-for-action in humans and other higher primates. Advances in sight restoration now utilize the other intact senses to provide information that is normally sensed visually through sensory substitution to replace missing visual information. Sensory substitution devices translate visual information from a sensor, such as a camera or ultrasound device, into a format that the auditory or tactile systems can detect and process, so the visually impaired can see through hearing or touch. Online control of action is essential for many daily tasks such as pointing, grasping and navigating, and adapting to a sensory substitution device successfully requires extensive learning. Here we review the research on sensory substitution for vision restoration in the context of providing the means of online control for action in the blind or blindfolded. It appears that the use of sensory substitution devices utilizes the neural visual system; this suggests the hypothesis that sensory substitution draws on the same underlying mechanisms as unimpaired visual control of action. Here we review the current state of the art for sensory substitution approaches to object recognition, localization, and navigation, and the potential these approaches have for revealing a metamodal behavioral and neural basis for the online control of action. PMID:26599473
Quantifying the costs and benefits of privacy-preserving health data publishing.
Khokhar, Rashid Hussain; Chen, Rui; Fung, Benjamin C M; Lui, Siu Man
2014-08-01
Cost-benefit analysis is a prerequisite for making good business decisions. In the business environment, companies intend to make profit from maximizing information utility of published data while having an obligation to protect individual privacy. In this paper, we quantify the trade-off between privacy and data utility in health data publishing in terms of monetary value. We propose an analytical cost model that can help health information custodians (HICs) make better decisions about sharing person-specific health data with other parties. We examine relevant cost factors associated with the value of anonymized data and the possible damage cost due to potential privacy breaches. Our model guides an HIC to find the optimal value of publishing health data and could be utilized for both perturbative and non-perturbative anonymization techniques. We show that our approach can identify the optimal value for different privacy models, including K-anonymity, LKC-privacy, and ∊-differential privacy, under various anonymization algorithms and privacy parameters through extensive experiments on real-life data. Copyright © 2014 Elsevier Inc. All rights reserved.
A Parametric Geometry Computational Fluid Dynamics (CFD) Study Utilizing Design of Experiments (DOE)
NASA Technical Reports Server (NTRS)
Rhew, Ray D.; Parker, Peter A.
2007-01-01
Design of Experiments (DOE) techniques were applied to the Launch Abort System (LAS) of the NASA Crew Exploration Vehicle (CEV) parametric geometry Computational Fluid Dynamics (CFD) study to efficiently identify and rank the primary contributors to the integrated drag over the vehicles ascent trajectory. Typical approaches to these types of activities involve developing all possible combinations of geometries changing one variable at a time, analyzing them with CFD, and predicting the main effects on an aerodynamic parameter, which in this application is integrated drag. The original plan for the LAS study team was to generate and analyze more than1000 geometry configurations to study 7 geometric parameters. By utilizing DOE techniques the number of geometries was strategically reduced to 84. In addition, critical information on interaction effects among the geometric factors were identified that would not have been possible with the traditional technique. Therefore, the study was performed in less time and provided more information on the geometric main effects and interactions impacting drag generated by the LAS. This paper discusses the methods utilized to develop the experimental design, execution, and data analysis.
Ore minerals textural characterization by hyperspectral imaging
NASA Astrophysics Data System (ADS)
Bonifazi, Giuseppe; Picone, Nicoletta; Serranti, Silvia
2013-02-01
The utilization of hyperspectral detection devices, for natural resources mapping/exploitation through remote sensing techniques, dates back to the early 1970s. From the first devices utilizing a one-dimensional profile spectrometer, HyperSpectral Imaging (HSI) devices have been developed. Thus, from specific-customized devices, originally developed by Governmental Agencies (e.g. NASA, specialized research labs, etc.), a lot of HSI based equipment are today available at commercial level. Parallel to this huge increase of hyperspectral systems development/manufacturing, addressed to airborne application, a strong increase also occurred in developing HSI based devices for "ground" utilization that is sensing units able to play inside a laboratory, a processing plant and/or in an open field. Thanks to this diffusion more and more applications have been developed and tested in this last years also in the materials sectors. Such an approach, when successful, is quite challenging being usually reliable, robust and characterised by lower costs if compared with those usually associated to commonly applied analytical off- and/or on-line analytical approaches. In this paper such an approach is presented with reference to ore minerals characterization. According to the different phases and stages of ore minerals and products characterization, and starting from the analyses of the detected hyperspectral firms, it is possible to derive useful information about mineral flow stream properties and their physical-chemical attributes. This last aspect can be utilized to define innovative process mineralogy strategies and to implement on-line procedures at processing level. The present study discusses the effects related to the adoption of different hardware configurations, the utilization of different logics to perform the analysis and the selection of different algorithms according to the different characterization, inspection and quality control actions to apply.
Burken, J.G.; Vroblesky, D.A.; Balouet, J.-C.
2011-01-01
As plants evolved to be extremely proficient in mass transfer with their surroundings and survive as earth's dominant biomass, they also accumulate and store some contaminants from surroundings, acting as passive samplers. Novel applications and analytical methods have been utilized to gain information about a wide range of contaminants in the biosphere soil, water, and air, with information available on both past (dendrochemistry) and present (phytoscreening). Collectively these sampling approaches provide rapid, cheap, ecologically friendly, and overall "green" tools termed "Phytoforensics". ?? 2011 American Chemical Society.
Lane Level Localization; Using Images and HD Maps to Mitigate the Lateral Error
NASA Astrophysics Data System (ADS)
Hosseinyalamdary, S.; Peter, M.
2017-05-01
In urban canyon where the GNSS signals are blocked by buildings, the accuracy of measured position significantly deteriorates. GIS databases have been frequently utilized to improve the accuracy of measured position using map matching approaches. In map matching, the measured position is projected to the road links (centerlines) in this approach and the lateral error of measured position is reduced. By the advancement in data acquision approaches, high definition maps which contain extra information, such as road lanes are generated. These road lanes can be utilized to mitigate the positional error and improve the accuracy in position. In this paper, the image content of a camera mounted on the platform is utilized to detect the road boundaries in the image. We apply color masks to detect the road marks, apply the Hough transform to fit lines to the left and right road boundaries, find the corresponding road segment in GIS database, estimate the homography transformation between the global and image coordinates of the road boundaries, and estimate the camera pose with respect to the global coordinate system. The proposed approach is evaluated on a benchmark. The position is measured by a smartphone's GPS receiver, images are taken from smartphone's camera and the ground truth is provided by using Real-Time Kinematic (RTK) technique. Results show the proposed approach significantly improves the accuracy of measured GPS position. The error in measured GPS position with average and standard deviation of 11.323 and 11.418 meters is reduced to the error in estimated postion with average and standard deviation of 6.725 and 5.899 meters.
2014-01-01
Background Heterologous gene expression is an important tool for synthetic biology that enables metabolic engineering and the production of non-natural biologics in a variety of host organisms. The translational efficiency of heterologous genes can often be improved by optimizing synonymous codon usage to better match the host organism. However, traditional approaches for optimization neglect to take into account many factors known to influence synonymous codon distributions. Results Here we define an alternative approach for codon optimization that utilizes systems level information and codon context for the condition under which heterologous genes are being expressed. Furthermore, we utilize a probabilistic algorithm to generate multiple variants of a given gene. We demonstrate improved translational efficiency using this condition-specific codon optimization approach with two heterologous genes, the fluorescent protein-encoding eGFP and the catechol 1,2-dioxygenase gene CatA, expressed in S. cerevisiae. For the latter case, optimization for stationary phase production resulted in nearly 2.9-fold improvements over commercial gene optimization algorithms. Conclusions Codon optimization is now often a standard tool for protein expression, and while a variety of tools and approaches have been developed, they do not guarantee improved performance for all hosts of applications. Here, we suggest an alternative method for condition-specific codon optimization and demonstrate its utility in Saccharomyces cerevisiae as a proof of concept. However, this technique should be applicable to any organism for which gene expression data can be generated and is thus of potential interest for a variety of applications in metabolic and cellular engineering. PMID:24636000
Providing services to trafficking survivors: Understanding practices across the globe.
Steiner, Jordan J; Kynn, Jamie; Stylianou, Amanda M; Postmus, Judy L
2018-01-01
Human trafficking is a global issue, with survivors representing all genders, ages, races, ethnicities, religions, and countries. However, little research exists that identifies effective practices in supporting survivors of human trafficking. The research that does exist is Western-centric. To fill this gap in the literature, the goal of this research was to understand practices used throughout the globe with adult human trafficking survivors. A qualitative approach was utilized. Providers from 26 countries, across six different continents, were interviewed to allow for a comprehensive and multi-faceted understanding of practices in working with survivors. Participants identified utilizing an empowerment-based, survivor, and human life-centered approach to working with survivors, emphasized the importance of engaging in community level interventions, and highlighted the importance of government recognition of human trafficking. Findings provide information from the perspective of advocates on best practices in the field that can be used by agencies to enhance human trafficking programming.
An approach to investigating linkage for bipolar disorder using large Costa Rican pedigrees
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freimer, N.B.; Reus, V.I.; Vinogradov, S.
1996-05-31
Despite the evidence that major gene effects exist for bipolar disorder (BP), efforts to map BP loci have so far been unsuccessful. A strategy for mapping BP loci is described, focused on investigation of large pedigrees from a genetically homogenous population, that of Costa Rica. This approach is based on the use of a conservative definition of the BP phenotype in preparation for whole genome screening with polymorphic markers. Linkage simulation analyses are utilized to indicate the probability of detecting evidence suggestive of linkage, using these pedigrees. These analyses are performed under a series of single locus models, ranging formmore » recessive to nearly dominant, utilizing both lod score and affected pedigree member analyses. Additional calculations demonstrate that with any of the models employed, most of the information for linkage derives from affected rather than unaffected individuals. 26 refs., 2 figs., 5 tabs.« less
Building a foundation for continued dialogue between climate science and water resource communities
NASA Astrophysics Data System (ADS)
Vano, J. A.; Arnold, J.; Clark, M. P.; Gutmann, E. D.; Hamman, J.; Nijssen, B.; Wood, A.
2017-12-01
Research into climate change has led to the development of many global climate models, downscaling techniques, and impacts models. This proliferation of information has resulted in insights into how climate change will impact hydrology that are more robust than any single approach, which is helpful for advancing the science. However, the variety of approaches makes navigating what information to use in water resource planning and management challenging. Each technique has strengths and weaknesses and associated uncertainties, and approaches are always being updated. Here we provide a user-focused, modularly framed guidance that is designed to be expandable and where updates can be targeted. This includes describing dos and don'ts for how to use climate change information in water resource planning and management that can be read at multiple levels. It can provide context for those seeking to understand the general need, opportunities, and challenges of including climate change information. It also provides details (frequently asked questions and examples) and direction to further guidance and resources for those engaged in the technical work. This guidance is intended to provide a foundation for continued dialogue within and between the climate science and application communities, to increase the utility and appropriate use of climate change information.
Yeung, Kai; Basu, Anirban; Hansen, Ryan N.; Watkins, John B.; Sullivan, Sean D.
2016-01-01
Background Value-based benefit design has been suggested as an effective approach to managing the high cost of pharmaceuticals in health insurance markets. Premera Blue Cross, a large regional health plan, implemented a Value-Based Formulary (VBF) for pharmaceuticals in 2010 that explicitly used cost-effectiveness analysis (CEA) to inform medication copayments. Objective To determine the impact of the VBF. Design Interrupted time-series of employer-sponsored plans from 2006 to 2013. Subjects Intervention group: 5,235 beneficiaries exposed to the VBF. Control group: 11,171 beneficiaries in plans without any changes in pharmacy benefits. Intervention The VBF assigned medications with lower value (estimated by CEA) to higher copayment tiers and assigned medications with higher value to lower copayment tiers. Measures Primary outcome was medication expenditures from member, health plan, and member plus health plan perspectives. Secondary outcomes were medication utilization, emergency department visits, hospitalizations, office visits, and non-medication expenditures. Results In the intervention group after VBF implementation, member medication expenditures increased by $2 per member per month (PMPM) (95% CI, $1 to $3) or 9%, while health plan medication expenditures decreased by $10 PMPM (CI, $18 to $2) or 16%, resulting in a net decrease of $8 PMPM (CI, $15 to $2) or 10%, which translates to a net savings of $1.1 million. Utilization of medications moved into lower copayment tiers increased by 1.95 days’ supply (CI, 1.29 to 2.62) or 17%. Total medication utilization, health services utilization and non-medication expenditures did not change. Conclusions Cost-sharing informed by CEA reduced overall medication expenditures without negatively impacting medication utilization, health services utilization or non-medication expenditures. PMID:27579915
Tinago, Chiwoneso B; Annang Ingram, Lucy; Blake, Christine E; Frongillo, Edward A
2017-07-01
Micronutrient deficiencies are prevalent among Zimbabweans with serious health and social implications. Due to a lack of a national micronutrient food fortification policy, the Zimbabwe Ministry of Health and Child Care established a policy for the prevention of maternal micronutrient deficiencies, which centres on pregnant women receiving daily iron and folic acid (IFA) at their first antenatal care visit and throughout pregnancy. Despite these efforts, utilization of IFA supplementation in pregnancy in Zimbabwe is low. This study aimed to understand the experiences and knowledge of IFA supplementation among pregnant women and healthcare workers in Harare, Zimbabwe, and the influence of health-service and social environments on utilization. Semi-structured in-depth interviews were conducted in Shona and English, with pregnant women (n = 24) and healthcare workers (n = 14) providing direct antenatal care services to pregnant women in two high-density community clinics. Data were analysed thematically using NVivo 10. Influences on utilization were at the individual and structural environmental levels. Reasons for low utilization of IFA supplementation included forgetting to take IFA, side effects, misconceptions about IFA, limited access to nutrition information, delayed entry or non-uptake of antenatal care and social norms of pregnant women for IFA supplementation. Utilization was enhanced by knowledge of risks and benefits of supplementation, fear of negative health complications with non-utilization, family support and healthcare worker recommendation for supplementation. Study findings can inform approaches to strengthen micronutrient supplementation utilization to improve the micronutrient status of pregnant women to decrease maternal mortality and improve overall maternal and child health in Zimbabwe. © 2016 John Wiley & Sons Ltd. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Budi Harja, Herman; Prakosa, Tri; Raharno, Sri; Yuwana Martawirya, Yatna; Nurhadi, Indra; Setyo Nogroho, Alamsyah
2018-03-01
The production characteristic of job-shop industry at which products have wide variety but small amounts causes every machine tool will be shared to conduct production process with dynamic load. Its dynamic condition operation directly affects machine tools component reliability. Hence, determination of maintenance schedule for every component should be calculated based on actual usage of machine tools component. This paper describes study on development of monitoring system to obtaining information about each CNC machine tool component usage in real time approached by component grouping based on its operation phase. A special device has been developed for monitoring machine tool component usage by utilizing usage phase activity data taken from certain electronics components within CNC machine. The components are adaptor, servo driver and spindle driver, as well as some additional components such as microcontroller and relays. The obtained data are utilized for detecting machine utilization phases such as power on state, machine ready state or spindle running state. Experimental result have shown that the developed CNC machine tool monitoring system is capable of obtaining phase information of machine tool usage as well as its duration and displays the information at the user interface application.
Student approaches for learning in medicine: What does it tell us about the informal curriculum?
2011-01-01
Background It has long been acknowledged that medical students frequently focus their learning on that which will enable them to pass examinations, and that they use a range of study approaches and resources in preparing for their examinations. A recent qualitative study identified that in addition to the formal curriculum, students are using a range of resources and study strategies which could be attributed to the informal curriculum. What is not clearly established is the extent to which these informal learning resources and strategies are utilized by medical students. The aim of this study was to establish the extent to which students in a graduate-entry medical program use various learning approaches to assist their learning and preparation for examinations, apart from those resources offered as part of the formal curriculum. Methods A validated survey instrument was administered to 522 medical students. Factor analysis and internal consistence, descriptive analysis and comparisons with demographic variables were completed. The factor analysis identified eight scales with acceptable levels of internal consistency with an alpha coefficient between 0.72 and 0.96. Results Nearly 80% of the students reported that they were overwhelmed by the amount of work that was perceived necessary to complete the formal curriculum, with 74.3% believing that the informal learning approaches helped them pass the examinations. 61.3% believed that they prepared them to be good doctors. A variety of informal learning activities utilized by students included using past student notes (85.8%) and PBL tutor guides (62.7%), and being part of self-organised study groups (62.6%), and peer-led tutorials (60.2%). Almost all students accessed the formal school resources for at least 10% of their study time. Students in the first year of the program were more likely to rely on the formal curriculum resources compared to those of Year 2 (p = 0.008). Conclusions Curriculum planners should examine the level of use of informal learning activities in their schools, and investigate whether this is to enhance student progress, a result of perceived weakness in the delivery and effectiveness of formal resources, or to overcome anxiety about the volume of work expected by medical programs. PMID:22013994
NASA Astrophysics Data System (ADS)
Abo-Ezz, E. R.; Essa, K. S.
2016-04-01
A new linear least-squares approach is proposed to interpret magnetic anomalies of the buried structures by using a new magnetic anomaly formula. This approach depends on solving different sets of algebraic linear equations in order to invert the depth ( z), amplitude coefficient ( K), and magnetization angle ( θ) of buried structures using magnetic data. The utility and validity of the new proposed approach has been demonstrated through various reliable synthetic data sets with and without noise. In addition, the method has been applied to field data sets from USA and India. The best-fitted anomaly has been delineated by estimating the root-mean squared (rms). Judging satisfaction of this approach is done by comparing the obtained results with other available geological or geophysical information.
Autonomous onboard crew operations: A review and developmental approach
NASA Technical Reports Server (NTRS)
Rogers, J. G.
1982-01-01
A review of the literature generated by an intercenter mission approach and consolidation team and their contractors was performed to obtain background information on the development of autonomous operations concepts for future space shuttle and space platform missions. The Boeing 757/767 flight management system was examined to determine the relevance for transfer of the developmental approach and technology to the performance of the crew operations function. In specific, the engine indications and crew alerting system was studied to determine the relevance of this display for the performance of crew operations onboard the vehicle. It was concluded that the developmental approach and technology utilized in the aeronautics industry would be appropriate for development of an autonomous operations concept for the space platform.
Clerehan, Rosemary; Hirsh, Di; Buchbinder, Rachelle
2009-01-01
While clinicians may routinely use patient information leaflets about drug therapy, a poorly conceived leaflet has the potential to do harm. We previously developed a novel approach to analysing leaflets about a rheumatoid arthritis drug, using an analytic approach based on systemic functional linguistics. The aim of the present study was to verify the validity of the linguistic framework by applying it to two further arthritis drug leaflets. The findings confirmed the applicability of the framework and were used to refine it. A new stage or 'move' in the genre was identified. While the function of many of the moves appeared to be 'to instruct' the patient, the instruction was often unclear. The role relationships expressed in the text were critical to the meaning. As with our previous study, judged on their lexical density, the leaflets resembled academic text. The framework can provide specific tools to assess and produce medication information leaflets to support readers in taking medication. Future work could utilize the framework to evaluate information on other treatments and procedures or on healthcare information more widely.
Community Detection Algorithm Combining Stochastic Block Model and Attribute Data Clustering
NASA Astrophysics Data System (ADS)
Kataoka, Shun; Kobayashi, Takuto; Yasuda, Muneki; Tanaka, Kazuyuki
2016-11-01
We propose a new algorithm to detect the community structure in a network that utilizes both the network structure and vertex attribute data. Suppose we have the network structure together with the vertex attribute data, that is, the information assigned to each vertex associated with the community to which it belongs. The problem addressed this paper is the detection of the community structure from the information of both the network structure and the vertex attribute data. Our approach is based on the Bayesian approach that models the posterior probability distribution of the community labels. The detection of the community structure in our method is achieved by using belief propagation and an EM algorithm. We numerically verified the performance of our method using computer-generated networks and real-world networks.
Image detection and compression for memory efficient system analysis
NASA Astrophysics Data System (ADS)
Bayraktar, Mustafa
2015-02-01
The advances in digital signal processing have been progressing towards efficient use of memory and processing. Both of these factors can be utilized efficiently by using feasible techniques of image storage by computing the minimum information of image which will enhance computation in later processes. Scale Invariant Feature Transform (SIFT) can be utilized to estimate and retrieve of an image. In computer vision, SIFT can be implemented to recognize the image by comparing its key features from SIFT saved key point descriptors. The main advantage of SIFT is that it doesn't only remove the redundant information from an image but also reduces the key points by matching their orientation and adding them together in different windows of image [1]. Another key property of this approach is that it works on highly contrasted images more efficiently because it`s design is based on collecting key points from the contrast shades of image.
RPMIS: The Roswell Park Management Information System
Priore, R.L.; Lane, W.W.; Edgerton, F.T.; Naeher, C.H.; Reese, P.A.
1978-01-01
This paper presents a generalized approach to data entry and editing utilizing formatted video computer terminals. The purpose of the system developed is to facilitate the creation of many small data bases, with a minimum of implementation time, while maintaining extensive editing capability and preserving ease of use by data entry personnel. RPMIS has demonstrated its utility in shortening the time between research activities and clinical application of results. The system allows entry and retrieval of overlapping subsets of the patient's record in an order and format most appropriate to the individual application. It is used for production of synoptic presentations of information from the labs, the ward and the clinic. RPMIS was designed for the clinical trials setting and has been well received and implemented for numerous such studies. Additional uses have included several registries, screening clinics, retrospective studies, and epidemiologic investigations. The system has found fortuitous use in maintaining curriculum vitae, publications lists and continuing medical education credits.
Geothermal reservoir simulation of hot sedimentary aquifer system using FEFLOW®
NASA Astrophysics Data System (ADS)
Nur Hidayat, Hardi; Gala Permana, Maximillian
2017-12-01
The study presents the simulation of hot sedimentary aquifer for geothermal utilization. Hot sedimentary aquifer (HSA) is a conduction-dominated hydrothermal play type utilizing deep aquifer, which is heated by near normal heat flow. One of the examples of HSA is Bavarian Molasse Basin in South Germany. This system typically uses doublet wells: an injection and production well. The simulation was run for 3650 days of simulation time. The technical feasibility and performance are analysed in regards to the extracted energy from this concept. Several parameters are compared to determine the model performance. Parameters such as reservoir characteristics, temperature information and well information are defined. Several assumptions are also defined to simplify the simulation process. The main results of the simulation are heat period budget or total extracted heat energy, and heat rate budget or heat production rate. Qualitative approaches for sensitivity analysis are conducted by using five parameters in which assigned lower and higher value scenarios.
An Analysis of Security and Privacy Issues in Smart Grid Software Architectures on Clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simmhan, Yogesh; Kumbhare, Alok; Cao, Baohua
2011-07-09
Power utilities globally are increasingly upgrading to Smart Grids that use bi-directional communication with the consumer to enable an information-driven approach to distributed energy management. Clouds offer features well suited for Smart Grid software platforms and applications, such as elastic resources and shared services. However, the security and privacy concerns inherent in an information rich Smart Grid environment are further exacerbated by their deployment on Clouds. Here, we present an analysis of security and privacy issues in a Smart Grids software architecture operating on different Cloud environments, in the form of a taxonomy. We use the Los Angeles Smart Gridmore » Project that is underway in the largest U.S. municipal utility to drive this analysis that will benefit both Cloud practitioners targeting Smart Grid applications, and Cloud researchers investigating security and privacy.« less
Use of laser range finders and range image analysis in automated assembly tasks
NASA Technical Reports Server (NTRS)
Alvertos, Nicolas; Dcunha, Ivan
1990-01-01
A proposition to study the effect of filtering processes on range images and to evaluate the performance of two different laser range mappers is made. Median filtering was utilized to remove noise from the range images. First and second order derivatives are then utilized to locate the similarities and dissimilarities between the processed and the original images. Range depth information is converted into spatial coordinates, and a set of coefficients which describe 3-D objects is generated using the algorithm developed in the second phase of this research. Range images of spheres and cylinders are used for experimental purposes. An algorithm was developed to compare the performance of two different laser range mappers based upon the range depth information of surfaces generated by each of the mappers. Furthermore, an approach based on 2-D analytic geometry is also proposed which serves as a basis for the recognition of regular 3-D geometric objects.
Link-Based Similarity Measures Using Reachability Vectors
Yoon, Seok-Ho; Kim, Ji-Soo; Ryu, Minsoo; Choi, Ho-Jin
2014-01-01
We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures. PMID:24701188
An informatics approach to analyzing the incidentalome.
Berg, Jonathan S; Adams, Michael; Nassar, Nassib; Bizon, Chris; Lee, Kristy; Schmitt, Charles P; Wilhelmsen, Kirk C; Evans, James P
2013-01-01
Next-generation sequencing has transformed genetic research and is poised to revolutionize clinical diagnosis. However, the vast amount of data and inevitable discovery of incidental findings require novel analytic approaches. We therefore implemented for the first time a strategy that utilizes an a priori structured framework and a conservative threshold for selecting clinically relevant incidental findings. We categorized 2,016 genes linked with Mendelian diseases into "bins" based on clinical utility and validity, and used a computational algorithm to analyze 80 whole-genome sequences in order to explore the use of such an approach in a simulated real-world setting. The algorithm effectively reduced the number of variants requiring human review and identified incidental variants with likely clinical relevance. Incorporation of the Human Gene Mutation Database improved the yield for missense mutations but also revealed that a substantial proportion of purported disease-causing mutations were misleading. This approach is adaptable to any clinically relevant bin structure, scalable to the demands of a clinical laboratory workflow, and flexible with respect to advances in genomics. We anticipate that application of this strategy will facilitate pretest informed consent, laboratory analysis, and posttest return of results in a clinical context.
Research-based care on an acute inpatient psychiatric unit.
Bartholomew, David; Collier, Elizabeth
Many studies of research-based practice in nursing highlight factors that impede the development of practice. With the aim of adding to this body of knowledge, a modified grounded theory approach was used in order to understand more about these barriers and how individual nurses utilize research in their practice. A selective sample of five staff nurses from one acute inpatient psychiatric unit took part in semi-structured interviews. Three main themes were identified, each with two sub-themes. These were (a) activities to utilize research with (i) a 'systematic' model and (ii) a 'latent' model of research utilization (b) enhancing research utilization with (i) organizational culture and (ii) individual attitude and knowledge and (c) impeding research utilization with (i) resources (ii) resistance to change. It is suggested that for these nurses research utilization occurs through their individual knowledge, skill and motivation coupled with organizational commitment. Recommendation is made that further investigation of the 'systematic' and 'latent' models should be carried out. Additionally, it is suggested that these research findings might be used to inform future training, further research-based initiatives and to raise managerial awareness of the impeding factors of research utilization.
Employing a Qualitative Description Approach in Health Care Research.
Bradshaw, Carmel; Atkinson, Sandra; Doody, Owen
2017-01-01
A qualitative description design is particularly relevant where information is required directly from those experiencing the phenomenon under investigation and where time and resources are limited. Nurses and midwives often have clinical questions suitable to a qualitative approach but little time to develop an exhaustive comprehension of qualitative methodological approaches. Qualitative description research is sometimes considered a less sophisticated approach for epistemological reasons. Another challenge when considering qualitative description design is differentiating qualitative description from other qualitative approaches. This article provides a systematic and robust journey through the philosophical, ontological, and epistemological perspectives, which evidences the purpose of qualitative description research. Methods and rigor issues underpinning qualitative description research are also appraised to provide the researcher with a systematic approach to conduct research utilizing this approach. The key attributes and value of qualitative description research in the health care professions will be highlighted with the aim of extending its usage.
EXTRACTING PRINCIPLE COMPONENTS FOR DISCRIMINANT ANALYSIS OF FMRI IMAGES
Liu, Jingyu; Xu, Lai; Caprihan, Arvind; Calhoun, Vince D.
2009-01-01
This paper presents an approach for selecting optimal components for discriminant analysis. Such an approach is useful when further detailed analyses for discrimination or characterization requires dimensionality reduction. Our approach can accommodate a categorical variable such as diagnosis (e.g. schizophrenic patient or healthy control), or a continuous variable like severity of the disorder. This information is utilized as a reference for measuring a component’s discriminant power after principle component decomposition. After sorting each component according to its discriminant power, we extract the best components for discriminant analysis. An application of our reference selection approach is shown using a functional magnetic resonance imaging data set in which the sample size is much less than the dimensionality. The results show that the reference selection approach provides an improved discriminant component set as compared to other approaches. Our approach is general and provides a solid foundation for further discrimination and classification studies. PMID:20582334
EXTRACTING PRINCIPLE COMPONENTS FOR DISCRIMINANT ANALYSIS OF FMRI IMAGES.
Liu, Jingyu; Xu, Lai; Caprihan, Arvind; Calhoun, Vince D
2008-05-12
This paper presents an approach for selecting optimal components for discriminant analysis. Such an approach is useful when further detailed analyses for discrimination or characterization requires dimensionality reduction. Our approach can accommodate a categorical variable such as diagnosis (e.g. schizophrenic patient or healthy control), or a continuous variable like severity of the disorder. This information is utilized as a reference for measuring a component's discriminant power after principle component decomposition. After sorting each component according to its discriminant power, we extract the best components for discriminant analysis. An application of our reference selection approach is shown using a functional magnetic resonance imaging data set in which the sample size is much less than the dimensionality. The results show that the reference selection approach provides an improved discriminant component set as compared to other approaches. Our approach is general and provides a solid foundation for further discrimination and classification studies.
Employing a Qualitative Description Approach in Health Care Research
Bradshaw, Carmel; Atkinson, Sandra; Doody, Owen
2017-01-01
A qualitative description design is particularly relevant where information is required directly from those experiencing the phenomenon under investigation and where time and resources are limited. Nurses and midwives often have clinical questions suitable to a qualitative approach but little time to develop an exhaustive comprehension of qualitative methodological approaches. Qualitative description research is sometimes considered a less sophisticated approach for epistemological reasons. Another challenge when considering qualitative description design is differentiating qualitative description from other qualitative approaches. This article provides a systematic and robust journey through the philosophical, ontological, and epistemological perspectives, which evidences the purpose of qualitative description research. Methods and rigor issues underpinning qualitative description research are also appraised to provide the researcher with a systematic approach to conduct research utilizing this approach. The key attributes and value of qualitative description research in the health care professions will be highlighted with the aim of extending its usage. PMID:29204457
NASA Astrophysics Data System (ADS)
Rose, K.; Glosser, D.; Bauer, J. R.; Barkhurst, A.
2015-12-01
The products of spatial analyses that leverage the interpolation of sparse, point data to represent continuous phenomena are often presented without clear explanations of the uncertainty associated with the interpolated values. As a result, there is frequently insufficient information provided to effectively support advanced computational analyses and individual research and policy decisions utilizing these results. This highlights the need for a reliable approach capable of quantitatively producing and communicating spatial data analyses and their inherent uncertainties for a broad range of uses. To address this need, we have developed the Variable Grid Method (VGM), and associated Python tool, which is a flexible approach that can be applied to a variety of analyses and use case scenarios where users need a method to effectively study, evaluate, and analyze spatial trends and patterns while communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations, etc. We will present examples of our research utilizing the VGM to quantify key spatial trends and patterns for subsurface data interpolations and their uncertainties and leverage these results to evaluate storage estimates and potential impacts associated with underground injection for CO2 storage and unconventional resource production and development. The insights provided by these examples identify how the VGM can provide critical information about the relationship between uncertainty and spatial data that is necessary to better support their use in advance computation analyses and informing research, management and policy decisions.
Dynamic cone beam CT angiography of carotid and cerebral arteries using canine model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai Weixing; Zhao Binghui; Conover, David
2012-01-15
Purpose: This research is designed to develop and evaluate a flat-panel detector-based dynamic cone beam CT system for dynamic angiography imaging, which is able to provide both dynamic functional information and dynamic anatomic information from one multirevolution cone beam CT scan. Methods: A dynamic cone beam CT scan acquired projections over four revolutions within a time window of 40 s after contrast agent injection through a femoral vein to cover the entire wash-in and wash-out phases. A dynamic cone beam CT reconstruction algorithm was utilized and a novel recovery method was developed to correct the time-enhancement curve of contrast flow.more » From the same data set, both projection-based subtraction and reconstruction-based subtraction approaches were utilized and compared to remove the background tissues and visualize the 3D vascular structure to provide the dynamic anatomic information. Results: Through computer simulations, the new recovery algorithm for dynamic time-enhancement curves was optimized and showed excellent accuracy to recover the actual contrast flow. Canine model experiments also indicated that the recovered time-enhancement curves from dynamic cone beam CT imaging agreed well with that of an IV-digital subtraction angiography (DSA) study. The dynamic vascular structures reconstructed using both projection-based subtraction and reconstruction-based subtraction were almost identical as the differences between them were comparable to the background noise level. At the enhancement peak, all the major carotid and cerebral arteries and the Circle of Willis could be clearly observed. Conclusions: The proposed dynamic cone beam CT approach can accurately recover the actual contrast flow, and dynamic anatomic imaging can be obtained with high isotropic 3D resolution. This approach is promising for diagnosis and treatment planning of vascular diseases and strokes.« less
Community Based Approach to Wind Energy Information Dissemination
DOE Office of Scientific and Technical Information (OSTI.GOV)
Innis, S.
The purpose of the Department of Energy's grant was to transfer to New Mexico and Utah a national award-winning market-based strategy to aggregate demand for wind energy. Their experiences over the past few years in New Mexico and utah have been quite different. In both states they have developed stronger relationships with utilities and policymakers which will increase the effectiveness of the future advocacy efforts.
Dynamic Tasking of Networked Sensors Using Covariance Information
2010-09-01
has been created under an effort called TASMAN (Tasking Autonomous Sensors in a Multiple Application Network). One of the first studies utilizing this...environment was focused on a novel resource management approach, namely covariance-based tasking. Under this scheme, the state error covariance of...resident space objects (RSO), sensor characteristics, and sensor- target geometry were used to determine the effectiveness of future observations in
David Nicholls; Frank Barnes; Felicia Acrea; Chinling Chen; Lara Y. Buluç; Michele M. Parker
2015-01-01
Federal agencies are mandated to measure, manage, and reduce greenhouse gas (GHG) emissions. The General Services Administration (GSA) Carbon Footprint Tool (CFT) is an online tool built to utilize measured GHG inventories to help Forest Service units streamline reporting and make informed decisions about operational efficiency. In fiscal year 2013, the Forest Service...
ERIC Educational Resources Information Center
Van Auken, Stuart; Chrysler, Earl; Wells, Ludmilla Gricenko; Simkin, Mark
2011-01-01
The authors utilized a gap analysis approach to assess general IS knowledge and skill voids or overages in a specific program context. The authors asked alumni to reveal the emphasis that should have been given to 10 IS knowledge and skill areas and compared the results with the emphasis that was actually given. They proceed by relating the…
Planetary Data Workshop, Part 1
NASA Technical Reports Server (NTRS)
1984-01-01
The community of planetary scientists addresses two general problems regarding planetary science data: (1) important data sets are being permanently lost; and (2) utilization is constrainted by difficulties in locating and accessing science data and supporting information necessary for its use. A means to correct the problems, provide science and functional requirements for a systematic and phased approach, and suggest technologies and standards appropriate to the solution were explored.
Improved interior wall detection using designated dictionaries in compressive urban sensing problems
NASA Astrophysics Data System (ADS)
Lagunas, Eva; Amin, Moeness G.; Ahmad, Fauzia; Nájar, Montse
2013-05-01
In this paper, we address sparsity-based imaging of building interior structures for through-the-wall radar imaging and urban sensing applications. The proposed approach utilizes information about common building construction practices to form an appropriate sparse representation of the building layout. With a ground based SAR system, and considering that interior walls are either parallel or perpendicular to the exterior walls, the antenna at each position would receive reflections from the walls parallel to the radar's scan direction as well as from the corners between two meeting walls. We propose a two-step approach for wall detection and localization. In the first step, a dictionary of possible wall locations is used to recover the positions of both interior and exterior walls that are parallel to the scan direction. A follow-on step uses a dictionary of possible corner reflectors to locate wall-wall junctions along the detected wall segments, thereby determining the true wall extents and detecting walls perpendicular to the scan direction. The utility of the proposed approach is demonstrated using simulated data.
Hybrid cooperative spectrum sharing for cognitive radio networks: A contract-based approach
NASA Astrophysics Data System (ADS)
Zhang, Songwei; Mu, Xiaomin; Wang, Ning; Zhang, Dalong; Han, Gangtao
2018-06-01
In order to improve the spectral efficiency, a contract-based hybrid cooperative spectrum sharing approach is proposed in this paper, in which multiple primary users (PUs) and multiple secondary users (SUs) share the primary channels in a hybrid manner. Specifically, the SUs switch their transmission mode between underlay and overlay based on the second-order statistics of the primary links. The average transmission rates of PUs and SUs are analyzed for the two transmission modes, and an optimization problem is formulated to maximize the utility of PUs under the constraint that the utility of SUs is nonnegative, which is further solved by a contract-based approach in global statistical channel statistical information (S-CSI) scenarios and local S-CSI scenarios, individually. Numerical results show that the average transmission rate of the PUs is significantly improved by using the proposed method in both of the two scenarios, and in the meantime, the SUs can achieve a good average rate, especially while the SUs have the same number of the PUs in the local S-CSI scenarios.
Decision-Making in Audiology: Balancing Evidence-Based Practice and Patient-Centered Care.
Boisvert, Isabelle; Clemesha, Jennifer; Lundmark, Erik; Crome, Erica; Barr, Caitlin; McMahon, Catherine M
2017-01-01
Health-care service delivery models have evolved from a practitioner-centered approach toward a patient-centered ideal. Concurrently, increasing emphasis has been placed on the use of empirical evidence in decision-making to increase clinical accountability. The way in which clinicians use empirical evidence and client preferences to inform decision-making provides an insight into health-care delivery models utilized in clinical practice. The present study aimed to investigate the sources of information audiologists use when discussing rehabilitation choices with clients, and discuss the findings within the context of evidence-based practice and patient-centered care. To assess the changes that may have occurred over time, this study uses a questionnaire based on one of the few studies of decision-making behavior in audiologists, published in 1989. The present questionnaire was completed by 96 audiologists who attended the World Congress of Audiology in 2014. The responses were analyzed using qualitative and quantitative approaches. Results suggest that audiologists rank clinical test results and client preferences as the most important factors for decision-making. Discussion with colleagues or experts was also frequently reported as an important source influencing decision-making. Approximately 20% of audiologists mentioned utilizing research evidence to inform decision-making when no clear solution was available. Information shared at conferences was ranked low in terms of importance and reliability. This study highlights an increase in awareness of concepts associated with evidence-based practice and patient-centered care within audiology settings, consistent with current research-to-practice dissemination pathways. It also highlights that these pathways may not be sufficient for an effective clinical implementation of these practices.
Neeman, Naama; Isaac, Thomas; Leveille, Suzanne; Dimonda, Clementina; Shin, Jacob Y; Aronson, Mark D; Freedman, Steven D
2012-08-01
Patients often do not fully understand medical information discussed during office visits. This can result in lack of adherence to recommended treatment plans and poorer health outcomes. We developed and implemented a program utilizing an encounter form, which provides structure to the medical interaction and facilitates bidirectional communication and informed decision-making. We conducted a prospective quality improvement intervention at a large tertiary-care academic medical center utilizing the encounter form and studied the effect on patient satisfaction, understanding and confidence in communicating with physicians. The intervention included 108 patients seen by seven physicians in five sub-specialties. Ninety-eight percent of patients were extremely satisfied (77%) or somewhat satisfied (21%) with the program. Ninety-six percent of patients reported being involved in decisions about their care and treatments as well as high levels of understanding of medical information that was discussed during visit. Sixty-nine percent of patients reported that they shared the encounter form with their families and friends. Patients' self-confidence in communicating with their doctors increased from a score of 8.1 to 8.7 post-intervention (P-value = 0.0018). When comparing pre- and post-intervention experiences, only 38% of patients felt that their problems and questions were adequately addressed by other physicians' pre-intervention, compared with 94% post-intervention. We introduced a program to enhance physician-patient communication and found that patients were highly satisfied, more informed and more actively involved in their care. This approach may be an easily generalizable approach to improving physician-patient communication at outpatient visits.
Image analysis by integration of disparate information
NASA Technical Reports Server (NTRS)
Lemoigne, Jacqueline
1993-01-01
Image analysis often starts with some preliminary segmentation which provides a representation of the scene needed for further interpretation. Segmentation can be performed in several ways, which are categorized as pixel based, edge-based, and region-based. Each of these approaches are affected differently by various factors, and the final result may be improved by integrating several or all of these methods, thus taking advantage of their complementary nature. In this paper, we propose an approach that integrates pixel-based and edge-based results by utilizing an iterative relaxation technique. This approach has been implemented on a massively parallel computer and tested on some remotely sensed imagery from the Landsat-Thematic Mapper (TM) sensor.
Pape-Haugaard, Louise; Frank, Lars
2011-01-01
A major obstacle in ensuring ubiquitous information is the utilization of heterogeneous systems in eHealth. The objective in this paper is to illustrate how an architecture for distributed eHealth databases can be designed without lacking the characteristic features of traditional sustainable databases. The approach is firstly to explain traditional architecture in central and homogeneous distributed database computing, followed by a possible approach to use an architectural framework to obtain sustainability across disparate systems i.e. heterogeneous databases, concluded with a discussion. It is seen that through a method of using relaxed ACID properties on a service-oriented architecture it is possible to achieve data consistency which is essential when ensuring sustainable interoperability.
Health-related media use among youth audiences in Senegal.
Glik, Deborah; Massey, Philip; Gipson, Jessica; Dieng, Thierno; Rideau, Alexandre; Prelip, Michael
2016-03-01
Lower- and middle-income countries (LMICs) are experiencing rapid changes in access to and use of new internet and digital media technologies. The purpose of this study was to better understand how younger audiences are navigating traditional and newer forms of media technologies, with particular emphasis on the skills and competencies needed to obtain, evaluate and apply health-related information, also defined as health and media literacy. Sixteen focus group discussions were conducted throughout Senegal in September 2012 with youth aged 15-25. Using an iterative coding process based on grounded theory, four themes emerged related to media use for health information among Senegalese youth. They include the following: (i) media utilization; (ii) barriers and conflicts regarding media utilization; (iii) uses and gratifications and (iv) health and media literacy. Findings suggest that Senegalese youth use a heterogeneous mix of media platforms (i.e. television, radio, internet) and utilization often occurs with family members or friends. Additionally, the need for entertainment, information and connectedness inform media use, mostly concerning sexual and reproductive health information. Importantly, tensions arise as youth balance innovative and interactive technologies with traditional and conservative values, particularly concerning ethical and privacy concerns. Findings support the use of multipronged intervention approaches that leverage both new media, as well as traditional media strategies, and that also address lack of health and media literacy in this population. Implementing health-related interventions across multiple media platforms provides an opportunity to create an integrated, as opposed to a disparate, user experience. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Quality of service management framework for dynamic chaining of geographic information services
NASA Astrophysics Data System (ADS)
Onchaga, Richard
2006-06-01
Dynamic chaining of geographic information services (geo-services) is gaining popularity as a new paradigm for evolving flexible geo-information systems and for providing on-demand access to geo-information. In dynamic chaining, disparate geo-services are discovered and composed at run time to yield more elaborate functionality and create value-added geo-information. Common approaches to service chaining discover and compose disparate geo-services based on the functional capability of individual geo-services. The primary concern of common approaches is thus the emergent behavior of the resulting composite geo-service. However, as geo-services become mundane and take on a greater and more strategic role in mission critical processes, deliverable quality of service (QoS) becomes an important concern. QoS concerns operational characteristics of a service that determine its utility in an application context. To address pertinent QoS requirements, a new approach to service chaining becomes necessary. In this paper we propose a QoS-aware chaining approach in which geo-services are discovered, composed and executed considering both functional and QoS requirements. We prescribe a QoS management framework that defines fundamental principles, concepts and mechanisms which can be applied to evolve an effective distributed computing platform for QoS-aware chaining of geo-services - the so-called geo-service infrastructure. The paper also defines an extensible QoS model for services delivered by dynamic compositions of geo-services. The process of orthophoto generation is used to demonstrate the applicability of the prescribed framework to service-oriented geographic information processing.
Real-time Social Media Data Analytics for Situational Awareness of the Electric Grid
NASA Astrophysics Data System (ADS)
Mao, H.; Chinthavali, S.; Lee, S.; Shankar, M.; Thiagarajan, S.
2016-12-01
With the increasing frequency of extreme events due to climate change, wide area situational awareness (SA) of the electric grid has become a primary need for federal agencies like DOE,FEMA etc. for emergency preparedness and recovery purposes. While several sensor feeds from Genscape, GridEye, PMUs provide a comprehensive view of the transmission grid, national-scale situational awareness tools are still relying on utility websites for outage information at a distribution level. The inconsistency and the variety in outage website's data formats makes this approach unreliable and also incurs huge software maintenance costs. Social media has emerged as a great medium for the utilities to share outage information with their customers. Despite their potential usefulness, extracting relevant data from these social media data-streams is challenging due to the inherent noise and irrelevant information such as tips to customers during storms, marketing, etc. In this study, we implement a practical and novel machine learning based data-analytics pipeline (Fig 1) for SA, which extracts real-time tweets from around 300 utility companies, processes these tweets using keyword filtering and Naïve-Bayes text classifier trained using supervised learning techniques to detect only relevant tweets. We validated the results by comparing it with the results identified by a human analyst for a period of 48 hours, and it showed around 98.3% accuracy. In addition to the tweets posted by utility companies, millions of twitter users, who are considered as human "social sensors", report power outages online. Therefore, we use Twitter Streaming API to extract real-time tweets containing keywords such as "power outage", "blackout", and "power cuts". An advanced natural language processing technique is proposed to identify the geo-locations associated with this power outage data. The detected tweets are visualized as a color-coded state and a county US map based on the number of outage tweets posted. Therefore, by analyzing a large amount of tweets posted by utilities and the general public, our approach can detect real-time power outages at a national-scale. This framework has been integrated into existing SA tools such as VERDE, EARSS and EAGLE-I, which is deployed by the Oak Ridge National Laboratory for the DOE.
Marenco, Luis; Ascoli, Giorgio A; Martone, Maryann E; Shepherd, Gordon M; Miller, Perry L
2008-09-01
This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information's (NCBI's) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation.
Disaster loss and social media: Can online information increase flood resilience?
NASA Astrophysics Data System (ADS)
Allaire, Maura C.
2016-09-01
When confronted with natural disasters, individuals around the world increasingly use online resources to become informed of forecasted conditions and advisable actions. This study tests the effectiveness of online information and social media in enabling households to reduce disaster losses. The 2011 Bangkok flood is utilized as a case study since it was one of the first major disasters to affect a substantial population connected to social media. The role of online information is investigated with a mixed methods approach. Both quantitative (propensity score matching) and qualitative (in-depth interviews) techniques are employed. The study relies on two data sources—survey responses from 469 Bangkok households and in-depth interviews with internet users who were a subset of the survey participants. Propensity score matching indicates that social media enabled households to reduce flood losses by an average of 37% (USD 3708 per household), using a nearest neighbor estimator. This reduction is substantial when considering that household flood losses for the matched sample averaged USD 8278. Social media offered information not available from other sources, such as localized and nearly real-time updates of flood location and depth. With this knowledge, households could move belongings to higher ground before floodwaters arrived. These findings suggest that utilizing social media users as sensors could better inform populations during disasters. Overall, the study reveals that online information can enable effective disaster preparedness and reduce losses.
Disaster Loss and Social Media: Can Online Information Increase Flood Resilience?
NASA Astrophysics Data System (ADS)
Allaire, M.
2016-12-01
When confronted with natural disasters, individuals around the world increasingly use online resources to become informed of forecasted conditions and advisable actions. This study tests the effectiveness of online information and social media in enabling households to reduce disaster losses. The 2011 Bangkok flood is utilized as a case study since it was one of the first major disasters to affect a substantial population connected to social media. The role of online information is investigated with a mixed methods approach. Both quantitative (propensity score matching) and qualitative (in-depth interviews) techniques are employed. The study relies on two data sources - survey responses from 469 Bangkok households and in-depth interviews with twenty-three internet users who are a subset of the survey participants. Propensity score matching indicates that social media enabled households to reduce flood losses by an average of 37% (USD 3,708), using a nearest neighbor estimator. This reduction is massive when considering that total flood losses for the full sample averaged USD 4,903. Social media offered information not available from other sources, such as localized and nearly real-time updates of flood location and depth. With this knowledge, households could move belongings to higher ground before floodwaters arrived. These findings suggest that utilizing social media users as sensors could better inform populations during disasters. Overall, the study reveals that online information can enable effective disaster preparedness and reduce losses.
Education and Library Services for Community Information Utilities.
ERIC Educational Resources Information Center
Farquhar, John A.
The concept of "computer utility"--the provision of computing and information service by a utility in the form of a national network to which any person desiring information could gain access--has been gaining interest among the public and among the technical community. This report on planning community information utilities discusses the…
Information Search and Decision Making: The Effects of Age and Complexity on Strategy Use
Queen, Tara L.; Hess, Thomas M.; Ennis, Gilda E.; Dowd, Keith; Grühn, Daniel
2012-01-01
The impact of task complexity on information search strategy and decision quality was examined in a sample of 135 young, middle-aged, and older adults. We were particularly interested in the competing roles of fluid cognitive ability and domain knowledge and experience, with the former being a negative influence and the latter being a positive influence on older adults’ performance. Participants utilized two decision matrices, which varied in complexity, regarding a consumer purchase. Using process tracing software and an algorithm developed to assess decision strategy, we recorded search behavior, strategy selection, and final decision. Contrary to expectations, older adults were not more likely than the younger age groups to engage in information-minimizing search behaviors in response to increases in task complexity. Similarly, adults of all ages used comparable decision strategies and adapted their strategies to the demands of the task. We also examined decision outcomes in relation to participants’ preferences. Overall, it seems that older adults utilize simpler sets of information primarily reflecting the most valued attributes in making their choice. The results of this study suggest that older adults are adaptive in their approach to decision making and this ability may benefit from accrued knowledge and experience. PMID:22663157
Shelby, Ashley; Ernst, Karen
2013-08-01
With little or no evidence-based information to back up claims of vaccine danger, anti-vaccine activists have relied on the power of storytelling to infect an entire generation of parents with fear of and doubt about vaccines. These parent accounts of perceived vaccine injury, coupled with Andrew Wakefield's fraudulent research study linking the MMR vaccine to autism, created a substantial amount of vaccine hesitancy in new parents, which manifests in both vaccine refusal and the adoption of delayed vaccine schedules. The tools used by the medical and public health communities to counteract the anti-vaccine movement include statistics, research, and other evidence-based information, often delivered verbally or in the form of the CDC's Vaccine Information Statements. This approach may not be effective enough on its own to convince vaccine-hesitant parents that vaccines are safe, effective, and crucial to their children's health. Utilizing some of the storytelling strategies used by the anti-vaccine movement, in addition to evidence-based vaccine information, could potentially offer providers, public health officials, and pro-vaccine parents an opportunity to mount a much stronger defense against anti-vaccine messaging.
NASA Astrophysics Data System (ADS)
Pradana, G. W.; Fanida, E. H.; Niswah, F.
2018-01-01
The demand for good governance is directed towards the realization of efficiency, effectiveness, and clean government. The move is demonstrated through national and regional levels to develop and implement electronic government concepts. Through the development of electronic government is done structuring management systems and work processes in the government environment by optimizing the utilization of information technology. One of the real forms of electronic government (e-Gov) implementation at the local level is the Intranet Sub-District program in Sukodono Sub-District, Sidoarjo. Intranet Sub-District is an innovation whose purpose is to realize the availability of information on the utilization of management, distribution, and storage of official scripts, and also the optimal delivery of information and communication in the implementation of guidance and supervision of local administration. The type of this paper is descriptive with a qualitative approach and focus on the implementation of the Intranet District Program in Sukodono District, Sidoarjo. The findings of the study are the limited number of human resources who have mastered ICT, the uneven network, the adequacy of institutional needs and the existence of budget support from the authorized institution and the information system has not accommodated all the service needs.
Colorectal cancer patients' attitudes towards involvement in decision making.
Beaver, Kinta; Campbell, Malcolm; Craven, Olive; Jones, David; Luker, Karen A; Susnerwala, Shabbir S
2009-03-01
To design and administer an attitude rating scale, exploring colorectal cancer patients' views of involvement in decision making. To examine the impact of socio-demographic and/or treatment-related factors on decision making. To conduct principal components analysis to determine if the scale could be simplified into a number of factors for future clinical utility. An attitude rating scale was constructed based on previous qualitative work and administered to colorectal cancer patients using a cross-sectional survey approach. 375 questionnaires were returned (81.7% response). For patients it was important to be informed and involved in the decision-making process. Information was not always used to make decisions as patients placed their trust in medical expertise. Women had more positive opinions on decision making and were more likely to want to make decisions. Written information was understood to a greater degree than verbal information. The scale could be simplified to a number of factors, indicating clinical utility. Few studies have explored the attitudes of colorectal cancer patients towards involvement in decision making. This study presents new insights into how patients view the concept of participation; important when considering current policy imperatives in the UK of involving service users in all aspects of care and treatment.
NASA Technical Reports Server (NTRS)
Murphy, M. R.
1980-01-01
A resource management approach to aircrew performance is defined and utilized in structuring an analysis of 84 exemplary incidents from the NASA Aviation Safety Reporting System. The distribution of enabling and associated (evolutionary) and recovery factors between and within five analytic categories suggests that resource management training be concentrated on: (1) interpersonal communications, with air traffic control information of major concern; (2) task management, mainly setting priorities and appropriately allocating tasks under varying workload levels; and (3) planning, coordination, and decisionmaking concerned with preventing and recovering from potentially unsafe situations in certain aircraft maneuvers.
Hierarchical nucleus segmentation in digital pathology images
NASA Astrophysics Data System (ADS)
Gao, Yi; Ratner, Vadim; Zhu, Liangjia; Diprima, Tammy; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel
2016-03-01
Extracting nuclei is one of the most actively studied topic in the digital pathology researches. Most of the studies directly search the nuclei (or seeds for the nuclei) from the finest resolution available. While the richest information has been utilized by such approaches, it is sometimes difficult to address the heterogeneity of nuclei in different tissues. In this work, we propose a hierarchical approach which starts from the lower resolution level and adaptively adjusts the parameters while progressing into finer and finer resolution. The algorithm is tested on brain and lung cancers images from The Cancer Genome Atlas data set.
A synoptic description of coal basins via image processing
NASA Technical Reports Server (NTRS)
Farrell, K. W., Jr.; Wherry, D. B.
1978-01-01
An existing image processing system is adapted to describe the geologic attributes of a regional coal basin. This scheme handles a map as if it were a matrix, in contrast to more conventional approaches which represent map information in terms of linked polygons. The utility of the image processing approach is demonstrated by a multiattribute analysis of the Herrin No. 6 coal seam in Illinois. Findings include the location of a resource and estimation of tonnage corresponding to constraints on seam thickness, overburden, and Btu value, which are illustrative of the need for new mining technology.
Ding, Xiuhua; Su, Shaoyong; Nandakumar, Kannabiran; Wang, Xiaoling; Fardo, David W
2014-01-01
Large-scale genetic studies are often composed of related participants, and utilizing familial relationships can be cumbersome and computationally challenging. We present an approach to efficiently handle sequencing data from complex pedigrees that incorporates information from rare variants as well as common variants. Our method employs a 2-step procedure that sequentially regresses out correlation from familial relatedness and then uses the resulting phenotypic residuals in a penalized regression framework to test for associations with variants within genetic units. The operating characteristics of this approach are detailed using simulation data based on a large, multigenerational cohort.
Kannampallil, Thomas G; Franklin, Amy; Mishra, Rashmi; Almoosa, Khalid F; Cohen, Trevor; Patel, Vimla L
2013-01-01
Information in critical care environments is distributed across multiple sources, such as paper charts, electronic records, and support personnel. For decision-making tasks, physicians have to seek, gather, filter and organize information from various sources in a timely manner. The objective of this research is to characterize the nature of physicians' information seeking process, and the content and structure of clinical information retrieved during this process. Eight medical intensive care unit physicians provided a verbal think-aloud as they performed a clinical diagnosis task. Verbal descriptions of physicians' activities, sources of information they used, time spent on each information source, and interactions with other clinicians were captured for analysis. The data were analyzed using qualitative and quantitative approaches. We found that the information seeking process was exploratory and iterative and driven by the contextual organization of information. While there was no significant differences between the overall time spent paper or electronic records, there was marginally greater relative information gain (i.e., more unique information retrieved per unit time) from electronic records (t(6)=1.89, p=0.1). Additionally, information retrieved from electronic records was at a higher level (i.e., observations and findings) in the knowledge structure than paper records, reflecting differences in the nature of knowledge utilization across resources. A process of local optimization drove the information seeking process: physicians utilized information that maximized their information gain even though it required significantly more cognitive effort. Implications for the design of health information technology solutions that seamlessly integrate information seeking activities within the workflow, such as enriching the clinical information space and supporting efficient clinical reasoning and decision-making, are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.
A Ranking Approach on Large-Scale Graph With Multidimensional Heterogeneous Information.
Wei, Wei; Gao, Bin; Liu, Tie-Yan; Wang, Taifeng; Li, Guohui; Li, Hang
2016-04-01
Graph-based ranking has been extensively studied and frequently applied in many applications, such as webpage ranking. It aims at mining potentially valuable information from the raw graph-structured data. Recently, with the proliferation of rich heterogeneous information (e.g., node/edge features and prior knowledge) available in many real-world graphs, how to effectively and efficiently leverage all information to improve the ranking performance becomes a new challenging problem. Previous methods only utilize part of such information and attempt to rank graph nodes according to link-based methods, of which the ranking performances are severely affected by several well-known issues, e.g., over-fitting or high computational complexity, especially when the scale of graph is very large. In this paper, we address the large-scale graph-based ranking problem and focus on how to effectively exploit rich heterogeneous information of the graph to improve the ranking performance. Specifically, we propose an innovative and effective semi-supervised PageRank (SSP) approach to parameterize the derived information within a unified semi-supervised learning framework (SSLF-GR), then simultaneously optimize the parameters and the ranking scores of graph nodes. Experiments on the real-world large-scale graphs demonstrate that our method significantly outperforms the algorithms that consider such graph information only partially.
The choice of sample size: a mixed Bayesian / frequentist approach.
Pezeshk, Hamid; Nematollahi, Nader; Maroufy, Vahed; Gittins, John
2009-04-01
Sample size computations are largely based on frequentist or classical methods. In the Bayesian approach the prior information on the unknown parameters is taken into account. In this work we consider a fully Bayesian approach to the sample size determination problem which was introduced by Grundy et al. and developed by Lindley. This approach treats the problem as a decision problem and employs a utility function to find the optimal sample size of a trial. Furthermore, we assume that a regulatory authority, which is deciding on whether or not to grant a licence to a new treatment, uses a frequentist approach. We then find the optimal sample size for the trial by maximising the expected net benefit, which is the expected benefit of subsequent use of the new treatment minus the cost of the trial.
Optimal route discovery for soft QOS provisioning in mobile ad hoc multimedia networks
NASA Astrophysics Data System (ADS)
Huang, Lei; Pan, Feng
2007-09-01
In this paper, we propose an optimal routing discovery algorithm for ad hoc multimedia networks whose resource keeps changing, First, we use stochastic models to measure the network resource availability, based on the information about the location and moving pattern of the nodes, as well as the link conditions between neighboring nodes. Then, for a certain multimedia packet flow to be transmitted from a source to a destination, we formulate the optimal soft-QoS provisioning problem as to find the best route that maximize the probability of satisfying its desired QoS requirements in terms of the maximum delay constraints. Based on the stochastic network resource model, we developed three approaches to solve the formulated problem: A centralized approach serving as the theoretical reference, a distributed approach that is more suitable to practical real-time deployment, and a distributed dynamic approach that utilizes the updated time information to optimize the routing for each individual packet. Examples of numerical results demonstrated that using the route discovered by our distributed algorithm in a changing network environment, multimedia applications could achieve better QoS statistically.
Efficient Bayesian experimental design for contaminant source identification
NASA Astrophysics Data System (ADS)
Zhang, Jiangjiang; Zeng, Lingzao; Chen, Cheng; Chen, Dingjiang; Wu, Laosheng
2015-01-01
In this study, an efficient full Bayesian approach is developed for the optimal sampling well location design and source parameters identification of groundwater contaminants. An information measure, i.e., the relative entropy, is employed to quantify the information gain from concentration measurements in identifying unknown parameters. In this approach, the sampling locations that give the maximum expected relative entropy are selected as the optimal design. After the sampling locations are determined, a Bayesian approach based on Markov Chain Monte Carlo (MCMC) is used to estimate unknown parameters. In both the design and estimation, the contaminant transport equation is required to be solved many times to evaluate the likelihood. To reduce the computational burden, an interpolation method based on the adaptive sparse grid is utilized to construct a surrogate for the contaminant transport equation. The approximated likelihood can be evaluated directly from the surrogate, which greatly accelerates the design and estimation process. The accuracy and efficiency of our approach are demonstrated through numerical case studies. It is shown that the methods can be used to assist in both single sampling location and monitoring network design for contaminant source identifications in groundwater.
Bodner, Todd E.
2017-01-01
Wilkinson and Task Force on Statistical Inference (1999) recommended that researchers include information on the practical magnitude of effects (e.g., using standardized effect sizes) to distinguish between the statistical and practical significance of research results. To date, however, researchers have not widely incorporated this recommendation into the interpretation and communication of the conditional effects and differences in conditional effects underlying statistical interactions involving a continuous moderator variable where at least one of the involved variables has an arbitrary metric. This article presents a descriptive approach to investigate two-way statistical interactions involving continuous moderator variables where the conditional effects underlying these interactions are expressed in standardized effect size metrics (i.e., standardized mean differences and semi-partial correlations). This approach permits researchers to evaluate and communicate the practical magnitude of particular conditional effects and differences in conditional effects using conventional and proposed guidelines, respectively, for the standardized effect size and therefore provides the researcher important supplementary information lacking under current approaches. The utility of this approach is demonstrated with two real data examples and important assumptions underlying the standardization process are highlighted. PMID:28484404
Cavagnaro, Daniel R; Myung, Jay I; Pitt, Mark A; Kujala, Janne V
2010-04-01
Discriminating among competing statistical models is a pressing issue for many experimentalists in the field of cognitive science. Resolving this issue begins with designing maximally informative experiments. To this end, the problem to be solved in adaptive design optimization is identifying experimental designs under which one can infer the underlying model in the fewest possible steps. When the models under consideration are nonlinear, as is often the case in cognitive science, this problem can be impossible to solve analytically without simplifying assumptions. However, as we show in this letter, a full solution can be found numerically with the help of a Bayesian computational trick derived from the statistics literature, which recasts the problem as a probability density simulation in which the optimal design is the mode of the density. We use a utility function based on mutual information and give three intuitive interpretations of the utility function in terms of Bayesian posterior estimates. As a proof of concept, we offer a simple example application to an experiment on memory retention.
Smart City: Utilization of IT resources to encounter natural disaster
NASA Astrophysics Data System (ADS)
Hartama, D.; Mawengkang, Herman; Zarlis, M.; Sembiring, R. W.
2017-09-01
This study proposes a framework for the utilization of IT resources in the face of natural disasters with the concept of Smart City in urban areas, which often face the earthquake, particularly in the city of North Sumatra and Aceh. Smart City is a city that integrates social development, capital, civic participation, and transportation with the use of information technology to support the preservation of natural resources and improved quality of life. Changes in the climate and environment have an impact on the occurrence of natural disasters, which tend to increase in recent decades, thus providing socio-economic impacts for the community. This study suggests a new approach that combines the Geographic Information System (GIS) and Mobile IT-based Android in the form of Geospatial information to encounter disaster. Resources and IT Infrastructure in implementing the Smart Mobility with Mobile service can make urban areas as a Smart City. This study describes the urban growth using the Smart City concept and considers how a GIS and Mobile Systems can increase Disaster Management, which consists of Preparedness, mitigation, response, and recovery for recovery from natural disasters.
Advanced algorithms for distributed fusion
NASA Astrophysics Data System (ADS)
Gelfand, A.; Smith, C.; Colony, M.; Bowman, C.; Pei, R.; Huynh, T.; Brown, C.
2008-03-01
The US Military has been undergoing a radical transition from a traditional "platform-centric" force to one capable of performing in a "Network-Centric" environment. This transformation will place all of the data needed to efficiently meet tactical and strategic goals at the warfighter's fingertips. With access to this information, the challenge of fusing data from across the batttlespace into an operational picture for real-time Situational Awareness emerges. In such an environment, centralized fusion approaches will have limited application due to the constraints of real-time communications networks and computational resources. To overcome these limitations, we are developing a formalized architecture for fusion and track adjudication that allows the distribution of fusion processes over a dynamically created and managed information network. This network will support the incorporation and utilization of low level tracking information within the Army Distributed Common Ground System (DCGS-A) or Future Combat System (FCS). The framework is based on Bowman's Dual Node Network (DNN) architecture that utilizes a distributed network of interlaced fusion and track adjudication nodes to build and maintain a globally consistent picture across all assets.
Physical Therapy in the Treatment of Central Pain Mechanisms for Female Sexual Pain.
Vandyken, Carolyn; Hilton, Sandra
2017-01-01
The complexity of female sexual pain requires an interdisciplinary approach. Physical therapists trained in pelvic health conditions are well positioned to be active members of an interdisciplinary team addressing the assessment and treatment of female sexual pain. Changes within physical therapy practice in the last ten years have resulted in significant utilization of pelvic floor muscle relaxation and manual therapy techniques to address a variety of pelvic pain conditions, including female sexual pain. However, sexual pain is a complex issue giving credence to the necessity of addressing all of the drivers of the pain experience- biological, psychological and social. This review aims to reconcile current pain science with a plan for integrating a biopsychosocial approach into the evaluation and subsequent treatment for female sexual pain for physical therapists. A literature review of the important components of skilled physical therapy interventions is presented including the physical examination, pain biology education, cognitive behavioral influences in treatment design, motivational interviewing as an adjunct to empathetic practice, and the integration of non-threatening movement and mindfulness into treatment. A single case study is used to demonstrate the biopsychosocial framework utilized in this approach. Appropriate measures for assessing psychosocial factors are readily available and inform a reasoned approach for physical therapy design that addresses both peripheral and central pain mechanisms. Decades of research support the integration of a biopsychosocial approach in the treatment of complex pain, including female sexual pain. It is reasonable for physical therapists to utilize evidence based strategies such as CBT, pain biology education, Mindfulness Based Stress Reduction (MBSR), yoga and imagery based exercises to address the biopsychosocial components of female sexual pain. Copyright © 2016 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.
Ellis, Margaret S.; Affolter, Ronald H.
2007-01-01
The Energy Resources Program of the U.S. Geological Survey promotes and supports coal research to improve the understanding of the coal endowment of the United States. This results in geologically based, non-biased energy information products for policy and decision makers, land and resource managers, other federal and state agencies, the domestic energy industry, foreign governments, nongovernmental groups, academia, and other scientists. A more integrated approach to our coal quality work involves what we call a 'cradle to grave' approach. These types of studies focus not on just one aspect of the coal but rather on how or where different quality parameters form and (or) occur and what happens to them through the mining, production, transport, utilization and waste disposal process. An extensive suite of coal quality analyses, mineralogical, petrology, and leaching investigations are determined on samples taken from the different phases of the coal utilization process. This report consists of a tutorial that was given on June 10, 2007 at the 32nd International Technical Conference on Coal Utilization & Fuel Systems, The Power of Coal, Clearwater Coal Conference in Clearwater, Florida, USA. This tutorial covers how these studies are conducted and the importance of providing improved, comprehensive, science-based data sets for policy and decision makers.
A statistical approach to combining multisource information in one-class classifiers
Simonson, Katherine M.; Derek West, R.; Hansen, Ross L.; ...
2017-06-08
A new method is introduced in this paper for combining information from multiple sources to support one-class classification. The contributing sources may represent measurements taken by different sensors of the same physical entity, repeated measurements by a single sensor, or numerous features computed from a single measured image or signal. The approach utilizes the theory of statistical hypothesis testing, and applies Fisher's technique for combining p-values, modified to handle nonindependent sources. Classifier outputs take the form of fused p-values, which may be used to gauge the consistency of unknown entities with one or more class hypotheses. The approach enables rigorousmore » assessment of classification uncertainties, and allows for traceability of classifier decisions back to the constituent sources, both of which are important for high-consequence decision support. Application of the technique is illustrated in two challenge problems, one for skin segmentation and the other for terrain labeling. Finally, the method is seen to be particularly effective for relatively small training samples.« less
A statistical approach to combining multisource information in one-class classifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simonson, Katherine M.; Derek West, R.; Hansen, Ross L.
A new method is introduced in this paper for combining information from multiple sources to support one-class classification. The contributing sources may represent measurements taken by different sensors of the same physical entity, repeated measurements by a single sensor, or numerous features computed from a single measured image or signal. The approach utilizes the theory of statistical hypothesis testing, and applies Fisher's technique for combining p-values, modified to handle nonindependent sources. Classifier outputs take the form of fused p-values, which may be used to gauge the consistency of unknown entities with one or more class hypotheses. The approach enables rigorousmore » assessment of classification uncertainties, and allows for traceability of classifier decisions back to the constituent sources, both of which are important for high-consequence decision support. Application of the technique is illustrated in two challenge problems, one for skin segmentation and the other for terrain labeling. Finally, the method is seen to be particularly effective for relatively small training samples.« less
Contact detection for nanomanipulation in a scanning electron microscope.
Ru, Changhai; To, Steve
2012-07-01
Nanomanipulation systems require accurate knowledge of the end-effector position in all three spatial coordinates, XYZ, for reliable manipulation of nanostructures. Although the images acquired by a scanning electron microscope (SEM) provide high resolution XY information, the lack of depth information in the Z-direction makes 3D nanomanipulation time-consuming. Existing approaches for contact detection of end-effectors inside SEM typically utilize fragile touch sensors that are difficult to integrate into a nanomanipulation system. This paper presents a method for determining the contact between an end-effector and a target surface during nanomanipulation inside SEM, purely based on the processing of SEM images. A depth-from-focus method is used in the fast approach of the end-effector to the substrate, followed by fine contact detection. Experimental results demonstrate that the contact detection approach is capable of achieving an accuracy of 21.5 nm at 50,000× magnification while inducing little end-effector damage. Copyright © 2012 Elsevier B.V. All rights reserved.
The indicator performance estimate approach to determining acceptable wilderness conditions
NASA Astrophysics Data System (ADS)
Hollenhorst, Steven; Gardner, Lisa
1994-11-01
Using data from a study conducted in the Cranberry Wilderness Area of West Virginia, United States, this paper describes how a modified importance—performance approach can be used to prioritize wilderness indicators and determine how much change from the pristine is acceptable. The approach uses two key types of information: (1) indicator importance, or visitor opinion as to which wilderness indicators have the greatest influence on their experience, and (2) management performance, or the extent to which actual indicator conditions exceed or are within visitor expectations. Performance was represented by calculating indicator performance estimates (IPEs), as defined by standardized differences between actual conditions and visitor preferences for each indicator. The results for each indicator are then presented graphically on a four-quadrant matrix for objective interpretation. Each quadrant represents a management response: keep up the good work, concentrate here, low priority, or possible overkill. The technique allows managers to more systematically and effectively utilize information routinely collected during the limits of acceptable change wilderness planning process.
Gillum, Tameka L
2014-10-01
Research is clear that violence against college women is a problem that warrants alternative prevention approaches to addressing and reducing its prevalence and creating safer campuses for women and men. Banyard's presentation gave us food for thought as we consider what such novel approaches may look like. New and innovative approaches that are multifaceted, comprehensive, and informed by theory are key. The ecological model can inform our understanding of the issue, the risk and protective factors associated, and the design and implementation of prevention efforts. It is critically important to engage college students in these efforts to create interventions that are culturally appropriate for college students. We must also meet students where they are, utilizing social marketing campaigns and capitalizing on social media and the use of communication technologies. Together, such efforts will facilitate our ultimate goal of reducing, if not eliminating, violence against women on college campuses. © The Author(s) 2014.
Bilal, Selamawit M; Spigt, Mark; Dinant, Geert Jan; Blanco, Roman
2015-03-01
Universal access to Sexual and Reproductive Health (SRH) services for adolescents was added as a target to the revised Millennium Development Goals framework in 2005. However, the utilization of SRH services among adolescents and their sexual activity is not well explored in Ethiopia, with the result that there is no well-designed and sustainable school based intervention for high school students. We aimed to investigate the utilization of sexual and reproductive health services and sexual activity and, to provide evidence based information and recommendations for possible interventions. A cross-sectional survey was conducted among 1031 female and male high school students aged 14-19 years in Mekelle town, Tigray Region, North Ethiopia. A total of 1031 students participated. Self-administered questionnaire was used. Utilization of sexual and reproductive health services and sexual activity were investigated using a self-administered questionnaire. One out of five students had used the SRH services in the past year. The primary reason for visiting the SRH services was to receive information. The mean age for the first sexual intercourse was 15.7 and one-quarter of the students had multiple sexual partners. Unwanted pregnancies and abortions were reported by female students. SRH services are known and used by students. However, sexual activity at an early age among high school students and unwanted pregnancies and abortions among female students still call for attention. Therefore, providing accurate SRH information on safe sex and enhancing family-student discussion could be a good approach to reach SRH of adolescents. Copyright © 2014 Elsevier B.V. All rights reserved.
An integrative health information systems approach for facilitating strategic planning in hospitals.
Killingsworth, Brenda; Newkirk, Henry E; Seeman, Elaine
2006-01-01
This article presents a framework for developing strategic information systems (SISs) for hospitals. It proposes a SIS formulation process which incorporates complexity theory, strategic/organizational analysis theory, and conventional MIS development concepts. Within the formulation process, four dimensions of SIS are proposed as well as an implementation plan. A major contribution of this article is the development of a hospital SIS framework which permits an organization to fluidly respond to external, interorganizational, and intraorganizational influences. In addition, this article offers a checklist which managers can utilize in developing an SIS in health care.
Shifting landscapes: immigrant women and postpartum depression.
Morrow, Marina; Smith, Jules E; Lai, Yuan; Jaswal, Suman
2008-07-01
Utilizing an ethnographic narrative approach, we explored in the Canadian context the experiences of three groups of first-generation Punjabi-speaking, Cantonese-speaking, and Mandarin-speaking immigrant women with depression after childbirth. The information emerging from women's narratives of their experiences reveals the critical importance of the sociocultural context of childbirth in understanding postpartum depression. We suggest that an examination of women's narratives about their experiences of postpartum depression can broaden the understanding of the kinds of perinatal supports women need beyond health care provision and yet can also usefully inform the practice of health care professionals.
Continual improvement: A bibliography with indexes, 1992-1993
NASA Technical Reports Server (NTRS)
1994-01-01
This bibliography lists 606 references to reports and journal articles entered into the NASA Scientific and Technical Information Database during 1992 to 1993. Topics cover the philosophy and history of Continual Improvement (CI), basic approaches and strategies for implementation, and lessons learned from public and private sector models. Entries are arranged according to the following categories: Leadership for Quality, Information and Analysis, Strategic Planning for CI, Human Resources Utilization, Management of Process Quality, Supplier Quality, Assessing Results, Customer Focus and Satisfaction, TQM Tools and Philosophies, and Applications. Indexes include subject, personal author, corporate source, contract number, report number, and accession number.
NASA Astrophysics Data System (ADS)
Ivanov, A. V.; Reva, I. L.; Babin, A. A.
2018-04-01
The article deals with influence of various ways to place vibration transmitters on efficiency of rooms safety for negotiations. Standing for remote vibration listening of window glass, electro-optical channel, the most typical technical channel of information leakage, was investigated. The modern system “Sonata-AB” of 4B model is used as an active protection tool. Factors influencing on security tools configuration efficiency have been determined. The results allow utilizer to reduce masking interference level as well as parasitic noise with keeping properties of room safety.
Self-Taught Low-Rank Coding for Visual Learning.
Li, Sheng; Li, Kang; Fu, Yun
2018-03-01
The lack of labeled data presents a common challenge in many computer vision and machine learning tasks. Semisupervised learning and transfer learning methods have been developed to tackle this challenge by utilizing auxiliary samples from the same domain or from a different domain, respectively. Self-taught learning, which is a special type of transfer learning, has fewer restrictions on the choice of auxiliary data. It has shown promising performance in visual learning. However, existing self-taught learning methods usually ignore the structure information in data. In this paper, we focus on building a self-taught coding framework, which can effectively utilize the rich low-level pattern information abstracted from the auxiliary domain, in order to characterize the high-level structural information in the target domain. By leveraging a high quality dictionary learned across auxiliary and target domains, the proposed approach learns expressive codings for the samples in the target domain. Since many types of visual data have been proven to contain subspace structures, a low-rank constraint is introduced into the coding objective to better characterize the structure of the given target set. The proposed representation learning framework is called self-taught low-rank (S-Low) coding, which can be formulated as a nonconvex rank-minimization and dictionary learning problem. We devise an efficient majorization-minimization augmented Lagrange multiplier algorithm to solve it. Based on the proposed S-Low coding mechanism, both unsupervised and supervised visual learning algorithms are derived. Extensive experiments on five benchmark data sets demonstrate the effectiveness of our approach.
Thermodynamic Studies for Drug Design and Screening
Garbett, Nichola C.; Chaires, Jonathan B.
2012-01-01
Introduction A key part of drug design and development is the optimization of molecular interactions between an engineered drug candidate and its binding target. Thermodynamic characterization provides information about the balance of energetic forces driving binding interactions and is essential for understanding and optimizing molecular interactions. Areas covered This review discusses the information that can be obtained from thermodynamic measurements and how this can be applied to the drug development process. Current approaches for the measurement and optimization of thermodynamic parameters are presented, specifically higher throughput and calorimetric methods. Relevant literature for this review was identified in part by bibliographic searches for the period 2004 – 2011 using the Science Citation Index and PUBMED and the keywords listed below. Expert opinion The most effective drug design and development platform comes from an integrated process utilizing all available information from structural, thermodynamic and biological studies. Continuing evolution in our understanding of the energetic basis of molecular interactions and advances in thermodynamic methods for widespread application are essential to realize the goal of thermodynamically-driven drug design. Comprehensive thermodynamic evaluation is vital early in the drug development process to speed drug development towards an optimal energetic interaction profile while retaining good pharmacological properties. Practical thermodynamic approaches, such as enthalpic optimization, thermodynamic optimization plots and the enthalpic efficiency index, have now matured to provide proven utility in design process. Improved throughput in calorimetric methods remains essential for even greater integration of thermodynamics into drug design. PMID:22458502
Characterization of PTO and Idle Behavior for Utility Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duran, Adam W.; Konan, Arnaud M.; Miller, Eric S.
This report presents the results of analyses performed on utility vehicle data composed primarily of aerial lift bucket trucks sampled from the National Renewable Energy Laboratory's Fleet DNA database to characterize power takeoff (PTO) and idle operating behavior for utility trucks. Two major data sources were examined in this study: a 75-vehicle sample of Odyne electric PTO (ePTO)-equipped vehicles drawn from multiple fleets spread across the United States and 10 conventional PTO-equipped Pacific Gas and Electric fleet vehicles operating in California. Novel data mining approaches were developed to identify PTO and idle operating states for each of the datasets usingmore » telematics and controller area network/onboard diagnostics data channels. These methods were applied to the individual datasets and aggregated to develop utilization curves and distributions describing PTO and idle behavior in both absolute and relative operating terms. This report also includes background information on the source vehicles, development of the analysis methodology, and conclusions regarding the study's findings.« less
Observability and Estimation of Distributed Space Systems via Local Information-Exchange Networks
NASA Technical Reports Server (NTRS)
Rahmani, Amirreza; Mesbahi, Mehran; Fathpour, Nanaz; Hadaegh, Fred Y.
2008-01-01
In this work, we develop an approach to formation estimation by explicitly characterizing formation's system-theoretic attributes in terms of the underlying inter-spacecraft information-exchange network. In particular, we approach the formation observer/estimator design by relaxing the accessibility to the global state information by a centralized observer/estimator- and in turn- providing an analysis and synthesis framework for formation observers/estimators that rely on local measurements. The noveltyof our approach hinges upon the explicit examination of the underlying distributed spacecraft network in the realm of guidance, navigation, and control algorithmic analysis and design. The overarching goal of our general research program, some of whose results are reported in this paper, is the development of distributed spacecraft estimation algorithms that are scalable, modular, and robust to variations inthe topology and link characteristics of the formation information exchange network. In this work, we consider the observability of a spacecraft formation from a single observation node and utilize the agreement protocol as a mechanism for observing formation states from local measurements. Specifically, we show how the symmetry structure of the network, characterized in terms of its automorphism group, directly relates to the observability of the corresponding multi-agent system The ramification of this notion of observability over networks is then explored in the context of distributed formation estimation.
Context-based virtual metrology
NASA Astrophysics Data System (ADS)
Ebersbach, Peter; Urbanowicz, Adam M.; Likhachev, Dmitriy; Hartig, Carsten; Shifrin, Michael
2018-03-01
Hybrid and data feed forward methodologies are well established for advanced optical process control solutions in highvolume semiconductor manufacturing. Appropriate information from previous measurements, transferred into advanced optical model(s) at following step(s), provides enhanced accuracy and exactness of the measured topographic (thicknesses, critical dimensions, etc.) and material parameters. In some cases, hybrid or feed-forward data are missed or invalid for dies or for a whole wafer. We focus on approaches of virtual metrology to re-create hybrid or feed-forward data inputs in high-volume manufacturing. We discuss missing data inputs reconstruction which is based on various interpolation and extrapolation schemes and uses information about wafer's process history. Moreover, we demonstrate data reconstruction approach based on machine learning techniques utilizing optical model and measured spectra. And finally, we investigate metrics that allow one to assess error margin of virtual data input.
NASA Technical Reports Server (NTRS)
Brubaker, N.; Jedlovec, G. J.
2004-01-01
With the preliminary release of AIRS Level 1 and 2 data to the scientific community, there is a growing need for an accurate AIRS cloud mask for data assimilation studies and in producing products derived from cloud free radiances. Current cloud information provided with the AIRS data are limited or based on simplified threshold tests. A multispectral cloud detection approach has been developed for AIRS that utilizes the hyper-spectral capabilities to detect clouds based on specific cloud signatures across the short wave and long wave infrared window regions. This new AIRS cloud mask has been validated against the existing AIRS Level 2 cloud product and cloud information derived from MODIS. Preliminary results for both day and night applications over the continental U.S. are encouraging. Details of the cloud detection approach and validation results will be presented at the conference.
A Group Based Key Sharing and Management Algorithm for Vehicular Ad Hoc Networks
Moharram, Mohammed Morsi; Azam, Farzana
2014-01-01
Vehicular ad hoc networks (VANETs) are one special type of ad hoc networks that involves vehicles on roads. Typically like ad hoc networks, broadcast approach is used for data dissemination. Blind broadcast to each and every node results in exchange of useless and irrelevant messages and hence creates an overhead. Unicasting is not preferred in ad-hoc networks due to the dynamic topology and the resource requirements as compared to broadcasting. Simple broadcasting techniques create several problems on privacy, disturbance, and resource utilization. In this paper, we propose media mixing algorithm to decide what information should be provided to each user and how to provide such information. Results obtained through simulation show that fewer number of keys are needed to share compared to simple broadcasting. Privacy is also enhanced through this approach. PMID:24587749
Definition of the 2005 flight deck environment
NASA Technical Reports Server (NTRS)
Alter, K. W.; Regal, D. M.
1992-01-01
A detailed description of the functional requirements necessary to complete any normal commercial flight or to handle any plausible abnormal situation is provided. This analysis is enhanced with an examination of possible future developments and constraints in the areas of air traffic organization and flight deck technologies (including new devices and procedures) which may influence the design of 2005 flight decks. This study includes a discussion on the importance of a systematic approach to identifying and solving flight deck information management issues, and a description of how the present work can be utilized as part of this approach. While the intent of this study was to investigate issues surrounding information management in 2005-era supersonic commercial transports, this document may be applicable to any research endeavor related to future flight deck system design in either supersonic or subsonic airplane development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wirick, D.W.; Montgomery, G.E.; Wagman, D.C.
1995-09-01
One technology that can assist utilities remain financially viable in competitive markets and help utilities and regulators to better serve the public is information technology. Because geography is an important part of an electric, natural gas, telecommunications, or water utility, computer-based Geographic Information Systems (GIS) and related Automated Mapping/Facilities Management systems are emerging as core technologies for managing an ever-expanding variety of formerly manual or paper-based tasks. This report focuses on GIS as an example of the types of information systems that can be used by utilities and regulatory commissions. Chapter 2 provides general information about information systems and effectsmore » of information on organizations; Chapter 3 explores the conversion of an organization to an information-based one; Chapters 4 and 5 set out GIS as an example of the use of information technologies to transform the operations of utilities and commissions; Chapter 6 describes the use of GIS and other information systems for organizational reengineering efforts; and Chapter 7 examines the regulatory treatment of information systems.« less
Security and Interdependency in a Public Cloud: A Game Theoretic Approach
2014-08-29
maximum utility can be reached (i.e., Pareto efficiency). However, the examples of perverse incentives and information inequality (where this feedback...interdependent structure. Cloud computing gives way to two types of interdependent relationships: cloud host-to- client and cloud client -to- client ... Client -to- client interdependency is much less studied than to the above-mentioned cloud host-to- client relationship. Although, it can still carry the
Shuttle Electrical Power Analysis Program (SEPAP); single string circuit analysis report
NASA Technical Reports Server (NTRS)
Murdock, C. R.
1974-01-01
An evaluation is reported of the data obtained from an analysis of the distribution network characteristics of the shuttle during a spacelab mission. A description of the approach utilized in the development of the computer program and data base is provided and conclusions are drawn from the analysis of the data. Data sheets are provided for information to support the detailed discussion on each computer run.
Multi-Sensor Information Integration and Automatic Understanding
2008-05-27
distributions for target tracks and class which are utilized by an active learning cueing management framework to optimally task the appropriate sensor...modality to cued regions of interest. Moreover, this active learning approach also facilitates analyst cueing to help resolve track ambiguities in complex...scenes. We intend to leverage SIG’s active learning with analyst cueing under future efforts with ONR and other DoD agencies. Obtaining long- term
Multi-Sensor Information Integration and Automatic Understanding
2008-08-27
distributions for target tracks and class which are utilized by an active learning cueing management framework to optimally task the appropriate sensor modality...to cued regions of interest. Moreover, this active learning approach also facilitates analyst cueing to help resolve track ambiguities in complex...scenes. We intend to leverage SIG’s active learning with analyst cueing under future efforts with ONR and other DoD agencies. Obtaining long- term
Human Rights-Based Approaches to Mental Health
Bradley, Valerie J.; Sahakian, Barbara J.
2016-01-01
Abstract The incidence of human rights violations in mental health care across nations has been described as a “global emergency” and an “unresolved global crisis.” The relationship between mental health and human rights is complex and bidirectional. Human rights violations can negatively impact mental health. Conversely, respecting human rights can improve mental health. This article reviews cases where an explicitly human rights-based approach was used in mental health care settings. Although the included studies did not exhibit a high level of methodological rigor, the qualitative information obtained was considered useful and informative for future studies. All studies reviewed suggest that human-rights based approaches can lead to clinical improvements at relatively low costs. Human rights-based approaches should be utilized for legal and moral reasons, since human rights are fundamental pillars of justice and civilization. The fact that such approaches can contribute to positive therapeutic outcomes and, potentially, cost savings, is additional reason for their implementation. However, the small sample size and lack of controlled, quantitative measures limit the strength of conclusions drawn from included studies. More objective, high quality research is needed to ascertain the true extent of benefits to service users and providers. PMID:27781015
Human Rights-Based Approaches to Mental Health: A Review of Programs.
Porsdam Mann, Sebastian; Bradley, Valerie J; Sahakian, Barbara J
2016-06-01
The incidence of human rights violations in mental health care across nations has been described as a "global emergency" and an "unresolved global crisis." The relationship between mental health and human rights is complex and bidirectional. Human rights violations can negatively impact mental health. Conversely, respecting human rights can improve mental health. This article reviews cases where an explicitly human rights-based approach was used in mental health care settings. Although the included studies did not exhibit a high level of methodological rigor, the qualitative information obtained was considered useful and informative for future studies. All studies reviewed suggest that human-rights based approaches can lead to clinical improvements at relatively low costs. Human rights-based approaches should be utilized for legal and moral reasons, since human rights are fundamental pillars of justice and civilization. The fact that such approaches can contribute to positive therapeutic outcomes and, potentially, cost savings, is additional reason for their implementation. However, the small sample size and lack of controlled, quantitative measures limit the strength of conclusions drawn from included studies. More objective, high quality research is needed to ascertain the true extent of benefits to service users and providers.
Pattern Activity Clustering and Evaluation (PACE)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Banas, Christopher; Paul, Michael; Bussjager, Becky; Seetharaman, Guna
2012-06-01
With the vast amount of network information available on activities of people (i.e. motions, transportation routes, and site visits) there is a need to explore the salient properties of data that detect and discriminate the behavior of individuals. Recent machine learning approaches include methods of data mining, statistical analysis, clustering, and estimation that support activity-based intelligence. We seek to explore contemporary methods in activity analysis using machine learning techniques that discover and characterize behaviors that enable grouping, anomaly detection, and adversarial intent prediction. To evaluate these methods, we describe the mathematics and potential information theory metrics to characterize behavior. A scenario is presented to demonstrate the concept and metrics that could be useful for layered sensing behavior pattern learning and analysis. We leverage work on group tracking, learning and clustering approaches; as well as utilize information theoretical metrics for classification, behavioral and event pattern recognition, and activity and entity analysis. The performance evaluation of activity analysis supports high-level information fusion of user alerts, data queries and sensor management for data extraction, relations discovery, and situation analysis of existing data.
Ultrasonic imaging of material flaws exploiting multipath information
NASA Astrophysics Data System (ADS)
Shen, Xizhong; Zhang, Yimin D.; Demirli, Ramazan; Amin, Moeness G.
2011-05-01
In this paper, we consider ultrasonic imaging for the visualization of flaws in a material. Ultrasonic imaging is a powerful nondestructive testing (NDT) tool which assesses material conditions via the detection, localization, and classification of flaws inside a structure. Multipath exploitations provide extended virtual array apertures and, in turn, enhance imaging capability beyond the limitation of traditional multisensor approaches. We utilize reflections of ultrasonic signals which occur when encountering different media and interior discontinuities. The waveforms observed at the physical as well as virtual sensors yield additional measurements corresponding to different aspect angles. Exploitation of multipath information addresses unique issues observed in ultrasonic imaging. (1) Utilization of physical and virtual sensors significantly extends the array aperture for image enhancement. (2) Multipath signals extend the angle of view of the narrow beamwidth of the ultrasound transducers, allowing improved visibility and array design flexibility. (3) Ultrasonic signals experience difficulty in penetrating a flaw, thus the aspect angle of the observation is limited unless access to other sides is available. The significant extension of the aperture makes it possible to yield flaw observation from multiple aspect angles. We show that data fusion of physical and virtual sensor data significantly improves the detection and localization performance. The effectiveness of the proposed multipath exploitation approach is demonstrated through experimental studies.
Cho, Na-Eun; Chang, Jongwha; Atems, Bebonchu
2014-11-01
To determine the impact of health information technology (HIT) adoption and hospital-physician integration on hospital efficiency. Using 2010 data from the American Hospital Association's (AHA) annual survey, the AHA IT survey, supplemented by the CMS Case Mix Index, and the US Census Bureau's small area income and poverty estimates, we examined how the adoption of HIT and employment of physicians affected hospital efficiency and whether they were substitutes or complements. The sample included 2173 hospitals. We employed a 2-stage approach. In the first stage, data envelopment analysis was used to estimate technical efficiency of hospitals. In the second stage, we used instrumental variable approaches, notably 2-stage least squares and the generalized method of moments, to examine the effects of IT adoption and integration on hospital efficiency. We found that HIT adoption and hospital-physician integration, when considered separately, each have statistically significant positive impacts on hospital efficiency. Also, we found that hospitals that adopted HIT with employed physicians will achieve less efficiency compared with hospitals that adopted HIT without employed physicians. Although HIT adoption and hospital-physician integration both seem to be key parts of improving hospital efficiency when one or the other is utilized individually, they can hurt hospital efficiency when utilized together.
Three-Way Analysis of Spectrospatial Electromyography Data: Classification and Interpretation
Kauppi, Jukka-Pekka; Hahne, Janne; Müller, Klaus-Robert; Hyvärinen, Aapo
2015-01-01
Classifying multivariate electromyography (EMG) data is an important problem in prosthesis control as well as in neurophysiological studies and diagnosis. With modern high-density EMG sensor technology, it is possible to capture the rich spectrospatial structure of the myoelectric activity. We hypothesize that multi-way machine learning methods can efficiently utilize this structure in classification as well as reveal interesting patterns in it. To this end, we investigate the suitability of existing three-way classification methods to EMG-based hand movement classification in spectrospatial domain, as well as extend these methods by sparsification and regularization. We propose to use Fourier-domain independent component analysis as preprocessing to improve classification and interpretability of the results. In high-density EMG experiments on hand movements across 10 subjects, three-way classification yielded higher average performance compared with state-of-the art classification based on temporal features, suggesting that the three-way analysis approach can efficiently utilize detailed spectrospatial information of high-density EMG. Phase and amplitude patterns of features selected by the classifier in finger-movement data were found to be consistent with known physiology. Thus, our approach can accurately resolve hand and finger movements on the basis of detailed spectrospatial information, and at the same time allows for physiological interpretation of the results. PMID:26039100
Athilingam, Ponrathi; Osorio, Richard E; Kaplan, Howard; Oliver, Drew; O'neachtain, Tara; Rogal, Philip J
2016-02-01
Health education is an important component of multidisciplinary disease management of heart failure. The educational information given at the time of discharge after hospitalization or at initial diagnosis is often overwhelming to patients and is often lost or never consulted again. Therefore, the aim of this developmental project was to embed interactive heart failure education in a mobile platform. A patient-centered approach, grounded on several learning theories including Mayer's Cognitive Theory of Multimedia Learning, Sweller's Cognitive Load, Instructional Design Approach, and Problem-Based Learning, was utilized to develop and test the mobile app. Ten heart failure patients, who attended an outpatient heart failure clinic, completed beta testing. A validated self-confidence questionnaire was utilized to assess patients' confidence in using the mobile app. All participants (100%) reported moderate to extreme confidence in using the app, 95% were very likely to use the app, 100% reported the design was easy to navigate, and content on heart failure was appropriate. Having the information accessible on their mobile phone was reported as a positive, like a health coach by all patients. Clinicians and nurses validated the content. Thus, embedding health education in a mobile app is proposed in promoting persistent engagement to improve health outcomes.
Uses of NHANES biomarker data for chemical risk ...
Background. Each year, the US NHANES measures hundreds of chemical biomarkers in samples from thousands of study participants. These biomarker measurements are meant to track trends and identify subsets of the US population with elevated exposures. There is now interest in further utilizing the NHANES data to inform chemical risk assessments. Objectives. This article highlights: 1) the extent to which NHANES chemical biomarker data have been evaluated, 2) groups of chemicals that have been studied, 3) data analysis approaches, and 4) opportunities for using these data to inform chemical risk assessments.Methods. A literature search (1999-2013) was performed to identify publications in which NHANES data were reported. Manual curation identified only the subset of publications that clearly utilized chemical biomarker data. This subset was evaluated for chemical groupings, data analysis approaches, and overall trends.Results. A small percentage of yearly NHANES-related publications reported on chemical biomarkers (8% yearly average). Of eleven chemical groups, metals/metalloids were most frequently evaluated (49%), followed by pesticides (9%) and environmental phenols (7%). Studies of multiple chemical groups were also common (8%). Publications linking chemical biomarkers to health metrics have increased dramatically in recent years. New studies are addressing challenges related to NHANES data interpretation in health risk contexts.Conclusions. This articl
A traveling salesman approach for predicting protein functions.
Johnson, Olin; Liu, Jing
2006-10-12
Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm 1 on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems.
A traveling salesman approach for predicting protein functions
Johnson, Olin; Liu, Jing
2006-01-01
Background Protein-protein interaction information can be used to predict unknown protein functions and to help study biological pathways. Results Here we present a new approach utilizing the classic Traveling Salesman Problem to study the protein-protein interactions and to predict protein functions in budding yeast Saccharomyces cerevisiae. We apply the global optimization tool from combinatorial optimization algorithms to cluster the yeast proteins based on the global protein interaction information. We then use this clustering information to help us predict protein functions. We use our algorithm together with the direct neighbor algorithm [1] on characterized proteins and compare the prediction accuracy of the two methods. We show our algorithm can produce better predictions than the direct neighbor algorithm, which only considers the immediate neighbors of the query protein. Conclusion Our method is a promising one to be used as a general tool to predict functions of uncharacterized proteins and a successful sample of using computer science knowledge and algorithms to study biological problems. PMID:17147783
Soneja, Sutyajeet I; Tielsch, James M; Khatry, Subarna K; Curriero, Frank C; Breysse, Patrick N
2016-03-01
Black carbon (BC) is a major contributor to hydrological cycle change and glacial retreat within the Indo-Gangetic Plain (IGP) and surrounding region. However, significant variability exists for estimates of BC regional concentration. Existing inventories within the IGP suffer from limited representation of rural sources, reliance on idealized point source estimates (e.g., utilization of emission factors or fuel-use estimates for cooking along with demographic information), and difficulty in distinguishing sources. Inventory development utilizes two approaches, termed top down and bottom up, which rely on various sources including transport models, emission factors, and remote sensing applications. Large discrepancies exist for BC source attribution throughout the IGP depending on the approach utilized. Cooking with biomass fuels, a major contributor to BC production has great source apportionment variability. Areas requiring attention tied to research of cookstove and biomass fuel use that have been recognized to improve emission inventory estimates include emission factors, particulate matter speciation, and better quantification of regional/economic sectors. However, limited attention has been given towards understanding ambient small-scale spatial variation of BC between cooking and non-cooking periods in low-resource environments. Understanding the indoor to outdoor relationship of BC emissions due to cooking at a local level is a top priority to improve emission inventories as many health and climate applications rely upon utilization of accurate emission inventories.
The Effects of Climate Model Similarity on Local, Risk-Based Adaptation Planning
NASA Astrophysics Data System (ADS)
Steinschneider, S.; Brown, C. M.
2014-12-01
The climate science community has recently proposed techniques to develop probabilistic projections of climate change from ensemble climate model output. These methods provide a means to incorporate the formal concept of risk, i.e., the product of impact and probability, into long-term planning assessments for local systems under climate change. However, approaches for pdf development often assume that different climate models provide independent information for the estimation of probabilities, despite model similarities that stem from a common genealogy. Here we utilize an ensemble of projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5) to develop probabilistic climate information, with and without an accounting of inter-model correlations, and use it to estimate climate-related risks to a local water utility in Colorado, U.S. We show that the tail risk of extreme climate changes in both mean precipitation and temperature is underestimated if model correlations are ignored. When coupled with impact models of the hydrology and infrastructure of the water utility, the underestimation of extreme climate changes substantially alters the quantification of risk for water supply shortages by mid-century. We argue that progress in climate change adaptation for local systems requires the recognition that there is less information in multi-model climate ensembles than previously thought. Importantly, adaptation decisions cannot be limited to the spread in one generation of climate models.
Critically reflective work behavior of health care professionals.
Groot, Esther de; Jaarsma, Debbie; Endedijk, Maaike; Mainhard, Tim; Lam, Ineke; Simons, Robert-Jan; Beukelen, Peter van
2012-01-01
Better understanding of critically reflective work behavior (CRWB), an approach for work-related informal learning, is important in order to gain more profound insight in the continuing development of health care professionals. A survey, developed to measure CRWB and its predictors, was distributed to veterinary professionals. The authors specified a model relating CRWB to a Perceived Need for Lifelong Learning, Perceived Workload, and Opportunities for Feedback. Furthermore, research utilization was added to the concept of CRWB. The model was tested against the data, using structural equation modeling (SEM). The model was well represented by the data. Four factors that reflect aspects of CRWB were distinguished: (1) individual CRWB; (2) being critical in interactions with others; (3) cross-checking of information; and (4) openness to new findings. The latter 2 originated from the factor research utilization in CRWB. The Perceived Need for Lifelong Learning predicts CRWB. Neither Perceived Workload nor Opportunities for Feedback of other practitioners was related to CRWB. The results suggest that research utilization, such as cross-checking information and openness to new findings, is essential for CRWB. Furthermore, perceptions of the need for lifelong learning are more relevant for CRWB of health care professionals than qualities of the workplace. Copyright © 2012 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on CME, Association for Hospital Medical Education.
Optimal directional view angles for remote-sensing missions
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Holben, B. N.; Tucker, C. J.; Newcomb, W. W.
1984-01-01
The present investigation is concerned with the directional, off-nadir viewing of terrestrial scenes using remote-sensing systems from aircraft and satellite platforms, taking into account advantages of such an approach over strictly nadir viewing systems. Directional reflectance data collected for bare soil and several different vegetation canopies in NOAA-7 AVHRR bands 1 and 2 were analyzed. Optimum view angles were recommended for two strategies. The first strategy views the utility of off-nadir measurements as extending spatial and temporal coverage of the target area. The second strategy views the utility of off-nadir measurements as providing additional information about the physical characteristics of the target. Conclusions regarding the two strategies are discussed.
The application of volume-outcome contouring in data warehousing.
Studnicki, James; Berndt, Donald J; Luther, Stephen L; Fisher, John W
2004-01-01
Despite a compelling body of published research on the nature of provider volume and clinical outcomes, healthcare executives and policymakers have not managed to develop and implement systems that are useful in directing patients to higher volume providers via selective referral or avoidance. A specialized data warehouse application, utilizing hospital discharge data linked to physician biographical information, allows detailed analysis of physician and hospital volume and the resulting pattern (contour) of related outcomes such as mortality, complications, and medical errors. The approach utilizes a historical repository of hospital discharge data in which the outcomes of interest, important patient characteristics and risk factors used in severity-adjusting of the outcomes are derived from the coding structure of the data.
Links between social environment and health care utilization and costs.
Brault, Marie A; Brewster, Amanda L; Bradley, Elizabeth H; Keene, Danya; Tan, Annabel X; Curry, Leslie A
2018-01-01
The social environment influences health outcomes for older adults and could be an important target for interventions to reduce costly medical care. We sought to understand which elements of the social environment distinguish communities that achieve lower health care utilization and costs from communities that experience higher health care utilization and costs for older adults with complex needs. We used a sequential explanatory mixed methods approach. We classified community performance based on three outcomes: rate of hospitalizations for ambulatory care sensitive conditions, all-cause risk-standardized hospital readmission rates, and Medicare spending per beneficiary. We conducted in-depth interviews with key informants (N = 245) from organizations providing health or social services. Higher performing communities were distinguished by several aspects of social environment, and these features were lacking in lower performing communities: 1) strong informal support networks; 2) partnerships between faith-based organizations and health care and social service organizations; and 3) grassroots organizing and advocacy efforts. Higher performing communities share similar social environmental features that complement the work of health care and social service organizations. Many of the supportive features and programs identified in the higher performing communities were developed locally and with limited governmental funding, providing opportunities for improvement.
Deshpande, Aparna; Menon, Ajit; Perri, Matthew; Zinkhan, George
2004-01-01
The growth in direct-to-consumer advertising(DTCA)over the past two decades has facilitated the communication of prescription drug information directly to consumers. Data from a 1999 national survey are employed to determine the factors influencing consumers' opinions of the utility of DTC ads for health care decision making. We also analyze whether consumers use DTC ad information in health care decision making and who are the key drivers of such information utilization. The study results suggest that consumers have positive opinions of DTCA utility, varying across demographics and perceptions of certain advertisement features. Specifically, consumers value information about both risks and benefits, but the perception of risk information is more important in shaping opinions of ad utility than the perception of benefit information. Consumers still perceive, however that the quality of benefit information in DTC ads is better than that of risk information. Opinions about ad utility significantly influence whether information from DTC ads is used in health care decision making.
Supply of genetic information--amount, format, and frequency.
Misztal, I; Lawlor, T J
1999-05-01
The volume and complexity of genetic information is increasing because of new traits and better models. New traits may include reproduction, health, and carcass. More comprehensive models include the test day model in dairy cattle or a growth model in beef cattle. More complex models, which may include nonadditive effects such as inbreeding and dominance, also provide additional information. The amount of information per animal may increase drastically if DNA marker typing becomes routine and quantitative trait loci information is utilized. In many industries, evaluations are run more frequently. They result in faster genetic progress and improved management and marketing opportunities but also in extra costs and information overload. Adopting new technology and making some organizational changes can help realize all the added benefits of the improvements to the genetic evaluation systems at an acceptable cost. Continuous genetic evaluation, in which new records are accepted and breeding values are updated continuously, will relieve time pressures. An online mating system with access to both genetic and marketing information can result in mating recommendations customized for each user. Such a system could utilize inbreeding and dominance information that cannot efficiently be accommodated in the current sire summaries or off-line mating programs. The new systems will require a new organizational approach in which the task of scientists and technicians will not be simply running the evaluations but also providing the research, design, supervision, and maintenance required in the entire system of evaluation, decision making, and distribution.
De Stefano, Manuela; Lanzillo, Roberta; Esposito, Sabrina; Moshtari, Fatemeh; Rullani, Francesco; Piscopo, Kyrie; Buonanno, Daniela; Brescia Morra, Vincenzo; Gallo, Antonio; Tedeschi, Gioacchino; Bonavita, Simona
2017-01-01
Background Social media are a vital link for people with health concerns who find in Web communities a valid and comforting source for information exchange, debate, and knowledge enrichment. This aspect is important for people affected by chronic diseases like multiple sclerosis (MS), who are very well informed about the disease but are vulnerable to hopes of being cured or saved by therapies whose efficacy is not always scientifically proven. To improve health-related coping and social interaction for people with MS, we created an MS social network (SMsocialnetwork.com) with a medical team constantly online to intervene promptly when false or inappropriate medical information are shared. Objective The goal of this study was to assess the impact of SMsocialnetwork.com on the health-related coping and social interaction of people with MS by analyzing areas of interest through a Web-based survey. Methods Referring to previous marketing studies analyzing the online platform’s role in targeted health care, we conducted a 39-item Web-based survey. We then performed a construct validation procedure using a factorial analysis, gathering together like items of the survey related to different areas of interest such as utility, proximity, sharing, interaction, solving uncertainty, suggestion attitude, and exploration. Results We collected 130 Web-based surveys. The areas of interest analysis demonstrated that the users positively evaluated SMsocialnetwork.com to obtain information, approach and solve problems, and to make decisions (utility: median 4.2); improve feeling of closeness (proximity: median 5); catalyze relationships and text general personal opinions (sharing: median 5.6); get in touch with other users to receive innovative, effective, and practical solutions (interaction, solving uncertainty, and suggestion attitude medians were respectively: 4.1, 3, and 3); and share information about innovative therapeutic approaches and treatment options (suggestion attitude: median: 3.3). Conclusions SMsocialnetwork.com was perceived by users to be a useful tool to support health-related coping and social interaction, and may suggest a new kind of therapeutic alliance between physicians and people with MS. PMID:28710056
Internet access and utilization for health information among university students in Islamabad.
Shaikh, Irshad Ali; Shaikh, Masood Ali; Kamal, Anila; Masood, Sobia
2008-01-01
Internet has changed the way we live and work. Advent of this technology has fundamentally transformed our lives the way invention of automobile changed how our lives and cities looked and worked before. Practically no information is available on the use of Internet for health by the people of Pakistan. The Objectives of the study were to assess the access and utilization pattern of Internet by university students in Islamabad, with emphasis on the healthcare information seeking. An anonymous, self-administered, and pre-tested questionnaire with questions on the access, and usage pattern of Internet, seeking health care information online, and belief about reliability of such information; was distributed to only those students who were enrolled in masters or higher degree programs. A total of 600 students were approached and 598 (99.7%) completed the questionnaires. The mean age of students was 23.5 years (range 19-40). The majority of students (423) were enrolled in masters program. Four hundred and sixty-eight students (78.26%) students had access to the computer either at home or at their university hostel. While 304 (50.84%) students had Internet access at home or in their university hostel. Out of 304 students who reported having access to Internet in the past three months, one hundred and thirty-nine (43.4%) students replied affirmatively to the question of having used Internet for seeking health care information. And 109 (78.4%) thought that such information was reliable. Out of 139 students who had used Internet for seeking health information, 35 (25.2%) students replied affirmatively to the question of having discussed health information obtained from Internet with their doctor/physician whom they visited for any illness/treatment. Majority of Islamabad university students in this study had access to computer and Internet. Young and healthy state of this educated age group perhaps accounts for limited use of Internet for seeking healthcare related information. However, high reliability of Internet obtained health information needs to be further studied in terms of websites utilized for seeking such information.
Garfield, Susan; Polisena, Julie; S Spinner, Daryl; Postulka, Anne; Y Lu, Christine; Tiwana, Simrandeep K; Faulkner, Eric; Poulios, Nick; Zah, Vladimir; Longacre, Michael
2016-01-01
Health technology assessments (HTAs) are increasingly used to inform coverage, access, and utilization of medical technologies including molecular diagnostics (MDx). Although MDx are used to screen patients and inform disease management and treatment decisions, there is no uniform approach to their evaluation by HTA organizations. The International Society for Pharmacoeconomics and Outcomes Research Devices and Diagnostics Special Interest Group reviewed diagnostic-specific HTA programs and identified elements representing common and best practices. MDx-specific HTA programs in Europe, Australia, and North America were characterized by methodology, evaluation framework, and impact. Published MDx HTAs were reviewed, and five representative case studies of test evaluations were developed: United Kingdom (National Institute for Health and Care Excellence's Diagnostics Assessment Programme, epidermal growth factor receptor tyrosine kinase mutation), United States (Palmetto's Molecular Diagnostic Services Program, OncotypeDx prostate cancer test), Germany (Institute for Quality and Efficiency in Healthcare, human papillomavirus testing), Australia (Medical Services Advisory Committee, anaplastic lymphoma kinase testing for non-small cell lung cancer), and Canada (Canadian Agency for Drugs and Technologies in Health, Rapid Response: Non-invasive Prenatal Testing). Overall, the few HTA programs that have MDx-specific methods do not provide clear parameters of acceptability related to clinical and analytic performance, clinical utility, and economic impact. The case studies highlight similarities and differences in evaluation approaches across HTAs in the performance metrics used (analytic and clinical validity, clinical utility), evidence requirements, and how value is measured. Not all HTAs are directly linked to reimbursement outcomes. To improve MDx HTAs, organizations should provide greater transparency, better communication and collaboration between industry and HTA stakeholders, clearer links between HTA and funding decisions, explicit recognition of and rationale for differential approaches to laboratory-developed versus regulatory-approved test, and clear evidence requirements. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Wolf, Douglas C.; Bachman, Ammie; Barrett, Gordon; Bellin, Cheryl; Goodman, Jay I.; Jensen, Elke; Moretto, Angelo; McMullin, Tami; Pastoor, Timothy P.; Schoeny, Rita; Slezak, Brian; Wend, Korinna; Embry, Michelle R.
2016-01-01
ABSTRACT The HESI-led RISK21 effort has developed a framework supporting the use of twenty-first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach that leads to an informed decision on human health safety to be made when sufficient evidence is available. It provides a transparent and consistent approach to evaluate information in order to maximize the ability of assessments to inform decisions and to optimize the use of resources. To demonstrate the application of the framework’s roadmap and matrix, this case study evaluates a large number of chemicals that could be present in drinking water. The focus is to prioritize which of these should be considered for human health risk as individual contaminants. The example evaluates 20 potential drinking water contaminants, using the tiered RISK21 approach in combination with graphical representation of information at each step, using the RISK21 matrix. Utilizing the framework, 11 of the 20 chemicals were assigned low priority based on available exposure data alone, which demonstrated that exposure was extremely low. The remaining nine chemicals were further evaluated, using refined estimates of toxicity based on readily available data, with three deemed high priority for further evaluation. In the present case study, it was determined that the greatest value of additional information would be from improved exposure models and not from additional hazard characterization. PMID:26451723
Wolf, Douglas C; Bachman, Ammie; Barrett, Gordon; Bellin, Cheryl; Goodman, Jay I; Jensen, Elke; Moretto, Angelo; McMullin, Tami; Pastoor, Timothy P; Schoeny, Rita; Slezak, Brian; Wend, Korinna; Embry, Michelle R
2016-01-01
The HESI-led RISK21 effort has developed a framework supporting the use of twenty-first century technology in obtaining and using information for chemical risk assessment. This framework represents a problem formulation-based, exposure-driven, tiered data acquisition approach that leads to an informed decision on human health safety to be made when sufficient evidence is available. It provides a transparent and consistent approach to evaluate information in order to maximize the ability of assessments to inform decisions and to optimize the use of resources. To demonstrate the application of the framework's roadmap and matrix, this case study evaluates a large number of chemicals that could be present in drinking water. The focus is to prioritize which of these should be considered for human health risk as individual contaminants. The example evaluates 20 potential drinking water contaminants, using the tiered RISK21 approach in combination with graphical representation of information at each step, using the RISK21 matrix. Utilizing the framework, 11 of the 20 chemicals were assigned low priority based on available exposure data alone, which demonstrated that exposure was extremely low. The remaining nine chemicals were further evaluated, using refined estimates of toxicity based on readily available data, with three deemed high priority for further evaluation. In the present case study, it was determined that the greatest value of additional information would be from improved exposure models and not from additional hazard characterization.
Ascoli, Giorgio A.; Martone, Maryann E.; Shepherd, Gordon M.; Miller, Perry L.
2009-01-01
This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. PMID:18975149
Medical pluralism of the Chinese in London: an exploratory study.
Rochelle, Tina L; Marks, David F
2010-11-01
This study was designed to examine the extent of medical pluralism among the Chinese in London. Members of the London Chinese community were recruited through Chinese organizations in London and participated in six focus groups. A total of 48 Chinese men and women aged 24-74 years were asked to talk about their health behaviour and health utilization patterns. Transcripts of the focus group discussions underwent thematic analysis to explore and describe the utilization of traditional Chinese medicine (TCM) and Western medicine (WM) of informants and factors that impacted on utilization. Findings focus on participants' evaluation of TCM and WM as two systems of health provision, how informants used these two health systems, and the reasons associated with use of these two systems. Utilization of TCM and WM varied. Concurrent use of TCM and WM was common. The National Health Service was generally perceived as difficult to use, with concerns over the language barrier, and communicating with and being able to trust health providers. The UK TCM trade was perceived as being aimed at the non-Chinese market and there were issues of trust related to the regulation of UK TCM. Although none of these issues are unique to the Chinese in the UK, previous experience with different approaches to health care, particularly TCM, may make the experience of such barriers more extreme.
Early Treatment in HCV: Is it a Cost-Utility Option from the Italian Perspective?
Marcellusi, Andrea; Viti, Raffaella; Damele, Francesco; Cammà, Calogero; Taliani, Gloria; Mennini, Francesco Saverio
2016-08-01
In Italy, the Italian Pharmaceutical Agency (AIFA) criteria used F3-F4 fibrosis stages as the threshold to prioritise the treatment with interferon (IFN)-free regimens, while in genotype 1 chronic hepatitis C (G1 CHC) patients with fibrosis of liver stage 2, an approach with pegylated interferon (PEG-IFN)-based triple therapy with simeprevir was suggested. The key clinical question is whether, in an era of financial constraints, the application of a universal IFN-free strategy in naïve G1 CHC patients is feasible within a short time horizon. The aim of this study is to perform an economic analysis to estimate the cost-utility of the early innovative therapy in Italy for managing hepatitis C virus (HCV)-infected patients. The incremental cost-utility analysis was carried out to quantify the benefits of the early treatment approach in HCV subjects. A Markov simulation model including direct and indirect costs and health outcomes was developed from an Italian National Healthcare Service and societal perspective. A total of 5000 Monte Carlo simulations were performed on two distinct scenarios: standard of care (SoC) which includes 14,000 genotype 1 patients in Italy treated with innovative interferon-free regimens in the fibrosis of liver stages 3 and 4 (F3-F4) versus early-treatment scenario (ETS) where 2000 patients were additionally treated with simeprevir plus PEG-IFN and ribavirin in the fibrosis stage 2 (F2) (based on Italian Medicines Agency AIFA reimbursement criteria). A systematic literature review was carried out to identify epidemiological and economic data, which were subsequently used to inform the model. Furthermore, a one-way probabilistic sensitivity was performed to measure the relationship between the main parameters of the model and the cost-utility results. The model shows that, in terms of incremental cost-effectiveness ratio (ICER) per quality adjusted life year (QALY) gained, ETS appeared to be the most cost-utility option compared with both perspective societal (ICER = EUR11,396) and NHS (ICER = EUR14,733) over a time period of 10 years. The cost-utility of ETS is more sustainable as it extends the time period analysis [ICER = EUR 6778 per QALY to 20 years and EUR4474 per QALY to 30 years]. From the societal perspective, the ETS represents the dominant option at a time horizon of 30 years. If we consider the sub-group population of treated patients [16,000 patients of which 2000 not treated in the SoC, the ETS scenario was dominant after only 5 years and the cost-utility at 2 years of simulation. The one-way sensitivity analysis on the main variables confirmed the robustness of the model for the early-treatment approach. Our model represents a tool for policy makers and health-care professionals, and provided information on the cost-utility of the early-treatment approach in HCV-infected patients in Italy. Starting innovative treatment regimens earlier keeps HCV-infected patients in better health and reduces the incidence of HCV-related events; generating a gain both in terms of health of the patients and correct resource allocation.
A decision theoretical approach for diffusion promotion
NASA Astrophysics Data System (ADS)
Ding, Fei; Liu, Yun
2009-09-01
In order to maximize cost efficiency from scarce marketing resources, marketers are facing the problem of which group of consumers to target for promotions. We propose to use a decision theoretical approach to model this strategic situation. According to one promotion model that we develop, marketers balance between probabilities of successful persuasion and the expected profits on a diffusion scale, before making their decisions. In the other promotion model, the cost for identifying influence information is considered, and marketers are allowed to ignore individual heterogeneity. We apply the proposed approach to two threshold influence models, evaluate the utility of each promotion action, and provide discussions about the best strategy. Our results show that efforts for targeting influentials or easily influenced people might be redundant under some conditions.
Clinical approaches to infertility in the bitch.
Wilborn, Robyn R; Maxwell, Herris S
2012-05-01
When presented with the apparently infertile bitch, the practitioner must sort through a myriad of facts, historical events, and diagnostic tests to uncover the etiology of the problem. Many bitches that present for infertility are reproductively normal and are able to conceive with appropriate intervention and breeding management. An algorithmic approach is helpful in cases of infertility, where simple questions lead to the next appropriate step. Most bitches can be categorized as either cyclic or acyclic, and then further classified based on historical data and diagnostic testing. Each female has a unique set of circumstances that can affect her reproductive potential. By utilizing all available information and a logical approach, the clinician can narrow the list of differentials and reach a diagnosis more quickly.
An integrated remote sensing approach for identifying ecological range sites. [parker mountain
NASA Technical Reports Server (NTRS)
Jaynes, R. A.
1983-01-01
A model approach for identifying ecological range sites was applied to high elevation sagebrush-dominated rangelands on Parker Mountain, in south-central Utah. The approach utilizes map information derived from both high altitude color infrared photography and LANDSAT digital data, integrated with soils, geological, and precipitation maps. Identification of the ecological range site for a given area requires an evaluation of all relevant environmental factors which combine to give that site the potential to produce characteristic types and amounts of vegetation. A table is presented which allows the user to determine ecological range site based upon an integrated use of the maps which were prepared. The advantages of identifying ecological range sites through an integrated photo interpretation/LANDSAT analysis are discussed.
The role of data assimilation in maximizing the utility of geospace observations (Invited)
NASA Astrophysics Data System (ADS)
Matsuo, T.
2013-12-01
Data assimilation can facilitate maximizing the utility of existing geospace observations by offering an ultimate marriage of inductive (data-driven) and deductive (first-principles based) approaches to addressing critical questions in space weather. Assimilative approaches that incorporate dynamical models are, in particular, capable of making a diverse set of observations consistent with physical processes included in a first-principles model, and allowing unobserved physical states to be inferred from observations. These points will be demonstrated in the context of the application of an ensemble Kalman filter (EnKF) to a thermosphere and ionosphere general circulation model. An important attribute of this approach is that the feedback between plasma and neutral variables is self-consistently treated both in the forecast model as well as in the assimilation scheme. This takes advantage of the intimate coupling between the thermosphere and ionosphere described in general circulation models to enable the inference of unobserved thermospheric states from the relatively plentiful observations of the ionosphere. Given the ever-growing infrastructure for the global navigation satellite system, this is indeed a promising prospect for geospace data assimilation. In principle, similar approaches can be applied to any geospace observing systems to extract more geophysical information from a given set of observations than would otherwise be possible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, William A.; Litovitz, Toby L.; Belson, Martin G.
2005-09-01
The Toxic Exposure Surveillance System (TESS) is a uniform data set of US poison centers cases. Categories of information include the patient, the caller, the exposure, the substance(s), clinical toxicity, treatment, and medical outcome. The TESS database was initiated in 1985, and provides a baseline of more than 36.2 million cases through 2003. The database has been utilized for a number of safety evaluations. Consideration of the strengths and limitations of TESS data must be incorporated into data interpretation. Real-time toxicovigilance was initiated in 2003 with continuous uploading of new cases from all poison centers to a central database. Real-timemore » toxicovigilance utilizing general and specific approaches is systematically run against TESS, further increasing the potential utility of poison center experiences as a means of early identification of potential public health threats.« less
Classification of Chemicals Based On Structured Toxicity ...
Thirty years and millions of dollars worth of pesticide registration toxicity studies, historically stored as hardcopy and scanned documents, have been digitized into highly standardized and structured toxicity data within the Toxicity Reference Database (ToxRefDB). Toxicity-based classifications of chemicals were performed as a model application of ToxRefDB. These endpoints will ultimately provide the anchoring toxicity information for the development of predictive models and biological signatures utilizing in vitro assay data. Utilizing query and structured data mining approaches, toxicity profiles were uniformly generated for greater than 300 chemicals. Based on observation rate, species concordance and regulatory relevance, individual and aggregated effects have been selected to classify the chemicals providing a set of predictable endpoints. ToxRefDB exhibits the utility of transforming unstructured toxicity data into structured data and, furthermore, into computable outputs, and serves as a model for applying such data to address modern toxicological problems.
Modeling the Dynamic Interrelations between Mobility, Utility, and Land Asking Price
NASA Astrophysics Data System (ADS)
Hidayat, E.; Rudiarto, I.; Siegert, F.; Vries, W. D.
2018-02-01
Limited and insufficient information about the dynamic interrelation among mobility, utility, and land price is the main reason to conduct this research. Several studies, with several approaches, and several variables have been conducted so far in order to model the land price. However, most of these models appear to generate primarily static land prices. Thus, a research is required to compare, design, and validate different models which calculate and/or compare the inter-relational changes of mobility, utility, and land price. The applied method is a combination of analysis of literature review, expert interview, and statistical analysis. The result is newly improved mathematical model which have been validated and is suitable for the case study location. This improved model consists of 12 appropriate variables. This model can be implemented in the Salatiga city as the case study location in order to arrange better land use planning to mitigate the uncontrolled urban growth.
Adapting legume crops to climate change using genomic approaches.
Mousavi-Derazmahalleh, Mahsa; Bayer, Philipp E; Hane, James K; Valliyodan, Babu; Nguyen, Henry T; Nelson, Matthew N; Erskine, William; Varshney, Rajeev K; Papa, Roberto; Edwards, David
2018-03-30
Our agricultural system and hence food security is threatened by combination of events, such as increasing population, the impacts of climate change, and the need to a more sustainable development. Evolutionary adaptation may help some species to overcome environmental changes through new selection pressures driven by climate change. However, success of evolutionary adaptation is dependent on various factors, one of which is the extent of genetic variation available within species. Genomic approaches provide an exceptional opportunity to identify genetic variation that can be employed in crop improvement programs. In this review, we illustrate some of the routinely used genomics-based methods as well as recent breakthroughs, which facilitate assessment of genetic variation and discovery of adaptive genes in legumes. Although additional information is needed, the current utility of selection tools indicate a robust ability to utilize existing variation among legumes to address the challenges of climate uncertainty. © 2018 The Authors. Plant, Cell & Environment Published by John Wiley & Sons Ltd.
From instinct to evidence: the role of data in country decision-making in Chile.
Aguilera, Ximena Paz; Espinosa-Marty, Consuelo; Castillo-Laborde, Carla; Gonzalez, Claudia
2016-01-01
The Chilean health system has undergone profound reforms since 1990, while going through many political upheavals, and faced demographic, health, and economic transformations. The full information requirements to develop an evidence-informed process implied the best possible use of available data, as well as efforts for improving the information systems. To examine, from a historical perspective, the use of data during the health reforms undertaken in Chile since 1990, and to identify the factors that have determined its utilization and improvement. A qualitative methodological approach was followed to review the case study of the Chilean experience with data on decision-making. We use as the primary source our first-hand experience as officials of the Ministry of Health (MOH) and the Ministry of Finance during the reform period considered. Second, a literature review was conducted, using documents from official sources, historical accounts, books, policy reports, and articles about the reform process, looking for the use of data. The Chilean health care reform process was intensive in utilization and production of information. In this context, the MOH conducted several studies, from the burden of disease, efficacy of interventions, cost-effectiveness, out-of-pocket payments, and fiscal impact to social preferences, among others. Policy and prioritization frameworks developed by international agencies influenced the use of data and the studies' agenda. Health systems in Latin America have struggled to adapt to changing health needs caused by demographic transition and economic growth. Health reforms in Chile provide lessons of this sustained effort, based on data and scientific grounds, with lights and shadows. Tradition, receptiveness to foreign ideas, and benchmarking with international data determined this approach, facilitated by the political influence of physicians and other technocrats. Besides, internationally comparable statistics are shown to play a significant role in policy debate.
Decision-Making in Audiology: Balancing Evidence-Based Practice and Patient-Centered Care
Clemesha, Jennifer; Lundmark, Erik; Crome, Erica; Barr, Caitlin; McMahon, Catherine M.
2017-01-01
Health-care service delivery models have evolved from a practitioner-centered approach toward a patient-centered ideal. Concurrently, increasing emphasis has been placed on the use of empirical evidence in decision-making to increase clinical accountability. The way in which clinicians use empirical evidence and client preferences to inform decision-making provides an insight into health-care delivery models utilized in clinical practice. The present study aimed to investigate the sources of information audiologists use when discussing rehabilitation choices with clients, and discuss the findings within the context of evidence-based practice and patient-centered care. To assess the changes that may have occurred over time, this study uses a questionnaire based on one of the few studies of decision-making behavior in audiologists, published in 1989. The present questionnaire was completed by 96 audiologists who attended the World Congress of Audiology in 2014. The responses were analyzed using qualitative and quantitative approaches. Results suggest that audiologists rank clinical test results and client preferences as the most important factors for decision-making. Discussion with colleagues or experts was also frequently reported as an important source influencing decision-making. Approximately 20% of audiologists mentioned utilizing research evidence to inform decision-making when no clear solution was available. Information shared at conferences was ranked low in terms of importance and reliability. This study highlights an increase in awareness of concepts associated with evidence-based practice and patient-centered care within audiology settings, consistent with current research-to-practice dissemination pathways. It also highlights that these pathways may not be sufficient for an effective clinical implementation of these practices. PMID:28752808
Anatomisation with slicing: a new privacy preservation approach for multiple sensitive attributes.
Susan, V Shyamala; Christopher, T
2016-01-01
An enormous quantity of personal health information is available in recent decades and tampering of any part of this information imposes a great risk to the health care field. Existing anonymization methods are only apt for single sensitive and low dimensional data to keep up with privacy specifically like generalization and bucketization. In this paper, an anonymization technique is proposed that is a combination of the benefits of anatomization, and enhanced slicing approach adhering to the principle of k-anonymity and l-diversity for the purpose of dealing with high dimensional data along with multiple sensitive data. The anatomization approach dissociates the correlation observed between the quasi identifier attributes and sensitive attributes (SA) and yields two separate tables with non-overlapping attributes. In the enhanced slicing algorithm, vertical partitioning does the grouping of the correlated SA in ST together and thereby minimizes the dimensionality by employing the advanced clustering algorithm. In order to get the optimal size of buckets, tuple partitioning is conducted by MFA. The experimental outcomes indicate that the proposed method can preserve privacy of data with numerous SA. The anatomization approach minimizes the loss of information and slicing algorithm helps in the preservation of correlation and utility which in turn results in reducing the data dimensionality and information loss. The advanced clustering algorithms prove its efficiency by minimizing the time and complexity. Furthermore, this work sticks to the principle of k-anonymity, l-diversity and thus avoids privacy threats like membership, identity and attributes disclosure.
Using field inversion to quantify functional errors in turbulence closures
NASA Astrophysics Data System (ADS)
Singh, Anand Pratap; Duraisamy, Karthik
2016-04-01
A data-informed approach is presented with the objective of quantifying errors and uncertainties in the functional forms of turbulence closure models. The approach creates modeling information from higher-fidelity simulations and experimental data. Specifically, a Bayesian formalism is adopted to infer discrepancies in the source terms of transport equations. A key enabling idea is the transformation of the functional inversion procedure (which is inherently infinite-dimensional) into a finite-dimensional problem in which the distribution of the unknown function is estimated at discrete mesh locations in the computational domain. This allows for the use of an efficient adjoint-driven inversion procedure. The output of the inversion is a full-field of discrepancy that provides hitherto inaccessible modeling information. The utility of the approach is demonstrated by applying it to a number of problems including channel flow, shock-boundary layer interactions, and flows with curvature and separation. In all these cases, the posterior model correlates well with the data. Furthermore, it is shown that even if limited data (such as surface pressures) are used, the accuracy of the inferred solution is improved over the entire computational domain. The results suggest that, by directly addressing the connection between physical data and model discrepancies, the field inversion approach materially enhances the value of computational and experimental data for model improvement. The resulting information can be used by the modeler as a guiding tool to design more accurate model forms, or serve as input to machine learning algorithms to directly replace deficient modeling terms.
Supporting tactical intelligence using collaborative environments and social networking
NASA Astrophysics Data System (ADS)
Wollocko, Arthur B.; Farry, Michael P.; Stark, Robert F.
2013-05-01
Modern military environments place an increased emphasis on the collection and analysis of intelligence at the tactical level. The deployment of analytical tools at the tactical level helps support the Warfighter's need for rapid collection, analysis, and dissemination of intelligence. However, given the lack of experience and staffing at the tactical level, most of the available intelligence is not exploited. Tactical environments are staffed by a new generation of intelligence analysts who are well-versed in modern collaboration environments and social networking. An opportunity exists to enhance tactical intelligence analysis by exploiting these personnel strengths, but is dependent on appropriately designed information sharing technologies. Existing social information sharing technologies enable users to publish information quickly, but do not unite or organize information in a manner that effectively supports intelligence analysis. In this paper, we present an alternative approach to structuring and supporting tactical intelligence analysis that combines the benefits of existing concepts, and provide detail on a prototype system embodying that approach. Since this approach employs familiar collaboration support concepts from social media, it enables new-generation analysts to identify the decision-relevant data scattered among databases and the mental models of other personnel, increasing the timeliness of collaborative analysis. Also, the approach enables analysts to collaborate visually to associate heterogeneous and uncertain data within the intelligence analysis process, increasing the robustness of collaborative analyses. Utilizing this familiar dynamic collaboration environment, we hope to achieve a significant reduction of time and skill required to glean actionable intelligence in these challenging operational environments.
Treatment of Children's Fears: A Strategic Utilization Approach.
ERIC Educational Resources Information Center
Protinsky, Howard
1985-01-01
Describes briefly Milton Erickson's strategic utilization approach to therapy. Discusses the usefulness of this approach in treating children's fears. Presents two case histories in which the approach successfully eliminated the fear of the child. (BH)
DuBard, C Annette; Jackson, Carlos T
2018-04-01
Care management of high-cost/high-needs patients is an increasingly common strategy to reduce health care costs. A variety of targeting methodologies have emerged to identify patients with high historical or predicted health care utilization, but the more pertinent question for program planners is how to identify those who are most likely to benefit from care management intervention. This paper describes the evolution of complex care management targeting strategies in Community Care of North Carolina's (CCNC) work with the statewide non-dual Medicaid population, culminating in the development of an "Impactability Score" that uses administrative data to predict achievable savings. It describes CCNC's pragmatic approach for estimating intervention effects in a historical cohort of 23,455 individuals, using a control population of 14,839 to determine expected spending at an individual level, against which actual spending could be compared. The actual-to-expected spending difference was then used as the dependent variable in a multivariate model to determine the predictive contribution of a multitude of demographic, clinical, and utilization characteristics. The coefficients from this model yielded the information required to build predictive models for prospective use. Model variables related to medication adherence and historical utilization unexplained by disease burden proved to be more important predictors of impactability than any given diagnosis or event, disease profile, or overall costs of care. Comparison of this approach to alternative targeting strategies (emergency department super-utilizers, inpatient super-utilizers, or patients with highest Hierarchical Condition Category risk scores) suggests a 2- to 3-fold higher return on investment using impactability-based targeting.
A rhetorical approach to environmental information sharing
NASA Astrophysics Data System (ADS)
Woolf, Andrew
2014-05-01
`Faceted search' has recently been widely adopted as a powerful information discovery framework, enabling users to navigate a complex landscape of information by successive refinement along key dimensions. The compelling user experience that results has seen adoption of faceted search by online retailers, media outlets, and encyclopedic publishers. A key challenge with faceted browse is the choice of suitable search dimensions, or facets. Conventional facet analysis adopts principles of exclusivity and exhaustiveness; identifying facets on their relevance to the subject and discrimination ability (Spiteri, 1998). The rhetoricians of ancient Greece defined seven dimensions (`circumstances') of analytical enquiry: who, what, when, where, why, in what way, by what means. These provide a broadly applicable framework that may be seen in Ranganathan's classic (`PMEST') scheme for facet analysis. The utility of the `Five Ws' is also manifest through their adoption in daily discourse and pedagogical frameworks. If we apply the `Five Ws' to environmental information, we arrive at a model very close to the `O&M' (ISO 19156) conceptual model for standardised exchange of environmental observation and measurements data: * who: metadata * what: observed property * when: time of observation * where: feature of interest * why: metadata * how: procedure Thus, we adopt an approach for distributed environmental information sharing which factors the architecture into components aligned with the `Five Ws' (or O&M). We give an overview of this architecture and its information classes, components, interfaces and standards. We also describe how it extends the classic SDI architecture to provide additional specific benefit for environmental information. Finally, we offer a perspective on the architecture which may be seen as a `brokering' overlay to environmental information resources, enabling an O&M-conformant view. The approach to be presented is being adopted by the Australian Bureau of Meteorology as the basis for a National Environmental Information Infrastructure.
The evolution of information-driven safeguards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budlong-sylvester, Kory W; Pilat, Joseph F
2010-10-14
From the adoption of the Model Additional Protocol and integrated safeguards in the 1990s, to current International Atomic Energy Agency (IAEA) efforts to deal with cases of noncompliance, the question of how the Agency can best utilize all the information available to it remains of great interest and increasing importance. How might the concept of 'information-driven' safeguards (IDS) evolve in the future? The ability of the Agency to identify and resolve anomalies has always been important and has emerged as a core Agency function in recent years as the IAEA has had to deal with noncompliance in Iran and themore » Democratic People's Republic of Korea (DPRK). Future IAEA safeguards implementation should be designed with the goal of facilitating and enhancing this vital capability. In addition, the Agency should utilize all the information it possesses, including its in-house assessments and expertise, to direct its safeguards activities. At the State level, knowledge of proliferation possibilities is currently being used to guide the analytical activities of the Agency and to develop inspection plans. How far can this approach be extended? Does it apply across State boundaries? Should it dictate a larger fraction of safeguards activities? Future developments in IDS should utilize the knowledge resident within the Agency to ensure that safeguards resources flow to where they are most needed in order to address anomalies first and foremost, but also to provide greater confidence in conclusions regarding the absence of undeclared nuclear activities. The elements of such a system and related implementation issues are assessed in this paper.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, D.A.
1995-12-31
Under the Acid Rain Program, by statute and regulation, affected utility units are allocated annual allowances. Each allowance permits a unit to emit one ton of SO{sub 2} during or after a specified year. At year end, utilities must hold allowances equal to or greater than the cumulative SO{sub 2} emissions throughout the year from their affected units. The program has been developing, on a staged basis, two major computer-based information systems: the Allowance Tracking System (ATS) for tracking creation, transfer, and ultimate use of allowances; and the Emissions Tracking System (ETS) for transmission, receipt, processing, and inventory of continuousmore » emissions monitoring (CEM) data. The systems collectively form a logical Acid Rain Data System (ARDS). ARDS will be the largest information system ever used to operate and evaluate an environmental program. The paper describes the progressive software engineering approach the Acid Rain Program has been using to develop ARDS. Iterative software version releases, keyed to critical program deadlines, add the functionality required to support specific statutory and regulatory provisions. Each software release also incorporates continual improvements for efficiency, user-friendliness, and lower life-cycle costs. The program is migrating the independent ATS and ETS systems into a logically coordinated True-Up processing model, to support the end-of-year reconciliation for balancing allowance holdings against annual emissions and compliance plans for Phase 1 affected utility units. The paper provides specific examples and data to illustrate exciting applications of today`s information technology in ARDS.« less
2009-09-01
to promote one way as the best, but to show there are several ways to define the problem. 107 Figure 71. Final Orientation/Obstacle Scenario...a comparison of the running cost vs. distance from an obstacle for varying values of p. Simulations have shown that for 4p , the running cost...sliding door example. This scenario shows a major weakness when conducting trajectory planning using snapshots in a dynamic environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-08-01
In this pilot project, the Building America Partnership for Improved Residential Construction and Florida Power and Light are collaborating to retrofit a large number of homes using a phased approach to both simple and deep retrofits. This project will provide the information necessary to significantly reduce energy use through larger community-scale projects in collaboration with utilities, program administrators and other market leader stakeholders.
A Patient-specific Approach to Hospital Cost Accounting
Macdonald, Larry K.; Reuter, Louis F.
1973-01-01
The hospital facilities and manpower used for the various procedures of a delivery suite are identified and measured as the basis for determining individual patient costs. The method of measuring staff and facility requirements, including the “cost of readiness” and the cost of inherent inefficiencies, generates detailed information that can be used in determining utilization ranges for budgeting decisions, for planning space needs, for personnel scheduling, and for patient billing. PMID:4269322
Intelligent multi-sensor integrations
NASA Technical Reports Server (NTRS)
Volz, Richard A.; Jain, Ramesh; Weymouth, Terry
1989-01-01
Growth in the intelligence of space systems requires the use and integration of data from multiple sensors. Generic tools are being developed for extracting and integrating information obtained from multiple sources. The full spectrum is addressed for issues ranging from data acquisition, to characterization of sensor data, to adaptive systems for utilizing the data. In particular, there are three major aspects to the project, multisensor processing, an adaptive approach to object recognition, and distributed sensor system integration.
A team approach to an undergraduate interprofessional communication course.
Doucet, Shelley; Buchanan, Judy; Cole, Tricia; McCoy, Carolyn
2013-05-01
Interprofessional communication is a team-taught upper-level undergraduate course for Nursing and Health Sciences students. In addition to teaching fundamental communication skills, this course weaves interprofessional competencies into weekly learning activities and assignments. The utilization of the principles and practices of team-based learning in the classroom enhances the attainment and practice of communication and interprofessional collaboration skills. Lessons learned from conducting informal course evaluations and delivering the course multiple times are presented.
Peter, Samuel C; Whelan, James P; Pfund, Rory A; Meyers, Andrew W
2018-06-14
Although readability has been traditionally operationalized and even become synonymous with the concept of word and sentence length, modern text analysis theory and technology have shifted toward multidimensional comprehension-based analytic techniques. In an effort to make use of these advancements and demonstrate their general utility, 6 commonly used measures of gambling disorder were submitted to readability analyses using 2 of these advanced approaches, Coh-Metrix and Question Understanding Aid (QUAID), and one traditional approach, the Flesch-Kincaid Grade Level. As hypothesized, significant variation was found across measures, with some questionnaires emerging as more appropriate than others for use in samples that may include individuals with low literacy. Recommendations are made for the use of these modern approaches to readability to inform decisions on measure selection and development. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
3D-model building of the jaw impression
NASA Astrophysics Data System (ADS)
Ahmed, Moumen T.; Yamany, Sameh M.; Hemayed, Elsayed E.; Farag, Aly A.
1997-03-01
A novel approach is proposed to obtain a record of the patient's occlusion using computer vision. Data acquisition is obtained using intra-oral video cameras. The technique utilizes shape from shading to extract 3D information from 2D views of the jaw, and a novel technique for 3D data registration using genetic algorithms. The resulting 3D model can be used for diagnosis, treatment planning, and implant purposes. The overall purpose of this research is to develop a model-based vision system for orthodontics to replace traditional approaches. This system will be flexible, accurate, and will reduce the cost of orthodontic treatments.
A mixed methods assessment of coping with pediatric cancer
Alderfer, Melissa A.; Deatrick, Janet A.; Marsac, Meghan L.
2014-01-01
The purpose of this study was to describe child coping and parent coping assistance with cancer-related stressors during treatment. Fifteen children (aged 6-12) with cancer and their parents (N = 17) completed semi-structured interviews and self-report measures to assess coping and coping assistance. Results suggest families utilized a broad array of approach and avoidance strategies to manage cancer and its treatment. Quantitative and qualitative assessments provided complementary and unique contributions to understanding coping among children with cancer and their parents. Using a mixed methods approach to assess coping provides a richer understanding of families’ experiences, which can better inform clinical practice. PMID:24428250
Fluorescence Spectroscopy for the Monitoring of Food Processes.
Ahmad, Muhammad Haseeb; Sahar, Amna; Hitzmann, Bernd
Different analytical techniques have been used to examine the complexity of food samples. Among them, fluorescence spectroscopy cannot be ignored in developing rapid and non-invasive analytical methodologies. It is one of the most sensitive spectroscopic approaches employed in identification, classification, authentication, quantification, and optimization of different parameters during food handling, processing, and storage and uses different chemometric tools. Chemometrics helps to retrieve useful information from spectral data utilized in the characterization of food samples. This contribution discusses in detail the potential of fluorescence spectroscopy of different foods, such as dairy, meat, fish, eggs, edible oil, cereals, fruit, vegetables, etc., for qualitative and quantitative analysis with different chemometric approaches.