A demonstrative model of a lunar base simulation on a personal computer
NASA Technical Reports Server (NTRS)
1985-01-01
The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.
Computer Models of Personality: Implications for Measurement
ERIC Educational Resources Information Center
Cranton, P. A.
1976-01-01
Current research on computer models of personality is reviewed and categorized under five headings: (1) models of belief systems; (2) models of interpersonal behavior; (3) models of decision-making processes; (4) prediction models; and (5) theory-based simulations of specific processes. The use of computer models in personality measurement is…
Computer-based personality judgments are more accurate than those made by humans
Youyou, Wu; Kosinski, Michal; Stillwell, David
2015-01-01
Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507
Computer-based personality judgments are more accurate than those made by humans.
Youyou, Wu; Kosinski, Michal; Stillwell, David
2015-01-27
Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.
Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data.
Guo, Ao; Ma, Jianhua
2018-02-25
A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject's persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual's facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model.
Archetype-Based Modeling of Persona for Comprehensive Personality Computing from Personal Big Data
Ma, Jianhua
2018-01-01
A model describing the wide variety of human behaviours called personality, is becoming increasingly popular among researchers due to the widespread availability of personal big data generated from the use of prevalent digital devices, e.g., smartphones and wearables. Such an approach can be used to model an individual and even digitally clone a person, e.g., a Cyber-I (cyber individual). This work is aimed at establishing a unique and comprehensive description for an individual to mesh with various personalized services and applications. An extensive research literature on or related to psychological modelling exists, i.e., into automatic personality computing. However, the integrity and accuracy of the results from current automatic personality computing is insufficient for the elaborate modeling in Cyber-I due to an insufficient number of data sources. To reach a comprehensive psychological description of a person, it is critical to bring in heterogeneous data sources that could provide plenty of personal data, i.e., the physiological data, and the Internet data. In addition, instead of calculating personality traits from personal data directly, an approach to a personality model derived from the theories of Carl Gustav Jung is used to measure a human subject’s persona. Therefore, this research is focused on designing an archetype-based modeling of persona covering an individual’s facets in different situations to approach a comprehensive personality model. Using personal big data to measure a specific persona in a certain scenario, our research is designed to ensure the accuracy and integrity of the generated personality model. PMID:29495343
Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing
Balasubramaniam, S.; Kavitha, V.
2015-01-01
Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826
Geometric data perturbation-based personal health record transactions in cloud computing.
Balasubramaniam, S; Kavitha, V
2015-01-01
Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.
Modeling the Impact of Motivation, Personality, and Emotion on Social Behavior
NASA Astrophysics Data System (ADS)
Miller, Lynn C.; Read, Stephen J.; Zachary, Wayne; Rosoff, Andrew
Models seeking to predict human social behavior must contend with multiple sources of individual and group variability that underlie social behavior. One set of interrelated factors that strongly contribute to that variability - motivations, personality, and emotions - has been only minimally incorporated in previous computational models of social behavior. The Personality, Affect, Culture (PAC) framework is a theory-based computational model that addresses this gap. PAC is used to simulate social agents whose social behavior varies according to their personalities and emotions, which, in turn, vary according to their motivations and underlying motive control parameters. Examples involving disease spread and counter-insurgency operations show how PAC can be used to study behavioral variability in different social contexts.
CAT Model with Personalized Algorithm for Evaluation of Estimated Student Knowledge
ERIC Educational Resources Information Center
Andjelic, Svetlana; Cekerevac, Zoran
2014-01-01
This article presents the original model of the computer adaptive testing and grade formation, based on scientifically recognized theories. The base of the model is a personalized algorithm for selection of questions depending on the accuracy of the answer to the previous question. The test is divided into three basic levels of difficulty, and the…
Navab, Nassir; Fellow, Miccai; Hennersperger, Christoph; Frisch, Benjamin; Fürst, Bernhard
2016-10-01
In the last decade, many researchers in medical image computing and computer assisted interventions across the world focused on the development of the Virtual Physiological Human (VPH), aiming at changing the practice of medicine from classification and treatment of diseases to that of modeling and treating patients. These projects resulted in major advancements in segmentation, registration, morphological, physiological and biomechanical modeling based on state of art medical imaging as well as other sensory data. However, a major issue which has not yet come into the focus is personalizing intra-operative imaging, allowing for optimal treatment. In this paper, we discuss the personalization of imaging and visualization process with particular focus on satisfying the challenging requirements of computer assisted interventions. We discuss such requirements and review a series of scientific contributions made by our research team to tackle some of these major challenges. Copyright © 2016. Published by Elsevier B.V.
USDA-ARS?s Scientific Manuscript database
Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...
Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A
2017-04-01
Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results obtained in an acute respiratory distress syndrome patient show the potential of this approach for personalized computationally guided optimization of mechanical ventilation in future. Copyright © 2017 the American Physiological Society.
Image-Based Predictive Modeling of Heart Mechanics.
Wang, V Y; Nielsen, P M F; Nash, M P
2015-01-01
Personalized biophysical modeling of the heart is a useful approach for noninvasively analyzing and predicting in vivo cardiac mechanics. Three main developments support this style of analysis: state-of-the-art cardiac imaging technologies, modern computational infrastructure, and advanced mathematical modeling techniques. In vivo measurements of cardiac structure and function can be integrated using sophisticated computational methods to investigate mechanisms of myocardial function and dysfunction, and can aid in clinical diagnosis and developing personalized treatment. In this article, we review the state-of-the-art in cardiac imaging modalities, model-based interpretation of 3D images of cardiac structure and function, and recent advances in modeling that allow personalized predictions of heart mechanics. We discuss how using such image-based modeling frameworks can increase the understanding of the fundamental biophysics behind cardiac mechanics, and assist with diagnosis, surgical guidance, and treatment planning. Addressing the challenges in this field will require a coordinated effort from both the clinical-imaging and modeling communities. We also discuss future directions that can be taken to bridge the gap between basic science and clinical translation.
A vectorial semantics approach to personality assessment.
Neuman, Yair; Cohen, Yochai
2014-04-23
Personality assessment and, specifically, the assessment of personality disorders have traditionally been indifferent to computational models. Computational personality is a new field that involves the automatic classification of individuals' personality traits that can be compared against gold-standard labels. In this context, we introduce a new vectorial semantics approach to personality assessment, which involves the construction of vectors representing personality dimensions and disorders, and the automatic measurements of the similarity between these vectors and texts written by human subjects. We evaluated our approach by using a corpus of 2468 essays written by students who were also assessed through the five-factor personality model. To validate our approach, we measured the similarity between the essays and the personality vectors to produce personality disorder scores. These scores and their correspondence with the subjects' classification of the five personality factors reproduce patterns well-documented in the psychological literature. In addition, we show that, based on the personality vectors, we can predict each of the five personality factors with high accuracy.
A Vectorial Semantics Approach to Personality Assessment
NASA Astrophysics Data System (ADS)
Neuman, Yair; Cohen, Yochai
2014-04-01
Personality assessment and, specifically, the assessment of personality disorders have traditionally been indifferent to computational models. Computational personality is a new field that involves the automatic classification of individuals' personality traits that can be compared against gold-standard labels. In this context, we introduce a new vectorial semantics approach to personality assessment, which involves the construction of vectors representing personality dimensions and disorders, and the automatic measurements of the similarity between these vectors and texts written by human subjects. We evaluated our approach by using a corpus of 2468 essays written by students who were also assessed through the five-factor personality model. To validate our approach, we measured the similarity between the essays and the personality vectors to produce personality disorder scores. These scores and their correspondence with the subjects' classification of the five personality factors reproduce patterns well-documented in the psychological literature. In addition, we show that, based on the personality vectors, we can predict each of the five personality factors with high accuracy.
A Vectorial Semantics Approach to Personality Assessment
Neuman, Yair; Cohen, Yochai
2014-01-01
Personality assessment and, specifically, the assessment of personality disorders have traditionally been indifferent to computational models. Computational personality is a new field that involves the automatic classification of individuals' personality traits that can be compared against gold-standard labels. In this context, we introduce a new vectorial semantics approach to personality assessment, which involves the construction of vectors representing personality dimensions and disorders, and the automatic measurements of the similarity between these vectors and texts written by human subjects. We evaluated our approach by using a corpus of 2468 essays written by students who were also assessed through the five-factor personality model. To validate our approach, we measured the similarity between the essays and the personality vectors to produce personality disorder scores. These scores and their correspondence with the subjects' classification of the five personality factors reproduce patterns well-documented in the psychological literature. In addition, we show that, based on the personality vectors, we can predict each of the five personality factors with high accuracy. PMID:24755833
Computational modeling in melanoma for novel drug discovery.
Pennisi, Marzio; Russo, Giulia; Di Salvatore, Valentina; Candido, Saverio; Libra, Massimo; Pappalardo, Francesco
2016-06-01
There is a growing body of evidence highlighting the applications of computational modeling in the field of biomedicine. It has recently been applied to the in silico analysis of cancer dynamics. In the era of precision medicine, this analysis may allow the discovery of new molecular targets useful for the design of novel therapies and for overcoming resistance to anticancer drugs. According to its molecular behavior, melanoma represents an interesting tumor model in which computational modeling can be applied. Melanoma is an aggressive tumor of the skin with a poor prognosis for patients with advanced disease as it is resistant to current therapeutic approaches. This review discusses the basics of computational modeling in melanoma drug discovery and development. Discussion includes the in silico discovery of novel molecular drug targets, the optimization of immunotherapies and personalized medicine trials. Mathematical and computational models are gradually being used to help understand biomedical data produced by high-throughput analysis. The use of advanced computer models allowing the simulation of complex biological processes provides hypotheses and supports experimental design. The research in fighting aggressive cancers, such as melanoma, is making great strides. Computational models represent the key component to complement these efforts. Due to the combinatorial complexity of new drug discovery, a systematic approach based only on experimentation is not possible. Computational and mathematical models are necessary for bringing cancer drug discovery into the era of omics, big data and personalized medicine.
An u-Service Model Based on a Smart Phone for Urban Computing Environments
NASA Astrophysics Data System (ADS)
Cho, Yongyun; Yoe, Hyun
In urban computing environments, all of services should be based on the interaction between humans and environments around them, which frequently and ordinarily in home and office. This paper propose an u-service model based on a smart phone for urban computing environments. The suggested service model includes a context-aware and personalized service scenario development environment that can instantly describe user's u-service demand or situation information with smart devices. To do this, the architecture of the suggested service model consists of a graphical service editing environment for smart devices, an u-service platform, and an infrastructure with sensors and WSN/USN. The graphic editor expresses contexts as execution conditions of a new service through a context model based on ontology. The service platform deals with the service scenario according to contexts. With the suggested service model, an user in urban computing environments can quickly and easily make u-service or new service using smart devices.
Tissue-scale, personalized modeling and simulation of prostate cancer growth
NASA Astrophysics Data System (ADS)
Lorenzo, Guillermo; Scott, Michael A.; Tew, Kevin; Hughes, Thomas J. R.; Zhang, Yongjie Jessica; Liu, Lei; Vilanova, Guillermo; Gomez, Hector
2016-11-01
Recently, mathematical modeling and simulation of diseases and their treatments have enabled the prediction of clinical outcomes and the design of optimal therapies on a personalized (i.e., patient-specific) basis. This new trend in medical research has been termed “predictive medicine.” Prostate cancer (PCa) is a major health problem and an ideal candidate to explore tissue-scale, personalized modeling of cancer growth for two main reasons: First, it is a small organ, and, second, tumor growth can be estimated by measuring serum prostate-specific antigen (PSA, a PCa biomarker in blood), which may enable in vivo validation. In this paper, we present a simple continuous model that reproduces the growth patterns of PCa. We use the phase-field method to account for the transformation of healthy cells to cancer cells and use diffusion-reaction equations to compute nutrient consumption and PSA production. To accurately and efficiently compute tumor growth, our simulations leverage isogeometric analysis (IGA). Our model is shown to reproduce a known shape instability from a spheroidal pattern to fingered growth. Results of our computations indicate that such shift is a tumor response to escape starvation, hypoxia, and, eventually, necrosis. Thus, branching enables the tumor to minimize the distance from inner cells to external nutrients, contributing to cancer survival and further development. We have also used our model to perform tissue-scale, personalized simulation of a PCa patient, based on prostatic anatomy extracted from computed tomography images. This simulation shows tumor progression similar to that seen in clinical practice.
Computational neuroanatomy: ontology-based representation of neural components and connectivity.
Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron
2009-02-05
A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future.
Computing of Learner's Personality Traits Based on Digital Annotations
ERIC Educational Resources Information Center
Omheni, Nizar; Kalboussi, Anis; Mazhoud, Omar; Kacem, Ahmed Hadj
2017-01-01
Researchers in education are interested in modeling of learner's profile and adapt their learning experiences accordingly. When learners read and interact with their reading materials, they do unconscious practices like annotations which may be, a key feature of their personalities. Annotation activity requires readers to be active, to think…
Registration of surface structures using airborne focused ultrasound.
Sundström, N; Börjesson, P O; Holmer, N G; Olsson, L; Persson, H W
1991-01-01
A low-cost measuring system, based on a personal computer combined with standard equipment for complex measurements and signal processing, has been assembled. Such a system increases the possibilities for small hospitals and clinics to finance advanced measuring equipment. A description of equipment developed for airborne ultrasound together with a personal computer-based system for fast data acquisition and processing is given. Two air-adapted ultrasound transducers with high lateral resolution have been developed. Furthermore, a few results for fast and accurate estimation of signal arrival time are presented. The theoretical estimation models developed are applied to skin surface profile registrations.
Development of qualification guidelines for personal computer-based aviation training devices.
DOT National Transportation Integrated Search
1995-02-01
Recent advances in the capabilities of personal computers have resulted in an increase in the number of flight simulation programs made available as Personal Computer-Based Aviation Training Devices (PCATDs).The potential benefits of PCATDs have been...
Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot
2017-10-01
Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.
NASA Astrophysics Data System (ADS)
Neic, Aurel; Campos, Fernando O.; Prassl, Anton J.; Niederer, Steven A.; Bishop, Martin J.; Vigmond, Edward J.; Plank, Gernot
2017-10-01
Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.
Agur, Zvia; Elishmereni, Moran; Kheifetz, Yuri
2014-01-01
Despite its great promise, personalized oncology still faces many hurdles, and it is increasingly clear that targeted drugs and molecular biomarkers alone yield only modest clinical benefit. One reason is the complex relationships between biomarkers and the patient's response to drugs, obscuring the true weight of the biomarkers in the overall patient's response. This complexity can be disentangled by computational models that integrate the effects of personal biomarkers into a simulator of drug-patient dynamic interactions, for predicting the clinical outcomes. Several computational tools have been developed for personalized oncology, notably evidence-based tools for simulating pharmacokinetics, Bayesian-estimated tools for predicting survival, etc. We describe representative statistical and mathematical tools, and discuss their merits, shortcomings and preliminary clinical validation attesting to their potential. Yet, the individualization power of mathematical models alone, or statistical models alone, is limited. More accurate and versatile personalization tools can be constructed by a new application of the statistical/mathematical nonlinear mixed effects modeling (NLMEM) approach, which until recently has been used only in drug development. Using these advanced tools, clinical data from patient populations can be integrated with mechanistic models of disease and physiology, for generating personal mathematical models. Upon a more substantial validation in the clinic, this approach will hopefully be applied in personalized clinical trials, P-trials, hence aiding the establishment of personalized medicine within the main stream of clinical oncology. © 2014 Wiley Periodicals, Inc.
Development of a Personalized Educational Computer Game Based on Students' Learning Styles
ERIC Educational Resources Information Center
Hwang, Gwo-Jen; Sung, Han-Yu; Hung, Chun-Ming; Huang, Iwen; Tsai, Chin-Chung
2012-01-01
In recent years, many researchers have been engaged in the development of educational computer games; however, previous studies have indicated that, without supportive models that take individual students' learning needs or difficulties into consideration, students might only show temporary interest during the learning process, and their learning…
Computer simulation: A modern day crystal ball?
NASA Technical Reports Server (NTRS)
Sham, Michael; Siprelle, Andrew
1994-01-01
It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.
Research and the Personal Computer.
ERIC Educational Resources Information Center
Blackburn, D. A.
1989-01-01
Discussed is the history and elements of the personal computer. Its uses as a laboratory assistant and generic toolkit for mathematical analysis and modeling are included. The future of the personal computer in research is addressed. (KR)
2012-01-01
Background Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. Results In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Conclusions Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org. PMID:23281941
El-Kalioby, Mohamed; Abouelhoda, Mohamed; Krüger, Jan; Giegerich, Robert; Sczyrba, Alexander; Wall, Dennis P; Tonellato, Peter
2012-01-01
Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org.
Computational neuroanatomy: ontology-based representation of neural components and connectivity
Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron
2009-01-01
Background A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. Results We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Conclusion Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future. PMID:19208191
Zao, John K.; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping
2014-01-01
EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system. PMID:24917804
Zao, John K; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping
2014-01-01
EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system.
Communication for Scientists and Engineers: A "Computer Model" in the Basic Course.
ERIC Educational Resources Information Center
Haynes, W. Lance
Successful speech should rest not on prepared notes and outlines but on genuine oral discourse based on "data" fed into the "software" in the computer which already exists within each person. Writing cannot speak for itself, nor can it continually adjust itself to accommodate diverse response. Moreover, no matter how skillfully…
A CLIPS based personal computer hardware diagnostic system
NASA Technical Reports Server (NTRS)
Whitson, George M.
1991-01-01
Often the person designated to repair personal computers has little or no knowledge of how to repair a computer. Described here is a simple expert system to aid these inexperienced repair people. The first component of the system leads the repair person through a number of simple system checks such as making sure that all cables are tight and that the dip switches are set correctly. The second component of the system assists the repair person in evaluating error codes generated by the computer. The final component of the system applies a large knowledge base to attempt to identify the component of the personal computer that is malfunctioning. We have implemented and tested our design with a full system to diagnose problems for an IBM compatible system based on the 8088 chip. In our tests, the inexperienced repair people found the system very useful in diagnosing hardware problems.
Relationship between Norm-internalization and Cooperation in N-person Prisoners' Dilemma Games
NASA Astrophysics Data System (ADS)
Matsumoto, Mitsutaka
In this paper, I discuss the problems of ``order in social situations'' using a computer simulation of iterated N-person prisoners' dilemma game. It has been claimed that, in the case of the 2-person prisoners' dilemma, repetition of games and the reciprocal use of the ``tit-for-tat'' strategy promote the possibility of cooperation. However, in cases of N-person prisoners' dilemma where N is greater than 2, the logic does not work effectively. The most essential problem is so called ``sanctioning problems''. In this paper, firstly, I discuss the ``sanctioning problems'' which were introduced by Axelrod and Keohane in 1986. Based on the model formalized by Axelrod, I propose a new model, in which I added a mechanism of players' payoff changes in the Axelrod's model. I call this mechanism norm-internalization and call our model ``norm-internalization game''. Second, by using the model, I investigated the relationship between agents' norm-internalization (payoff-alternation) and the possibilities of cooperation. The results of computer simulation indicated that unequal distribution of cooperating norm and uniform distribution of sanctioning norm are more effective in establishing cooperation. I discuss the mathematical features and the implications of the results on social science.
Personality and emotion-based high-level control of affective story characters.
Su, Wen-Poh; Pham, Binh; Wardhani, Aster
2007-01-01
Human emotional behavior, personality, and body language are the essential elements in the recognition of a believable synthetic story character. This paper presents an approach using story scripts and action descriptions in a form similar to the content description of storyboards to predict specific personality and emotional states. By adopting the Abridged Big Five Circumplex (AB5C) Model of personality from the study of psychology as a basis for a computational model, we construct a hierarchical fuzzy rule-based system to facilitate the personality and emotion control of the body language of a dynamic story character. The story character can consistently perform specific postures and gestures based on his/her personality type. Story designers can devise a story context in the form of our story interface which predictably motivates personality and emotion values to drive the appropriate movements of the story characters. Our system takes advantage of relevant knowledge described by psychologists and researchers of storytelling, nonverbal communication, and human movement. Our ultimate goal is to facilitate the high-level control of a synthetic character.
Sorensen, Mads Solvsten; Mosegaard, Jesper; Trier, Peter
2009-06-01
Existing virtual simulators for middle ear surgery are based on 3-dimensional (3D) models from computed tomographic or magnetic resonance imaging data in which image quality is limited by the lack of detail (maximum, approximately 50 voxels/mm3), natural color, and texture of the source material.Virtual training often requires the purchase of a program, a customized computer, and expensive peripherals dedicated exclusively to this purpose. The Visible Ear freeware library of digital images from a fresh-frozen human temporal bone was segmented, and real-time volume rendered as a 3D model of high-fidelity, true color, and great anatomic detail and realism of the surgically relevant structures. A haptic drilling model was developed for surgical interaction with the 3D model. Realistic visualization in high-fidelity (approximately 125 voxels/mm3) and true color, 2D, or optional anaglyph stereoscopic 3D was achieved on a standard Core 2 Duo personal computer with a GeForce 8,800 GTX graphics card, and surgical interaction was provided through a relatively inexpensive (approximately $2,500) Phantom Omni haptic 3D pointing device. This prototype is published for download (approximately 120 MB) as freeware at http://www.alexandra.dk/ves/index.htm.With increasing personal computer performance, future versions may include enhanced resolution (up to 8,000 voxels/mm3) and realistic interaction with deformable soft tissue components such as skin, tympanic membrane, dura, and cholesteatomas-features some of which are not possible with computed tomographic-/magnetic resonance imaging-based systems.
Integrative approaches to computational biomedicine
Coveney, Peter V.; Diaz-Zuccarini, Vanessa; Graf, Norbert; Hunter, Peter; Kohl, Peter; Tegner, Jesper; Viceconti, Marco
2013-01-01
The new discipline of computational biomedicine is concerned with the application of computer-based techniques and particularly modelling and simulation to human health. Since 2007, this discipline has been synonymous, in Europe, with the name given to the European Union's ambitious investment in integrating these techniques with the eventual aim of modelling the human body as a whole: the virtual physiological human. This programme and its successors are expected, over the next decades, to transform the study and practice of healthcare, moving it towards the priorities known as ‘4P's’: predictive, preventative, personalized and participatory medicine.
Deformable torso phantoms of Chinese adults for personalized anatomy modelling.
Wang, Hongkai; Sun, Xiaobang; Wu, Tongning; Li, Congsheng; Chen, Zhonghua; Liao, Meiying; Li, Mengci; Yan, Wen; Huang, Hui; Yang, Jia; Tan, Ziyu; Hui, Libo; Liu, Yue; Pan, Hang; Qu, Yue; Chen, Zhaofeng; Tan, Liwen; Yu, Lijuan; Shi, Hongcheng; Huo, Li; Zhang, Yanjun; Tang, Xin; Zhang, Shaoxiang; Liu, Changjian
2018-04-16
In recent years, there has been increasing demand for personalized anatomy modelling for medical and industrial applications, such as ergonomics device development, clinical radiological exposure simulation, biomechanics analysis, and 3D animation character design. In this study, we constructed deformable torso phantoms that can be deformed to match the personal anatomy of Chinese male and female adults. The phantoms were created based on a training set of 79 trunk computed tomography (CT) images (41 males and 38 females) from normal Chinese subjects. Major torso organs were segmented from the CT images, and the statistical shape model (SSM) approach was used to learn the inter-subject anatomical variations. To match the personal anatomy, the phantoms were registered to individual body surface scans or medical images using the active shape model method. The constructed SSM demonstrated anatomical variations in body height, fat quantity, respiratory status, organ geometry, male muscle size, and female breast size. The masses of the deformed phantom organs were consistent with Chinese population organ mass ranges. To validate the performance of personal anatomy modelling, the phantoms were registered to the body surface scan and CT images. The registration accuracy measured from 22 test CT images showed a median Dice coefficient over 0.85, a median volume recovery coefficient (RC vlm ) between 0.85 and 1.1, and a median averaged surface distance (ASD) < 1.5 mm. We hope these phantoms can serve as computational tools for personalized anatomy modelling for the research community. © 2018 Anatomical Society.
Web-Based Real Time Earthquake Forecasting and Personal Risk Management
NASA Astrophysics Data System (ADS)
Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.
2012-12-01
Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and ROC tests allow us to judge data completeness and estimate error. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges and pitfalls in serving up these datasets over the web.
ERIC Educational Resources Information Center
Naemi, Bobby; Seybert, Jacob; Robbins, Steven; Kyllonen, Patrick
2014-01-01
This report introduces the "WorkFORCE"™ Assessment for Job Fit, a personality assessment utilizing the "FACETS"™ core capability, which is based on innovations in forced-choice assessment and computer adaptive testing. The instrument is derived from the fivefactor model (FFM) of personality and encompasses a broad spectrum of…
Peretz, Chava; Korczyn, Amos D; Shatil, Evelyn; Aharonson, Vered; Birnboim, Smadar; Giladi, Nir
2011-01-01
Many studies have suggested that cognitive training can result in cognitive gains in healthy older adults. We investigated whether personalized computerized cognitive training provides greater benefits than those obtained by playing conventional computer games. This was a randomized double-blind interventional study. Self-referred healthy older adults (n = 155, 68 ± 7 years old) were assigned to either a personalized, computerized cognitive training or to a computer games group. Cognitive performance was assessed at baseline and after 3 months by a neuropsychological assessment battery. Differences in cognitive performance scores between and within groups were evaluated using mixed effects models in 2 approaches: adherence only (AO; n = 121) and intention to treat (ITT; n = 155). Both groups improved in cognitive performance. The improvement in the personalized cognitive training group was significant (p < 0.03, AO and ITT approaches) in all 8 cognitive domains. However, in the computer games group it was significant (p < 0.05) in only 4 (AO) or 6 domains (ITT). In the AO analysis, personalized cognitive training was significantly more effective than playing games in improving visuospatial working memory (p = 0.0001), visuospatial learning (p = 0.0012) and focused attention (p = 0.0019). Personalized, computerized cognitive training appears to be more effective than computer games in improving cognitive performance in healthy older adults. Further studies are needed to evaluate the ecological validity of these findings. Copyright © 2011 S. Karger AG, Basel.
ERIC Educational Resources Information Center
Sim, KwongNui; Butson, Russell
2014-01-01
This scoping study examines the degree to which twenty two undergraduate students used their personal computers to support their academic study. The students were selected based on their responses to a questionnaire aimed at gauging their degree of computer skill. Computer activity data was harvested from the personal computers of eighteen…
Definitions of database files and fields of the Personal Computer-Based Water Data Sources Directory
Green, J. Wayne
1991-01-01
This report describes the data-base files and fields of the personal computer-based Water Data Sources Directory (WDSD). The personal computer-based WDSD was derived from the U.S. Geological Survey (USGS) mainframe computer version. The mainframe version of the WDSD is a hierarchical data-base design. The personal computer-based WDSD is a relational data- base design. This report describes the data-base files and fields of the relational data-base design in dBASE IV (the use of brand names in this abstract is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey) for the personal computer. The WDSD contains information on (1) the type of organization, (2) the major orientation of water-data activities conducted by each organization, (3) the names, addresses, and telephone numbers of offices within each organization from which water data may be obtained, (4) the types of data held by each organization and the geographic locations within which these data have been collected, (5) alternative sources of an organization's data, (6) the designation of liaison personnel in matters related to water-data acquisition and indexing, (7) the volume of water data indexed for the organization, and (8) information about other types of data and services available from the organization that are pertinent to water-resources activities.
Role of Statistical Random-Effects Linear Models in Personalized Medicine.
Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose
2012-03-01
Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization.
Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong
2010-10-01
Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
VISUAL PLUMES CONCEPTS TO POTENTIALLY ADAPT OR ADOPT IN MODELING PLATFORMS SUCH AS VISJET
Windows-based programs share many familiar features and components. For example, file dialogue windows are familiar to most Windows-based personal computer users. Such program elements are desirable because the user is already familiar with how they function, obviating the need f...
Emura, Takeshi; Nakatochi, Masahiro; Matsui, Shigeyuki; Michimae, Hirofumi; Rondeau, Virginie
2017-01-01
Developing a personalized risk prediction model of death is fundamental for improving patient care and touches on the realm of personalized medicine. The increasing availability of genomic information and large-scale meta-analytic data sets for clinicians has motivated the extension of traditional survival prediction based on the Cox proportional hazards model. The aim of our paper is to develop a personalized risk prediction formula for death according to genetic factors and dynamic tumour progression status based on meta-analytic data. To this end, we extend the existing joint frailty-copula model to a model allowing for high-dimensional genetic factors. In addition, we propose a dynamic prediction formula to predict death given tumour progression events possibly occurring after treatment or surgery. For clinical use, we implement the computation software of the prediction formula in the joint.Cox R package. We also develop a tool to validate the performance of the prediction formula by assessing the prediction error. We illustrate the method with the meta-analysis of individual patient data on ovarian cancer patients.
Theory and Programs for Dynamic Modeling of Tree Rings from Climate
Paul C. van Deusen; Jennifer Koretz
1988-01-01
Computer programs written in GAUSS(TM) for IBM compatible personal computers are described that perform dynamic tree ring modeling with climate data; the underlying theory is also described. The programs and a separate users manual are available from the authors, although users must have the GAUSS software package on their personal computer. An example application of...
A user-oriented and computerized model for estimating vehicle ride quality
NASA Technical Reports Server (NTRS)
Leatherwood, J. D.; Barker, L. M.
1984-01-01
A simplified empirical model and computer program for estimating passenger ride comfort within air and surface transportation systems are described. The model is based on subjective ratings from more than 3000 persons who were exposed to controlled combinations of noise and vibration in the passenger ride quality apparatus. This model has the capability of transforming individual elements of a vehicle's noise and vibration environment into subjective discomfort units and then combining the subjective units to produce a single discomfort index typifying passenger acceptance of the environment. The computational procedures required to obtain discomfort estimates are discussed, and a user oriented ride comfort computer program is described. Examples illustrating application of the simplified model to helicopter and automobile ride environments are presented.
Personalized mitral valve closure computation and uncertainty analysis from 3D echocardiography.
Grbic, Sasa; Easley, Thomas F; Mansi, Tommaso; Bloodworth, Charles H; Pierce, Eric L; Voigt, Ingmar; Neumann, Dominik; Krebs, Julian; Yuh, David D; Jensen, Morten O; Comaniciu, Dorin; Yoganathan, Ajit P
2017-01-01
Intervention planning is essential for successful Mitral Valve (MV) repair procedures. Finite-element models (FEM) of the MV could be used to achieve this goal, but the translation to the clinical domain is challenging. Many input parameters for the FEM models, such as tissue properties, are not known. In addition, only simplified MV geometry models can be extracted from non-invasive modalities such as echocardiography imaging, lacking major anatomical details such as the complex chordae topology. A traditional approach for FEM computation is to use a simplified model (also known as parachute model) of the chordae topology, which connects the papillary muscle tips to the free-edges and select basal points. Building on the existing parachute model a new and comprehensive MV model was developed that utilizes a novel chordae representation capable of approximating regional connectivity. In addition, a fully automated personalization approach was developed for the chordae rest length, removing the need for tedious manual parameter selection. Based on the MV model extracted during mid-diastole (open MV) the MV geometric configuration at peak systole (closed MV) was computed according to the FEM model. In this work the focus was placed on validating MV closure computation. The method is evaluated on ten in vitro ovine cases, where in addition to echocardiography imaging, high-resolution μCT imaging is available for accurate validation. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Lai, K. Robert; Lan, Chung Hsien
2006-01-01
This work presents a novel method for modeling collaborative learning as multi-issue agent negotiation using fuzzy constraints. Agent negotiation is an iterative process, through which, the proposed method aggregates student marks to reduce personal bias. In the framework, students define individual fuzzy membership functions based on their…
A Computational Model of Learners Achievement Emotions Using Control-Value Theory
ERIC Educational Resources Information Center
Muñoz, Karla; Noguez, Julieta; Neri, Luis; Mc Kevitt, Paul; Lunney, Tom
2016-01-01
Game-based Learning (GBL) environments make instruction flexible and interactive. Positive experiences depend on personalization. Student modelling has focused on affect. Three methods are used: (1) recognizing the physiological effects of emotion, (2) reasoning about emotion from its origin and (3) an approach combining 1 and 2. These have proven…
The Individual Virtual Eye: a Computer Model for Advanced Intraocular Lens Calculation
Einighammer, Jens; Oltrup, Theo; Bende, Thomas; Jean, Benedikt
2010-01-01
Purpose To describe the individual virtual eye, a computer model of a human eye with respect to its optical properties. It is based on measurements of an individual person and one of its major application is calculating intraocular lenses (IOLs) for cataract surgery. Methods The model is constructed from an eye's geometry, including axial length and topographic measurements of the anterior corneal surface. All optical components of a pseudophakic eye are modeled with computer scientific methods. A spline-based interpolation method efficiently includes data from corneal topographic measurements. The geometrical optical properties, such as the wavefront aberration, are simulated with real ray-tracing using Snell's law. Optical components can be calculated using computer scientific optimization procedures. The geometry of customized aspheric IOLs was calculated for 32 eyes and the resulting wavefront aberration was investigated. Results The more complex the calculated IOL is, the lower the residual wavefront error is. Spherical IOLs are only able to correct for the defocus, while toric IOLs also eliminate astigmatism. Spherical aberration is additionally reduced by aspheric and toric aspheric IOLs. The efficient implementation of time-critical numerical ray-tracing and optimization procedures allows for short calculation times, which may lead to a practicable method integrated in some device. Conclusions The individual virtual eye allows for simulations and calculations regarding geometrical optics for individual persons. This leads to clinical applications like IOL calculation, with the potential to overcome the limitations of those current calculation methods that are based on paraxial optics, exemplary shown by calculating customized aspheric IOLs.
COSP - A computer model of cyclic oxidation
NASA Technical Reports Server (NTRS)
Lowell, Carl E.; Barrett, Charles A.; Palmer, Raymond W.; Auping, Judith V.; Probst, Hubert B.
1991-01-01
A computer model useful in predicting the cyclic oxidation behavior of alloys is presented. The model considers the oxygen uptake due to scale formation during the heating cycle and the loss of oxide due to spalling during the cooling cycle. The balance between scale formation and scale loss is modeled and used to predict weight change and metal loss kinetics. A simple uniform spalling model is compared to a more complex random spall site model. In nearly all cases, the simpler uniform spall model gave predictions as accurate as the more complex model. The model has been applied to several nickel-base alloys which, depending upon composition, form Al2O3 or Cr2O3 during oxidation. The model has been validated by several experimental approaches. Versions of the model that run on a personal computer are available.
Towards Open-World Person Re-Identification by One-Shot Group-Based Verification.
Zheng, Wei-Shi; Gong, Shaogang; Xiang, Tao
2016-03-01
Solving the problem of matching people across non-overlapping multi-camera views, known as person re-identification (re-id), has received increasing interests in computer vision. In a real-world application scenario, a watch-list (gallery set) of a handful of known target people are provided with very few (in many cases only a single) image(s) (shots) per target. Existing re-id methods are largely unsuitable to address this open-world re-id challenge because they are designed for (1) a closed-world scenario where the gallery and probe sets are assumed to contain exactly the same people, (2) person-wise identification whereby the model attempts to verify exhaustively against each individual in the gallery set, and (3) learning a matching model using multi-shots. In this paper, a novel transfer local relative distance comparison (t-LRDC) model is formulated to address the open-world person re-identification problem by one-shot group-based verification. The model is designed to mine and transfer useful information from a labelled open-world non-target dataset. Extensive experiments demonstrate that the proposed approach outperforms both non-transfer learning and existing transfer learning based re-id methods.
Valence-Dependent Belief Updating: Computational Validation
Kuzmanovic, Bojana; Rigoux, Lionel
2017-01-01
People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments. PMID:28706499
Valence-Dependent Belief Updating: Computational Validation.
Kuzmanovic, Bojana; Rigoux, Lionel
2017-01-01
People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates) with trials with bad news (worse-than-expected base rates). After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic) Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational) Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on reinforcement learning was superior to the Bayesian approach. The computational validation of valence-dependent belief updating represents a novel support for a genuine optimism bias in human belief formation. Moreover, the precise control of relevant cognitive variables justifies the conclusion that the motivation to adopt the most favorable self-referential conclusions biases human judgments.
An, Gary; Bartels, John; Vodovotz, Yoram
2011-03-01
The clinical translation of promising basic biomedical findings, whether derived from reductionist studies in academic laboratories or as the product of extensive high-throughput and -content screens in the biotechnology and pharmaceutical industries, has reached a period of stagnation in which ever higher research and development costs are yielding ever fewer new drugs. Systems biology and computational modeling have been touted as potential avenues by which to break through this logjam. However, few mechanistic computational approaches are utilized in a manner that is fully cognizant of the inherent clinical realities in which the drugs developed through this ostensibly rational process will be ultimately used. In this article, we present a Translational Systems Biology approach to inflammation. This approach is based on the use of mechanistic computational modeling centered on inherent clinical applicability, namely that a unified suite of models can be applied to generate in silico clinical trials, individualized computational models as tools for personalized medicine, and rational drug and device design based on disease mechanism.
ERIC Educational Resources Information Center
Krus, David J.; And Others
This paper describes a test which attempts to measure a group of personality traits by analyzing the actual behavior of the participant in a computer-simulated game. ECHO evolved from an extension and computerization of Horstein and Deutsch's allocation game. The computerized version of ECHO requires subjects to make decisions about the allocation…
ERIC Educational Resources Information Center
Liew, Tze Wei; Tan, Su-Mae; Seydali, Rouzbeh
2014-01-01
In this article, the effects of personalized narration in multimedia learning on learners' computer perceptions and task-related attitudes were examined. Twenty-six field independent and 22 field dependent participants studied the computer-based multimedia lessons on C-Programming, either with personalized narration or non-personalized narration.…
Climate@Home: Crowdsourcing Climate Change Research
NASA Astrophysics Data System (ADS)
Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.
2011-12-01
Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.
Qualification and Approval of Personal Computer-Based Aviation Training Devices
DOT National Transportation Integrated Search
1997-05-12
This Advisory Circular (AC) provides information and guidance to potential training device manufacturers and aviation training consumers concerning a means, acceptable to the Administrator, by which personal computer-based aviation training devices (...
Role of Statistical Random-Effects Linear Models in Personalized Medicine
Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose
2012-01-01
Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization. PMID:23467392
Chien, Tsair-Wei; Shao, Yang; Kuo, Shu-Chun
2017-01-10
Many continuous item responses (CIRs) are encountered in healthcare settings, but no one uses item response theory's (IRT) probabilistic modeling to present graphical presentations for interpreting CIR results. A computer module that is programmed to deal with CIRs is required. To present a computer module, validate it, and verify its usefulness in dealing with CIR data, and then to apply the model to real healthcare data in order to show how the CIR that can be applied to healthcare settings with an example regarding a safety attitude survey. Using Microsoft Excel VBA (Visual Basic for Applications), we designed a computer module that minimizes the residuals and calculates model's expected scores according to person responses across items. Rasch models based on a Wright map and on KIDMAP were demonstrated to interpret results of the safety attitude survey. The author-made CIR module yielded OUTFIT mean square (MNSQ) and person measures equivalent to those yielded by professional Rasch Winsteps software. The probabilistic modeling of the CIR module provides messages that are much more valuable to users and show the CIR advantage over classic test theory. Because of advances in computer technology, healthcare users who are familiar to MS Excel can easily apply the study CIR module to deal with continuous variables to benefit comparisons of data with a logistic distribution and model fit statistics.
Model for disease dynamics of a waterborne pathogen on a random network.
Li, Meili; Ma, Junling; van den Driessche, P
2015-10-01
A network epidemic SIWR model for cholera and other diseases that can be transmitted via the environment is developed and analyzed. The person-to-person contacts are modeled by a random contact network, and the contagious environment is modeled by an external node that connects to every individual. The model is adapted from the Miller network SIR model, and in the homogeneous mixing limit becomes the Tien and Earn deterministic cholera model without births and deaths. The dynamics of our model shows excellent agreement with stochastic simulations. The basic reproduction number [Formula: see text] is computed, and on a Poisson network shown to be the sum of the basic reproduction numbers of the person-to-person and person-to-water-to-person transmission pathways. However, on other networks, [Formula: see text] depends nonlinearly on the transmission along the two pathways. Type reproduction numbers are computed and quantify measures to control the disease. Equations giving the final epidemic size are obtained.
ERIC Educational Resources Information Center
Fenton, Ginger D.; LaBorde, Luke F.; Radhakrishna, Rama B.; Brown, J. Lynne; Cutter, Catherine N.
2006-01-01
Computer-based training is increasingly favored by food companies for training workers due to convenience, self-pacing ability, and ease of use. The objectives of this study were to determine if personal hygiene training, offered through a computer-based method, is as effective as a face-to-face method in knowledge acquisition and improved…
Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models
NASA Astrophysics Data System (ADS)
Pallant, Amy; Lee, Hee-Sun
2015-04-01
Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.
Computer-Based Script Training for Aphasia: Emerging Themes from Post-Treatment Interviews
ERIC Educational Resources Information Center
Cherney, Leora R.; Halper, Anita S.; Kaye, Rosalind C.
2011-01-01
This study presents results of post-treatment interviews following computer-based script training for persons with chronic aphasia. Each of the 23 participants received 9 weeks of AphasiaScripts training. Post-treatment interviews were conducted with the person with aphasia and/or a significant other person. The 23 interviews yielded 584 coded…
Requirements for benchmarking personal image retrieval systems
NASA Astrophysics Data System (ADS)
Bouguet, Jean-Yves; Dulong, Carole; Kozintsev, Igor; Wu, Yi
2006-01-01
It is now common to have accumulated tens of thousands of personal ictures. Efficient access to that many pictures can only be done with a robust image retrieval system. This application is of high interest to Intel processor architects. It is highly compute intensive, and could motivate end users to upgrade their personal computers to the next generations of processors. A key question is how to assess the robustness of a personal image retrieval system. Personal image databases are very different from digital libraries that have been used by many Content Based Image Retrieval Systems.1 For example a personal image database has a lot of pictures of people, but a small set of different people typically family, relatives, and friends. Pictures are taken in a limited set of places like home, work, school, and vacation destination. The most frequent queries are searched for people, and for places. These attributes, and many others affect how a personal image retrieval system should be benchmarked, and benchmarks need to be different from existing ones based on art images, or medical images for examples. The attributes of the data set do not change the list of components needed for the benchmarking of such systems as specified in2: - data sets - query tasks - ground truth - evaluation measures - benchmarking events. This paper proposed a way to build these components to be representative of personal image databases, and of the corresponding usage models.
A personal computer-based nuclear magnetic resonance spectrometer
NASA Astrophysics Data System (ADS)
Job, Constantin; Pearson, Robert M.; Brown, Michael F.
1994-11-01
Nuclear magnetic resonance (NMR) spectroscopy using personal computer-based hardware has the potential of enabling the application of NMR methods to fields where conventional state of the art equipment is either impractical or too costly. With such a strategy for data acquisition and processing, disciplines including civil engineering, agriculture, geology, archaeology, and others have the possibility of utilizing magnetic resonance techniques within the laboratory or conducting applications directly in the field. Another aspect is the possibility of utilizing existing NMR magnets which may be in good condition but unused because of outdated or nonrepairable electronics. Moreover, NMR applications based on personal computer technology may open up teaching possibilities at the college or even secondary school level. The goal of developing such a personal computer (PC)-based NMR standard is facilitated by existing technologies including logic cell arrays, direct digital frequency synthesis, use of PC-based electrical engineering software tools to fabricate electronic circuits, and the use of permanent magnets based on neodymium-iron-boron alloy. Utilizing such an approach, we have been able to place essentially an entire NMR spectrometer console on two printed circuit boards, with the exception of the receiver and radio frequency power amplifier. Future upgrades to include the deuterium lock and the decoupler unit are readily envisioned. The continued development of such PC-based NMR spectrometers is expected to benefit from the fast growing, practical, and low cost personal computer market.
Automated method for structural segmentation of nasal airways based on cone beam computed tomography
NASA Astrophysics Data System (ADS)
Tymkovych, Maksym Yu.; Avrunin, Oleg G.; Paliy, Victor G.; Filzow, Maksim; Gryshkov, Oleksandr; Glasmacher, Birgit; Omiotek, Zbigniew; DzierŻak, RóŻa; Smailova, Saule; Kozbekova, Ainur
2017-08-01
The work is dedicated to the segmentation problem of human nasal airways using Cone Beam Computed Tomography. During research, we propose a specialized approach of structured segmentation of nasal airways. That approach use spatial information, symmetrisation of the structures. The proposed stages can be used for construction a virtual three dimensional model of nasal airways and for production full-scale personalized atlases. During research we build the virtual model of nasal airways, which can be used for construction specialized medical atlases and aerodynamics researches.
An E-learning System based on Affective Computing
NASA Astrophysics Data System (ADS)
Duo, Sun; Song, Lu Xue
In recent years, e-learning as a learning system is very popular. But the current e-learning systems cannot instruct students effectively since they do not consider the emotional state in the context of instruction. The emergence of the theory about "Affective computing" can solve this question. It can make the computer's intelligence no longer be a pure cognitive one. In this paper, we construct an emotional intelligent e-learning system based on "Affective computing". A dimensional model is put forward to recognize and analyze the student's emotion state and a virtual teacher's avatar is offered to regulate student's learning psychology with consideration of teaching style based on his personality trait. A "man-to-man" learning environment is built to simulate the traditional classroom's pedagogy in the system.
Bryant, Stephanie J; Vernerey, Franck J
2018-01-01
Biomimetic and biodegradable synthetic hydrogels are emerging as a promising platform for cell encapsulation and tissue engineering. Notably, synthetic-based hydrogels offer highly programmable macroscopic properties (e.g., mechanical, swelling and transport properties) and degradation profiles through control over several tunable parameters (e.g., the initial network structure, degradation kinetics and behavior, and polymer properties). One component to success is the ability to maintain structural integrity as the hydrogel transitions to neo-tissue. This seamless transition is complicated by the fact that cellular activity is highly variable among donors. Thus, computational models provide an important tool in tissue engineering due to their unique ability to explore the coupled processes of hydrogel degradation and neo-tissue growth across multiple length scales. In addition, such models provide new opportunities to develop predictive computational tools to overcome the challenges with designing hydrogels for different donors. In this report, programmable properties of synthetic-based hydrogels and their relation to the hydrogel's structural properties and their evolution with degradation are reviewed. This is followed by recent progress on the development of computational models that describe hydrogel degradation with neo-tissue growth when cells are encapsulated in a hydrogel. Finally, the potential for predictive models to enable patient-specific hydrogel designs for personalized tissue engineering is discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A self-taught artificial agent for multi-physics computational model personalization.
Neumann, Dominik; Mansi, Tommaso; Itu, Lucian; Georgescu, Bogdan; Kayvanpour, Elham; Sedaghat-Hamedani, Farbod; Amr, Ali; Haas, Jan; Katus, Hugo; Meder, Benjamin; Steidl, Stefan; Hornegger, Joachim; Comaniciu, Dorin
2016-12-01
Personalization is the process of fitting a model to patient data, a critical step towards application of multi-physics computational models in clinical practice. Designing robust personalization algorithms is often a tedious, time-consuming, model- and data-specific process. We propose to use artificial intelligence concepts to learn this task, inspired by how human experts manually perform it. The problem is reformulated in terms of reinforcement learning. In an off-line phase, Vito, our self-taught artificial agent, learns a representative decision process model through exploration of the computational model: it learns how the model behaves under change of parameters. The agent then automatically learns an optimal strategy for on-line personalization. The algorithm is model-independent; applying it to a new model requires only adjusting few hyper-parameters of the agent and defining the observations to match. The full knowledge of the model itself is not required. Vito was tested in a synthetic scenario, showing that it could learn how to optimize cost functions generically. Then Vito was applied to the inverse problem of cardiac electrophysiology and the personalization of a whole-body circulation model. The obtained results suggested that Vito could achieve equivalent, if not better goodness of fit than standard methods, while being more robust (up to 11% higher success rates) and with faster (up to seven times) convergence rate. Our artificial intelligence approach could thus make personalization algorithms generalizable and self-adaptable to any patient and any model. Copyright © 2016. Published by Elsevier B.V.
Computational Modeling of Inflammation and Wound Healing
Ziraldo, Cordelia; Mi, Qi; An, Gary; Vodovotz, Yoram
2013-01-01
Objective Inflammation is both central to proper wound healing and a key driver of chronic tissue injury via a positive-feedback loop incited by incidental cell damage. We seek to derive actionable insights into the role of inflammation in wound healing in order to improve outcomes for individual patients. Approach To date, dynamic computational models have been used to study the time evolution of inflammation in wound healing. Emerging clinical data on histo-pathological and macroscopic images of evolving wounds, as well as noninvasive measures of blood flow, suggested the need for tissue-realistic, agent-based, and hybrid mechanistic computational simulations of inflammation and wound healing. Innovation We developed a computational modeling system, Simple Platform for Agent-based Representation of Knowledge, to facilitate the construction of tissue-realistic models. Results A hybrid equation–agent-based model (ABM) of pressure ulcer formation in both spinal cord-injured and -uninjured patients was used to identify control points that reduce stress caused by tissue ischemia/reperfusion. An ABM of arterial restenosis revealed new dynamics of cell migration during neointimal hyperplasia that match histological features, but contradict the currently prevailing mechanistic hypothesis. ABMs of vocal fold inflammation were used to predict inflammatory trajectories in individuals, possibly allowing for personalized treatment. Conclusions The intertwined inflammatory and wound healing responses can be modeled computationally to make predictions in individuals, simulate therapies, and gain mechanistic insights. PMID:24527362
GAS eleven node thermal model (GEM)
NASA Technical Reports Server (NTRS)
Butler, Dan
1988-01-01
The Eleven Node Thermal Model (GEM) of the Get Away Special (GAS) container was originally developed based on the results of thermal tests of the GAS container. The model was then used in the thermal analysis and design of several NASA/GSFC GAS experiments, including the Flight Verification Payload, the Ultraviolet Experiment, and the Capillary Pumped Loop. The model description details the five cu ft container both with and without an insulated end cap. Mass specific heat values are also given so that transient analyses can be performed. A sample problem for each configuration is included as well so that GEM users can verify their computations. The model can be run on most personal computers with a thermal analyzer solution routine.
NASA Astrophysics Data System (ADS)
Cleves, Ann E.; Jain, Ajay N.
2008-03-01
Inductive bias is the set of assumptions that a person or procedure makes in making a prediction based on data. Different methods for ligand-based predictive modeling have different inductive biases, with a particularly sharp contrast between 2D and 3D similarity methods. A unique aspect of ligand design is that the data that exist to test methodology have been largely man-made, and that this process of design involves prediction. By analyzing the molecular similarities of known drugs, we show that the inductive bias of the historic drug discovery process has a very strong 2D bias. In studying the performance of ligand-based modeling methods, it is critical to account for this issue in dataset preparation, use of computational controls, and in the interpretation of results. We propose specific strategies to explicitly address the problems posed by inductive bias considerations.
An, Gary; Bartels, John; Vodovotz, Yoram
2011-01-01
The clinical translation of promising basic biomedical findings, whether derived from reductionist studies in academic laboratories or as the product of extensive high-throughput and –content screens in the biotechnology and pharmaceutical industries, has reached a period of stagnation in which ever higher research and development costs are yielding ever fewer new drugs. Systems biology and computational modeling have been touted as potential avenues by which to break through this logjam. However, few mechanistic computational approaches are utilized in a manner that is fully cognizant of the inherent clinical realities in which the drugs developed through this ostensibly rational process will be ultimately used. In this article, we present a Translational Systems Biology approach to inflammation. This approach is based on the use of mechanistic computational modeling centered on inherent clinical applicability, namely that a unified suite of models can be applied to generate in silico clinical trials, individualized computational models as tools for personalized medicine, and rational drug and device design based on disease mechanism. PMID:21552346
Psychopathy-related traits and the use of reward and social information: a computational approach
Brazil, Inti A.; Hunt, Laurence T.; Bulten, Berend H.; Kessels, Roy P. C.; de Bruijn, Ellen R. A.; Mars, Rogier B.
2013-01-01
Psychopathy is often linked to disturbed reinforcement-guided adaptation of behavior in both clinical and non-clinical populations. Recent work suggests that these disturbances might be due to a deficit in actively using information to guide changes in behavior. However, how much information is actually used to guide behavior is difficult to observe directly. Therefore, we used a computational model to estimate the use of information during learning. Thirty-six female subjects were recruited based on their total scores on the Psychopathic Personality Inventory (PPI), a self-report psychopathy list, and performed a task involving simultaneous learning of reward-based and social information. A Bayesian reinforcement-learning model was used to parameterize the use of each source of information during learning. Subsequently, we used the subscales of the PPI to assess psychopathy-related traits, and the traits that were strongly related to the model's parameters were isolated through a formal variable selection procedure. Finally, we assessed how these covaried with model parameters. We succeeded in isolating key personality traits believed to be relevant for psychopathy that can be related to model-based descriptions of subject behavior. Use of reward-history information was negatively related to levels of trait anxiety and fearlessness, whereas use of social advice decreased as the perceived ability to manipulate others and lack of anxiety increased. These results corroborate previous findings suggesting that sub-optimal use of different types of information might be implicated in psychopathy. They also further highlight the importance of considering the potential of computational modeling to understand the role of latent variables, such as the weight people give to various sources of information during goal-directed behavior, when conducting research on psychopathy-related traits and in the field of forensic psychiatry. PMID:24391615
Fernández Peruchena, Carlos M; Prado-Velasco, Manuel
2010-01-01
Diabetes mellitus (DM) has a growing incidence and prevalence in modern societies, pushed by the aging and change of life styles. Despite the huge resources dedicated to improve their quality of life, mortality and morbidity rates, these are still very poor. In this work, DM pathology is revised from clinical and metabolic points of view, as well as mathematical models related to DM, with the aim of justifying an evolution of DM therapies towards the correction of the physiological metabolic loops involved. We analyze the reliability of mathematical models, under the perspective of virtual physiological human (VPH) initiatives, for generating and integrating customized knowledge about patients, which is needed for that evolution. Wearable smart sensors play a key role in this frame, as they provide patient's information to the models.A telehealthcare computational architecture based on distributed smart sensors (first processing layer) and personalized physiological mathematical models integrated in Human Physiological Images (HPI) computational components (second processing layer), is presented. This technology was designed for a renal disease telehealthcare in earlier works and promotes crossroads between smart sensors and the VPH initiative. We suggest that it is able to support a truly personalized, preventive, and predictive healthcare model for the delivery of evolved DM therapies.
Fernández Peruchena, Carlos M; Prado-Velasco, Manuel
2010-01-01
Diabetes mellitus (DM) has a growing incidence and prevalence in modern societies, pushed by the aging and change of life styles. Despite the huge resources dedicated to improve their quality of life, mortality and morbidity rates, these are still very poor. In this work, DM pathology is revised from clinical and metabolic points of view, as well as mathematical models related to DM, with the aim of justifying an evolution of DM therapies towards the correction of the physiological metabolic loops involved. We analyze the reliability of mathematical models, under the perspective of virtual physiological human (VPH) initiatives, for generating and integrating customized knowledge about patients, which is needed for that evolution. Wearable smart sensors play a key role in this frame, as they provide patient’s information to the models. A telehealthcare computational architecture based on distributed smart sensors (first processing layer) and personalized physiological mathematical models integrated in Human Physiological Images (HPI) computational components (second processing layer), is presented. This technology was designed for a renal disease telehealthcare in earlier works and promotes crossroads between smart sensors and the VPH initiative. We suggest that it is able to support a truly personalized, preventive, and predictive healthcare model for the delivery of evolved DM therapies. PMID:21625646
Kannan, Srimathi; Schulz, Amy; Israel, Barbara; Ayra, Indira; Weir, Sheryl; Dvonch, Timothy J.; Rowe, Zachary; Miller, Patricia; Benjamin, Alison
2008-01-01
Background Computer tailoring and personalizing recommendations for dietary health-promoting behaviors are in accordance with community-based participatory research (CBPR) principles, which emphasizes research that benefits the participants and community involved. Objective To describe the CBPR process utilized to computer-generate and disseminate personalized nutrition feedback reports (NFRs) for Detroit Healthy Environments Partnership (HEP) study participants. METHODS The CBPR process included discussion and feedback from HEP partners on several draft personalized reports. The nutrition feedback process included defining the feedback objectives; prioritizing the nutrients; customizing the report design; reviewing and revising the NFR template and readability; producing and disseminating the report; and participant follow-up. Lessons Learned Application of CBPR principles in designing the NFR resulted in a reader-friendly product with useful recommendations to promote heart health. Conclusions A CBPR process can enhance computer tailoring of personalized NFRs to address racial and socioeconomic disparities in cardiovascular disease (CVD). PMID:19337572
Computer Administering of the Psychological Investigations: Set-Relational Representation
NASA Astrophysics Data System (ADS)
Yordzhev, Krasimir
Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.
An efficient and scalable deformable model for virtual reality-based medical applications.
Choi, Kup-Sze; Sun, Hanqiu; Heng, Pheng-Ann
2004-09-01
Modeling of tissue deformation is of great importance to virtual reality (VR)-based medical simulations. Considerable effort has been dedicated to the development of interactively deformable virtual tissues. In this paper, an efficient and scalable deformable model is presented for virtual-reality-based medical applications. It considers deformation as a localized force transmittal process which is governed by algorithms based on breadth-first search (BFS). The computational speed is scalable to facilitate real-time interaction by adjusting the penetration depth. Simulated annealing (SA) algorithms are developed to optimize the model parameters by using the reference data generated with the linear static finite element method (FEM). The mechanical behavior and timing performance of the model have been evaluated. The model has been applied to simulate the typical behavior of living tissues and anisotropic materials. Integration with a haptic device has also been achieved on a generic personal computer (PC) platform. The proposed technique provides a feasible solution for VR-based medical simulations and has the potential for multi-user collaborative work in virtual environment.
Applications of Artificial Intelligence in Education--A Personal View.
ERIC Educational Resources Information Center
Richer, Mark H.
1985-01-01
Discusses: how artificial intelligence (AI) can advance education; if the future of software lies in AI; the roots of intelligent computer-assisted instruction; protocol analysis; reactive environments; LOGO programming language; student modeling and coaching; and knowledge-based instructional programs. Numerous examples of AI programs are cited.…
Chiêm, Jean-Christophe; Van Durme, Thérèse; Vandendorpe, Florence; Schmitz, Olivier; Speybroeck, Niko; Cès, Sophie; Macq, Jean
2014-08-01
Various elderly case management projects have been implemented in Belgium. This type of long-term health care intervention involves contextual factors and human interactions. These underlying complex mechanisms can be usefully informed with field experts' knowledge, which are hard to make explicit. However, computer simulation has been suggested as one possible method of overcoming the difficulty of articulating such elicited qualitative views. A simulation model of case management was designed using an agent-based methodology, based on the initial qualitative research material. Variables and rules of interaction were formulated into a simple conceptual framework. This model has been implemented and was used as a support for a structured discussion with experts in case management. The rigorous formulation provided by the agent-based methodology clarified the descriptions of the interventions and the problems encountered regarding: the diverse network topologies of health care actors in the project; the adaptation time required by the intervention; the communication between the health care actors; the institutional context; the organization of the care; and the role of the case manager and his or hers personal ability to interpret the informal demands of the frail older person. The simulation model should be seen primarily as a tool for thinking and learning. A number of insights were gained as part of a valuable cognitive process. Computer simulation supporting field experts' elicitation can lead to better-informed decisions in the organization of complex health care interventions. © 2013 John Wiley & Sons, Ltd.
DOT National Transportation Integrated Search
2007-08-01
This research was conducted to develop and test a personal computer-based study procedure (PCSP) with secondary task loading for use in human factors laboratory experiments in lieu of a driving simulator to test reading time and understanding of traf...
Personalization through the Application of Inverse Bayes to Student Modeling
ERIC Educational Resources Information Center
Lang, Charles William McLeod
2015-01-01
Personalization, the idea that teaching can be tailored to each students' needs, has been a goal for the educational enterprise for at least 2,500 years (Regian, Shute, & Shute, 2013, p.2). Recently personalization has picked up speed with the advent of mobile computing, the Internet and increases in computer processing power. These changes…
Jiang-Jun, Zhou; Min, Zhao; Ya-Bo, Yan; Wei, Lei; Ren-Fa, Lv; Zhi-Yu, Zhu; Rong-Jian, Chen; Wei-Tao, Yu; Cheng-Fei, Du
2014-03-01
Finite element analysis was used to compare preoperative and postoperative stress distribution of a bone healing model of femur fracture, to identify whether broken ends of fractured bone would break or not after fixation dislodgement one year after intramedullary nailing. Method s: Using fast, personalized imaging, bone healing models of femur fracture were constructed based on data from multi-slice spiral computed tomography using Mimics, Geomagic Studio, and Abaqus software packages. The intramedullary pin was removed by Boolean operations before fixation was dislodged. Loads were applied on each model to simulate a person standing on one leg. The von Mises stress distribution, maximum stress, and its location was observed. Results : According to 10 kinds of display groups based on material assignment, the nodes of maximum and minimum von Mises stress were the same before and after dislodgement, and all nodes of maximum von Mises stress were outside the fracture line. The maximum von Mises stress node was situated at the bottom quarter of the femur. The von Mises stress distribution was identical before and after surgery. Conclusion : Fast, personalized model establishment can simulate fixation dislodgement before operation, and personalized finite element analysis was performed to successfully predict whether nail dislodgement would disrupt femur fracture or not.
Biehler, J; Wall, W A
2018-02-01
If computational models are ever to be used in high-stakes decision making in clinical practice, the use of personalized models and predictive simulation techniques is a must. This entails rigorous quantification of uncertainties as well as harnessing available patient-specific data to the greatest extent possible. Although researchers are beginning to realize that taking uncertainty in model input parameters into account is a necessity, the predominantly used probabilistic description for these uncertain parameters is based on elementary random variable models. In this work, we set out for a comparison of different probabilistic models for uncertain input parameters using the example of an uncertain wall thickness in finite element models of abdominal aortic aneurysms. We provide the first comparison between a random variable and a random field model for the aortic wall and investigate the impact on the probability distribution of the computed peak wall stress. Moreover, we show that the uncertainty about the prevailing peak wall stress can be reduced if noninvasively available, patient-specific data are harnessed for the construction of the probabilistic wall thickness model. Copyright © 2017 John Wiley & Sons, Ltd.
1980-01-01
SUPPLEMENTARY NOTES Is. KEY WORDS (Continue on reverse ede If neceseay id Identify by block number) Carbon Monoxide (CO) Computer Program Carboxyhemoglobin ...several researchers, which predicts the instantaneous amount of carboxyhemoglobin (COHb) in the blood of a person based upon the amount of carbon monoxide...developed from an empirical equation (derived from reference I and detailed in reference 3) which predicts the amount of carboxyhemoglobin (COHb) in
NASA Astrophysics Data System (ADS)
Kuznetsov, P. G.; Tverdokhlebov, S. I.; Goreninskii, S. I.; Bolbasov, E. N.; Popkov, A. V.; Kulbakin, D. E.; Grigoryev, E. G.; Cherdyntseva, N. V.; Choinzonov, E. L.
2017-09-01
The present work demonstrates the possibility of production of personalized implants from bioresorbable polymers designed for replacement of bone defects. The stages of creating a personalized implant are described, which include the obtaining of 3D model from a computer tomogram, development of the model with respect to shape of bone fitment bore using Autodesk Meshmixer software, and 3D printing process from bioresorbable polymers. The results of bioresorbable polymer scaffolds implantation in pre-clinical tests on laboratory animals are shown. The biological properties of new bioresorbable polymers based on poly(lactic acid) were studied during their subcutaneous, intramuscular, bone and intraosseous implantation in laboratory animals. In all cases, there was a lack of a fibrous capsule formation around the bioresorbable polymer over time. Also, during the performed study, conclusions were made on osteogenesis intensity depending on the initial state of bone tissue.
Emancipative Educational Technology.
ERIC Educational Resources Information Center
Boyd, Gary M.
1996-01-01
Presents a theoretical systems model for computer-mediated conferencing. Discusses Habermas' criteria for emancipative discourse (i.e., liberation or emancipation is increasing a person's abilities and opportunities to make rational choices about matters important to that person). Demonstrates why computer conferencing is best suited to…
A simple node and conductor data generator for SINDA
NASA Technical Reports Server (NTRS)
Gottula, Ronald R.
1992-01-01
This paper presents a simple, automated method to generate NODE and CONDUCTOR DATA for thermal match modes. The method uses personal computer spreadsheets to create SINDA inputs. It was developed in order to make SINDA modeling less time consuming and serves as an alternative to graphical methods. Anyone having some experience using a personal computer can easily implement this process. The user develops spreadsheets to automatically calculate capacitances and conductances based on material properties and dimensional data. The necessary node and conductor information is then taken from the spreadsheets and automatically arranged into the proper format, ready for insertion directly into the SINDA model. This technique provides a number of benefits to the SINDA user such as a reduction in the number of hand calculations, and an ability to very quickly generate a parametric set of NODE and CONDUCTOR DATA blocks. It also provides advantages over graphical thermal modeling systems by retaining the analyst's complete visibility into the thermal network, and by permitting user comments anywhere within the DATA blocks.
ERIC Educational Resources Information Center
Moran, Mark; Hawkes, Mark; El Gayar, Omar
2010-01-01
Many educational institutions have implemented ubiquitous or required laptop, notebook, or tablet personal computing programs for their students. Yet, limited evidence exists to validate integration and acceptance of the technology among student populations. This research examines student acceptance of mobile computing devices using a modification…
Sato, Emi; Matsuda, Kouhei
2018-06-11
The purpose of this study was to examine cerebral blood flow in the frontal cortex area during personality self-rating tasks. Our two hypotheses were (1) cerebral blood flow varies based on personality rating condition and (2) cerebral blood flow varies based on the personality traits. This experiment measured cerebral blood flow under 3 personal computer rating conditions and 2 questionnaire conditions. Comparing the rating conditions, the results of the t-test indicated that cerebral blood flow was higher in the questionnaire condition than it was in the personal computer condition. With respect to the Big Five, the result of the correlation coefficient, that is, cerebral blood flow during a personality rating task, changed according to the trait for agreeableness. The results of the analysis of the 5-cluster on individual differences indicated that certain personality traits were related to the factors that increased or decreased cerebral blood flow. An analysis of variance indicated that openness to experience and Behavioural Activation System-drive was significant given that participants with high intellectual curiosity were motivated in this experiment, thus, their cerebral blood flow may have increased. The significance of this experiment was that by employing certain performance measures we could examine differences in physical changes based on personality traits. © 2018 International Union of Psychological Science.
BIBLIO: A Computer System Designed to Support the Near-Library User Model of Information Retrieval.
ERIC Educational Resources Information Center
Belew, Richard K.; Holland, Maurita Peterson
1988-01-01
Description of the development of the Information Exchange Facility, a prototype microcomputer-based personal bibliographic facility, covers software selection, user selection, overview of the system, and evaluation. The plan for an integrated system, BIBLIO, and the future role of libraries are discussed. (eight references) (MES)
Three-Dimensional Space to Assess Cloud Interoperability
2013-03-01
12 1. Portability and Mobility ...collection of network-enabled services that guarantees to provide a scalable, easy accessible, reliable, and personalized computing infrastructure , based on...are used in research to describe cloud models, such as SaaS (Software as a Service), PaaS (Platform as a service), IaaS ( Infrastructure as a Service
Jacobs, Robin J; Caballero, Joshua; Ownby, Raymond L; Kane, Michael N
2014-11-30
Low health literacy is associated with poor medication adherence in persons with human immunodeficiency virus (HIV), which can lead to poor health outcomes. As linguistic minorities, Spanish-dominant Hispanics (SDH) face challenges such as difficulties in obtaining and understanding accurate information about HIV and its treatment. Traditional health educational methods (e.g., pamphlets, talking) may not be as effective as delivering through alternate venues. Technology-based health information interventions have the potential for being readily available on desktop computers or over the Internet. The purpose of this research was to adapt a theoretically-based computer application (initially developed for English-speaking HIV-positive persons) that will provide linguistically and culturally appropriate tailored health education to Spanish-dominant Hispanics with HIV (HIV + SDH). A mixed methods approach using quantitative and qualitative interviews with 25 HIV + SDH and 5 key informants guided by the Information-Motivation-Behavioral (IMB) Skills model was used to investigate cultural factors influencing medication adherence in HIV + SDH. We used a triangulation approach to identify major themes within cultural contexts relevant to understanding factors related to motivation to adhere to treatment. From this data we adapted an automated computer-based health literacy intervention to be delivered in Spanish. Culture-specific motivational factors for treatment adherence in HIV + SDH persons that emerged from the data were stigma, familismo (family), mood, and social support. Using this data, we developed a culturally and linguistically adapted a tailored intervention that provides information about HIV infection, treatment, and medication related problem solving skills (proven effective in English-speaking populations) that can be delivered using touch-screen computers, tablets, and smartphones to be tested in a future study. Using a theoretically-grounded Internet-based eHealth education intervention that builds on knowledge and also targets core cultural determinants of adherence may prove a highly effective approach to improve health literacy and medication decision-making in this group.
92. VIEW OF CHART RECORDERS AND PERSONAL COMPUTER LINING NORTHEAST ...
92. VIEW OF CHART RECORDERS AND PERSONAL COMPUTER LINING NORTHEAST CORNER OF AUTOPILOT ROOM - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA
Computationally modeling interpersonal trust.
Lee, Jin Joo; Knox, W Bradley; Wormwood, Jolie B; Breazeal, Cynthia; Desteno, David
2013-01-01
We present a computational model capable of predicting-above human accuracy-the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind's readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naiveté of this domain knowledge. We then present the construction of hidden Markov models to investigate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.
Normalized distance aggregation of discriminative features for person reidentification
NASA Astrophysics Data System (ADS)
Hou, Li; Han, Kang; Wan, Wanggen; Hwang, Jenq-Neng; Yao, Haiyan
2018-03-01
We propose an effective person reidentification method based on normalized distance aggregation of discriminative features. Our framework is built on the integration of three high-performance discriminative feature extraction models, including local maximal occurrence (LOMO), feature fusion net (FFN), and a concatenation of LOMO and FFN called LOMO-FFN, through two fast and discriminant metric learning models, i.e., cross-view quadratic discriminant analysis (XQDA) and large-scale similarity learning (LSSL). More specifically, we first represent all the cross-view person images using LOMO, FFN, and LOMO-FFN, respectively, and then apply each extracted feature representation to train XQDA and LSSL, respectively, to obtain the optimized individual cross-view distance metric. Finally, the cross-view person matching is computed as the sum of the optimized individual cross-view distance metric through the min-max normalization. Experimental results have shown the effectiveness of the proposed algorithm on three challenging datasets (VIPeR, PRID450s, and CUHK01).
The Physician's Workstation: Recording a Physical Examination Using a Controlled Vocabulary
Cimino, James J.; Barnett, G. Octo
1987-01-01
A system has been developed which runs on MS-DOS personal computers and serves as an experimental model of a physician's workstation. The program provides an interface to a controlled vocabulary which allows rapid selection of appropriate terms and modifiers for entry of clinical information. Because it captures patient descriptions, it has the ability to serve as an intermediary between the physician and computer-based medical knowledge resources. At present, the vocabulary permits rapid, reliable representation of cardiac physical examination findings.
Nair, Pradeep S; John, Eugene B
2007-01-01
Aligning specific sequences against a very large number of other sequences is a central aspect of bioinformatics. With the widespread availability of personal computers in biology laboratories, sequence alignment is now often performed locally. This makes it necessary to analyse the performance of personal computers for sequence aligning bioinformatics benchmarks. In this paper, we analyse the performance of a personal computer for the popular BLAST and FASTA sequence alignment suites. Results indicate that these benchmarks have a large number of recurring operations and use memory operations extensively. It seems that the performance can be improved with a bigger L1-cache.
Neural simulations on multi-core architectures.
Eichner, Hubert; Klug, Tobias; Borst, Alexander
2009-01-01
Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing.
Neural Simulations on Multi-Core Architectures
Eichner, Hubert; Klug, Tobias; Borst, Alexander
2009-01-01
Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing. PMID:19636393
Visualizing ultrasound through computational modeling
NASA Technical Reports Server (NTRS)
Guo, Theresa W.
2004-01-01
The Doppler Ultrasound Hematocrit Project (DHP) hopes to find non-invasive methods of determining a person s blood characteristics. Because of the limits of microgravity and the space travel environment, it is important to find non-invasive methods of evaluating the health of persons in space. Presently, there is no well developed method of determining blood composition non-invasively. This projects hopes to use ultrasound and Doppler signals to evaluate the characteristic of hematocrit, the percentage by volume of red blood cells within whole blood. These non-invasive techniques may also be developed to be used on earth for trauma patients where invasive measure might be detrimental. Computational modeling is a useful tool for collecting preliminary information and predictions for the laboratory research. We hope to find and develop a computer program that will be able to simulate the ultrasound signals the project will work with. Simulated models of test conditions will more easily show what might be expected from laboratory results thus help the research group make informed decisions before and during experimentation. There are several existing Matlab based computer programs available, designed to interpret and simulate ultrasound signals. These programs will be evaluated to find which is best suited for the project needs. The criteria of evaluation that will be used are 1) the program must be able to specify transducer properties and specify transmitting and receiving signals, 2) the program must be able to simulate ultrasound signals through different attenuating mediums, 3) the program must be able to process moving targets in order to simulate the Doppler effects that are associated with blood flow, 4) the program should be user friendly and adaptable to various models. After a computer program is chosen, two simulation models will be constructed. These models will simulate and interpret an RF data signal and a Doppler signal.
Noury, N; Hadidi, T
2012-12-01
We propose a simulator of human activities collected with presence sensors in our experimental Health Smart Home "Habitat Intelligent pour la Sante (HIS)". We recorded 1492 days of data on several experimental HIS during the French national project "AILISA". On these real data, we built a mathematical model of the behavior of the data series, based on "Hidden Markov Models" (HMM). The model is then played on a computer to produce simulated data series with added flexibility to adjust the parameters in various scenarios. We also tested several methods to measure the similarity between our real and simulated data. Our simulator can produce large data base which can be further used to evaluate the algorithms to raise an alarm in case of loss in autonomy. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
A Simplified Model for Detonation Based Pressure-Gain Combustors
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.
2010-01-01
A time-dependent model is presented which simulates the essential physics of a detonative or otherwise constant volume, pressure-gain combustor for gas turbine applications. The model utilizes simple, global thermodynamic relations to determine an assumed instantaneous and uniform post-combustion state in one of many envisioned tubes comprising the device. A simple, second order, non-upwinding computational fluid dynamic algorithm is then used to compute the (continuous) flowfield properties during the blowdown and refill stages of the periodic cycle which each tube undergoes. The exhausted flow is averaged to provide mixed total pressure and enthalpy which may be used as a cycle performance metric for benefits analysis. The simplicity of the model allows for nearly instantaneous results when implemented on a personal computer. The results compare favorably with higher resolution numerical codes which are more difficult to configure, and more time consuming to operate.
Personalized glucose forecasting for type 2 diabetes using data assimilation
Albers, David J.; Gluckman, Bruce; Ginsberg, Henry; Hripcsak, George; Mamykina, Lena
2017-01-01
Type 2 diabetes leads to premature death and reduced quality of life for 8% of Americans. Nutrition management is critical to maintaining glycemic control, yet it is difficult to achieve due to the high individual differences in glycemic response to nutrition. Anticipating glycemic impact of different meals can be challenging not only for individuals with diabetes, but also for expert diabetes educators. Personalized computational models that can accurately forecast an impact of a given meal on an individual’s blood glucose levels can serve as the engine for a new generation of decision support tools for individuals with diabetes. However, to be useful in practice, these computational engines need to generate accurate forecasts based on limited datasets consistent with typical self-monitoring practices of individuals with type 2 diabetes. This paper uses three forecasting machines: (i) data assimilation, a technique borrowed from atmospheric physics and engineering that uses Bayesian modeling to infuse data with human knowledge represented in a mechanistic model, to generate real-time, personalized, adaptable glucose forecasts; (ii) model averaging of data assimilation output; and (iii) dynamical Gaussian process model regression. The proposed data assimilation machine, the primary focus of the paper, uses a modified dual unscented Kalman filter to estimate states and parameters, personalizing the mechanistic models. Model selection is used to make a personalized model selection for the individual and their measurement characteristics. The data assimilation forecasts are empirically evaluated against actual postprandial glucose measurements captured by individuals with type 2 diabetes, and against predictions generated by experienced diabetes educators after reviewing a set of historical nutritional records and glucose measurements for the same individual. The evaluation suggests that the data assimilation forecasts compare well with specific glucose measurements and match or exceed in accuracy expert forecasts. We conclude by examining ways to present predictions as forecast-derived range quantities and evaluate the comparative advantages of these ranges. PMID:28448498
Building an Integrated Environment for Multimedia
NASA Technical Reports Server (NTRS)
1997-01-01
Multimedia courseware on the solar system and earth science suitable for use in elementary, middle, and high schools was developed under this grant. The courseware runs on Silicon Graphics, Incorporated (SGI) workstations and personal computers (PCs). There is also a version of the courseware accessible via the World Wide Web. Accompanying multimedia database systems were also developed to enhance the multimedia courseware. The database systems accompanying the PC software are based on the relational model, while the database systems accompanying the SGI software are based on the object-oriented model.
Study on Thermal Conductivity of Personal Computer Aluminum-Magnesium Alloy Casing
NASA Astrophysics Data System (ADS)
Liao, MeiHong
With the rapid development of computer technology, micro-state atoms by simulating the movement of material to analyze the nature of the macro-state have become an important subject. Materials, especially aluminium-magnesium alloy materials, often used in personal computer case, this article puts forward heat conduction model of the material, and numerical methods of heat transfer performance of the material.
NASA Astrophysics Data System (ADS)
Zhu, Hou; Hu, Bin
2017-03-01
Human flesh search as a new net crowed behavior, on the one hand can help us to find some special information, on the other hand may lead to privacy leaking and offending human right. In order to study the mechanism of human flesh search, this paper proposes a simulation model based on agent-based model and complex networks. The computational experiments show some useful results. Discovered information quantity and involved personal ratio are highly correlated, and most of net citizens will take part in the human flesh search or will not take part in the human flesh search. Knowledge quantity does not influence involved personal ratio, but influences whether HFS can find out the target human. When the knowledge concentrates on hub nodes, the discovered information quantity is either perfect or almost zero. Emotion of net citizens influences both discovered information quantity and involved personal ratio. Concretely, when net citizens are calm to face the search topic, it will be hardly to find out the target; But when net citizens are agitated, the target will be found out easily.
INDIVIDUAL-BASED MODELS: POWERFUL OR POWER STRUGGLE?
Willem, L; Stijven, S; Hens, N; Vladislavleva, E; Broeckhove, J; Beutels, P
2015-01-01
Individual-based models (IBMs) offer endless possibilities to explore various research questions but come with high model complexity and computational burden. Large-scale IBMs have become feasible but the novel hardware architectures require adapted software. The increased model complexity also requires systematic exploration to gain thorough system understanding. We elaborate on the development of IBMs for vaccine-preventable infectious diseases and model exploration with active learning. Investment in IBM simulator code can lead to significant runtime reductions. We found large performance differences due to data locality. Sorting the population once, reduced simulation time by a factor two. Storing person attributes separately instead of using person objects also seemed more efficient. Next, we improved model performance up to 70% by structuring potential contacts based on health status before processing disease transmission. The active learning approach we present is based on iterative surrogate modelling and model-guided experimentation. Symbolic regression is used for nonlinear response surface modelling with automatic feature selection. We illustrate our approach using an IBM for influenza vaccination. After optimizing the parameter spade, we observed an inverse relationship between vaccination coverage and the clinical attack rate reinforced by herd immunity. These insights can be used to focus and optimise research activities, and to reduce both dimensionality and decision uncertainty.
Modeling of Pedestrian Flows Using Hybrid Models of Euler Equations and Dynamical Systems
NASA Astrophysics Data System (ADS)
Bärwolff, Günter; Slawig, Thomas; Schwandt, Hartmut
2007-09-01
In the last years various systems have been developed for controlling, planning and predicting the traffic of persons and vehicles, in particular under security aspects. Going beyond pure counting and statistical models, approaches were found to be very adequate and accurate which are based on well-known concepts originally developed in very different research areas, namely continuum mechanics and computer science. In the present paper, we outline a continuum mechanical approach for the description of pedestrain flow.
SIGMA--A Graphical Approach to Teaching Simulation.
ERIC Educational Resources Information Center
Schruben, Lee W.
1992-01-01
SIGMA (Simulation Graphical Modeling and Analysis) is a computer graphics environment for building, testing, and experimenting with discrete event simulation models on personal computers. It uses symbolic representations (computer animation) to depict the logic of large, complex discrete event systems for easier understanding and has proven itself…
Developing a multimodal biometric authentication system using soft computing methods.
Malcangi, Mario
2015-01-01
Robust personal authentication is becoming ever more important in computer-based applications. Among a variety of methods, biometric offers several advantages, mainly in embedded system applications. Hard and soft multi-biometric, combined with hard and soft computing methods, can be applied to improve the personal authentication process and to generalize the applicability. This chapter describes the embedded implementation of a multi-biometric (voiceprint and fingerprint) multimodal identification system based on hard computing methods (DSP) for feature extraction and matching, an artificial neural network (ANN) for soft feature pattern matching, and a fuzzy logic engine (FLE) for data fusion and decision.
A computer-based physics laboratory apparatus: Signal generator software
NASA Astrophysics Data System (ADS)
Thanakittiviroon, Tharest; Liangrocapart, Sompong
2005-09-01
This paper describes a computer-based physics laboratory apparatus to replace expensive instruments such as high-precision signal generators. This apparatus uses a sound card in a common personal computer to give sinusoidal signals with an accurate frequency that can be programmed to give different frequency signals repeatedly. An experiment on standing waves on an oscillating string uses this apparatus. In conjunction with interactive lab manuals, which have been developed using personal computers in our university, we achieve a complete set of low-cost, accurate, and easy-to-use equipment for teaching a physics laboratory.
NASA Tech Briefs, February 2001. Volume 25, No. 2
NASA Technical Reports Server (NTRS)
2001-01-01
The topics include: 1) Application Briefs; 2) National Design Engineering Show Preview; 3) Marketing Inventions to Increase Income; 4) A Personal-Computer-Based Physiological Training System; 5) Reconfigurable Arrays of Transistors for Evolvable Hardware; 6) Active Tactile Display Device for Reading by a Blind Person; 7) Program Automates Management of IBM VM Computer Systems; 8) System for Monitoring the Environment of a Spacecraft Launch; 9) Measurement of Stresses and Strains in Muscles and Tendons; 10) Optical Measurement of Temperatures in Muscles and Tendons; 11) Small Low-Temperature Thermometer With Nanokelvin Resolution; 12) Heterodyne Interferometer With Phase-Modulated Carrier; 13) Rechargeable Batteries Based on Intercalation in Graphite; 14) Signal Processor for Doppler Measurements in Icing Research; 15) Model Optimizes Drying of Wet Sheets; 16) High-Performance POSS-Modified Polymeric Composites; 17) Model Simulates Semi-Solid Material Processing; 18) Modular Cryogenic Insulation; 19) Passive Venting for Alleviating Helicopter Tail-Boom Loads; 20) Computer Program Predicts Rocket Noise; 21) Process for Polishing Bare Aluminum to High Optical Quality; 22) External Adhesive Pressure-Wall Patch; 23) Java Implementation of Information-Sharing Protocol; 24) Electronic Bulletin Board Publishes Schedules in Real Time; 25) Apparatus Would Extract Water From the Martian Atmosphere; 26) Review of Research on Supercritical vs Subcritical Fluids; 27) Hybrid Regenerative Water-Recycling System; 28) Study of Fusion-Driven Plasma Thruster With Magnetic Nozzle; 29) Liquid/Vapor-Hydrazine Thruster Would Produce Small Impulses; and 30) Thruster Based on Sublimation of Solid Hydrazine
Eren, Nurhan
2014-12-01
In this study, we aimed to develop two reliable and valid assessment instruments for investigating the level of difficulties mental health workers experience while working with patients with personality disorders and the attitudes they develop tt the patients. The research was carried out based on the general screening model. The study sample consisted of 332 mental health workers in several mental health clinics of Turkey, with a certain amount of experience in working with personality disorders, who were selected with a random assignment method. In order to collect data, the Personal Information Questionnaire, Difficulty of Working with Personality Disorders Scale (PD-DWS), and Attitudes Towards Patients with Personality Disorders Scale (PD-APS), which are being examined for reliability and validity, were applied. To determine construct validity, the Adjective Check List, Maslach Burnout Inventory, and State and Trait Anxiety Inventory were used. Explanatory factor analysis was used for investigating the structural validity, and Cronbach alpha, Spearman-Brown, Guttman Split-Half reliability analyses were utilized to examine the reliability. Also, item reliability and validity computations were carried out by investigating the corrected item-total correlations and discriminative indexes of the items in the scales. For the PD-DWS KMO test, the value was .946; also, a significant difference was found for the Bartlett sphericity test (p<.001). The computed test-retest coefficient reliability was .702; the Cronbach alpha value of the total test score was .952. For PD-APS KMO, the value was .925; a significant difference was found in Bartlett sphericity test (p<.001); the computed reliability coefficient based on continuity was .806; and the Cronbach alpha value of the total test score was .913. Analyses on both scales were based on total scores. It was found that PD-DWS and PD-APS have good psychometric properties, measuring the structure that is being investigated, are compatible with other scales, have high levels of internal reliability between their items, and are consistent across time. Therefore, it was concluded that both scales are valid and reliable instruments.
Espinosa, Pablo; Pfeiffer, Ruth M; García-Casado, Zaida; Requena, Celia; Landi, Maria Teresa; Kumar, Rajiv; Nagore, Eduardo
2016-01-01
Melanoma survivors are at an increased risk of developing other malignancies, including keratinocyte skin cancer (KSC). While it is known that many risk factors for melanoma also impact risk of KSC in the general population, no previous study has investigated risk factors for KSC development in melanoma patients. We assessed associations of personal and clinical characteristics, including skin phenotype and variations in the melanocortin 1 receptor (MC1R) gene, with KSC risk in melanoma patients. We used prospective follow-up information on 1200 patients treated for melanoma at the Instituto Valenciano de Oncología, Spain, between 2000 and 2011. We computed hazard ratios and 95% confidence intervals (CIs) for the association of clinical, personal and genetic characteristics with risk of KSC, squamous cell carcinoma (SCC), or basal cell carcinoma (BCC) from Cox proportional hazard models. Five-year cumulative incidence based on competing risk models of SCC, BCC or KSC overall was computed using multivariate subdistribution hazard models. To assess predictive performance of the models, we computed areas under the receiver-operating characteristic curves (AUCs, discriminatory power) using cross-validation. Median follow-up was 57.2 months; a KSC was detected in 163 patients (13.6%). In multivariable Cox models, age, sex, sunburns, chronic sun exposure, past personal history of non-melanoma skin cancer or other non-cutaneous neoplasia, and the MC1R variants p.D294H and p.R163Q were significantly associated with KSC risk. A cumulative incidence model including age, sex, personal history of KSC, and of other non-cutaneous neoplasia had an AUC of 0.76 (95% CI: 0.71-0.80). When p.D294H and p.R163Q variants were added to the model, the AUC increased to 0.81 (95% CI: 0.77-0.84) (p-value for difference <0.0001). In addition to age, sex, skin characteristics, and sun exposure, p.R163Q and p.D294H MC1R variants significantly increased KSC risk among melanoma patients. Our findings may help identify patients who could benefit most from preventive measures. Copyright © 2015 Elsevier Ltd. All rights reserved.
Development of a personal-computer-based intelligent tutoring system
NASA Technical Reports Server (NTRS)
Mueller, Stephen J.
1988-01-01
A large number of Intelligent Tutoring Systems (ITSs) have been built since they were first proposed in the early 1970's. Research conducted on the use of the best of these systems has demonstrated their effectiveness in tutoring in selected domains. A prototype ITS for tutoring students in the use of CLIPS language: CLIPSIT (CLIPS Intelligent Tutor) was developed. For an ITS to be widely accepted, not only must it be effective, flexible, and very responsive, it must also be capable of functioning on readily available computers. While most ITSs have been developed on powerful workstations, CLIPSIT is designed for use on the IBM PC/XT/AT personal computer family (and their clones). There are many issues to consider when developing an ITS on a personal computer such as the teaching strategy, user interface, knowledge representation, and program design methodology. Based on experiences in developing CLIPSIT, results on how to address some of these issues are reported and approaches are suggested for maintaining a powerful learning environment while delivering robust performance within the speed and memory constraints of the personal computer.
Brem, M H; Böhner, C; Brenning, A; Gelse, K; Radkow, T; Blanke, M; Schlechtweg, P M; Neumann, G; Wu, I Y; Bautz, W; Hennig, F F; Richter, H
2006-11-01
To compare the diagnostic value of low-cost computer monitors and a Picture Archiving and Communication System (PACS) workstation for the evaluation of cervical spine fractures in the emergency room. Two groups of readers blinded to the diagnoses (2 radiologists and 3 orthopaedic surgeons) independently assessed-digital radiographs of the cervical spine (anterior-posterior, oblique and trans-oral-dens views). The radiographs of 57 patients who arrived consecutively to the emergency room in 2004 with clinical suspicion of a cervical spine injury were evaluated. The diagnostic values of these radiographs were scored on a 3-point scale (1 = diagnosis not possible/bad image quality, 2 = diagnosis uncertain, 3 = clear diagnosis of fracture or no fracture) on a PACS workstation and on two different liquid crystal display (LCD) personal computer monitors. The images were randomised to avoid memory effects. We used logistic mixed-effects models to determine the possible effects of monitor type on the evaluation of x ray images. To determine the overall effects of monitor type, this variable was used as a fixed effect, and the image number and reader group (radiologist or orthopaedic surgeon) were used as random effects on display quality. Group-specific effects were examined, with the reader group and additional fixed effects as terms. A significance level of 0.05 was established for assessing the contribution of each fixed effect to the model. Overall, the diagnostic score did not differ significantly between standard personal computer monitors and the PACS workstation (both p values were 0.78). Low-cost LCD personal computer monitors may be useful in establishing a diagnosis of cervical spine fractures in the emergency room.
PHREEQCI; a graphical user interface for the geochemical computer program PHREEQC
Charlton, Scott R.; Macklin, Clifford L.; Parkhurst, David L.
1997-01-01
PhreeqcI is a Windows-based graphical user interface for the geochemical computer program PHREEQC. PhreeqcI provides the capability to generate and edit input data files, run simulations, and view text files containing simulation results, all within the framework of a single interface. PHREEQC is a multipurpose geochemical program that can perform speciation, inverse, reaction-path, and 1D advective reaction-transport modeling. Interactive access to all of the capabilities of PHREEQC is available with PhreeqcI. The interface is written in Visual Basic and will run on personal computers under the Windows(3.1), Windows95, and WindowsNT operating systems.
Computer Integrated Manufacturing: Physical Modelling Systems Design. A Personal View.
ERIC Educational Resources Information Center
Baker, Richard
A computer-integrated manufacturing (CIM) Physical Modeling Systems Design project was undertaken in a time of rapid change in the industrial, business, technological, training, and educational areas in Australia. A specification of a manufacturing physical modeling system was drawn up. Physical modeling provides a flexibility and configurability…
Distributed Computing Environment for Mine Warfare Command
1993-06-01
based system to a decentralized network of personal computers over the past several years. This thesis analyzes the progress of the evolution as of May of...network of personal computers over the past several years. This thesis analyzes the progress of the evolution as of May of 1992. The building blocks of a...85 A. BACKGROUND ............. .................. 85 B. PAST ENVIRONMENT ........... ............... 86 C. PRESENT ENVIRONMENT
Measurement properties of the Spinal Cord Injury-Functional Index (SCI-FI) short forms.
Heinemann, Allen W; Dijkers, Marcel P; Ni, Pengsheng; Tulsky, David S; Jette, Alan
2014-07-01
To evaluate the psychometric properties of the Spinal Cord Injury-Functional Index (SCI-FI) short forms (basic mobility, self-care, fine motor, ambulation, manual wheelchair, and power wheelchair) based on internal consistency; correlations between short forms banks, full item bank forms, and a 10-item computer adaptive test version; magnitude of ceiling and floor effects; and test information functions. Cross-sectional cohort study. Six rehabilitation hospitals in the United States. Individuals with traumatic spinal cord injury (N=855) recruited from 6 national Spinal Cord Injury Model Systems facilities. Not applicable. SCI-FI full item bank, 10-item computer adaptive test, and parallel short form scores. The SCI-FI short forms (with separate versions for individuals with paraplegia and tetraplegia) demonstrate very good internal consistency, group-level reliability, excellent correlations between short forms and scores based on the total item bank, and minimal ceiling and floor effects (except ceiling effects for persons with paraplegia on self-care, fine motor, and power wheelchair ability and floor effects for persons with tetraplegia on self-care, fine motor, and manual wheelchair ability). The test information functions are acceptable across the range of scores where most persons in the sample performed. Clinicians and researchers should consider the SCI-FI short forms when computer adaptive testing is not feasible. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voisin, Sophie; Pinto, Frank M; Morin-Ducote, Garnetta
2013-01-01
Purpose: The primary aim of the present study was to test the feasibility of predicting diagnostic errors in mammography by merging radiologists gaze behavior and image characteristics. A secondary aim was to investigate group-based and personalized predictive models for radiologists of variable experience levels. Methods: The study was performed for the clinical task of assessing the likelihood of malignancy of mammographic masses. Eye-tracking data and diagnostic decisions for 40 cases were acquired from 4 Radiology residents and 2 breast imaging experts as part of an IRB-approved pilot study. Gaze behavior features were extracted from the eye-tracking data. Computer-generated and BIRADsmore » images features were extracted from the images. Finally, machine learning algorithms were used to merge gaze and image features for predicting human error. Feature selection was thoroughly explored to determine the relative contribution of the various features. Group-based and personalized user modeling was also investigated. Results: Diagnostic error can be predicted reliably by merging gaze behavior characteristics from the radiologist and textural characteristics from the image under review. Leveraging data collected from multiple readers produced a reasonable group model (AUC=0.79). Personalized user modeling was far more accurate for the more experienced readers (average AUC of 0.837 0.029) than for the less experienced ones (average AUC of 0.667 0.099). The best performing group-based and personalized predictive models involved combinations of both gaze and image features. Conclusions: Diagnostic errors in mammography can be predicted reliably by leveraging the radiologists gaze behavior and image content.« less
An Interactive Computer-Based Training Program for Beginner Personal Computer Maintenance.
ERIC Educational Resources Information Center
Summers, Valerie Brooke
A computer-assisted instructional program, which was developed for teaching beginning computer maintenance to employees of Unisys, covered external hardware maintenance, proper diskette care, making software backups, and electro-static discharge prevention. The procedure used in developing the program was based upon the Dick and Carey (1985) model…
2011-01-01
Background To develop a web-based computer adaptive testing (CAT) application for efficiently collecting data regarding workers' perceptions of job satisfaction, we examined whether a 37-item Job Content Questionnaire (JCQ-37) could evaluate the job satisfaction of individual employees as a single construct. Methods The JCQ-37 makes data collection via CAT on the internet easy, viable and fast. A Rasch rating scale model was applied to analyze data from 300 randomly selected hospital employees who participated in job-satisfaction surveys in 2008 and 2009 via non-adaptive and computer-adaptive testing, respectively. Results Of the 37 items on the questionnaire, 24 items fit the model fairly well. Person-separation reliability for the 2008 surveys was 0.88. Measures from both years and item-8 job satisfaction for groups were successfully evaluated through item-by-item analyses by using t-test. Workers aged 26 - 35 felt that job satisfaction was significantly worse in 2009 than in 2008. Conclusions A Web-CAT developed in the present paper was shown to be more efficient than traditional computer-based or pen-and-paper assessments at collecting data regarding workers' perceptions of job content. PMID:21496311
2016-10-01
Can non- specific cellular immunity protect HIV-infected persons with very low CD4 counts? Presented at Conference on Integrating Psychology and...Under Review. 50. Nierenberg B, Cooper S, Feuer SJ, Broderick G. Applying Network Medicine to Chronic Illness: A Model for Integrating Psychology ...function in these subjects as compared to GW era sedentary healthy controls. We applied an integrative systems- based approach rooted in computational
Computational Modeling of Tissue Self-Assembly
NASA Astrophysics Data System (ADS)
Neagu, Adrian; Kosztin, Ioan; Jakab, Karoly; Barz, Bogdan; Neagu, Monica; Jamison, Richard; Forgacs, Gabor
As a theoretical framework for understanding the self-assembly of living cells into tissues, Steinberg proposed the differential adhesion hypothesis (DAH) according to which a specific cell type possesses a specific adhesion apparatus that combined with cell motility leads to cell assemblies of various cell types in the lowest adhesive energy state. Experimental and theoretical efforts of four decades turned the DAH into a fundamental principle of developmental biology that has been validated both in vitro and in vivo. Based on computational models of cell sorting, we have developed a DAH-based lattice model for tissues in interaction with their environment and simulated biological self-assembly using the Monte Carlo method. The present brief review highlights results on specific morphogenetic processes with relevance to tissue engineering applications. Our own work is presented on the background of several decades of theoretical efforts aimed to model morphogenesis in living tissues. Simulations of systems involving about 105 cells have been performed on high-end personal computers with CPU times of the order of days. Studied processes include cell sorting, cell sheet formation, and the development of endothelialized tubes from rings made of spheroids of two randomly intermixed cell types, when the medium in the interior of the tube was different from the external one. We conclude by noting that computer simulations based on mathematical models of living tissues yield useful guidelines for laboratory work and can catalyze the emergence of innovative technologies in tissue engineering.
Language Model Applications to Spelling with Brain-Computer Interfaces
Mora-Cortes, Anderson; Manyakov, Nikolay V.; Chumerin, Nikolay; Van Hulle, Marc M.
2014-01-01
Within the Ambient Assisted Living (AAL) community, Brain-Computer Interfaces (BCIs) have raised great hopes as they provide alternative communication means for persons with disabilities bypassing the need for speech and other motor activities. Although significant advancements have been realized in the last decade, applications of language models (e.g., word prediction, completion) have only recently started to appear in BCI systems. The main goal of this article is to review the language model applications that supplement non-invasive BCI-based communication systems by discussing their potential and limitations, and to discern future trends. First, a brief overview of the most prominent BCI spelling systems is given, followed by an in-depth discussion of the language models applied to them. These language models are classified according to their functionality in the context of BCI-based spelling: the static/dynamic nature of the user interface, the use of error correction and predictive spelling, and the potential to improve their classification performance by using language models. To conclude, the review offers an overview of the advantages and challenges when implementing language models in BCI-based communication systems when implemented in conjunction with other AAL technologies. PMID:24675760
John R. Mills
1989-01-01
The timber resource inventory model (TRIM) has been adapted to run on person al computers. The personal computer version of TRIM (PC-TRIM) is more widely used than its mainframe parent. Errors that existed in previous versions of TRIM have been corrected. Information is presented to help users with program input and output management in the DOS environment, to...
Security of Personal Computer Systems: A Management Guide.
ERIC Educational Resources Information Center
Steinauer, Dennis D.
This report describes management and technical security considerations associated with the use of personal computer systems as well as other microprocessor-based systems designed for use in a general office environment. Its primary objective is to identify and discuss several areas of potential vulnerability and associated protective measures. The…
Brunette, Mary F; Rotondi, Armando J; Ben-Zeev, Dror; Gottlieb, Jennifer D; Mueser, Kim T; Robinson, Delbert G; Achtyes, Eric D; Gingerich, Susan; Marcy, Patricia; Schooler, Nina R; Meyer-Kalos, Piper; Kane, John M
2016-04-01
Despite advances in schizophrenia treatment, symptom relapses and rehospitalizations impede recovery for many people and are a principal driver of the high cost of care. Technology-delivered or technology-enhanced treatment may be a cost-effective way to provide flexible, personalized evidence-based treatments directly to people in their homes and communities. However, evidence for the safety, acceptability, and efficacy of such interventions is only now being established. The authors of this Open Forum describe a novel, technology-based approach to prevent relapse after a hospitalization for psychosis, the Health Technology Program (HTP), which they developed. HTP provides in-person relapse prevention planning that directs use of tailored, technology-based treatment based on cognitive-behavioral therapy for psychosis, family psychoeducation for schizophrenia, and prescriber decision support through a Web-based program that solicits information from clients at every visit. Technology-based treatments are delivered through smartphones and computers.
Design Process of a Goal-Based Scenario on Computing Fundamentals
ERIC Educational Resources Information Center
Beriswill, Joanne Elizabeth
2014-01-01
In this design case, an instructor developed a goal-based scenario (GBS) for undergraduate computer fundamentals students to apply their knowledge of computer equipment and software. The GBS, entitled the MegaTech Project, presented the students with descriptions of the everyday activities of four persons needing to purchase a computer system. The…
A 3D visualization and simulation of the individual human jaw.
Muftić, Osman; Keros, Jadranka; Baksa, Sarajko; Carek, Vlado; Matković, Ivo
2003-01-01
A new biomechanical three-dimensional (3D) model for the human mandible based on computer-generated virtual model is proposed. Using maps obtained from the special kinds of photos of the face of the real subject, it is possible to attribute personality to the virtual character, while computer animation offers movements and characteristics within the confines of space and time of the virtual world. A simple two-dimensional model of the jaw cannot explain the biomechanics, where the muscular forces through occlusion and condylar surfaces are in the state of 3D equilibrium. In the model all forces are resolved into components according to a selected coordinate system. The muscular forces act on the jaw, along with the necessary force level for chewing as some kind of mandible balance, preventing dislocation and loading of nonarticular tissues. In the work is used new approach to computer-generated animation of virtual 3D characters (called "Body SABA"), using in one object package of minimal costs and easy for operation.
Virtual working systems to support R&D groups
NASA Astrophysics Data System (ADS)
Dew, Peter M.; Leigh, Christine; Drew, Richard S.; Morris, David; Curson, Jayne
1995-03-01
The paper reports on the progress at Leeds University to build a Virtual Science Park (VSP) to enhance the University's ability to interact with industry, grow its applied research and workplace learning activities. The VSP exploits the advances in real time collaborative computing and networking to provide an environment that meets the objectives of physically based science parks without the need for the organizations to relocate. It provides an integrated set of services (e.g. virtual consultancy, workbased learning) built around a structured person- centered information model. This model supports the integration of tools for: (a) navigating around the information space; (b) browsing information stored within the VSP database; (c) communicating through a variety of Person-to-Person collaborative tools; and (d) the ability to the information stored in the VSP including the relationships to other information that support the underlying model. The paper gives an overview of a generic virtual working system based on X.500 directory services and the World-Wide Web that can be used to support the Virtual Science Park. Finally the paper discusses some of the research issues that need to be addressed to fully realize a Virtual Science Park.
Mosler, Hans-Joachim; Martens, Thomas
2008-09-01
Agent-based computer simulation was used to create artificial communities in which each individual was constructed according to the principles of the elaboration likelihood model of Petty and Cacioppo [1986. The elaboration likelihood model of persuasion. In: Berkowitz, L. (Ed.), Advances in Experimental Social Psychology. Academic Press, New York, NY, pp. 123-205]. Campaigning strategies and community characteristics were varied systematically to understand and test their impact on attitudes towards environmental protection. The results show that strong arguments influence a green (environmentally concerned) population with many contacts most effectively, while peripheral cues have the greatest impact on a non-green population with fewer contacts. Overall, deeper information scrutiny increases the impact of strong arguments but is especially important for convincing green populations. Campaigns involving person-to-person communication are superior to mass-media campaigns because they can be adapted to recipients' characteristics.
Using the cloud to speed-up calibration of watershed-scale hydrologic models (Invited)
NASA Astrophysics Data System (ADS)
Goodall, J. L.; Ercan, M. B.; Castronova, A. M.; Humphrey, M.; Beekwilder, N.; Steele, J.; Kim, I.
2013-12-01
This research focuses on using the cloud to address computational challenges associated with hydrologic modeling. One example is calibration of a watershed-scale hydrologic model, which can take days of execution time on typical computers. While parallel algorithms for model calibration exist and some researchers have used multi-core computers or clusters to run these algorithms, these solutions do not fully address the challenge because (i) calibration can still be too time consuming even on multicore personal computers and (ii) few in the community have the time and expertise needed to manage a compute cluster. Given this, another option for addressing this challenge that we are exploring through this work is the use of the cloud for speeding-up calibration of watershed-scale hydrologic models. The cloud used in this capacity provides a means for renting a specific number and type of machines for only the time needed to perform a calibration model run. The cloud allows one to precisely balance the duration of the calibration with the financial costs so that, if the budget allows, the calibration can be performed more quickly by renting more machines. Focusing specifically on the SWAT hydrologic model and a parallel version of the DDS calibration algorithm, we show significant speed-up time across a range of watershed sizes using up to 256 cores to perform a model calibration. The tool provides a simple web-based user interface and the ability to monitor the calibration job submission process during the calibration process. Finally this talk concludes with initial work to leverage the cloud for other tasks associated with hydrologic modeling including tasks related to preparing inputs for constructing place-based hydrologic models.
NASA Astrophysics Data System (ADS)
Aditya, K.; Biswadeep, G.; Kedar, S.; Sundar, S.
2017-11-01
Human computer communication has growing demand recent days. The new generation of autonomous technology aspires to give computer interfaces emotional states that relate and consider user as well as system environment considerations. In the existing computational model is based an artificial intelligent and externally by multi-modal expression augmented with semi human characteristics. But the main problem with is multi-model expression is that the hardware control given to the Artificial Intelligence (AI) is very limited. So, in our project we are trying to give the Artificial Intelligence (AI) more control on the hardware. There are two main parts such as Speech to Text (STT) and Text to Speech (TTS) engines are used accomplish the requirement. In this work, we are using a raspberry pi 3, a speaker and a mic as hardware and for the programing part, we are using python scripting.
Rotary engine performance computer program (RCEMAP and RCEMAPPC): User's guide
NASA Technical Reports Server (NTRS)
Bartrand, Timothy A.; Willis, Edward A.
1993-01-01
This report is a user's guide for a computer code that simulates the performance of several rotary combustion engine configurations. It is intended to assist prospective users in getting started with RCEMAP and/or RCEMAPPC. RCEMAP (Rotary Combustion Engine performance MAP generating code) is the mainframe version, while RCEMAPPC is a simplified subset designed for the personal computer, or PC, environment. Both versions are based on an open, zero-dimensional combustion system model for the prediction of instantaneous pressures, temperature, chemical composition and other in-chamber thermodynamic properties. Both versions predict overall engine performance and thermal characteristics, including bmep, bsfc, exhaust gas temperature, average material temperatures, and turbocharger operating conditions. Required inputs include engine geometry, materials, constants for use in the combustion heat release model, and turbomachinery maps. Illustrative examples and sample input files for both versions are included.
NASA Astrophysics Data System (ADS)
Deng, Dongdong; Murphy, Michael J.; Hakim, Joe B.; Franceschi, William H.; Zahid, Sohail; Pashakhanloo, Farhad; Trayanova, Natalia A.; Boyle, Patrick M.
2017-09-01
Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia, causing morbidity and mortality in millions worldwide. The atria of patients with persistent AF (PsAF) are characterized by the presence of extensive and distributed atrial fibrosis, which facilitates the formation of persistent reentrant drivers (RDs, i.e., spiral waves), which promote fibrillatory activity. Targeted catheter ablation of RD-harboring tissues has shown promise as a clinical treatment for PsAF, but the outcomes remain sub-par. Personalized computational modeling has been proposed as a means of non-invasively predicting optimal ablation targets in individual PsAF patients, but it remains unclear how RD localization dynamics are influenced by inter-patient variability in the spatial distribution of atrial fibrosis, action potential duration (APD), and conduction velocity (CV). Here, we conduct simulations in computational models of fibrotic atria derived from the clinical imaging of PsAF patients to characterize the sensitivity of RD locations to these three factors. We show that RDs consistently anchor to boundaries between fibrotic and non-fibrotic tissues, as delineated by late gadolinium-enhanced magnetic resonance imaging, but those changes in APD/CV can enhance or attenuate the likelihood that an RD will anchor to a specific site. These findings show that the level of uncertainty present in patient-specific atrial models reconstructed without any invasive measurements (i.e., incorporating each individual's unique distribution of fibrotic tissue from medical imaging alongside an average representation of AF-remodeled electrophysiology) is sufficiently high that a personalized ablation strategy based on targeting simulation-predicted RD trajectories alone may not produce the desired result.
Optimizing agent-based transmission models for infectious diseases.
Willem, Lander; Stijven, Sean; Tijskens, Engelbert; Beutels, Philippe; Hens, Niel; Broeckhove, Jan
2015-06-02
Infectious disease modeling and computational power have evolved such that large-scale agent-based models (ABMs) have become feasible. However, the increasing hardware complexity requires adapted software designs to achieve the full potential of current high-performance workstations. We have found large performance differences with a discrete-time ABM for close-contact disease transmission due to data locality. Sorting the population according to the social contact clusters reduced simulation time by a factor of two. Data locality and model performance can also be improved by storing person attributes separately instead of using person objects. Next, decreasing the number of operations by sorting people by health status before processing disease transmission has also a large impact on model performance. Depending of the clinical attack rate, target population and computer hardware, the introduction of the sort phase decreased the run time from 26% up to more than 70%. We have investigated the application of parallel programming techniques and found that the speedup is significant but it drops quickly with the number of cores. We observed that the effect of scheduling and workload chunk size is model specific and can make a large difference. Investment in performance optimization of ABM simulator code can lead to significant run time reductions. The key steps are straightforward: the data structure for the population and sorting people on health status before effecting disease propagation. We believe these conclusions to be valid for a wide range of infectious disease ABMs. We recommend that future studies evaluate the impact of data management, algorithmic procedures and parallelization on model performance.
Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)
NASA Astrophysics Data System (ADS)
Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.
2013-12-01
Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.
NASA Astrophysics Data System (ADS)
Taki, Tsuyoshi; Hasegawa, Jun-ichi
1998-12-01
This paper proposes a basic feature for quantitative measurement and evaluation of group behavior of persons. This feature called 'dominant region' is a kind of sphere of influence for each person in the group. The dominant region is defined as a region in where the person can arrive earlier than any other persons and can be formulated as Voronoi region modified by replacing the distance function with a time function. This time function is calculated based on a computational model of moving ability of the person. As an application of the dominant region, we present a motion analysis system of soccer games. The purpose of this system is to evaluate the teamwork quantitatively based on movement of all the players in the game. From experiments using motion pictures of actual games, it is suggested that the proposed feature is useful for measurement and evaluation of group behavior in team sports. This basic feature may be applied to other team ball games, such as American football, basketball, handball and water polo.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voisin, Sophie; Tourassi, Georgia D.; Pinto, Frank
2013-10-15
Purpose: The primary aim of the present study was to test the feasibility of predicting diagnostic errors in mammography by merging radiologists’ gaze behavior and image characteristics. A secondary aim was to investigate group-based and personalized predictive models for radiologists of variable experience levels.Methods: The study was performed for the clinical task of assessing the likelihood of malignancy of mammographic masses. Eye-tracking data and diagnostic decisions for 40 cases were acquired from four Radiology residents and two breast imaging experts as part of an IRB-approved pilot study. Gaze behavior features were extracted from the eye-tracking data. Computer-generated and BIRADS imagesmore » features were extracted from the images. Finally, machine learning algorithms were used to merge gaze and image features for predicting human error. Feature selection was thoroughly explored to determine the relative contribution of the various features. Group-based and personalized user modeling was also investigated.Results: Machine learning can be used to predict diagnostic error by merging gaze behavior characteristics from the radiologist and textural characteristics from the image under review. Leveraging data collected from multiple readers produced a reasonable group model [area under the ROC curve (AUC) = 0.792 ± 0.030]. Personalized user modeling was far more accurate for the more experienced readers (AUC = 0.837 ± 0.029) than for the less experienced ones (AUC = 0.667 ± 0.099). The best performing group-based and personalized predictive models involved combinations of both gaze and image features.Conclusions: Diagnostic errors in mammography can be predicted to a good extent by leveraging the radiologists’ gaze behavior and image content.« less
Personality Characteristics and Performance on Computer Assisted Instruction and Programmed Text.
ERIC Educational Resources Information Center
Blitz, Allan N.; Smith, Timothy
An empirical study investigated whether personality characteristics have a bearing on an individual's success with particular modes of instruction, in this case, computer-assisted instruction (CAI) and the programed text (PT). The study was developed in an attempt to establish useful criteria on which to base a rationale for choosing suitable…
Spjuth, Ola; Karlsson, Andreas; Clements, Mark; Humphreys, Keith; Ivansson, Emma; Dowling, Jim; Eklund, Martin; Jauhiainen, Alexandra; Czene, Kamila; Grönberg, Henrik; Sparén, Pär; Wiklund, Fredrik; Cheddad, Abbas; Pálsdóttir, Þorgerður; Rantalainen, Mattias; Abrahamsson, Linda; Laure, Erwin; Litton, Jan-Eric; Palmgren, Juni
2017-09-01
We provide an e-Science perspective on the workflow from risk factor discovery and classification of disease to evaluation of personalized intervention programs. As case studies, we use personalized prostate and breast cancer screenings. We describe an e-Science initiative in Sweden, e-Science for Cancer Prevention and Control (eCPC), which supports biomarker discovery and offers decision support for personalized intervention strategies. The generic eCPC contribution is a workflow with 4 nodes applied iteratively, and the concept of e-Science signifies systematic use of tools from the mathematical, statistical, data, and computer sciences. The eCPC workflow is illustrated through 2 case studies. For prostate cancer, an in-house personalized screening tool, the Stockholm-3 model (S3M), is presented as an alternative to prostate-specific antigen testing alone. S3M is evaluated in a trial setting and plans for rollout in the population are discussed. For breast cancer, new biomarkers based on breast density and molecular profiles are developed and the US multicenter Women Informed to Screen Depending on Measures (WISDOM) trial is referred to for evaluation. While current eCPC data management uses a traditional data warehouse model, we discuss eCPC-developed features of a coherent data integration platform. E-Science tools are a key part of an evidence-based process for personalized medicine. This paper provides a structured workflow from data and models to evaluation of new personalized intervention strategies. The importance of multidisciplinary collaboration is emphasized. Importantly, the generic concepts of the suggested eCPC workflow are transferrable to other disease domains, although each disease will require tailored solutions. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Laber, Eric B; Zhao, Ying-Qi; Regh, Todd; Davidian, Marie; Tsiatis, Anastasios; Stanford, Joseph B; Zeng, Donglin; Song, Rui; Kosorok, Michael R
2016-04-15
A personalized treatment strategy formalizes evidence-based treatment selection by mapping patient information to a recommended treatment. Personalized treatment strategies can produce better patient outcomes while reducing cost and treatment burden. Thus, among clinical and intervention scientists, there is a growing interest in conducting randomized clinical trials when one of the primary aims is estimation of a personalized treatment strategy. However, at present, there are no appropriate sample size formulae to assist in the design of such a trial. Furthermore, because the sampling distribution of the estimated outcome under an estimated optimal treatment strategy can be highly sensitive to small perturbations in the underlying generative model, sample size calculations based on standard (uncorrected) asymptotic approximations or computer simulations may not be reliable. We offer a simple and robust method for powering a single stage, two-armed randomized clinical trial when the primary aim is estimating the optimal single stage personalized treatment strategy. The proposed method is based on inverting a plugin projection confidence interval and is thereby regular and robust to small perturbations of the underlying generative model. The proposed method requires elicitation of two clinically meaningful parameters from clinical scientists and uses data from a small pilot study to estimate nuisance parameters, which are not easily elicited. The method performs well in simulated experiments and is illustrated using data from a pilot study of time to conception and fertility awareness. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
White, P. R.; Little, R. R.
1985-01-01
A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.
Personalized modeling for real-time pressure ulcer prevention in sitting posture.
Luboz, Vincent; Bailet, Mathieu; Boichon Grivot, Christelle; Rochette, Michel; Diot, Bruno; Bucki, Marek; Payan, Yohan
2018-02-01
Ischial pressure ulcer is an important risk for every paraplegic person and a major public health issue. Pressure ulcers appear following excessive compression of buttock's soft tissues by bony structures, and particularly in ischial and sacral bones. Current prevention techniques are mainly based on daily skin inspection to spot red patches or injuries. Nevertheless, most pressure ulcers occur internally and are difficult to detect early. Estimating internal strains within soft tissues could help to evaluate the risk of pressure ulcer. A subject-specific biomechanical model could be used to assess internal strains from measured skin surface pressures. However, a realistic 3D non-linear Finite Element buttock model, with different layers of tissue materials for skin, fat and muscles, requires somewhere between minutes and hours to compute, therefore forbidding its use in a real-time daily prevention context. In this article, we propose to optimize these computations by using a reduced order modeling technique (ROM) based on proper orthogonal decompositions of the pressure and strain fields coupled with a machine learning method. ROM allows strains to be evaluated inside the model interactively (i.e. in less than a second) for any pressure field measured below the buttocks. In our case, with only 19 modes of variation of pressure patterns, an error divergence of one percent is observed compared to the full scale simulation for evaluating the strain field. This reduced model could therefore be the first step towards interactive pressure ulcer prevention in a daily set-up. Copyright © 2017 Tissue Viability Society. Published by Elsevier Ltd. All rights reserved.
An integrated framework for detecting suspicious behaviors in video surveillance
NASA Astrophysics Data System (ADS)
Zin, Thi Thi; Tin, Pyke; Hama, Hiromitsu; Toriu, Takashi
2014-03-01
In this paper, we propose an integrated framework for detecting suspicious behaviors in video surveillance systems which are established in public places such as railway stations, airports, shopping malls and etc. Especially, people loitering in suspicion, unattended objects left behind and exchanging suspicious objects between persons are common security concerns in airports and other transit scenarios. These involve understanding scene/event, analyzing human movements, recognizing controllable objects, and observing the effect of the human movement on those objects. In the proposed framework, multiple background modeling technique, high level motion feature extraction method and embedded Markov chain models are integrated for detecting suspicious behaviors in real time video surveillance systems. Specifically, the proposed framework employs probability based multiple backgrounds modeling technique to detect moving objects. Then the velocity and distance measures are computed as the high level motion features of the interests. By using an integration of the computed features and the first passage time probabilities of the embedded Markov chain, the suspicious behaviors in video surveillance are analyzed for detecting loitering persons, objects left behind and human interactions such as fighting. The proposed framework has been tested by using standard public datasets and our own video surveillance scenarios.
NASA Astrophysics Data System (ADS)
Pakos, Wojciech
2015-09-01
The paper presents numerical analysis of harmonically excited vibration of a cable-stayed footbridge caused by a load function simulating crouching (squats) while changing the static tension in chosen cables. The intentional synchronized motion (e.g., squats) of a single person or group of persons on the footbridge with a frequency close to the natural frequency of the structure may lead to the resonant vibrations with large amplitudes. The appropriate tension changes in some cables cause detuning of resonance on account of stiffness changes of structures and hence detuning in the natural frequency that is close to the excitation frequency. The research was carried out on a 3D computer model of a real structure - a cable-stayed steel footbridge in Leśnica, a quarter of Wrocław, Poland, with the help of standard computer software based on FEM COSMOS/M System.
Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing.
Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C; Chien, Tsair-Wei
2016-01-22
Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk.
NASA Astrophysics Data System (ADS)
Wenzel, H.; Wünsche, H. J.
1988-11-01
A description is given of a numerical model of a semiconductor laser with a quasioptic waveguide (index guide). This model can be used on a personal computer. The model can be used to find the radiation field distributions in the vertical and lateral directions, the pump currents at the threshold, and also to solve dynamic rate equations.
FleXConf: A Flexible Conference Assistant Using Context-Aware Notification Services
NASA Astrophysics Data System (ADS)
Armenatzoglou, Nikos; Marketakis, Yannis; Kriara, Lito; Apostolopoulos, Elias; Papavasiliou, Vicky; Kampas, Dimitris; Kapravelos, Alexandros; Kartsonakis, Eythimis; Linardakis, Giorgos; Nikitaki, Sofia; Bikakis, Antonis; Antoniou, Grigoris
Integrating context-aware notification services to ubiquitous computing systems aims at the provision of the right information to the right users, at the right time, in the right place, and on the right device, and constitutes a significant step towards the realization of the Ambient Intelligence vision. In this paper, we present FlexConf, a semantics-based system that supports location-based, personalized notification services for the assistance of conference attendees. Its special features include an ontology-based representation model, rule-based context-aware reasoning, and a novel positioning system for indoor environments.
Games as Artistic Medium: Interfacing Complexity Theory in Game-Based Art Pedagogy
ERIC Educational Resources Information Center
Patton, Ryan Matthew
2011-01-01
Having computer skills, let alone access to a personal computer, has become a necessary component of contemporary Western society and many parts of the world. Digital media literacy involves youth being able to view, participate in, and make creative works with technologies in personal and meaningful ways. Games, defined in this study as…
Partitioning-based mechanisms under personalized differential privacy.
Li, Haoran; Xiong, Li; Ji, Zhanglong; Jiang, Xiaoqian
2017-05-01
Differential privacy has recently emerged in private statistical aggregate analysis as one of the strongest privacy guarantees. A limitation of the model is that it provides the same privacy protection for all individuals in the database. However, it is common that data owners may have different privacy preferences for their data. Consequently, a global differential privacy parameter may provide excessive privacy protection for some users, while insufficient for others. In this paper, we propose two partitioning-based mechanisms, privacy-aware and utility-based partitioning, to handle personalized differential privacy parameters for each individual in a dataset while maximizing utility of the differentially private computation. The privacy-aware partitioning is to minimize the privacy budget waste, while utility-based partitioning is to maximize the utility for a given aggregate analysis. We also develop a t -round partitioning to take full advantage of remaining privacy budgets. Extensive experiments using real datasets show the effectiveness of our partitioning mechanisms.
Partitioning-based mechanisms under personalized differential privacy
Li, Haoran; Xiong, Li; Ji, Zhanglong; Jiang, Xiaoqian
2017-01-01
Differential privacy has recently emerged in private statistical aggregate analysis as one of the strongest privacy guarantees. A limitation of the model is that it provides the same privacy protection for all individuals in the database. However, it is common that data owners may have different privacy preferences for their data. Consequently, a global differential privacy parameter may provide excessive privacy protection for some users, while insufficient for others. In this paper, we propose two partitioning-based mechanisms, privacy-aware and utility-based partitioning, to handle personalized differential privacy parameters for each individual in a dataset while maximizing utility of the differentially private computation. The privacy-aware partitioning is to minimize the privacy budget waste, while utility-based partitioning is to maximize the utility for a given aggregate analysis. We also develop a t-round partitioning to take full advantage of remaining privacy budgets. Extensive experiments using real datasets show the effectiveness of our partitioning mechanisms. PMID:28932827
Papageorgiou, Elpiniki I; Jayashree Subramanian; Karmegam, Akila; Papandrianos, Nikolaos
2015-11-01
Breast cancer is the most deadly disease affecting women and thus it is natural for women aged 40-49 years (who have a family history of breast cancer or other related cancers) to assess their personal risk for developing familial breast cancer (FBC). Besides, as each individual woman possesses different levels of risk of developing breast cancer depending on their family history, genetic predispositions and personal medical history, individualized care setting mechanism needs to be identified so that appropriate risk assessment, counseling, screening, and prevention options can be determined by the health care professionals. The presented work aims at developing a soft computing based medical decision support system using Fuzzy Cognitive Map (FCM) that assists health care professionals in deciding the individualized care setting mechanisms based on the FBC risk level of the given women. The FCM based FBC risk management system uses NHL to learn causal weights from 40 patient records and achieves a 95% diagnostic accuracy. The results obtained from the proposed model are in concurrence with the comprehensive risk evaluation tool based on Tyrer-Cuzick model for 38/40 patient cases (95%). Besides, the proposed model identifies high risk women by calculating higher accuracy of prediction than the standard Gail and NSAPB models. The testing accuracy of the proposed model using 10-fold cross validation technique outperforms other standard machine learning based inference engines as well as previous FCM-based risk prediction methods for BC. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rest, J.
1995-08-01
This report describes the primary physical models that form the basis of the DART mechanistic computer model for calculating fission-product-induced swelling of aluminum dispersion fuels; the calculated results are compared with test data. In addition, DART calculates irradiation-induced changes in the thermal conductivity of the dispersion fuel, as well as fuel restructuring due to aluminum fuel reaction, amorphization, and recrystallization. Input instructions for execution on mainframe, workstation, and personal computers are provided, as is a description of DART output. The theory of fission gas behavior and its effect on fuel swelling is discussed. The behavior of these fission products inmore » both crystalline and amorphous fuel and in the presence of irradiation-induced recrystallization and crystalline-to-amorphous-phase change phenomena is presented, as are models for these irradiation-induced processes.« less
Puskaric, Marin; von Helversen, Bettina; Rieskamp, Jörg
2017-08-01
Social information such as observing others can improve performance in decision making. In particular, social information has been shown to be useful when finding the best solution on one's own is difficult, costly, or dangerous. However, past research suggests that when making decisions people do not always consider other people's behaviour when it is at odds with their own experiences. Furthermore, the cognitive processes guiding the integration of social information with individual experiences are still under debate. Here, we conducted two experiments to test whether information about other persons' behaviour influenced people's decisions in a classification task. Furthermore, we examined how social information is integrated with individual learning experiences by testing different computational models. Our results show that social information had a small but reliable influence on people's classifications. The best computational model suggests that in categorization people first make up their own mind based on the non-social information, which is then updated by the social information.
Smart learning services based on smart cloud computing.
Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik
2011-01-01
Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user's behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)--smart pull, smart prospect, smart content, and smart push--concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users' needs by collecting and analyzing users' behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users' behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.
Smart Learning Services Based on Smart Cloud Computing
Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik
2011-01-01
Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user’s behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)—smart pull, smart prospect, smart content, and smart push—concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users’ needs by collecting and analyzing users’ behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users’ behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users. PMID:22164048
Muscle parameters estimation based on biplanar radiography.
Dubois, G; Rouch, P; Bonneau, D; Gennisson, J L; Skalli, W
2016-11-01
The evaluation of muscle and joint forces in vivo is still a challenge. Musculo-Skeletal (musculo-skeletal) models are used to compute forces based on movement analysis. Most of them are built from a scaled-generic model based on cadaver measurements, which provides a low level of personalization, or from Magnetic Resonance Images, which provide a personalized model in lying position. This study proposed an original two steps method to access a subject-specific musculo-skeletal model in 30 min, which is based solely on biplanar X-Rays. First, the subject-specific 3D geometry of bones and skin envelopes were reconstructed from biplanar X-Rays radiography. Then, 2200 corresponding control points were identified between a reference model and the subject-specific X-Rays model. Finally, the shape of 21 lower limb muscles was estimated using a non-linear transformation between the control points in order to fit the muscle shape of the reference model to the X-Rays model. Twelfth musculo-skeletal models were reconstructed and compared to their reference. The muscle volume was not accurately estimated with a standard deviation (SD) ranging from 10 to 68%. However, this method provided an accurate estimation the muscle line of action with a SD of the length difference lower than 2% and a positioning error lower than 20 mm. The moment arm was also well estimated with SD lower than 15% for most muscle, which was significantly better than scaled-generic model for most muscle. This method open the way to a quick modeling method for gait analysis based on biplanar radiography.
Developing Educational Computer Animation Based on Human Personality Types
ERIC Educational Resources Information Center
Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol
2015-01-01
Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…
Computing Information Value from RDF Graph Properties
DOE Office of Scientific and Technical Information (OSTI.GOV)
al-Saffar, Sinan; Heileman, Gregory
2010-11-08
Information value has been implicitly utilized and mostly non-subjectively computed in information retrieval (IR) systems. We explicitly define and compute the value of an information piece as a function of two parameters, the first is the potential semantic impact the target information can subjectively have on its recipient's world-knowledge, and the second parameter is trust in the information source. We model these two parameters as properties of RDF graphs. Two graphs are constructed, a target graph representing the semantics of the target body of information and a context graph representing the context of the consumer of that information. We computemore » information value subjectively as a function of both potential change to the context graph (impact) and the overlap between the two graphs (trust). Graph change is computed as a graph edit distance measuring the dissimilarity between the context graph before and after the learning of the target graph. A particular application of this subjective information valuation is in the construction of a personalized ranking component in Web search engines. Based on our method, we construct a Web re-ranking system that personalizes the information experience for the information-consumer.« less
Design and Implementation of an MC68020-Based Educational Computer Board
1989-12-01
device and the other for a Macintosh personal computer. A stored program can be installed in 8K bytes Programmable Read Only Memory (PROM) to initialize...MHz. It includes four * Static Random Access Memory (SRAM) chips which provide a storage of 32K bytes. Two Programmable Array Logic (PAL) chips...device and the other for a Macintosh personal computer. A stored program can be installed in 8K bytes Programmable Read Only Memory (PROM) to
Legato: Personal Computer Software for Analyzing Pressure-Sensitive Paint Data
NASA Technical Reports Server (NTRS)
Schairer, Edward T.
2001-01-01
'Legato' is personal computer software for analyzing radiometric pressure-sensitive paint (PSP) data. The software is written in the C programming language and executes under Windows 95/98/NT operating systems. It includes all operations normally required to convert pressure-paint image intensities to normalized pressure distributions mapped to physical coordinates of the test article. The program can analyze data from both single- and bi-luminophore paints and provides for both in situ and a priori paint calibration. In addition, there are functions for determining paint calibration coefficients from calibration-chamber data. The software is designed as a self-contained, interactive research tool that requires as input only the bare minimum of information needed to accomplish each function, e.g., images, model geometry, and paint calibration coefficients (for a priori calibration) or pressure-tap data (for in situ calibration). The program includes functions that can be used to generate needed model geometry files for simple model geometries (e.g., airfoils, trapezoidal wings, rotor blades) based on the model planform and airfoil section. All data files except images are in ASCII format and thus are easily created, read, and edited. The program does not use database files. This simplifies setup but makes the program inappropriate for analyzing massive amounts of data from production wind tunnels. Program output consists of Cartesian plots, false-colored real and virtual images, pressure distributions mapped to the surface of the model, assorted ASCII data files, and a text file of tabulated results. Graphical output is displayed on the computer screen and can be saved as publication-quality (PostScript) files.
NASA Astrophysics Data System (ADS)
Zhang, Ying; Feng, Yuanming; Wang, Wei; Yang, Chengwen; Wang, Ping
2017-03-01
A novel and versatile “bottom-up” approach is developed to estimate the radiobiological effect of clinic radiotherapy. The model consists of multi-scale Monte Carlo simulations from organ to cell levels. At cellular level, accumulated damages are computed using a spectrum-based accumulation algorithm and predefined cellular damage database. The damage repair mechanism is modeled by an expanded reaction-rate two-lesion kinetic model, which were calibrated through replicating a radiobiological experiment. Multi-scale modeling is then performed on a lung cancer patient under conventional fractionated irradiation. The cell killing effects of two representative voxels (isocenter and peripheral voxel of the tumor) are computed and compared. At microscopic level, the nucleus dose and damage yields vary among all nucleuses within the voxels. Slightly larger percentage of cDSB yield is observed for the peripheral voxel (55.0%) compared to the isocenter one (52.5%). For isocenter voxel, survival fraction increase monotonically at reduced oxygen environment. Under an extreme anoxic condition (0.001%), survival fraction is calculated to be 80% and the hypoxia reduction factor reaches a maximum value of 2.24. In conclusion, with biological-related variations, the proposed multi-scale approach is more versatile than the existing approaches for evaluating personalized radiobiological effects in radiotherapy.
Bipartite graphs as models of population structures in evolutionary multiplayer games.
Peña, Jorge; Rochat, Yannick
2012-01-01
By combining evolutionary game theory and graph theory, "games on graphs" study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner's dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner's dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures.
Data base development and research and editorial support
NASA Technical Reports Server (NTRS)
1988-01-01
The Life Sciences Bibliographic Data Base was created in 1981 and subsequently expanded. A systematic, professional system was developed to collect, organize, and disseminate information about scientific publications resulting from research. The data base consists of bibliographic information and hard copies of all research papers published by Life Sciences-supported investigators. Technical improvements were instituted in the database. To minimize costs, take advantage of advances in personal computer technology, and achieve maximum flexibility and control, the data base was transferred from the JSC computer to personal computers at George Washington University (GWU). GWU also performed a range of related activities such as conducting in-depth searches on a variety of subjects, retrieving scientific literature, preparing presentations, summarizing research progress, answering correspondence requiring reference support, and providing writing and editorial support.
Incorporating CLIPS into a personal-computer-based Intelligent Tutoring System
NASA Technical Reports Server (NTRS)
Mueller, Stephen J.
1990-01-01
A large number of Intelligent Tutoring Systems (ITS's) have been built since they were first proposed in the early 1970's. Research conducted on the use of the best of these systems has demonstrated their effectiveness in tutoring in selected domains. Computer Sciences Corporation, Applied Technology Division, Houston Operations has been tasked by the Spacecraft Software Division at NASA/Johnson Space Center (NASA/JSC) to develop a number of lTS's in a variety of domains and on many different platforms. This paper will address issues facing the development of an ITS on a personal computer using the CLIPS (C Language Integrated Production System) language. For an ITS to be widely accepted, not only must it be effective, flexible, and very responsive, it must also be capable of functioning on readily available computers. There are many issues to consider when using CLIPS to develop an ITS on a personal computer. Some of these issues are the following: when to use CLIPS and when to use a procedural language such as C, how to maximize speed and minimize memory usage, and how to decrease the time required to load your rule base once you are ready to deliver the system. Based on experiences in developing the CLIPS Intelligent Tutoring System (CLIPSITS) on an IBM PC clone and an intelligent Physics Tutor on a Macintosh 2, this paper reports results on how to address some of these issues. It also suggests approaches for maintaining a powerful learning environment while delivering robust performance within the speed and memory constraints of the personal computer.
Expertise transfer for expert system design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boose, J.H.
This book is about the Expertise Transfer System-a computer program which interviews experts and helps them build expert systems, i.e. computer programs that use knowledge from experts to make decisions and judgements under conditions of uncertainty. The techniques are useful to anyone who uses decision-making information based on the expertise of others. The methods can also be applied to personal decision-making. The interviewing methodology is borrowed from a branch of psychology called Personal Construct Theory. It is not necessary to use a computer to take advantage of the techniques from Personal Construction Theory; the fundamental procedures used by the Expertisemore » Transfer System can be performed using paper and pencil. It is not necessary that the reader understand very much about computers to understand the ideas in this book. The few relevant concepts from computer science and expert systems that are needed are explained in a straightforward manner. Ideas from Personal Construct Psychology are also introduced as needed.« less
Tsai, Tsung-Yuan; Li, Jing-Sheng; Wang, Shaobai; Li, Pingyue; Kwon, Young-Min; Li, Guoan
2015-01-01
The statistical shape model (SSM) method that uses 2D images of the knee joint to predict the three-dimensional (3D) joint surface model has been reported in the literature. In this study, we constructed a SSM database using 152 human computed tomography (CT) knee joint models, including the femur, tibia and patella and analysed the characteristics of each principal component of the SSM. The surface models of two in vivo knees were predicted using the SSM and their 2D bi-plane fluoroscopic images. The predicted models were compared to their CT joint models. The differences between the predicted 3D knee joint surfaces and the CT image-based surfaces were 0.30 ± 0.81 mm, 0.34 ± 0.79 mm and 0.36 ± 0.59 mm for the femur, tibia and patella, respectively (average ± standard deviation). The computational time for each bone of the knee joint was within 30 s using a personal computer. The analysis of this study indicated that the SSM method could be a useful tool to construct 3D surface models of the knee with sub-millimeter accuracy in real time. Thus, it may have a broad application in computer-assisted knee surgeries that require 3D surface models of the knee.
Computational Psychometrics for Modeling System Dynamics during Stressful Disasters.
Cipresso, Pietro; Bessi, Alessandro; Colombo, Desirée; Pedroli, Elisa; Riva, Giuseppe
2017-01-01
Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future.
Ahamed, Nizam U; Sundaraj, Kenneth; Poo, Tarn S
2013-03-01
This article describes the design of a robust, inexpensive, easy-to-use, small, and portable online electromyography acquisition system for monitoring electromyography signals during rehabilitation. This single-channel (one-muscle) system was connected via the universal serial bus port to a programmable Windows operating system handheld tablet personal computer for storage and analysis of the data by the end user. The raw electromyography signals were amplified in order to convert them to an observable scale. The inherent noise of 50 Hz (Malaysia) from power lines electromagnetic interference was then eliminated using a single-hybrid IC notch filter. These signals were sampled by a signal processing module and converted into 24-bit digital data. An algorithm was developed and programmed to transmit the digital data to the computer, where it was reassembled and displayed in the computer using software. Finally, the following device was furnished with the graphical user interface to display the online muscle strength streaming signal in a handheld tablet personal computer. This battery-operated system was tested on the biceps brachii muscles of 20 healthy subjects, and the results were compared to those obtained with a commercial single-channel (one-muscle) electromyography acquisition system. The results obtained using the developed device when compared to those obtained from a commercially available physiological signal monitoring system for activities involving muscle contractions were found to be comparable (the comparison of various statistical parameters) between male and female subjects. In addition, the key advantage of this developed system over the conventional desktop personal computer-based acquisition systems is its portability due to the use of a tablet personal computer in which the results are accessible graphically as well as stored in text (comma-separated value) form.
Comparative study of methods for recognition of an unknown person's action from a video sequence
NASA Astrophysics Data System (ADS)
Hori, Takayuki; Ohya, Jun; Kurumisawa, Jun
2009-02-01
This paper proposes a Tensor Decomposition Based method that can recognize an unknown person's action from a video sequence, where the unknown person is not included in the database (tensor) used for the recognition. The tensor consists of persons, actions and time-series image features. For the observed unknown person's action, one of the actions stored in the tensor is assumed. Using the motion signature obtained from the assumption, the unknown person's actions are synthesized. The actions of one of the persons in the tensor are replaced by the synthesized actions. Then, the core tensor for the replaced tensor is computed. This process is repeated for the actions and persons. For each iteration, the difference between the replaced and original core tensors is computed. The assumption that gives the minimal difference is the action recognition result. For the time-series image features to be stored in the tensor and to be extracted from the observed video sequence, the human body silhouette's contour shape based feature is used. To show the validity of our proposed method, our proposed method is experimentally compared with Nearest Neighbor rule and Principal Component analysis based method. Experiments using 33 persons' seven kinds of action show that our proposed method achieves better recognition accuracies for the seven actions than the other methods.
Orhan, Umut; Erdogmus, Deniz; Roark, Brian; Purwar, Shalini; Hild, Kenneth E.; Oken, Barry; Nezamfar, Hooman; Fried-Oken, Melanie
2013-01-01
Event related potentials (ERP) corresponding to a stimulus in electroencephalography (EEG) can be used to detect the intent of a person for brain computer interfaces (BCI). This paradigm is widely utilized to build letter-by-letter text input systems using BCI. Nevertheless using a BCI-typewriter depending only on EEG responses will not be sufficiently accurate for single-trial operation in general, and existing systems utilize many-trial schemes to achieve accuracy at the cost of speed. Hence incorporation of a language model based prior or additional evidence is vital to improve accuracy and speed. In this paper, we study the effects of Bayesian fusion of an n-gram language model with a regularized discriminant analysis ERP detector for EEG-based BCIs. The letter classification accuracies are rigorously evaluated for varying language model orders as well as number of ERP-inducing trials. The results demonstrate that the language models contribute significantly to letter classification accuracy. Specifically, we find that a BCI-speller supported by a 4-gram language model may achieve the same performance using 3-trial ERP classification for the initial letters of the words and using single trial ERP classification for the subsequent ones. Overall, fusion of evidence from EEG and language models yields a significant opportunity to increase the word rate of a BCI based typing system. PMID:22255652
Inventory of environmental impact models related to energy technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owen, P.T.; Dailey, N.S.; Johnson, C.A.
The purpose of this inventory is to identify and collect data on computer simulations and computational models related to the environmental effects of energy source development, energy conversion, or energy utilization. Information for 33 data fields was sought for each model reported. All of the information which could be obtained within the time alloted for completion of the project is presented for each model listed. Efforts will be continued toward acquiring the needed information. Readers who are interested in these particular models are invited to contact ESIC for assistance in locating them. In addition to the standard bibliographic information, othermore » data fields of interest to modelers, such as computer hardware and software requirements, algorithms, applications, and existing model validation information, are included. Indexes are provided for contact person, acronym, keyword, and title. The models are grouped into the following categories: atmospheric transport, air quality, aquatic transport, terrestrial food chains, soil transport, aquatic food chains, water quality, dosimetry, and human effects, animal effects, plant effects, and generalized environmental transport. Within these categories, the models are arranged alphabetically by last name of the contact person.« less
The Cell Collective: Toward an open and collaborative approach to systems biology
2012-01-01
Background Despite decades of new discoveries in biomedical research, the overwhelming complexity of cells has been a significant barrier to a fundamental understanding of how cells work as a whole. As such, the holistic study of biochemical pathways requires computer modeling. Due to the complexity of cells, it is not feasible for one person or group to model the cell in its entirety. Results The Cell Collective is a platform that allows the world-wide scientific community to create these models collectively. Its interface enables users to build and use models without specifying any mathematical equations or computer code - addressing one of the major hurdles with computational research. In addition, this platform allows scientists to simulate and analyze the models in real-time on the web, including the ability to simulate loss/gain of function and test what-if scenarios in real time. Conclusions The Cell Collective is a web-based platform that enables laboratory scientists from across the globe to collaboratively build large-scale models of various biological processes, and simulate/analyze them in real time. In this manuscript, we show examples of its application to a large-scale model of signal transduction. PMID:22871178
Terrace Layout Using a Computer Assisted System
USDA-ARS?s Scientific Manuscript database
Development of a web-based terrace design tool based on the MOTERR program is presented, along with representative layouts for conventional and parallel terrace systems. Using digital elevation maps and geographic information systems (GIS), this tool utilizes personal computers to rapidly construct ...
ERIC Educational Resources Information Center
Loh, Christian Sebastian
2001-01-01
Examines how mobile computers, or personal digital assistants (PDAs), can be used in a Web-based learning environment. Topics include wireless networks on college campuses; online learning; Web-based learning technologies; synchronous and asynchronous communication via the Web; content resources; Web connections; and collaborative learning. (LRW)
ERIC Educational Resources Information Center
Mechling, Linda C.; Gast, David L.; Langone, John
2002-01-01
A study evaluated use of computer-based video instruction to teach generalized reading of grocery store aisle signs and the location of the corresponding grocery items to four students (ages 9-17) mental retardation. The computer-based video program was successful in teaching generalized reading of signs and the location of items. (Contains…
A PC-Based Controller for Dextrous Arms
NASA Technical Reports Server (NTRS)
Fiorini, Paolo; Seraji, Homayoun; Long, Mark
1996-01-01
This paper describes the architecture and performance of a PC-based controller for 7-DOF dextrous manipulators. The computing platform is a 486-based personal computer equipped with a bus extender to access the robot Multibus controller, together with a single board computer as the graphical engine, and with a parallel I/O board to interface with a force-torque sensor mounted on the manipulator wrist.
Deformed Palmprint Matching Based on Stable Regions.
Wu, Xiangqian; Zhao, Qiushi
2015-12-01
Palmprint recognition (PR) is an effective technology for personal recognition. A main problem, which deteriorates the performance of PR, is the deformations of palmprint images. This problem becomes more severe on contactless occasions, in which images are acquired without any guiding mechanisms, and hence critically limits the applications of PR. To solve the deformation problems, in this paper, a model for non-linearly deformed palmprint matching is derived by approximating non-linear deformed palmprint images with piecewise-linear deformed stable regions. Based on this model, a novel approach for deformed palmprint matching, named key point-based block growing (KPBG), is proposed. In KPBG, an iterative M-estimator sample consensus algorithm based on scale invariant feature transform features is devised to compute piecewise-linear transformations to approximate the non-linear deformations of palmprints, and then, the stable regions complying with the linear transformations are decided using a block growing algorithm. Palmprint feature extraction and matching are performed over these stable regions to compute matching scores for decision. Experiments on several public palmprint databases show that the proposed models and the KPBG approach can effectively solve the deformation problem in palmprint verification and outperform the state-of-the-art methods.
Bounthavong, Mark; Pruitt, Larry D; Smolenski, Derek J; Gahm, Gregory A; Bansal, Aasthaa; Hansen, Ryan N
2018-02-01
Introduction Home-based telebehavioural healthcare improves access to mental health care for patients restricted by travel burden. However, there is limited evidence assessing the economic value of home-based telebehavioural health care compared to in-person care. We sought to compare the economic impact of home-based telebehavioural health care and in-person care for depression among current and former US service members. Methods We performed trial-based cost-minimisation and cost-utility analyses to assess the economic impact of home-based telebehavioural health care versus in-person behavioural care for depression. Our analyses focused on the payer perspective (Department of Defense and Department of Veterans Affairs) at three months. We also performed a scenario analysis where all patients possessed video-conferencing technology that was approved by these agencies. The cost-utility analysis evaluated the impact of different depression categories on the incremental cost-effectiveness ratio. One-way and probabilistic sensitivity analyses were performed to test the robustness of the model assumptions. Results In the base case analysis the total direct cost of home-based telebehavioural health care was higher than in-person care (US$71,974 versus US$20,322). Assuming that patients possessed government-approved video-conferencing technology, home-based telebehavioural health care was less costly compared to in-person care (US$19,177 versus US$20,322). In one-way sensitivity analyses, the proportion of patients possessing personal computers was a major driver of direct costs. In the cost-utility analysis, home-based telebehavioural health care was dominant when patients possessed video-conferencing technology. Results from probabilistic sensitivity analyses did not differ substantially from base case results. Discussion Home-based telebehavioural health care is dependent on the cost of supplying video-conferencing technology to patients but offers the opportunity to increase access to care. Health-care policies centred on implementation of home-based telebehavioural health care should ensure that these technologies are able to be successfully deployed on patients' existing technology.
Intentions of hospital nurses to work with computers: based on the theory of planned behavior.
Shoham, Snunith; Gonen, Ayala
2008-01-01
The purpose of this study was to determine registered nurses' attitudes related to intent to use computers in the hospital setting as a predictor of their future behavior. The study was further aimed at identifying the relationship between these attitudes and selected sociological, professional, and personal factors and to describe a research model integrating these various factors. The study was based on the theory of planned behavior. A random sample of 411 registered nurses was selected from a single large medical center in Israel. The study tool was a Likert-style questionnaire. Nine different indices were used: (1) behavioral intention toward computer use; (2) general attitudes toward computer use; (3) nursing attitudes toward computer use; (4) threat involved in computer use; (5) challenge involved in computer use; (6) organizational climate; (7) departmental climate; (8) attraction to technological innovations/innovativeness; (9) self-efficacy, ability to control behavior. Strong significant positive correlations were found between the nurses' attitudes (general attitudes and nursing attitudes), self-efficacy, innovativeness, and intentions to use computers. Higher correlations were found between departmental climate and attitudes than between organizational climate and attitudes. The threat and challenge that are involved in computer use were shown as important mediating variables to the understanding of the process of predicting attitudes and intentions toward using computers.
Issues in implementing a knowledge-based ECG analyzer for personal mobile health monitoring.
Goh, K W; Kim, E; Lavanya, J; Kim, Y; Soh, C B
2006-01-01
Advances in sensor technology, personal mobile devices, and wireless broadband communications are enabling the development of an integrated personal mobile health monitoring system that can provide patients with a useful tool to assess their own health and manage their personal health information anytime and anywhere. Personal mobile devices, such as PDAs and mobile phones, are becoming more powerful integrated information management tools and play a major role in many people's lives. We focus on designing a health-monitoring system for people who suffer from cardiac arrhythmias. We have developed computer simulation models to evaluate the performance of appropriate electrocardiogram (ECG) analysis techniques that can be implemented on personal mobile devices. This paper describes an ECG analyzer to perform ECG beat and episode detection and classification. We have obtained promising preliminary results from our study. Also, we discuss several key considerations when implementing a mobile health monitoring solution. The mobile ECG analyzer would become a front-end patient health data acquisition module, which is connected to the Personal Health Information Management System (PHIMS) for data repository.
A trunk ranging system based on binocular stereo vision
NASA Astrophysics Data System (ADS)
Zhao, Xixuan; Kan, Jiangming
2017-07-01
Trunk ranging is an essential function for autonomous forestry robots. Traditional trunk ranging systems based on personal computers are not convenient in practical application. This paper examines the implementation of a trunk ranging system based on the binocular vision theory via TI's DaVinc DM37x system. The system is smaller and more reliable than that implemented using a personal computer. It calculates the three-dimensional information from the images acquired by binocular cameras, producing the targeting and ranging results. The experimental results show that the measurement error is small and the system design is feasible for autonomous forestry robots.
Detection of people in military and security context imagery
NASA Astrophysics Data System (ADS)
Shannon, Thomas M. L.; Spier, Emmet H.; Wiltshire, Ben
2014-10-01
A high level of manual visual surveillance of complex scenes is dependent solely on the awareness of human operators whereas an autonomous person detection solution could assist by drawing their attention to potential issues, in order to reduce cognitive burden and achieve more with less manpower. Our research addressed the challenge of the reliable identification of persons in a scene who may be partially obscured by structures or by handling weapons or tools. We tested the efficacy of a recently published computer vision approach based on the construction of cascaded, non-linear classifiers from part-based deformable models by assessing performance using imagery containing infantrymen in the open or when obscured, undertaking low level tactics or acting as civilians using tools. Results were compared with those obtained from published upright pedestrian imagery. The person detector yielded a precision of approximately 65% for a recall rate of 85% for military context imagery as opposed to a precision of 85% for the upright pedestrian image cases. These results compared favorably with those reported by the authors when applied to a range of other on-line imagery databases. Our conclusion is that the deformable part-based model method may be a potentially useful people detection tool in the challenging environment of military and security context imagery.
Construction of a three-dimensional interactive model of the skull base and cranial nerves.
Kakizawa, Yukinari; Hongo, Kazuhiro; Rhoton, Albert L
2007-05-01
The goal was to develop an interactive three-dimensional (3-D) computerized anatomic model of the skull base for teaching microneurosurgical anatomy and for operative planning. The 3-D model was constructed using commercially available software (Maya 6.0 Unlimited; Alias Systems Corp., Delaware, MD), a personal computer, four cranial specimens, and six dry bones. Photographs from at least two angles of the superior and lateral views were imported to the 3-D software. Many photographs were needed to produce the model in anatomically complex areas. Careful dissection was needed to expose important structures in the two views. Landmarks, including foramen, bone, and dura mater, were used as reference points. The 3-D model of the skull base and related structures was constructed using more than 300,000 remodeled polygons. The model can be viewed from any angle. It can be rotated 360 degrees in any plane using any structure as the focal point of rotation. The model can be reduced or enlarged using the zoom function. Variable transparencies could be assigned to any structures so that the structures at any level can be seen. Anatomic labels can be attached to the structures in the 3-D model for educational purposes. This computer-generated 3-D model can be observed and studied repeatedly without the time limitations and stresses imposed by surgery. This model may offer the potential to create interactive surgical exercises useful in evaluating multiple surgical routes to specific target areas in the skull base.
Hilton, Joan F.; Barkoff, Lynsey; Chang, Olivia; Halperin, Lindsay; Ratanawongsa, Neda; Sarkar, Urmimala; Leykin, Yan; Muñoz, Ricardo F.; Thom, David H.; Kahn, James S.
2012-01-01
Background Personal health records (PHR) may improve patients' health by providing access to and context for health information. Among patients receiving care at a safety-net HIV/AIDS clinic, we examined the hypothesis that a mental health (MH) or substance use (SU) condition represents a barrier to engagement with web-based health information, as measured by consent to participate in a trial that provided access to personal (PHR) or general (non-PHR) health information portals and by completion of baseline study surveys posted there. Methods Participants were individually trained to access and navigate individualized online accounts and to complete study surveys. In response to need, during accrual months 4 to 12 we enhanced participant training to encourage survey completion with the help of staff. Using logistic regression models, we estimated odds ratios for study participation and for survey completion by combined MH/SU status, adjusted for levels of computer competency, on-study training, and demographics. Results Among 2,871 clinic patients, 70% had MH/SU conditions, with depression (38%) and methamphetamine use (17%) most commonly documented. Middle-aged patients and those with a MH/SU condition were over-represented among study participants (N = 338). Survey completion was statistically independent of MH/SU status (OR, 1.85 [95% CI, 0.93–3.66]) but tended to be higher among those with MH/SU conditions. Completion rates were low among beginner computer users, regardless of training level (<50%), but adequate among advanced users (>70%). Conclusions Among patients attending a safety-net clinic, MH/SU conditions were not barriers to engagement with web-based health information. Instead, level of computer competency was useful for identifying individuals requiring substantial computer training in order to fully participate in the study. Intensive on-study training was insufficient to enable beginner computer users to complete study surveys. PMID:22363761
Learning gestures for customizable human-computer interaction in the operating room.
Schwarz, Loren Arthur; Bigdelou, Ali; Navab, Nassir
2011-01-01
Interaction with computer-based medical devices in the operating room is often challenging for surgeons due to sterility requirements and the complexity of interventional procedures. Typical solutions, such as delegating the interaction task to an assistant, can be inefficient. We propose a method for gesture-based interaction in the operating room that surgeons can customize to personal requirements and interventional workflow. Given training examples for each desired gesture, our system learns low-dimensional manifold models that enable recognizing gestures and tracking particular poses for fine-grained control. By capturing the surgeon's movements with a few wireless body-worn inertial sensors, we avoid issues of camera-based systems, such as sensitivity to illumination and occlusions. Using a component-based framework implementation, our method can easily be connected to different medical devices. Our experiments show that the approach is able to robustly recognize learned gestures and to distinguish these from other movements.
Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines
del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J.; Raboso, Mariano
2015-01-01
Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation—based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking—to reduce the dimensions of images—and binarization—to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements. PMID:26091392
Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines.
del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J; Raboso, Mariano
2015-06-17
Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation-based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking-to reduce the dimensions of images-and binarization-to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.
Parallel approach to identifying the well-test interpretation model using a neurocomputer
NASA Astrophysics Data System (ADS)
May, Edward A., Jr.; Dagli, Cihan H.
1996-03-01
The well test is one of the primary diagnostic and predictive tools used in the analysis of oil and gas wells. In these tests, a pressure recording device is placed in the well and the pressure response is recorded over time under controlled flow conditions. The interpreted results are indicators of the well's ability to flow and the damage done to the formation surrounding the wellbore during drilling and completion. The results are used for many purposes, including reservoir modeling (simulation) and economic forecasting. The first step in the analysis is the identification of the Well-Test Interpretation (WTI) model, which determines the appropriate solution method. Mis-identification of the WTI model occurs due to noise and non-ideal reservoir conditions. Previous studies have shown that a feed-forward neural network using the backpropagation algorithm can be used to identify the WTI model. One of the drawbacks to this approach is, however, training time, which can run into days of CPU time on personal computers. In this paper a similar neural network is applied using both a personal computer and a neurocomputer. Input data processing, network design, and performance are discussed and compared. The results show that the neurocomputer greatly eases the burden of training and allows the network to outperform a similar network running on a personal computer.
Description of 'REQUEST-KYUSHYU' for KYUKEICHO regional data base
NASA Astrophysics Data System (ADS)
Takimoto, Shin'ichi
Kyushu Economic Research Association (a foundational juridical person) initiated the regional database services, ' REQUEST-Kyushu ' recently. It is the full scale databases compiled based on the information and know-hows which the Association has accumulated over forty years. It covers the regional information database for journal and newspaper articles, and statistical information database for economic statistics. As to the former database it is searched on a personal computer and then a search result (original text) is sent through a facsimile. As to the latter, it is also searched on a personal computer where the data is processed, edited or downloaded. This paper describes characteristics, content and the system outline of 'REQUEST-Kyushu'.
Dynamic Enforcement of Knowledge-based Security Policies
2011-04-05
foster and maintain relationships by sharing information with friends and fans. These services store users’ personal information and use it to customize...Facebook selects ads based on age, gender, and even sexual preference [2]. Unfortunately, once personal information is collected, users have limited...could use a storage server (e.g., running on their home network) that handles personal † University of Maryland, Department of Computer Science
A scalable parallel black oil simulator on distributed memory parallel computers
NASA Astrophysics Data System (ADS)
Wang, Kun; Liu, Hui; Chen, Zhangxin
2015-11-01
This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.
Methods for Improving the User-Computer Interface. Technical Report.
ERIC Educational Resources Information Center
McCann, Patrick H.
This summary of methods for improving the user-computer interface is based on a review of the pertinent literature. Requirements of the personal computer user are identified and contrasted with computer designer perspectives towards the user. The user's psychological needs are described, so that the design of the user-computer interface may be…
Connectionist Models for Intelligent Computation
1989-07-26
Intelligent Canputation 12. PERSONAL AUTHOR(S) H.H. Chen and Y.C. Lee 13a. o R,POT Cal 13b TIME lVD/rED 14 DATE OF REPORT (Year, Month, Day) JS PAGE...fied Project Title: Connectionist Models-for Intelligent Computation Contract/Grant No.: AFOSR-87-0388 Contract/Grant Period of Performance: Sept. 1...underlying principles, architectures and appilications of artificial neural networks for intelligent computations.o, Approach: -) We use both numerical
Geil, Mark D
2007-01-01
Computer-aided design (CAD) and computer-aided manufacturing systems have been adapted for specific use in prosthetics, providing practitioners with a means to digitally capture the shape of a patient's limb, modify the socket model using software, and automatically manufacture either a positive model to be used in the fabrication of a socket or the socket itself. The digital shape captured is a three-dimensional (3-D) model from which standard anthropometric measures can be easily obtained. This study recorded six common anthropometric dimensions from CAD shape files of three foam positive models of the residual limbs of persons with transtibial amputations. Two systems were used to obtain 3-D models of the residual limb, a noncontact optical system and a contact-based electromagnetic field system, and both experienced practitioners and prosthetics students conducted measurements. Measurements were consistent; the mean range (difference of maximum and minimum) across all measurements was 0.96 cm. Both systems provided similar results, and both groups used the systems consistently. Students were slightly more consistent than practitioners but not to a clinically significant degree. Results also compared favorably with traditional measurement, with differences versus hand measurements about 5 mm. These results suggest the routine use of digital shape capture for collection of patient volume information.
HOPE information system review
NASA Astrophysics Data System (ADS)
Suzuki, Yoshiaki; Nishiyama, Kenji; Ono, Shuuji; Fukuda, Kouin
1992-08-01
An overview of the review conducted on H-2 Orbiting Plane (HOPE) is presented. A prototype model was constructed by inputting various technical information proposed by related laboratories. Especially operation flow which enables understanding of correlation between various analysis items, judgement criteria, technical data, and interfaces with others was constructed. Technical information data base and retrieval systems were studied. A Macintosh personal computer was selected for information shaping because of its excellent function, performance, operability, and software completeness.
Song, Hongning; Zhou, Qing; Zhang, Lan; Deng, Qing; Wang, Yijia; Hu, Bo; Tan, Tuantuan; Chen, Jinling; Pan, Yiteng; He, Fazhi
2017-01-01
Abstract The novel 3-dimensional printing (3DP) technique has shown its ability to assist personalized cardiac intervention therapy. This study aimed to determine the feasibility of 3D-printed left atrial appendage (LAA) models based on 3D transesophageal echocardiography (3D TEE) data and their application value in treating LAA occlusions. Eighteen patients with transcatheter LAA occlusion, and preprocedure 3D TEE and cardiac computed tomography were enrolled. 3D TEE volumetric data of the LAA were acquired and postprocessed for 3DP. Two types of 3D models of the LAA (ie, hard chamber model and flexible wall model) were printed by a 3D printer. The morphological classification and lobe identification of the LAA were assessed by the 3D chamber model, and LAA dimensions were measured via the 3D wall model. Additionally, a simulation operative rehearsal was performed on the 3D models in cases of challenging LAA morphology for the purpose of understanding the interactions between the device and the model. Three-dimensional TEE volumetric data of the LAA were successfully reprocessed and printed as 3D LAA chamber models and 3D LAA wall models in all patients. The consistency of the morphological classifications of the LAA based on 3D models and cardiac computed tomography was 0.92 (P < .01). The differences between the LAA ostium dimensions and depth measured using the 3D models were not significant from those measured on 3D TEE (P > .05). A simulation occlusion was successfully performed on the 3D model of the 2 challenging cases and compared with the real procedure. The echocardiographic 3DP technique is feasible and accurate in reflecting the spatial morphology of the LAA, which may be promising for the personalized planning of transcatheter LAA occlusion. PMID:28930824
Development of a dynamic computational model of social cognitive theory.
Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C
2016-12-01
Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.
NASA Astrophysics Data System (ADS)
Aydiner, Ekrem; Cherstvy, Andrey G.; Metzler, Ralf
2018-01-01
We study by Monte Carlo simulations a kinetic exchange trading model for both fixed and distributed saving propensities of the agents and rationalize the person and wealth distributions. We show that the newly introduced wealth distribution - that may be more amenable in certain situations - features a different power-law exponent, particularly for distributed saving propensities of the agents. For open agent-based systems, we analyze the person and wealth distributions and find that the presence of trap agents alters their amplitude, leaving however the scaling exponents nearly unaffected. For an open system, we show that the total wealth - for different trap agent densities and saving propensities of the agents - decreases in time according to the classical Kohlrausch-Williams-Watts stretched exponential law. Interestingly, this decay does not depend on the trap agent density, but rather on saving propensities. The system relaxation for fixed and distributed saving schemes are found to be different.
Eye gaze correction with stereovision for video-teleconferencing.
Yang, Ruigang; Zhang, Zhengyou
2004-07-01
The lack of eye contact in desktop video teleconferencing substantially reduces the effectiveness of video contents. While expensive and bulky hardware is available on the market to correct eye gaze, researchers have been trying to provide a practical software-based solution to bring video-teleconferencing one step closer to the mass market. This paper presents a novel approach: Based on stereo analysis combined with rich domain knowledge (a personalized face model), we synthesize, using graphics hardware, a virtual video that maintains eye contact. A 3D stereo head tracker with a personalized face model is used to compute initial correspondences across two views. More correspondences are then added through template and feature matching. Finally, all the correspondence information is fused together for view synthesis using view morphing techniques. The combined methods greatly enhance the accuracy and robustness of the synthesized views. Our current system is able to generate an eye-gaze corrected video stream at five frames per second on a commodity 1 GHz PC.
Silvey, Garry M.; Macri, Jennifer M.; Lee, Paul P.; Lobach, David F.
2005-01-01
New mobile computing devices including personal digital assistants (PDAs) and tablet computers have emerged to facilitate data collection at the point of care. Unfortunately, little research has been reported regarding which device is optimal for a given care setting. In this study we created and compared functionally identical applications on a Palm operating system-based PDA and a Windows-based tablet computer for point-of-care documentation of clinical observations by eye care professionals when caring for patients with diabetes. Eye-care professionals compared the devices through focus group sessions and through validated usability surveys. We found that the application on the tablet computer was preferred over the PDA for documenting the complex data related to eye care. Our findings suggest that the selection of a mobile computing platform depends on the amount and complexity of the data to be entered; the tablet computer functions better for high volume, complex data entry, and the PDA, for low volume, simple data entry. PMID:16779128
The Acceptance of Computer Technology by Teachers in Early Childhood Education
ERIC Educational Resources Information Center
Jeong, Hye In; Kim, Yeolib
2017-01-01
This study investigated kindergarten teachers' decision-making process regarding the acceptance of computer technology. We incorporated the Technology Acceptance Model framework, in addition to computer self-efficacy, subjective norm, and personal innovativeness in education technology as external variables. The data were obtained from 160…
Tsai, Tsung-Yuan; Li, Jing-Sheng; Wang, Shaobai; Li, Pingyue; Kwon, Young-Min; Li, Guoan
2013-01-01
The statistical shape model (SSM) method that uses 2D images of the knee joint to predict the 3D joint surface model has been reported in literature. In this study, we constructed a SSM database using 152 human CT knee joint models, including the femur, tibia and patella and analyzed the characteristics of each principal component of the SSM. The surface models of two in vivo knees were predicted using the SSM and their 2D bi-plane fluoroscopic images. The predicted models were compared to their CT joint models. The differences between the predicted 3D knee joint surfaces and the CT image-based surfaces were 0.30 ± 0.81 mm, 0.34 ± 0.79 mm and 0.36 ± 0.59 mm for the femur, tibia and patella, respectively (average ± standard deviation). The computational time for each bone of the knee joint was within 30 seconds using a personal computer. The analysis of this study indicated that the SSM method could be a useful tool to construct 3D surface models of the knee with sub-millimeter accuracy in real time. Thus it may have a broad application in computer assisted knee surgeries that require 3D surface models of the knee. PMID:24156375
Short assessment of the Big Five: robust across survey methods except telephone interviewing.
Lang, Frieder R; John, Dennis; Lüdtke, Oliver; Schupp, Jürgen; Wagner, Gert G
2011-06-01
We examined measurement invariance and age-related robustness of a short 15-item Big Five Inventory (BFI-S) of personality dimensions, which is well suited for applications in large-scale multidisciplinary surveys. The BFI-S was assessed in three different interviewing conditions: computer-assisted or paper-assisted face-to-face interviewing, computer-assisted telephone interviewing, and a self-administered questionnaire. Randomized probability samples from a large-scale German panel survey and a related probability telephone study were used in order to test method effects on self-report measures of personality characteristics across early, middle, and late adulthood. Exploratory structural equation modeling was used in order to test for measurement invariance of the five-factor model of personality trait domains across different assessment methods. For the short inventory, findings suggest strong robustness of self-report measures of personality dimensions among young and middle-aged adults. In old age, telephone interviewing was associated with greater distortions in reliable personality assessment. It is concluded that the greater mental workload of telephone interviewing limits the reliability of self-report personality assessment. Face-to-face surveys and self-administrated questionnaire completion are clearly better suited than phone surveys when personality traits in age-heterogeneous samples are assessed.
ERIC Educational Resources Information Center
Fritsch, Helmut
A project was conducted to increase as well as to professionalize communication between tutors and learners in a West German university's distance education program by the use of personal computers. Two tutors worked on the systematic development of a PC-based correcting system. The goal, apart from developing general language skills in English,…
Increased Memory Load during Task Completion when Procedures Are Presented on Mobile Screens
ERIC Educational Resources Information Center
Byrd, Keena S.; Caldwell, Barrett S.
2011-01-01
The primary objective of this research was to compare procedure-based task performance using three common mobile screen sizes: ultra mobile personal computer (7 in./17.8 cm), personal data assistant (3.5 in./8.9 cm), and SmartPhone (2.8 in./7.1 cm). Subjects used these three screen sizes to view and execute a computer maintenance procedure.…
ERIC Educational Resources Information Center
Khribi, Mohamed Koutheair; Jemni, Mohamed; Nasraoui, Olfa
2009-01-01
In this paper, we describe an automatic personalization approach aiming to provide online automatic recommendations for active learners without requiring their explicit feedback. Recommended learning resources are computed based on the current learner's recent navigation history, as well as exploiting similarities and dissimilarities among…
Computational Psychometrics for Modeling System Dynamics during Stressful Disasters
Cipresso, Pietro; Bessi, Alessandro; Colombo, Desirée; Pedroli, Elisa; Riva, Giuseppe
2017-01-01
Disasters can be very stressful events. However, computational models of stress require data that might be very difficult to collect during disasters. Moreover, personal experiences are not repeatable, so it is not possible to collect bottom-up information when building a coherent model. To overcome these problems, we propose the use of computational models and virtual reality integration to recreate disaster situations, while examining possible dynamics in order to understand human behavior and relative consequences. By providing realistic parameters associated with disaster situations, computational scientists can work more closely with emergency responders to improve the quality of interventions in the future. PMID:28861026
A personal computer-based, multitasking data acquisition system
NASA Technical Reports Server (NTRS)
Bailey, Steven A.
1990-01-01
A multitasking, data acquisition system was written to simultaneously collect meteorological radar and telemetry data from two sources. This system is based on the personal computer architecture. Data is collected via two asynchronous serial ports and is deposited to disk. The system is written in both the C programming language and assembler. It consists of three parts: a multitasking kernel for data collection, a shell with pull down windows as user interface, and a graphics processor for editing data and creating coded messages. An explanation of both system principles and program structure is presented.
Uncertainty quantification for personalized analyses of human proximal femurs.
Wille, Hagen; Ruess, Martin; Rank, Ernst; Yosibash, Zohar
2016-02-29
Computational models for the personalized analysis of human femurs contain uncertainties in bone material properties and loads, which affect the simulation results. To quantify the influence we developed a probabilistic framework based on polynomial chaos (PC) that propagates stochastic input variables through any computational model. We considered a stochastic E-ρ relationship and a stochastic hip contact force, representing realistic variability of experimental data. Their influence on the prediction of principal strains (ϵ1 and ϵ3) was quantified for one human proximal femur, including sensitivity and reliability analysis. Large variabilities in the principal strain predictions were found in the cortical shell of the femoral neck, with coefficients of variation of ≈40%. Between 60 and 80% of the variance in ϵ1 and ϵ3 are attributable to the uncertainty in the E-ρ relationship, while ≈10% are caused by the load magnitude and 5-30% by the load direction. Principal strain directions were unaffected by material and loading uncertainties. The antero-superior and medial inferior sides of the neck exhibited the largest probabilities for tensile and compression failure, however all were very small (pf<0.001). In summary, uncertainty quantification with PC has been demonstrated to efficiently and accurately describe the influence of very different stochastic inputs, which increases the credibility and explanatory power of personalized analyses of human proximal femurs. Copyright © 2015 Elsevier Ltd. All rights reserved.
Sanchez, Rosanna; Weinflash, Noah; Myers, Catherine E.
2016-01-01
Decisions based on trust are critical for human social interaction. We judge the trustworthiness of partners in social interactions based on a number of partner characteristics as well as experiences with those partners. These decisions are also influenced by personality. The current study examined how the personality trait of behavioral inhibition, which involves the tendency to avoid or withdraw from novelty in both social and non-social situations, is related to explicit ratings of trustworthiness as well as decisions made in the trust game. In the game, healthy young adults interacted with three fictional partners who were portrayed as trustworthy, untrustworthy or neutral through biographical information. Participants could choose to keep $1 or send $3 of virtual money to a partner. The partner could then choose to send $1.5 back to the participant or to keep the entire amount. On any trial in which the participant chose to send, the partner always reciprocated with 50% probability, irrespective of how that partner was portrayed in the biography. Behavioral inhibition was assessed through a self-report questionnaire. Finally, a reinforcement learning computational model was fit to the behavior of each participant. Self-reported ratings of trust confirmed that all participants, irrespective of behavioral inhibition, perceived differences in the moral character of the three partners (trustworthiness of good > neutral > bad partner). Decisions made in the game showed that inhibited participants tended to trust the neutral partner less than uninhibited participants. In contrast, this was not reflected in the ratings of the neutral partner (either pre- or post-game), indicating a dissociation between ratings of trustworthiness and decisions made by inhibited participants. Computational modeling showed that this was due to lower initial trust of the neutral partner rather than a higher learning rate associated with loss, suggesting an implicit bias against the neutral partner. Overall, the results suggest inhibited individuals may be predisposed to interpret neutral or ambiguous information more negatively which could, at least in part, account for the tendency to avoid unfamiliar people characteristic of behaviorally inhibited temperament, as well as its relationship to anxiety disorders. PMID:27004148
U.S. hydropower resource assessment for Idaho
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conner, A.M.; Francfort, J.E.
1998-08-01
The US Department of Energy is developing an estimate of the undeveloped hydropower potential in the US. The Hydropower Evaluation Software (HES) is a computer model that was developed by the Idaho National Engineering and Environmental Laboratory for this purpose. HES measures the undeveloped hydropower resources available in the US, using uniform criteria for measurement. The software was developed and tested using hydropower information and data provided by the Southwestern Power Administration. It is a menu-driven program that allows the personal computer user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based onmore » the environmental attributes present, and generate reports based on these suitability factors. This report describes the resource assessment results for the State of Idaho.« less
An Inter-Personal Information Sharing Model Based on Personalized Recommendations
NASA Astrophysics Data System (ADS)
Kamei, Koji; Funakoshi, Kaname; Akahani, Jun-Ichi; Satoh, Tetsuji
In this paper, we propose an inter-personal information sharing model among individuals based on personalized recommendations. In the proposed model, we define an information resource as shared between people when both of them consider it important --- not merely when they both possess it. In other words, the model defines the importance of information resources based on personalized recommendations from identifiable acquaintances. The proposed method is based on a collaborative filtering system that focuses on evaluations from identifiable acquaintances. It utilizes both user evaluations for documents and their contents. In other words, each user profile is represented as a matrix of credibility to the other users' evaluations on each domain of interests. We extended the content-based collaborative filtering method to distinguish other users to whom the documents should be recommended. We also applied a concept-based vector space model to represent the domain of interests instead of the previous method which represented them by a term-based vector space model. We introduce a personalized concept-base compiled from each user's information repository to improve the information retrieval in the user's environment. Furthermore, the concept-spaces change from user to user since they reflect the personalities of the users. Because of different concept-spaces, the similarity between a document and a user's interest varies for each user. As a result, a user receives recommendations from other users who have different view points, achieving inter-personal information sharing based on personalized recommendations. This paper also describes an experimental simulation of our information sharing model. In our laboratory, five participants accumulated a personal repository of e-mails and web pages from which they built their own concept-base. Then we estimated the user profiles according to personalized concept-bases and sets of documents which others evaluated. We simulated inter-personal recommendation based on the user profiles and evaluated the performance of the recommendation method by comparing the recommended documents to the result of the content-based collaborative filtering.
Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I
1995-01-01
Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.
ERIC Educational Resources Information Center
Pike, Ronald E.; Pittman, Jason M.; Hwang, Drew
2017-01-01
This paper investigates the use of a cloud computing environment to facilitate the teaching of web development at a university in the Southwestern United States. A between-subjects study of students in a web development course was conducted to assess the merits of a cloud computing environment instead of personal computers for developing websites.…
A video, text, and speech-driven realistic 3-d virtual head for human-machine interface.
Yu, Jun; Wang, Zeng-Fu
2015-05-01
A multiple inputs-driven realistic facial animation system based on 3-D virtual head for human-machine interface is proposed. The system can be driven independently by video, text, and speech, thus can interact with humans through diverse interfaces. The combination of parameterized model and muscular model is used to obtain a tradeoff between computational efficiency and high realism of 3-D facial animation. The online appearance model is used to track 3-D facial motion from video in the framework of particle filtering, and multiple measurements, i.e., pixel color value of input image and Gabor wavelet coefficient of illumination ratio image, are infused to reduce the influence of lighting and person dependence for the construction of online appearance model. The tri-phone model is used to reduce the computational consumption of visual co-articulation in speech synchronized viseme synthesis without sacrificing any performance. The objective and subjective experiments show that the system is suitable for human-machine interaction.
García-Sáez, Gema; Rigla, Mercedes; Martínez-Sarriegui, Iñaki; Shalom, Erez; Peleg, Mor; Broens, Tom; Pons, Belén; Caballero-Ruíz, Estefanía; Gómez, Enrique J; Hernando, M Elena
2014-03-01
The risks associated with gestational diabetes (GD) can be reduced with an active treatment able to improve glycemic control. Advances in mobile health can provide new patient-centric models for GD to create personalized health care services, increase patient independence and improve patients' self-management capabilities, and potentially improve their treatment compliance. In these models, decision-support functions play an essential role. The telemedicine system MobiGuide provides personalized medical decision support for GD patients that is based on computerized clinical guidelines and adapted to a mobile environment. The patient's access to the system is supported by a smartphone-based application that enhances the efficiency and ease of use of the system. We formalized the GD guideline into a computer-interpretable guideline (CIG). We identified several workflows that provide decision-support functionalities to patients and 4 types of personalized advice to be delivered through a mobile application at home, which is a preliminary step to providing decision-support tools in a telemedicine system: (1) therapy, to help patients to comply with medical prescriptions; (2) monitoring, to help patients to comply with monitoring instructions; (3) clinical assessment, to inform patients about their health conditions; and (4) upcoming events, to deal with patients' personal context or special events. The whole process to specify patient-oriented decision support functionalities ensures that it is based on the knowledge contained in the GD clinical guideline and thus follows evidence-based recommendations but at the same time is patient-oriented, which could enhance clinical outcomes and patients' acceptance of the whole system. © 2014 Diabetes Technology Society.
System for assisted mobility using eye movements based on electrooculography.
Barea, Rafael; Boquete, Luciano; Mazo, Manuel; López, Elena
2002-12-01
This paper describes an eye-control method based on electrooculography (EOG) to develop a system for assisted mobility. One of its most important features is its modularity, making it adaptable to the particular needs of each user according to the type and degree of handicap involved. An eye model based on electroculographic signal is proposed and its validity is studied. Several human-machine interfaces (HMI) based on EOG are commented, focusing our study on guiding and controlling a wheelchair for disabled people, where the control is actually effected by eye movements within the socket. Different techniques and guidance strategies are then shown with comments on the advantages and disadvantages of each one. The system consists of a standard electric wheelchair with an on-board computer, sensors and a graphic user interface run by the computer. On the other hand, this eye-control method can be applied to handle graphical interfaces, where the eye is used as a mouse computer. Results obtained show that this control technique could be useful in multiple applications, such as mobility and communication aid for handicapped persons.
NASA Astrophysics Data System (ADS)
Gupta, V.; Gupta, N.; Gupta, S.; Field, E.; Maechling, P.
2003-12-01
Modern laptop computers, and personal computers, can provide capabilities that are, in many ways, comparable to workstations or departmental servers. However, this doesn't mean we should run all computations on our local computers. We have identified several situations in which it preferable to implement our seismological application programs in a distributed, server-based, computing model. In this model, application programs on the user's laptop, or local computer, invoke programs that run on an organizational server, and the results are returned to the invoking system. Situations in which a server-based architecture may be preferred include: (a) a program is written in a language, or written for an operating environment, that is unsupported on the local computer, (b) software libraries or utilities required to execute a program are not available on the users computer, (c) a computational program is physically too large, or computationally too expensive, to run on a users computer, (d) a user community wants to enforce a consistent method of performing a computation by standardizing on a single implementation of a program, and (e) the computational program may require current information, that is not available to all client computers. Until recently, distributed, server-based, computational capabilities were implemented using client/server architectures. In these architectures, client programs were often written in the same language, and they executed in the same computing environment, as the servers. Recently, a new distributed computational model, called Web Services, has been developed. Web Services are based on Internet standards such as XML, SOAP, WDSL, and UDDI. Web Services offer the promise of platform, and language, independent distributed computing. To investigate this new computational model, and to provide useful services to the SCEC Community, we have implemented several computational and utility programs using a Web Service architecture. We have hosted these Web Services as a part of the SCEC Community Modeling Environment (SCEC/CME) ITR Project (http://www.scec.org/cme). We have implemented Web Services for several of the reasons sited previously. For example, we implemented a FORTRAN-based Earthquake Rupture Forecast (ERF) as a Web Service for use by client computers that don't support a FORTRAN runtime environment. We implemented a Generic Mapping Tool (GMT) Web Service for use by systems that don't have local access to GMT. We implemented a Hazard Map Calculator Web Service to execute Hazard calculations that are too computationally intensive to run on a local system. We implemented a Coordinate Conversion Web Service to enforce a standard and consistent method for converting between UTM and Lat/Lon. Our experience developing these services indicates both strengths and weakness in current Web Service technology. Client programs that utilize Web Services typically need network access, a significant disadvantage at times. Programs with simple input and output parameters were the easiest to implement as Web Services, while programs with complex parameter-types required a significant amount of additional development. We also noted that Web services are very data-oriented, and adapting object-oriented software into the Web Service model proved problematic. Also, the Web Service approach of converting data types into XML format for network transmission has significant inefficiencies for some data sets.
Ad Hoc modeling, expert problem solving, and R&T program evaluation
NASA Technical Reports Server (NTRS)
Silverman, B. G.; Liebowitz, J.; Moustakis, V. S.
1983-01-01
A simplified cost and time (SCAT) analysis program utilizing personal-computer technology is presented and demonstrated in the case of the NASA-Goddard end-to-end data system. The difficulties encountered in implementing complex program-selection and evaluation models in the research and technology field are outlined. The prototype SCAT system described here is designed to allow user-friendly ad hoc modeling in real time and at low cost. A worksheet constructed on the computer screen displays the critical parameters and shows how each is affected when one is altered experimentally. In the NASA case, satellite data-output and control requirements, ground-facility data-handling capabilities, and project priorities are intricately interrelated. Scenario studies of the effects of spacecraft phaseout or new spacecraft on throughput and delay parameters are shown. The use of a network of personal computers for higher-level coordination of decision-making processes is suggested, as a complement or alternative to complex large-scale modeling.
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-01-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic. PMID:26451333
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic.
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-09-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zuo, Wangda; McNeil, Andrew; Wetter, Michael
2013-05-23
Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach wasmore » evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.« less
Implicit prosody mining based on the human eye image capture technology
NASA Astrophysics Data System (ADS)
Gao, Pei-pei; Liu, Feng
2013-08-01
The technology of eye tracker has become the main methods of analyzing the recognition issues in human-computer interaction. Human eye image capture is the key problem of the eye tracking. Based on further research, a new human-computer interaction method introduced to enrich the form of speech synthetic. We propose a method of Implicit Prosody mining based on the human eye image capture technology to extract the parameters from the image of human eyes when reading, control and drive prosody generation in speech synthesis, and establish prosodic model with high simulation accuracy. Duration model is key issues for prosody generation. For the duration model, this paper put forward a new idea for obtaining gaze duration of eyes when reading based on the eye image capture technology, and synchronous controlling this duration and pronunciation duration in speech synthesis. The movement of human eyes during reading is a comprehensive multi-factor interactive process, such as gaze, twitching and backsight. Therefore, how to extract the appropriate information from the image of human eyes need to be considered and the gaze regularity of eyes need to be obtained as references of modeling. Based on the analysis of current three kinds of eye movement control model and the characteristics of the Implicit Prosody reading, relative independence between speech processing system of text and eye movement control system was discussed. It was proved that under the same text familiarity condition, gaze duration of eyes when reading and internal voice pronunciation duration are synchronous. The eye gaze duration model based on the Chinese language level prosodic structure was presented to change previous methods of machine learning and probability forecasting, obtain readers' real internal reading rhythm and to synthesize voice with personalized rhythm. This research will enrich human-computer interactive form, and will be practical significance and application prospect in terms of disabled assisted speech interaction. Experiments show that Implicit Prosody mining based on the human eye image capture technology makes the synthesized speech has more flexible expressions.
FaceWarehouse: a 3D facial expression database for visual computing.
Cao, Chen; Weng, Yanlin; Zhou, Shun; Tong, Yiying; Zhou, Kun
2014-03-01
We present FaceWarehouse, a database of 3D facial expressions for visual computing applications. We use Kinect, an off-the-shelf RGBD camera, to capture 150 individuals aged 7-80 from various ethnic backgrounds. For each person, we captured the RGBD data of her different expressions, including the neutral expression and 19 other expressions such as mouth-opening, smile, kiss, etc. For every RGBD raw data record, a set of facial feature points on the color image such as eye corners, mouth contour, and the nose tip are automatically localized, and manually adjusted if better accuracy is required. We then deform a template facial mesh to fit the depth data as closely as possible while matching the feature points on the color image to their corresponding points on the mesh. Starting from these fitted face meshes, we construct a set of individual-specific expression blendshapes for each person. These meshes with consistent topology are assembled as a rank-3 tensor to build a bilinear face model with two attributes: identity and expression. Compared with previous 3D facial databases, for every person in our database, there is a much richer matching collection of expressions, enabling depiction of most human facial actions. We demonstrate the potential of FaceWarehouse for visual computing with four applications: facial image manipulation, face component transfer, real-time performance-based facial image animation, and facial animation retargeting from video to image.
Rosenow, Felix; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Bauer, Sebastian
2017-11-01
Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. Part I includes the clinical phenotyping and diagnostic methods, EEG network-analysis, biomarkers, and personalized treatment approaches. In Part II, experimental and translational approaches will be discussed (Bauer et al., 2017) [1]. Copyright © 2017 Elsevier Inc. All rights reserved.
Bauer, Sebastian; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Rosenow, Felix
2017-11-01
Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics, and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. This Part II includes the experimental and translational approaches and a discussion of the future perspectives, while the diagnostic methods, EEG network analysis, biomarkers, and personalized treatment approaches were addressed in Part I [1]. Copyright © 2017. Published by Elsevier Inc.
Personal Computer System for Automatic Coronary Venous Flow Measurement
Dew, Robert B.
1985-01-01
We developed an automated system based on an IBM PC/XT Personal computer to measure coronary venous blood flow during cardiac catheterization. Flow is determined by a thermodilution technique in which a cold saline solution is infused through a catheter into the coronary venous system. Regional temperature fluctuations sensed by the catheter are used to determine great cardiac vein and coronary sinus blood flow. The computer system replaces manual methods of acquiring and analyzing temperature data related to flow measurement, thereby increasing the speed and accuracy with which repetitive flow determinations can be made.
A Personal Computer-Based Head-Spine Model
1998-09-01
the CHSM. CHSM was comprised of the pelvis, the thoracolumbar spine, a single beam representation of the cervical spine, the head, the rib cage , and...developing the private sector HSM-PC project follows the Phase II program Work Plan , but continues into a Phase m SBIR program internally funded by...on completing the head and neck portion of HSM-PC, which as described in the Confidence Assessment Plan (CA Plan ) will be known as the Head Cervical
2014-12-01
Primary Military Occupational Specialty PRO Proficiency Q-Q Quantile - Quantile RSS Residual Sum of Squares SI Shop Information T&R Training and...construct multivariate linear regression models to estimate Marines’ Computed Tier Score and time to achieve E-4 based on their individual personal...Science (GS) score, ASVAB Mathematics Knowledge (MK) score, ASVAB Paragraph Comprehension (PC) score, weight , and whether a Marine receives a weight
Privacy Management and Networked PPD Systems - Challenges Solutions.
Ruotsalainen, Pekka; Pharow, Peter; Petersen, Francoise
2015-01-01
Modern personal portable health devices (PPDs) become increasingly part of a larger, inhomogeneous information system. Information collected by sensors are stored and processed in global clouds. Services are often free of charge, but at the same time service providers' business model is based on the disclosure of users' intimate health information. Health data processed in PPD networks is not regulated by health care specific legislation. In PPD networks, there is no guarantee that stakeholders share same ethical principles with the user. Often service providers have own security and privacy policies and they rarely offer to the user possibilities to define own, or adapt existing privacy policies. This all raises huge ethical and privacy concerns. In this paper, the authors have analyzed privacy challenges in PPD networks from users' viewpoint using system modeling method and propose the principle "Personal Health Data under Personal Control" must generally be accepted at global level. Among possible implementation of this principle, the authors propose encryption, computer understandable privacy policies, and privacy labels or trust based privacy management methods. The latter can be realized using infrastructural trust calculation and monitoring service. A first step is to require the protection of personal health information and the principle proposed being internationally mandatory. This requires both regulatory and standardization activities, and the availability of open and certified software application which all service providers can implement. One of those applications should be the independent Trust verifier.
ERIC Educational Resources Information Center
Karsh, Kathryn G.
This final report describes activities of a federally funded project which developed an educational computer-assisted instructional program for persons with severe disabilities. A preliminary review of the literature identified specific inadequacies of most software for this population, such as: too few examples of a task or concept thus limiting…
Zittrain, Jonathan
2008-10-28
Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.
Nontrivial, Nonintelligent, Computer-Based Learning.
ERIC Educational Resources Information Center
Bork, Alfred
1987-01-01
This paper describes three interactive computer programs used with personal computers to present science learning modules for all ages. Developed by groups of teachers at the Educational Technology Center at the University of California, Irvine, these instructional materials do not use the techniques of contemporary artificial intelligence. (GDC)
2012-01-01
Background In recent years, computer simulation models have supported development of pandemic influenza preparedness policies. However, U.S. policymakers have raised several concerns about the practical use of these models. In this review paper, we examine the extent to which the current literature already addresses these concerns and identify means of enhancing the current models for higher operational use. Methods We surveyed PubMed and other sources for published research literature on simulation models for influenza pandemic preparedness. We identified 23 models published between 1990 and 2010 that consider single-region (e.g., country, province, city) outbreaks and multi-pronged mitigation strategies. We developed a plan for examination of the literature based on the concerns raised by the policymakers. Results While examining the concerns about the adequacy and validity of data, we found that though the epidemiological data supporting the models appears to be adequate, it should be validated through as many updates as possible during an outbreak. Demographical data must improve its interfaces for access, retrieval, and translation into model parameters. Regarding the concern about credibility and validity of modeling assumptions, we found that the models often simplify reality to reduce computational burden. Such simplifications may be permissible if they do not interfere with the performance assessment of the mitigation strategies. We also agreed with the concern that social behavior is inadequately represented in pandemic influenza models. Our review showed that the models consider only a few social-behavioral aspects including contact rates, withdrawal from work or school due to symptoms appearance or to care for sick relatives, and compliance to social distancing, vaccination, and antiviral prophylaxis. The concern about the degree of accessibility of the models is palpable, since we found three models that are currently accessible by the public while other models are seeking public accessibility. Policymakers would prefer models scalable to any population size that can be downloadable and operable in personal computers. But scaling models to larger populations would often require computational needs that cannot be handled with personal computers and laptops. As a limitation, we state that some existing models could not be included in our review due to their limited available documentation discussing the choice of relevant parameter values. Conclusions To adequately address the concerns of the policymakers, we need continuing model enhancements in critical areas including: updating of epidemiological data during a pandemic, smooth handling of large demographical databases, incorporation of a broader spectrum of social-behavioral aspects, updating information for contact patterns, adaptation of recent methodologies for collecting human mobility data, and improvement of computational efficiency and accessibility. PMID:22463370
Bipartite Graphs as Models of Population Structures in Evolutionary Multiplayer Games
Peña, Jorge; Rochat, Yannick
2012-01-01
By combining evolutionary game theory and graph theory, “games on graphs” study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner’s dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner’s dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures. PMID:22970237
DRI Model of the U.S. Economy -- Model Documentation
1993-01-01
Provides documentation on Data Resources, Inc., DRI Model of the U.S. Economy and the DRI Personal Computer Input/Output Model. It also describes the theoretical basis, structure and functions of both DRI models; and contains brief descriptions of the models and their equations.
ERIC Educational Resources Information Center
Hedman, Leif; Sharafi, Parvaneh
2004-01-01
This case study explores how educational training and clinical practice that uses personal computers (PCs) and Personal Digital Assistants (PDAs) to access Internet-based medical information, affects the engagement modes of students, flow experience components, and IT-competence. A questionnaire assessing these variables was administered before…
Gul, Ahmet; Erman, Burak
2018-01-16
Prediction of peptide binding on specific human leukocyte antigens (HLA) has long been studied with successful results. We herein describe the effects of entropy and dynamics by investigating the binding stabilities of 10 nanopeptides on various HLA Class I alleles using a theoretical model based on molecular dynamics simulations. The fluctuational entropies of the peptides are estimated over a temperature range of 310-460 K. The estimated entropies correlate well with experimental binding affinities of the peptides: peptides that have higher binding affinities have lower entropies compared to non-binders, which have significantly larger entropies. The computation of the entropies is based on a simple model that requires short molecular dynamics trajectories and allows for approximate but rapid determination. The paper draws attention to the long neglected dynamic aspects of peptide binding, and provides a fast computation scheme that allows for rapid scanning of large numbers of peptides on selected HLA antigens, which may be useful in defining the right peptides for personal immunotherapy.
NASA Astrophysics Data System (ADS)
Gul, Ahmet; Erman, Burak
2018-03-01
Prediction of peptide binding on specific human leukocyte antigens (HLA) has long been studied with successful results. We herein describe the effects of entropy and dynamics by investigating the binding stabilities of 10 nanopeptides on various HLA Class I alleles using a theoretical model based on molecular dynamics simulations. The fluctuational entropies of the peptides are estimated over a temperature range of 310-460 K. The estimated entropies correlate well with experimental binding affinities of the peptides: peptides that have higher binding affinities have lower entropies compared to non-binders, which have significantly larger entropies. The computation of the entropies is based on a simple model that requires short molecular dynamics trajectories and allows for approximate but rapid determination. The paper draws attention to the long neglected dynamic aspects of peptide binding, and provides a fast computation scheme that allows for rapid scanning of large numbers of peptides on selected HLA antigens, which may be useful in defining the right peptides for personal immunotherapy.
Hydrodynamics Analysis and CFD Simulation of Portal Venous System by TIPS and LS.
Wang, Meng; Zhou, Hongyu; Huang, Yaozhen; Gong, Piyun; Peng, Bing; Zhou, Shichun
2015-06-01
In cirrhotic patients, portal hypertension is often associated with a hyperdynamic changes. Transjugular Intrahepatic Portosystemic Shunt (TIPS) and Laparoscopic splenectomy are both treatments for liver cirrhosis due to portal hypertension. While, the two different interventions have different effects on hemodynamics after operation and the possibilities of triggering PVT are different. How hemodynamics of portal vein system evolving with two different operations remain unknown. Based on ultrasound and established numerical methods, CFD technique is applied to analyze hemodynamic changes after TIPS and Laparoscopic splenectomy. In this paper, we applied two 3-D flow models to the hemodynamic analysis for two patients who received a TIPS and a laparoscopic splenectomy, both therapies for treating portal hypertension induced diseases. The current computer simulations give a quantitative analysis of the interplay between hemodynamics and TIPS or splenectomy. In conclusion, the presented computational model can be used for the theoretical analysis of TIPS and laparoscopic splenectomy, clinical decisions could be made based on the simulation results with personal properly treatment.
Computer-Based and Paper-Based Measurement of Semantic Knowledge
1989-01-01
of Personality Assessment , 34, 353-361. McArthur, D. L., & Choppin, B. H. (1984). Computerized diagnostic testing. Journal 15 of Educational...Computers in Human Behavior, 1, 49-58. Lushene, R. E., O’Neii, H. F., & Dunn, T. (1974). Equivalent validity of a completely computerized MMPI. Journal
Computer-aided personal interviewing. A new technique for data collection in epidemiologic surveys.
Birkett, N J
1988-03-01
Most epidemiologic studies involve the collection of data directly from selected respondents. Traditionally, interviewers are provided with the interview in booklet form on paper and answers are recorded therein. On receipt at the study office, the interview results are coded, transcribed, and keypunched for analysis. The author's team has developed a method of personal interviewing which uses a structured interview stored on a lap-sized computer. Responses are entered into the computer and are subject to immediate error-checking and correction. All skip-patterns are automatic. Data entry to the final data-base involves no manual data transcription. A pilot evaluation with a preliminary version of the system using tape-recorded interviews in a test/re-test methodology revealed a slightly higher error rate, probably related to weaknesses in the pilot system and the training process. Computer interviews tended to be longer but other features of the interview process were not affected by computer. The author's team has now completed 2,505 interviews using this system in a community-based blood pressure survey. It has been well accepted by both interviewers and respondents. Failure to complete an interview on the computer was uncommon (5 per cent) and well-handled by paper back-up questionnaires. The results show that computer-aided personal interviewing in the home is feasible but that further evaluation is needed to establish the impact of this methodology on overall data quality.
Chenoweth, Lynn; Vickland, Victor; Stein-Parbury, Jane; Jeon, Yun-Hee; Kenny, Patricia; Brodaty, Henry
2015-10-01
To answer questions on the essential components (services, operations and resources) of a person-centered aged care home (iHome) using computer simulation. iHome was developed with AnyLogic software using extant study data obtained from 60 Australian aged care homes, 900+ clients and 700+ aged care staff. Bayesian analysis of simulated trial data will determine the influence of different iHome characteristics on care service quality and client outcomes. Interim results: A person-centered aged care home (socio-cultural context) and care/lifestyle services (interactional environment) can produce positive outcomes for aged care clients (subjective experiences) in the simulated environment. Further testing will define essential characteristics of a person-centered care home.
Itasaka, H; Matsumata, T; Taketomi, A; Yamamoto, K; Yanaga, K; Takenaka, K; Akazawa, K; Sugimachi, K
1994-12-01
A simple outpatient follow-up system was developed with a laptop personal computer to assist management of patients with hepatocellular carcinoma after hepatic resections. Since it is based on a non-relational database program and the graphical user interface of Macintosh operating system, those who are not a specialist of the computer operation can use it. It is helpful to promptly recognize current status and problems of the patients, to diagnose recurrences of the disease and to prevent lost from follow-up cases. A portability of the computer also facilitates utilization of these data everywhere, such as in clinical conferences and laboratories.
NEW GIS WATERSHED ANALYSIS TOOLS FOR SOIL CHARACTERIZATION AND EROSION AND SEDIMENTATION MODELING
A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed which utilizes a suite of automated scripts and a pair of processing-intensive executable programs operating on a personal computer platform.
Manifest: A computer program for 2-D flow modeling in Stirling machines
NASA Technical Reports Server (NTRS)
Gedeon, David
1989-01-01
A computer program named Manifest is discussed. Manifest is a program one might want to use to model the fluid dynamics in the manifolds commonly found between the heat exchangers and regenerators of Stirling machines; but not just in the manifolds - in the regenerators as well. And in all sorts of other places too, such as: in heaters or coolers, or perhaps even in cylinder spaces. There are probably nonStirling uses for Manifest also. In broad strokes, Manifest will: (1) model oscillating internal compressible laminar fluid flow in a wide range of two-dimensional regions, either filled with porous materials or empty; (2) present a graphics-based user-friendly interface, allowing easy selection and modification of region shape and boundary condition specification; (3) run on a personal computer, or optionally (in the case of its number-crunching module) on a supercomputer; and (4) allow interactive examination of the solution output so the user can view vector plots of flow velocity, contour plots of pressure and temperature at various locations and tabulate energy-related integrals of interest.
Ball, J.W.; Nordstrom, D. Kirk; Zachmann, D.W.
1987-01-01
A FORTRAN 77 version of the PL/1 computer program for the geochemical model WATEQ2, which computes major and trace element speciation and mineral saturation for natural waters has been developed. The code (WATEQ4F) has been adapted to execute on an IBM PC or compatible microcomputer. Two versions of the code are available, one operating with IBM Professional FORTRAN and an 8087 or 89287 numeric coprocessor, and one which operates without a numeric coprocessor using Microsoft FORTRAN 77. The calculation procedure is identical to WATEQ2, which has been installed on many mainframes and minicomputers. Limited data base revisions include the addition of the following ions: AlHS04(++), BaS04, CaHS04(++), FeHS04(++), NaF, SrC03, and SrHCO3(+). This report provides the reactions and references for the data base revisions, instructions for program operation, and an explanation of the input and output files. Attachments contain sample output from three water analyses used as test cases and the complete FORTRAN source listing. U.S. Geological Survey geochemical simulation program PHREEQE and mass balance program BALANCE also have been adapted to execute on an IBM PC or compatible microcomputer with a numeric coprocessor and the IBM Professional FORTRAN compiler. (Author 's abstract)
NASA Technical Reports Server (NTRS)
Jansen, B. J., Jr.
1998-01-01
The features of the data acquisition and control systems of the NASA Langley Research Center's Jet Noise Laboratory are presented. The Jet Noise Laboratory is a facility that simulates realistic mixed flow turbofan jet engine nozzle exhaust systems in simulated flight. The system is capable of acquiring data for a complete take-off assessment of noise and nozzle performance. This paper describes the development of an integrated system to control and measure the behavior of model jet nozzles featuring dual independent high pressure combusting air streams with wind tunnel flow. The acquisition and control system is capable of simultaneous measurement of forces, moments, static and dynamic model pressures and temperatures, and jet noise. The design concepts for the coordination of the control computers and multiple data acquisition computers and instruments are discussed. The control system design and implementation are explained, describing the features, equipment, and the experiences of using a primarily Personal Computer based system. Areas for future development are examined.
Behavioral personal digital assistants: The seventh generation of computing
Stephens, Kenneth R.; Hutchison, William R.
1992-01-01
Skinner (1985) described two divergent approaches to developing computer systems that would behave with some approximation to intelligence. The first approach, which corresponds to the mainstream of artificial intelligence and expert systems, models intelligence as a set of production rules that incorporate knowledge and a set of heuristics for inference and symbol manipulation. The alternative is a system that models the behavioral repertoire as a network of associations between antecedent stimuli and operants, and adapts when supplied with reinforcement. The latter approach is consistent with developments in the field of “neural networks.” The authors describe how an existing adaptive network software system, based on behavior analysis and developed since 1983, can be extended to provide a new generation of software systems capable of acquiring verbal behavior. This effort will require the collaboration of the academic and commercial sectors of the behavioral community, but the end result will enable a generational change in computer systems and support for behavior analytic concepts. PMID:22477053
Wilaiprasitporn, Theerawit; Yagi, Tohru
2015-01-01
This research demonstrates the orientation-modulated attention effect on visual evoked potential. We combined this finding with our previous findings about the motion-modulated attention effect and used the result to develop novel visual stimuli for a personal identification number (PIN) application based on a brain-computer interface (BCI) framework. An electroencephalography amplifier with a single electrode channel was sufficient for our application. A computationally inexpensive algorithm and small datasets were used in processing. Seven healthy volunteers participated in experiments to measure offline performance. Mean accuracy was 83.3% at 13.9 bits/min. Encouraged by these results, we plan to continue developing the BCI-based personal identification application toward real-time systems.
NASA Astrophysics Data System (ADS)
Chiavassa, S.; Aubineau-Lanièce, I.; Bitar, A.; Lisbona, A.; Barbet, J.; Franck, D.; Jourdain, J. R.; Bardiès, M.
2006-02-01
Dosimetric studies are necessary for all patients treated with targeted radiotherapy. In order to attain the precision required, we have developed Oedipe, a dosimetric tool based on the MCNPX Monte Carlo code. The anatomy of each patient is considered in the form of a voxel-based geometry created using computed tomography (CT) images or magnetic resonance imaging (MRI). Oedipe enables dosimetry studies to be carried out at the voxel scale. Validation of the results obtained by comparison with existing methods is complex because there are multiple sources of variation: calculation methods (different Monte Carlo codes, point kernel), patient representations (model or specific) and geometry definitions (mathematical or voxel-based). In this paper, we validate Oedipe by taking each of these parameters into account independently. Monte Carlo methodology requires long calculation times, particularly in the case of voxel-based geometries, and this is one of the limits of personalized dosimetric methods. However, our results show that the use of voxel-based geometry as opposed to a mathematically defined geometry decreases the calculation time two-fold, due to an optimization of the MCNPX2.5e code. It is therefore possible to envisage the use of Oedipe for personalized dosimetry in the clinical context of targeted radiotherapy.
Lindblom, Katrina; Gregory, Tess; Flight, Ingrid H K; Zajac, Ian
2011-01-01
Objective This study investigated the efficacy of an internet-based personalized decision support (PDS) tool designed to aid in the decision to screen for colorectal cancer (CRC) using a fecal occult blood test. We tested whether the efficacy of the tool in influencing attitudes to screening was mediated by perceived usability and acceptability, and considered the role of computer self-efficacy and computer anxiety in these relationships. Methods Eighty-one participants aged 50–76 years worked through the on-line PDS tool and completed questionnaires on computer self-efficacy, computer anxiety, attitudes to and beliefs about CRC screening before and after exposure to the PDS, and perceived usability and acceptability of the tool. Results Repeated measures ANOVA found that PDS exposure led to a significant increase in knowledge about CRC and screening, and more positive attitudes to CRC screening as measured by factors from the Preventive Health Model. Perceived usability and acceptability of the PDS mediated changes in attitudes toward CRC screening (but not CRC knowledge), and computer self-efficacy and computer anxiety were significant predictors of individuals' perceptions of the tool. Conclusion Interventions designed to decrease computer anxiety, such as computer courses and internet training, may improve the acceptability of new health information technologies including internet-based decision support tools, increasing their impact on behavior change. PMID:21857024
A New Approach to Personalization: Integrating E-Learning and M-Learning
ERIC Educational Resources Information Center
Nedungadi, Prema; Raman, Raghu
2012-01-01
Most personalized learning systems are designed for either personal computers (e-learning) or mobile devices (m-learning). Our research has resulted in a cloud-based adaptive learning system that incorporates mobile devices into a classroom setting. This system is fully integrated into the formative assessment process and, most importantly,…
Thomas, Neil; Farhall, John; Foley, Fiona; Rossell, Susan L; Castle, David; Ladd, Emma; Meyer, Denny; Mihalopoulos, Cathrine; Leitan, Nuwan; Nunan, Cassy; Frankish, Rosalie; Smark, Tara; Farnan, Sue; McLeod, Bronte; Sterling, Leon; Murray, Greg; Fossey, Ellie; Brophy, Lisa; Kyrios, Michael
2016-09-07
Psychosocial interventions have an important role in promoting recovery in people with persisting psychotic disorders such as schizophrenia. Readily available, digital technology provides a means of developing therapeutic resources for use together by practitioners and mental health service users. As part of the Self-Management and Recovery Technology (SMART) research program, we have developed an online resource providing materials on illness self-management and personal recovery based on the Connectedness-Hope-Identity-Meaning-Empowerment (CHIME) framework. Content is communicated using videos featuring persons with lived experience of psychosis discussing how they have navigated issues in their own recovery. This was developed to be suitable for use on a tablet computer during sessions with a mental health worker to promote discussion about recovery. This is a rater-blinded randomised controlled trial comparing a low intensity recovery intervention of eight one-to-one face-to-face sessions with a mental health worker using the SMART website alongside routine care, versus an eight-session comparison condition, befriending. The recruitment target is 148 participants with a schizophrenia-related disorder or mood disorder with a history of psychosis, recruited from mental health services in Victoria, Australia. Following baseline assessment, participants are randomised to intervention, and complete follow up assessments at 3, 6 and 9 months post-baseline. The primary outcome is personal recovery measured using the Process of Recovery Questionnaire (QPR). Secondary outcomes include positive and negative symptoms assessed with the Positive and Negative Syndrome Scale, subjective experiences of psychosis, emotional symptoms, quality of life and resource use. Mechanisms of change via effects on self-stigma and self-efficacy will be examined. This protocol describes a novel intervention which tests new therapeutic methods including in-session tablet computer use and video-based peer modelling. It also informs a possible low intensity intervention model potentially viable for delivery across the mental health workforce. NCT02474524 , 24 May 2015, retrospectively registered during the recruitment phase.
Besio, Walter G; Cao, Hongbao; Zhou, Peng
2008-04-01
For persons with severe disabilities, a brain-computer interface (BCI) may be a viable means of communication. Lapalacian electroencephalogram (EEG) has been shown to improve classification in EEG recognition. In this work, the effectiveness of signals from tripolar concentric electrodes and disc electrodes were compared for use as a BCI. Two sets of left/right hand motor imagery EEG signals were acquired. An autoregressive (AR) model was developed for feature extraction with a Mahalanobis distance based linear classifier for classification. An exhaust selection algorithm was employed to analyze three factors before feature extraction. The factors analyzed were 1) length of data in each trial to be used, 2) start position of data, and 3) the order of the AR model. The results showed that tripolar concentric electrodes generated significantly higher classification accuracy than disc electrodes.
ERIC Educational Resources Information Center
Miller, Christopher; Mazur, Joan M.
2001-01-01
A person-centered instructional design model was developed for virtual, Web-based environments, based on the work of Carl Rogers. This model attempts to address several issues raised in the literature. A person-centered instructional model is described and contrasted with instructionalist and constructivist approaches. Theoretical and practical…
Ukwatta, Eranga; Arevalo, Hermenegild; Li, Kristina; Yuan, Jing; Qiu, Wu; Malamas, Peter; Wu, Katherine C.
2016-01-01
Accurate representation of myocardial infarct geometry is crucial to patient-specific computational modeling of the heart in ischemic cardiomyopathy. We have developed a methodology for segmentation of left ventricular (LV) infarct from clinically acquired, two-dimensional (2D), late-gadolinium enhanced cardiac magnetic resonance (LGE-CMR) images, for personalized modeling of ventricular electrophysiology. The infarct segmentation was expressed as a continuous min-cut optimization problem, which was solved using its dual formulation, the continuous max-flow (CMF). The optimization objective comprised of a smoothness term, and a data term that quantified the similarity between image intensity histograms of segmented regions and those of a set of training images. A manual segmentation of the LV myocardium was used to initialize and constrain the developed method. The three-dimensional geometry of infarct was reconstructed from its segmentation using an implicit, shape-based interpolation method. The proposed methodology was extensively evaluated using metrics based on geometry, and outcomes of individualized electrophysiological simulations of cardiac dys(function). Several existing LV infarct segmentation approaches were implemented, and compared with the proposed method. Our results demonstrated that the CMF method was more accurate than the existing approaches in reproducing expert manual LV infarct segmentations, and in electrophysiological simulations. The infarct segmentation method we have developed and comprehensively evaluated in this study constitutes an important step in advancing clinical applications of personalized simulations of cardiac electrophysiology. PMID:26731693
Translational Systems Biology and Voice Pathophysiology
Li, Nicole Y. K.; Abbott, Katherine Verdolini; Rosen, Clark; An, Gary; Hebda, Patricia A.; Vodovotz, Yoram
2011-01-01
Objectives/Hypothesis Personalized medicine has been called upon to tailor healthcare to an individual's needs. Evidence-based medicine (EBM) has advocated using randomized clinical trials with large populations to evaluate treatment effects. However, due to large variations across patients, the results are likely not to apply to an individual patient. We suggest that a complementary, systems biology approach using computational modeling may help tackle biological complexity in order to improve ultimate patient care. The purpose of the article is: 1) to review the pros and cons of EBM, and 2) to discuss the alternative systems biology method and present its utility in clinical voice research. Study Design Tutorial Methods Literature review and discussion. Results We propose that translational systems biology can address many of the limitations of EBM pertinent to voice and other health care domains, and thus complement current health research models. In particular, recent work using mathematical modeling suggests that systems biology has the ability to quantify the highly complex biologic processes underlying voice pathophysiology. Recent data support the premise that this approach can be applied specifically in the case of phonotrauma and surgically induced vocal fold trauma, and may have particular power to address personalized medicine. Conclusions We propose that evidence around vocal health and disease be expanded beyond a population-based method to consider more fully issues of complexity and systems interactions, especially in implementing personalized medicine in voice care and beyond. PMID:20025041
Jeunet, Camille; N'Kaoua, Bernard; Subramanian, Sriram; Hachet, Martin; Lotte, Fabien
2015-01-01
Mental-Imagery based Brain-Computer Interfaces (MI-BCIs) allow their users to send commands to a computer using their brain-activity alone (typically measured by ElectroEncephaloGraphy-EEG), which is processed while they perform specific mental tasks. While very promising, MI-BCIs remain barely used outside laboratories because of the difficulty encountered by users to control them. Indeed, although some users obtain good control performances after training, a substantial proportion remains unable to reliably control an MI-BCI. This huge variability in user-performance led the community to look for predictors of MI-BCI control ability. However, these predictors were only explored for motor-imagery based BCIs, and mostly for a single training session per subject. In this study, 18 participants were instructed to learn to control an EEG-based MI-BCI by performing 3 MI-tasks, 2 of which were non-motor tasks, across 6 training sessions, on 6 different days. Relationships between the participants' BCI control performances and their personality, cognitive profile and neurophysiological markers were explored. While no relevant relationships with neurophysiological markers were found, strong correlations between MI-BCI performances and mental-rotation scores (reflecting spatial abilities) were revealed. Also, a predictive model of MI-BCI performance based on psychometric questionnaire scores was proposed. A leave-one-subject-out cross validation process revealed the stability and reliability of this model: it enabled to predict participants' performance with a mean error of less than 3 points. This study determined how users' profiles impact their MI-BCI control ability and thus clears the way for designing novel MI-BCI training protocols, adapted to the profile of each user.
Jeunet, Camille; N’Kaoua, Bernard; Subramanian, Sriram; Hachet, Martin; Lotte, Fabien
2015-01-01
Mental-Imagery based Brain-Computer Interfaces (MI-BCIs) allow their users to send commands to a computer using their brain-activity alone (typically measured by ElectroEncephaloGraphy—EEG), which is processed while they perform specific mental tasks. While very promising, MI-BCIs remain barely used outside laboratories because of the difficulty encountered by users to control them. Indeed, although some users obtain good control performances after training, a substantial proportion remains unable to reliably control an MI-BCI. This huge variability in user-performance led the community to look for predictors of MI-BCI control ability. However, these predictors were only explored for motor-imagery based BCIs, and mostly for a single training session per subject. In this study, 18 participants were instructed to learn to control an EEG-based MI-BCI by performing 3 MI-tasks, 2 of which were non-motor tasks, across 6 training sessions, on 6 different days. Relationships between the participants’ BCI control performances and their personality, cognitive profile and neurophysiological markers were explored. While no relevant relationships with neurophysiological markers were found, strong correlations between MI-BCI performances and mental-rotation scores (reflecting spatial abilities) were revealed. Also, a predictive model of MI-BCI performance based on psychometric questionnaire scores was proposed. A leave-one-subject-out cross validation process revealed the stability and reliability of this model: it enabled to predict participants’ performance with a mean error of less than 3 points. This study determined how users’ profiles impact their MI-BCI control ability and thus clears the way for designing novel MI-BCI training protocols, adapted to the profile of each user. PMID:26625261
ERIC Educational Resources Information Center
Lund, David M.; Hildreth, Donna
A case study investigated an instructional model that incorporated the personal computer and Hyperstudio (tm) software into an assignment to write and illustrate an interactive, multimedia story. Subjects were 21 students in a fifth-grade homeroom in a public school (with a state-mandated minimum 45% ratio of minority students achieved by busing…
Arc Habitat Suitability Index computer software
Thomas M. Juntti; Mark A. Rumble
2006-01-01
This user manual describes the Arc Habitat Suitability Index (ArcHSI), which is a geographical information system (GIS) model that estimates the ability of an area to meet the food and cover requirements of an animal species. The components and parameters of the model occur in tables and can be easily edited or otherwise modified. ArcHSI runs on personal computers with...
Optics Program Modified for Multithreaded Parallel Computing
NASA Technical Reports Server (NTRS)
Lou, John; Bedding, Dave; Basinger, Scott
2006-01-01
A powerful high-performance computer program for simulating and analyzing adaptive and controlled optical systems has been developed by modifying the serial version of the Modeling and Analysis for Controlled Optical Systems (MACOS) program to impart capabilities for multithreaded parallel processing on computing systems ranging from supercomputers down to Symmetric Multiprocessing (SMP) personal computers. The modifications included the incorporation of OpenMP, a portable and widely supported application interface software, that can be used to explicitly add multithreaded parallelism to an application program under a shared-memory programming model. OpenMP was applied to parallelize ray-tracing calculations, one of the major computing components in MACOS. Multithreading is also used in the diffraction propagation of light in MACOS based on pthreads [POSIX Thread, (where "POSIX" signifies a portable operating system for UNIX)]. In tests of the parallelized version of MACOS, the speedup in ray-tracing calculations was found to be linear, or proportional to the number of processors, while the speedup in diffraction calculations ranged from 50 to 60 percent, depending on the type and number of processors. The parallelized version of MACOS is portable, and, to the user, its interface is basically the same as that of the original serial version of MACOS.
Sharma, Nandita; Gedeon, Tom
2012-12-01
Stress is a major growing concern in our day and age adversely impacting both individuals and society. Stress research has a wide range of benefits from improving personal operations, learning, and increasing work productivity to benefiting society - making it an interesting and socially beneficial area of research. This survey reviews sensors that have been used to measure stress and investigates techniques for modelling stress. It discusses non-invasive and unobtrusive sensors for measuring computed stress, a term we coin in the paper. Sensors that do not impede everyday activities that could be used by those who would like to monitor stress levels on a regular basis (e.g. vehicle drivers, patients with illnesses linked to stress) is the focus of the discussion. Computational techniques have the capacity to determine optimal sensor fusion and automate data analysis for stress recognition and classification. Several computational techniques have been developed to model stress based on techniques such as Bayesian networks, artificial neural networks, and support vector machines, which this survey investigates. The survey concludes with a summary and provides possible directions for further computational stress research. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
36 CFR 1236.2 - What definitions apply to this part?
Code of Federal Regulations, 2014 CFR
2014-07-01
... users but does not retain any transmission data), data systems used to collect and process data that have been organized into data files or data bases on either personal computers or mainframe computers...
36 CFR 1236.2 - What definitions apply to this part?
Code of Federal Regulations, 2011 CFR
2011-07-01
... users but does not retain any transmission data), data systems used to collect and process data that have been organized into data files or data bases on either personal computers or mainframe computers...
36 CFR 1236.2 - What definitions apply to this part?
Code of Federal Regulations, 2012 CFR
2012-07-01
... users but does not retain any transmission data), data systems used to collect and process data that have been organized into data files or data bases on either personal computers or mainframe computers...
36 CFR 1236.2 - What definitions apply to this part?
Code of Federal Regulations, 2010 CFR
2010-07-01
... users but does not retain any transmission data), data systems used to collect and process data that have been organized into data files or data bases on either personal computers or mainframe computers...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-23
... methods of administration (e.g., computer assisted personal interviews [CAPI], audio computer assisted self-interviews [ACASI], web-based interviews). Cognitive testing of these materials and methods will...
Verkhivker, Gennady M
2016-01-01
The human protein kinome presents one of the largest protein families that orchestrate functional processes in complex cellular networks, and when perturbed, can cause various cancers. The abundance and diversity of genetic, structural, and biochemical data underlies the complexity of mechanisms by which targeted and personalized drugs can combat mutational profiles in protein kinases. Coupled with the evolution of system biology approaches, genomic and proteomic technologies are rapidly identifying and charactering novel resistance mechanisms with the goal to inform rationale design of personalized kinase drugs. Integration of experimental and computational approaches can help to bring these data into a unified conceptual framework and develop robust models for predicting the clinical drug resistance. In the current study, we employ a battery of synergistic computational approaches that integrate genetic, evolutionary, biochemical, and structural data to characterize the effect of cancer mutations in protein kinases. We provide a detailed structural classification and analysis of genetic signatures associated with oncogenic mutations. By integrating genetic and structural data, we employ network modeling to dissect mechanisms of kinase drug sensitivities to oncogenic EGFR mutations. Using biophysical simulations and analysis of protein structure networks, we show that conformational-specific drug binding of Lapatinib may elicit resistant mutations in the EGFR kinase that are linked with the ligand-mediated changes in the residue interaction networks and global network properties of key residues that are responsible for structural stability of specific functional states. A strong network dependency on high centrality residues in the conformation-specific Lapatinib-EGFR complex may explain vulnerability of drug binding to a broad spectrum of mutations and the emergence of drug resistance. Our study offers a systems-based perspective on drug design by unravelling complex relationships between robustness of targeted kinase genes and binding specificity of targeted kinase drugs. We discuss how these approaches can exploit advances in chemical biology and network science to develop novel strategies for rationally tailored and robust personalized drug therapies.
Bańkowski, Robert; Wiadrowska, Bozena; Beresińska, Martyna; Ludwicki, Jan K; Noworyta-Głowacka, Justyna; Godyń, Artur; Doruchowski, Grzegorz; Hołownicki, Ryszard
2013-01-01
Faulty but still operating agricultural pesticide sprayers may pose an unacceptable health risk for operators. The computerized models designed to calculate exposure and risk for pesticide sprayers used as an aid in the evaluation and further authorisation of plant protection products may be applied also to assess a health risk for operators when faulty sprayers are used. To evaluate the impact of different exposure scenarios on the health risk for the operators using faulty agricultural spraying equipment by means of computer modelling. The exposure modelling was performed for 15 pesticides (5 insecticides, 7 fungicides and 3 herbicides). The critical parameter, i.e. toxicological end-point, on which the risk assessment was based was the no observable adverse effect level (NOAEL). This enabled risk to be estimated under various exposure conditions such as pesticide concentration in the plant protection product and type of the sprayed crop as well as the number of treatments. Computer modelling was based on the UK POEM model including determination of the acceptable operator exposure level (AOEL). Thus the degree of operator exposure could be defined during pesticide treatment whether or not personal protection equipment had been employed by individuals. Data used for computer modelling was obtained from simulated, pesticide substitute treatments using variously damaged knapsack sprayers. These substitute preparations consisted of markers that allowed computer simulations to be made, analogous to real-life exposure situations, in a dose dependent fashion. Exposures were estimated according to operator dosimetry exposure under 'field' conditions for low level, medium and high target field crops. The exposure modelling in the high target field crops demonstrated exceedance of the AOEL in all simulated treatment cases (100%) using damaged sprayers irrespective of the type of damage or if individual protective measures had been adopted or not. For low level and medium field crops exceedances ranged between 40 - 80% cases. The computer modelling may be considered as an practical tool for the hazard assessment when the faulty agricultural sprayers are used. It also may be applied for programming the quality checks and maintenance systems of this equipment.
ParticleCall: A particle filter for base calling in next-generation sequencing systems
2012-01-01
Background Next-generation sequencing systems are capable of rapid and cost-effective DNA sequencing, thus enabling routine sequencing tasks and taking us one step closer to personalized medicine. Accuracy and lengths of their reads, however, are yet to surpass those provided by the conventional Sanger sequencing method. This motivates the search for computationally efficient algorithms capable of reliable and accurate detection of the order of nucleotides in short DNA fragments from the acquired data. Results In this paper, we consider Illumina’s sequencing-by-synthesis platform which relies on reversible terminator chemistry and describe the acquired signal by reformulating its mathematical model as a Hidden Markov Model. Relying on this model and sequential Monte Carlo methods, we develop a parameter estimation and base calling scheme called ParticleCall. ParticleCall is tested on a data set obtained by sequencing phiX174 bacteriophage using Illumina’s Genome Analyzer II. The results show that the developed base calling scheme is significantly more computationally efficient than the best performing unsupervised method currently available, while achieving the same accuracy. Conclusions The proposed ParticleCall provides more accurate calls than the Illumina’s base calling algorithm, Bustard. At the same time, ParticleCall is significantly more computationally efficient than other recent schemes with similar performance, rendering it more feasible for high-throughput sequencing data analysis. Improvement of base calling accuracy will have immediate beneficial effects on the performance of downstream applications such as SNP and genotype calling. ParticleCall is freely available at https://sourceforge.net/projects/particlecall. PMID:22776067
Yue, Xiao; Wang, Huiju; Jin, Dawei; Li, Mingqiang; Jiang, Wei
2016-10-01
Healthcare data are a valuable source of healthcare intelligence. Sharing of healthcare data is one essential step to make healthcare system smarter and improve the quality of healthcare service. Healthcare data, one personal asset of patient, should be owned and controlled by patient, instead of being scattered in different healthcare systems, which prevents data sharing and puts patient privacy at risks. Blockchain is demonstrated in the financial field that trusted, auditable computing is possible using a decentralized network of peers accompanied by a public ledger. In this paper, we proposed an App (called Healthcare Data Gateway (HGD)) architecture based on blockchain to enable patient to own, control and share their own data easily and securely without violating privacy, which provides a new potential way to improve the intelligence of healthcare systems while keeping patient data private. Our proposed purpose-centric access model ensures patient own and control their healthcare data; simple unified Indicator-Centric Schema (ICS) makes it possible to organize all kinds of personal healthcare data practically and easily. We also point out that MPC (Secure Multi-Party Computing) is one promising solution to enable untrusted third-party to conduct computation over patient data without violating privacy.
Home-Based Risk of Falling Assessment Test Using a Closed-Loop Balance Model.
Ayena, Johannes C; Zaibi, Helmi; Otis, Martin J-D; Menelas, Bob-Antoine J
2016-12-01
The aim of this study is to improve and facilitate the methods used to assess risk of falling at home among older people through the computation of a risk of falling in real time in daily activities. In order to increase a real time computation of the risk of falling, a closed-loop balance model is proposed and compared with One-Leg Standing Test (OLST). This balance model allows studying the postural response of a person having an unpredictable perturbation. Twenty-nine volunteers participated in this study for evaluating the effectiveness of the proposed system which includes seventeen elder participants: ten healthy elderly ( 68.4 ±5.5 years), seven Parkinson's disease (PD) subjects ( 66.28 ±8.9 years), and twelve healthy young adults ( 28.27 ±3.74 years). Our work suggests that there is a relationship between OLST score and the risk of falling based on center of pressure measurement with four low cost force sensors located inside an instrumented insole, which could be predicted using our suggested closed-loop balance model. For long term monitoring at home, this system could be included in a medical electronic record and could be useful as a diagnostic aid tool.
Haller, Toomas; Leitsalu, Liis; Fischer, Krista; Nuotio, Marja-Liisa; Esko, Tõnu; Boomsma, Dorothea Irene; Kyvik, Kirsten Ohm; Spector, Tim D; Perola, Markus; Metspalu, Andres
2017-01-01
Ancestry information at the individual level can be a valuable resource for personalized medicine, medical, demographical and history research, as well as for tracing back personal history. We report a new method for quantitatively determining personal genetic ancestry based on genome-wide data. Numerical ancestry component scores are assigned to individuals based on comparisons with reference populations. These comparisons are conducted with an existing analytical pipeline making use of genotype phasing, similarity matrix computation and our addition-multidimensional best fitting by MixFit. The method is demonstrated by studying Estonian and Finnish populations in geographical context. We show the main differences in the genetic composition of these otherwise close European populations and how they have influenced each other. The components of our analytical pipeline are freely available computer programs and scripts one of which was developed in house (available at: www.geenivaramu.ee/en/tools/mixfit).
NASA Astrophysics Data System (ADS)
Ponomarev, A. A.; Mamadaliev, R. A.; Semenova, T. V.
2016-10-01
The article presents a brief overview of the current state of computed tomography in the sphere of oil and gas production in Russia and in the world. Operation of computed microtomograph Skyscan 1172 is also provided, as well as personal examples of its application in solving geological problems.
Pirolli, Peter
2016-08-01
Computational models were developed in the ACT-R neurocognitive architecture to address some aspects of the dynamics of behavior change. The simulations aim to address the day-to-day goal achievement data available from mobile health systems. The models refine current psychological theories of self-efficacy, intended effort, and habit formation, and provide an account for the mechanisms by which goal personalization, implementation intentions, and remindings work.
Harmonic and anharmonic oscillations investigated by using a microcomputer-based Atwood's machine
NASA Astrophysics Data System (ADS)
Pecori, Barbara; Torzo, Giacomo; Sconza, Andrea
1999-03-01
We describe how the Atwood's machine, interfaced to a personal computer through a rotary encoder, is suited for investigating harmonic and anharmonic oscillations, exploiting the buoyancy force acting on a body immersed in water. We report experimental studies of oscillators produced by driving forces of the type F=-kxn with n=1,2,3, and F=-k sgn(x). Finally we suggest how this apparatus can be used for showing to the students a macroscopic model of interatomic forces.
Person Re-Identification via Distance Metric Learning With Latent Variables.
Sun, Chong; Wang, Dong; Lu, Huchuan
2017-01-01
In this paper, we propose an effective person re-identification method with latent variables, which represents a pedestrian as the mixture of a holistic model and a number of flexible models. Three types of latent variables are introduced to model uncertain factors in the re-identification problem, including vertical misalignments, horizontal misalignments and leg posture variations. The distance between two pedestrians can be determined by minimizing a given distance function with respect to latent variables, and then be used to conduct the re-identification task. In addition, we develop a latent metric learning method for learning the effective metric matrix, which can be solved via an iterative manner: once latent information is specified, the metric matrix can be obtained based on some typical metric learning methods; with the computed metric matrix, the latent variables can be determined by searching the state space exhaustively. Finally, extensive experiments are conducted on seven databases to evaluate the proposed method. The experimental results demonstrate that our method achieves better performance than other competing algorithms.
Spruijt-Metz, Donna; Hekler, Eric; Saranummi, Niilo; Intille, Stephen; Korhonen, Ilkka; Nilsen, Wendy; Rivera, Daniel E; Spring, Bonnie; Michie, Susan; Asch, David A; Sanna, Alberto; Salcedo, Vicente Traver; Kukakfa, Rita; Pavel, Misha
2015-09-01
Adverse and suboptimal health behaviors and habits are responsible for approximately 40 % of preventable deaths, in addition to their unfavorable effects on quality of life and economics. Our current understanding of human behavior is largely based on static "snapshots" of human behavior, rather than ongoing, dynamic feedback loops of behavior in response to ever-changing biological, social, personal, and environmental states. This paper first discusses how new technologies (i.e., mobile sensors, smartphones, ubiquitous computing, and cloud-enabled processing/computing) and emerging systems modeling techniques enable the development of new, dynamic, and empirical models of human behavior that could facilitate just-in-time adaptive, scalable interventions. The paper then describes concrete steps to the creation of robust dynamic mathematical models of behavior including: (1) establishing "gold standard" measures, (2) the creation of a behavioral ontology for shared language and understanding tools that both enable dynamic theorizing across disciplines, (3) the development of data sharing resources, and (4) facilitating improved sharing of mathematical models and tools to support rapid aggregation of the models. We conclude with the discussion of what might be incorporated into a "knowledge commons," which could help to bring together these disparate activities into a unified system and structure for organizing knowledge about behavior.
Relative User Ratings of MMPI-2 Computer-Based Test Interpretations
ERIC Educational Resources Information Center
Williams, John E.; Weed, Nathan C.
2004-01-01
There are eight commercially available computer-based test interpretations (CBTIs) for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2), of which few have been empirically evaluated. Prospective users of these programs have little scientific data to guide choice of a program. This study compared ratings of these eight CBTIs. Test users…
Peter D. Knopp; Susan L. Stout
2014-01-01
This user's guide for the SILVAH computer program, version 6.2, supersedes the 1992 user's guide (Gen. Tech. Rep. NE-162). Designed for stand-alone Windows-based personal computers, SILVAH recommends a silvicultural prescription for a forest stand based on a summary and analysis of field inventory data. The program also includes a simulator that can be used...
36 CFR § 1236.2 - What definitions apply to this part?
Code of Federal Regulations, 2013 CFR
2013-07-01
... users but does not retain any transmission data), data systems used to collect and process data that have been organized into data files or data bases on either personal computers or mainframe computers...
EVALUATION OF VENTILATION PERFORMANCE FOR INDOOR SPACE
The paper discusses a personal-computer-based application of computational fluid dynamics that can be used to determine the turbulent flow field and time-dependent/steady-state contaminant concentration distributions within isothermal indoor space. (NOTE: Ventilation performance ...
Computational techniques to enable visualizing shapes of objects of extra spatial dimensions
NASA Astrophysics Data System (ADS)
Black, Don Vaughn, II
Envisioning extra dimensions beyond the three of common experience is a daunting challenge for three dimensional observers. Intuition relies on experience gained in a three dimensional environment. Gaining experience with virtual four dimensional objects and virtual three manifolds in four-space on a personal computer may provide the basis for an intuitive grasp of four dimensions. In order to enable such a capability for ourselves, it is first necessary to devise and implement a computationally tractable method to visualize, explore, and manipulate objects of dimension beyond three on the personal computer. A technology is described in this dissertation to convert a representation of higher dimensional models into a format that may be displayed in realtime on graphics cards available on many off-the-shelf personal computers. As a result, an opportunity has been created to experience the shape of four dimensional objects on the desktop computer. The ultimate goal has been to provide the user a tangible and memorable experience with mathematical models of four dimensional objects such that the user can see the model from any user selected vantage point. By use of a 4D GUI, an arbitrary convex hull or 3D silhouette of the 4D model can be rotated, panned, scrolled, and zoomed until a suitable dimensionally reduced view or Aspect is obtained. The 4D GUI then allows the user to manipulate a 3-flat hyperplane cutting tool to slice the model at an arbitrary orientation and position to extract or "pluck" an embedded 3D slice or "aspect" from the embedding four-space. This plucked 3D aspect can be viewed from all angles via a conventional 3D viewer using three multiple POV viewports, and optionally exported to a third party CAD viewer for further manipulation. Plucking and Manipulating the Aspect provides a tangible experience for the end-user in the same manner as any 3D Computer Aided Design viewing and manipulation tool does for the engineer or a 3D video game provides for the nascent student.
Visual privacy by context: proposal and evaluation of a level-based visualisation scheme.
Padilla-López, José Ramón; Chaaraoui, Alexandros Andre; Gu, Feng; Flórez-Revuelta, Francisco
2015-06-04
Privacy in image and video data has become an important subject since cameras are being installed in an increasing number of public and private spaces. Specifically, in assisted living, intelligent monitoring based on computer vision can allow one to provide risk detection and support services that increase people's autonomy at home. In the present work, a level-based visualisation scheme is proposed to provide visual privacy when human intervention is necessary, such as at telerehabilitation and safety assessment applications. Visualisation levels are dynamically selected based on the previously modelled context. In this way, different levels of protection can be provided, maintaining the necessary intelligibility required for the applications. Furthermore, a case study of a living room, where a top-view camera is installed, is presented. Finally, the performed survey-based evaluation indicates the degree of protection provided by the different visualisation models, as well as the personal privacy preferences and valuations of the users.
Personal computer study of finite-difference methods for the transonic small disturbance equation
NASA Technical Reports Server (NTRS)
Bland, Samuel R.
1989-01-01
Calculation of unsteady flow phenomena requires careful attention to the numerical treatment of the governing partial differential equations. The personal computer provides a convenient and useful tool for the development of meshes, algorithms, and boundary conditions needed to provide time accurate solution of these equations. The one-dimensional equation considered provides a suitable model for the study of wave propagation in the equations of transonic small disturbance potential flow. Numerical results for effects of mesh size, extent, and stretching, time step size, and choice of far-field boundary conditions are presented. Analysis of the discretized model problem supports these numerical results. Guidelines for suitable mesh and time step choices are given.
Probability-based collaborative filtering model for predicting gene-disease associations.
Zeng, Xiangxiang; Ding, Ningxiang; Rodríguez-Patón, Alfonso; Zou, Quan
2017-12-28
Accurately predicting pathogenic human genes has been challenging in recent research. Considering extensive gene-disease data verified by biological experiments, we can apply computational methods to perform accurate predictions with reduced time and expenses. We propose a probability-based collaborative filtering model (PCFM) to predict pathogenic human genes. Several kinds of data sets, containing data of humans and data of other nonhuman species, are integrated in our model. Firstly, on the basis of a typical latent factorization model, we propose model I with an average heterogeneous regularization. Secondly, we develop modified model II with personal heterogeneous regularization to enhance the accuracy of aforementioned models. In this model, vector space similarity or Pearson correlation coefficient metrics and data on related species are also used. We compared the results of PCFM with the results of four state-of-arts approaches. The results show that PCFM performs better than other advanced approaches. PCFM model can be leveraged for predictions of disease genes, especially for new human genes or diseases with no known relationships.
NASA Astrophysics Data System (ADS)
Sylvan, David
At least since Adam Smith's The Wealth of Nations, it has been understood that social systems can be considered as having emergent properties not reducible to the actions of individuals. The appeal of this idea is obvious, no different now than in Smith's time: that aggregates of persons can be ordered without such order being intended or enforced by any particular person or persons. A search for such an "invisible hand" is what brings many of us to the study of complexity and the construction of various types of computational models aimed at capturing it. However, in proceeding along these lines, we have tended to focus on particular types of social systems — what I will in this paper call "thin" systems, such as markets and populations — and ignored other types, such as groups, whose base interactions are "thick," i.e., constructed as one of many possibilities, by the participants, at the moment in which they take place. These latter systems are not only ubiquitous but pose particular modeling problems for students of complexity: the local interactions are themselves complex and the systems display no strongly emergent features.
ERIC Educational Resources Information Center
Cassel, Russell N.; Sumantardja, Elmira N.
1982-01-01
Describes Type-A personality as the result of mad pursuit for excellence, characteristic in Western Culture. Relaxation training and stress reduction in management, combined with careful ordering of priorities for single goal attainment, results in Type-C personalities, which implies the development of coping skills for achieving goals.…
Computational strategy for quantifying human pesticide exposure based upon a saliva measurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.
The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in salivamore » at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between rats and humans. Ongoing efforts are focused on extending this modeling strategy to an in vitro salivary acinar cell based system that will be utilized to experimentally determine and computationally predict salivary gland uptake and clearance for a broad range of xenobiotics. Hence, it is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of both environmental and occupational exposure in human populations using saliva.« less
Mind-life continuity: A qualitative study of conscious experience.
Hipólito, Inês; Martins, Jorge
2017-12-01
There are two fundamental models to understanding the phenomenon of natural life. One is the computational model, which is based on the symbolic thinking paradigm. The other is the biological organism model. The common difficulty attributed to these paradigms is that their reductive tools allow the phenomenological aspects of experience to remain hidden behind yes/no responses (behavioral tests), or brain 'pictures' (neuroimaging). Hence, one of the problems regards how to overcome methodological difficulties towards a non-reductive investigation of conscious experience. It is our aim in this paper to show how cooperation between Eastern and Western traditions may shed light for a non-reductive study of mind and life. This study focuses on the first-person experience associated with cognitive and mental events. We studied phenomenal data as a crucial fact for the domain of living beings, which, we expect, can provide the ground for a subsequent third-person study. The intervention with Jhana meditation, and its qualitative assessment, provided us with experiential profiles based upon subjects' evaluations of their own conscious experiences. The overall results should move towards an integrated or global perspective on mind where neither experience nor external mechanisms have the final word. Copyright © 2017. Published by Elsevier Ltd.
Weyand, Sabine; Takehara-Nishiuchi, Kaori; Chau, Tom
2015-10-30
Near-infrared spectroscopy (NIRS) brain-computer interfaces (BCIs) enable users to interact with their environment using only cognitive activities. This paper presents the results of a comparison of four methodological frameworks used to select a pair of tasks to control a binary NIRS-BCI; specifically, three novel personalized task paradigms and the state-of-the-art prescribed task framework were explored. Three types of personalized task selection approaches were compared, including: user-selected mental tasks using weighted slope scores (WS-scores), user-selected mental tasks using pair-wise accuracy rankings (PWAR), and researcher-selected mental tasks using PWAR. These paradigms, along with the state-of-the-art prescribed mental task framework, where mental tasks are selected based on the most commonly used tasks in literature, were tested by ten able-bodied participants who took part in five NIRS-BCI sessions. The frameworks were compared in terms of their accuracy, perceived ease-of-use, computational time, user preference, and length of training. Most notably, researcher-selected personalized tasks resulted in significantly higher accuracies, while user-selected personalized tasks resulted in significantly higher perceived ease-of-use. It was also concluded that PWAR minimized the amount of data that needed to be collected; while, WS-scores maximized user satisfaction and minimized computational time. In comparison to the state-of-the-art prescribed mental tasks, our findings show that overall, personalized tasks appear to be superior to prescribed tasks with respect to accuracy and perceived ease-of-use. The deployment of personalized rather than prescribed mental tasks ought to be considered and further investigated in future NIRS-BCI studies. Copyright © 2015 Elsevier B.V. All rights reserved.
Providing Assistive Technology Applications as a Service Through Cloud Computing.
Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio
2015-01-01
Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.
The IEA/ORAU Long-Term Global Energy- CO2 Model: Personal Computer Version A84PC
Edmonds, Jae A.; Reilly, John M.; Boden, Thomas A. [CDIAC; Reynolds, S. E. [CDIAC; Barns, D. W.
1995-01-01
The IBM A84PC version of the Edmonds-Reilly model has the capability to calculate both CO2 and CH4 emission estimates by source and region. Population, labor productivity, end-use energy efficiency, income effects, price effects, resource base, technological change in energy production, environmental costs of energy production, market-penetration rate of energy-supply technology, solar and biomass energy costs, synfuel costs, and the number of forecast periods may be interactively inspected and altered producing a variety of global and regional CO2 and CH4 emission scenarios for 1975 through 2100. Users are strongly encouraged to see our instructions for downloading, installing, and running the model.
Crew appliance computer program manual, volume 1
NASA Technical Reports Server (NTRS)
Russell, D. J.
1975-01-01
Trade studies of numerous appliance concepts for advanced spacecraft galley, personal hygiene, housekeeping, and other areas were made to determine which best satisfy the space shuttle orbiter and modular space station mission requirements. Analytical models of selected appliance concepts not currently included in the G-189A Generalized Environmental/Thermal Control and Life Support Systems (ETCLSS) Computer Program subroutine library were developed. The new appliance subroutines are given along with complete analytical model descriptions, solution methods, user's input instructions, and validation run results. The appliance components modeled were integrated with G-189A ETCLSS models for shuttle orbiter and modular space station, and results from computer runs of these systems are presented.
Personality from a cognitive-biological perspective
NASA Astrophysics Data System (ADS)
Neuman, Yair
2014-12-01
The term "personality" is used to describe a distinctive and relatively stable set of mental traits that aim to explain the organism's behavior. The concept of personality that emerged in human psychology has been also applied to the study of non-human organisms from birds to horses. In this paper, I critically review the concept of personality from an interdisciplinary perspective, and point to some ideas that may be used for developing a cognitive-biological theory of personality. Integrating theories and research findings from various fields such as cognitive ethnology, clinical psychology, and neuroscience, I argue that the common denominator of various personality theories are neural systems of threat/trust management and their emotional, cognitive, and behavioral dimensions. In this context, personality may be also conceived as a meta-heuristics both human and non-human organisms apply to model and predict the behavior of others. The paper concludes by suggesting a minimal computational model of personality that may guide future research.
Heliport noise model (HNM) version 1 user's guide
DOT National Transportation Integrated Search
1988-02-01
This document contains the instructions to execute the Heliport Noise Model (HNM), Version 1. HNM Version 1 is a computer tool for determining the total impact of helicopter noise at and around heliports. The model runs on IBM PC/XT/AT personal compu...
A First Step towards a Clinical Decision Support System for Post-traumatic Stress Disorders.
Ma, Sisi; Galatzer-Levy, Isaac R; Wang, Xuya; Fenyö, David; Shalev, Arieh Y
2016-01-01
PTSD is distressful and debilitating, following a non-remitting course in about 10% to 20% of trauma survivors. Numerous risk indicators of PTSD have been identified, but individual level prediction remains elusive. As an effort to bridge the gap between scientific discovery and practical application, we designed and implemented a clinical decision support pipeline to provide clinically relevant recommendation for trauma survivors. To meet the specific challenge of early prediction, this work uses data obtained within ten days of a traumatic event. The pipeline creates personalized predictive model for each individual, and computes quality metrics for each predictive model. Clinical recommendations are made based on both the prediction of the model and its quality, thus avoiding making potentially detrimental recommendations based on insufficient information or suboptimal model. The current pipeline outperforms the acute stress disorder, a commonly used clinical risk factor for PTSD development, both in terms of sensitivity and specificity.
26 CFR 1.168(j)-1T - Questions and answers concerning tax-exempt entity leasing rules (temporary).
Code of Federal Regulations, 2011 CFR
2011-04-01
... technological equipment” means (1) any computer or peripheral equipment, (2) any high technology telephone..., electromechanical, or computer-based high technology equipment which is tangible personal property used in the... before the expiration of its physical useful life. High technology medical equipment may include computer...
Visual Persons Behavior Diary Generation Model based on Trajectories and Pose Estimation
NASA Astrophysics Data System (ADS)
Gang, Chen; Bin, Chen; Yuming, Liu; Hui, Li
2018-03-01
The behavior pattern of persons was the important output of the surveillance analysis. This paper focus on the generation model of visual person behavior diary. The pipeline includes the person detection, tracking, and the person behavior classify. This paper adopts the deep convolutional neural model YOLO (You Only Look Once)V2 for person detection module. Multi person tracking was based on the detection framework. The Hungarian assignment algorithm was used to the matching. The person appearance model was integrated by HSV color model and Hash code model. The person object motion was estimated by the Kalman Filter. The multi objects were matching with exist tracklets through the appearance and motion location distance by the Hungarian assignment method. A long continuous trajectory for one person was get by the spatial-temporal continual linking algorithm. And the face recognition information was used to identify the trajectory. The trajectories with identification information can be used to generate the visual diary of person behavior based on the scene context information and person action estimation. The relevant modules are tested in public data sets and our own capture video sets. The test results show that the method can be used to generate the visual person behavior pattern diary with certain accuracy.
ERIC Educational Resources Information Center
Tardif-Williams, Christine Y.; Owen, Frances; Feldman, Maurice; Tarulli, Donato; Griffiths, Dorothy; Sales, Carol; McQueen-Fuentes, Glenys; Stoner, Karen
2007-01-01
We tested the effectiveness of an interactive, video CD-ROM in teaching persons with intellectual disabilities (ID) about their human rights. Thirty-nine participants with ID were trained using both a classroom activity-based version of the training program and the interactive CD-ROM in a counterbalanced presentation. All individuals were pre- and…
Analysis of brute-force break-ins of a palmprint authentication system.
Kong, Adams W K; Zhang, David; Kamel, Mohamed
2006-10-01
Biometric authentication systems are widely applied because they offer inherent advantages over classical knowledge-based and token-based personal-identification approaches. This has led to the development of products using palmprints as biometric traits and their use in several real applications. However, as biometric systems are vulnerable to replay, database, and brute-force attacks, such potential attacks must be analyzed before biometric systems are massively deployed in security systems. This correspondence proposes a projected multinomial distribution for studying the probability of successfully using brute-force attacks to break into a palmprint system. To validate the proposed model, we have conducted a simulation. Its results demonstrate that the proposed model can accurately estimate the probability. The proposed model indicates that it is computationally infeasible to break into the palmprint system using brute-force attacks.
Modeling Trait Anxiety: From Computational Processes to Personality
Raymond, James G.; Steele, J. Douglas; Seriès, Peggy
2017-01-01
Computational methods are increasingly being applied to the study of psychiatric disorders. Often, this involves fitting models to the behavior of individuals with subclinical character traits that are known vulnerability factors for the development of psychiatric conditions. Anxiety disorders can be examined with reference to the behavior of individuals high in “trait” anxiety, which is a known vulnerability factor for the development of anxiety and mood disorders. However, it is not clear how this self-report measure relates to neural and behavioral processes captured by computational models. This paper reviews emerging computational approaches to the study of trait anxiety, specifying how interacting processes susceptible to analysis using computational models could drive a tendency to experience frequent anxious states and promote vulnerability to the development of clinical disorders. Existing computational studies are described in the light of this perspective and appropriate targets for future studies are discussed. PMID:28167920
Modeling Trait Anxiety: From Computational Processes to Personality.
Raymond, James G; Steele, J Douglas; Seriès, Peggy
2017-01-01
Computational methods are increasingly being applied to the study of psychiatric disorders. Often, this involves fitting models to the behavior of individuals with subclinical character traits that are known vulnerability factors for the development of psychiatric conditions. Anxiety disorders can be examined with reference to the behavior of individuals high in "trait" anxiety, which is a known vulnerability factor for the development of anxiety and mood disorders. However, it is not clear how this self-report measure relates to neural and behavioral processes captured by computational models. This paper reviews emerging computational approaches to the study of trait anxiety, specifying how interacting processes susceptible to analysis using computational models could drive a tendency to experience frequent anxious states and promote vulnerability to the development of clinical disorders. Existing computational studies are described in the light of this perspective and appropriate targets for future studies are discussed.
Multi-tasking computer control of video related equipment
NASA Technical Reports Server (NTRS)
Molina, Rod; Gilbert, Bob
1989-01-01
The flexibility, cost-effectiveness and widespread availability of personal computers now makes it possible to completely integrate the previously separate elements of video post-production into a single device. Specifically, a personal computer, such as the Commodore-Amiga, can perform multiple and simultaneous tasks from an individual unit. Relatively low cost, minimal space requirements and user-friendliness, provides the most favorable environment for the many phases of video post-production. Computers are well known for their basic abilities to process numbers, text and graphics and to reliably perform repetitive and tedious functions efficiently. These capabilities can now apply as either additions or alternatives to existing video post-production methods. A present example of computer-based video post-production technology is the RGB CVC (Computer and Video Creations) WorkSystem. A wide variety of integrated functions are made possible with an Amiga computer existing at the heart of the system.
S3DB core: a framework for RDF generation and management in bioinformatics infrastructures
2010-01-01
Background Biomedical research is set to greatly benefit from the use of semantic web technologies in the design of computational infrastructure. However, beyond well defined research initiatives, substantial issues of data heterogeneity, source distribution, and privacy currently stand in the way towards the personalization of Medicine. Results A computational framework for bioinformatic infrastructure was designed to deal with the heterogeneous data sources and the sensitive mixture of public and private data that characterizes the biomedical domain. This framework consists of a logical model build with semantic web tools, coupled with a Markov process that propagates user operator states. An accompanying open source prototype was developed to meet a series of applications that range from collaborative multi-institution data acquisition efforts to data analysis applications that need to quickly traverse complex data structures. This report describes the two abstractions underlying the S3DB-based infrastructure, logical and numerical, and discusses its generality beyond the immediate confines of existing implementations. Conclusions The emergence of the "web as a computer" requires a formal model for the different functionalities involved in reading and writing to it. The S3DB core model proposed was found to address the design criteria of biomedical computational infrastructure, such as those supporting large scale multi-investigator research, clinical trials, and molecular epidemiology. PMID:20646315
Simulation framework for intelligent transportation systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewing, T.; Doss, E.; Hanebutte, U.
1996-10-01
A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphicalmore » user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.« less
Virtual reality for dermatologic surgery: virtually a reality in the 21st century.
Gladstone, H B; Raugi, G J; Berg, D; Berkley, J; Weghorst, S; Ganter, M
2000-01-01
In the 20th century, virtual reality has predominantly played a role in training pilots and in the entertainment industry. Despite much publicity, virtual reality did not live up to its perceived potential. During the past decade, it has also been applied for medical uses, particularly as training simulators, for minimally invasive surgery. Because of advances in computer technology, virtual reality is on the cusp of becoming an effective medical educational tool. At the University of Washington, we are developing a virtual reality soft tissue surgery simulator. Based on fast finite element modeling and using a personal computer, this device can simulate three-dimensional human skin deformations with real-time tactile feedback. Although there are many cutaneous biomechanical challenges to solve, it will eventually provide more realistic dermatologic surgery training for medical students and residents than the currently used models.
Physical activity interventions using mass media, print media, and information technology.
Marcus, B H; Owen, N; Forsyth, L H; Cavill, N A; Fridinger, F
1998-11-01
Media-based physical activity interventions include a variety of print, graphic, audiovisual, and broadcast media programs intended to influence behavior change. New information technology allows print to be delivered in personalized, interactive formats that may enhance efficacy. Media-based interventions have been shaped by conceptual models from health education, Social Cognitive Theory, the Transtheoretical Model, and Social Marketing frameworks. We reviewed 28 studies of media-based interventions of which seven were mass media campaigns at the state or national level and the remaining 21 were delivered through health care, the workplace, or in the community. Recall of mass-media messages generally was high, but mass-media campaigns had very little impact on physical activity behavior. Interventions using print and/or telephone were effective in changing behavior in the short term. Studies in which there were more contacts and interventions tailored to the target audience were most effective. A key issue for research on media-based physical activity interventions is reaching socially disadvantaged groups for whom access, particularly to new forms of communication technology, may be limited. There is a clear need for controlled trials comparing different forms and intensities of media-based physical activity interventions. Controlled studies of personalized print, interactive computer-mediated programs, and web-based formats for program delivery also are needed. The integration of media-based methods into public and private sector service delivery has much potential for innovation.
Beyer, Jonathan A.; Lumley, Mark A.; Latsch, Deborah A.; Oberleitner, Lindsay M.S.; Carty, Jennifer N.; Radcliffe, Alison M.
2014-01-01
Standard written emotional disclosure (WED) about stress, which is private and unguided, yields small health benefits. The effect of providing individualized guidance to writers may enhance WED, but has not been tested. This trial of computer-based WED compared two novel therapist-guided forms of WED—advance guidance (before sessions) or real-time guidance (during sessions, through instant messaging)—to both standard WED and control writing; it also tested Big 5 personality traits as moderators of guided WED. Young adult participants (n = 163) with unresolved stressful experiences were randomized to conditions, had three, 30-min computer-based writing sessions, and were reassessed 6 weeks later. Contrary to hypotheses, real-time guidance WED had poorer outcomes than the other conditions on several measures, and advance guidance WED also showed some poorer outcomes. Moderator analyses revealed that participants with low baseline agreeableness, low extraversion, or high conscientiousness had relatively poor responses to guidance. We conclude that providing guidance for WED, especially in real-time, may interfere with emotional processing of unresolved stress, particularly for people whose personalities have poor fit with this interactive form of WED. PMID:24266598
NASA Astrophysics Data System (ADS)
Wu, Huaying; Wang, Li Zhong; Wang, Yantao; Yuan, Xiaolei
2018-05-01
The blade or surface grinding blade of the hypervelocity grinding wheel may be damaged due to too high rotation rate of the spindle of the machine and then fly out. Its speed as a projectile may severely endanger the field persons. Critical thickness model of the protective plate of the high-speed machine is studied in this paper. For easy analysis, the shapes of the possible impact objects flying from the high-speed machine are simplified as sharp-nose model, ball-nose model and flat-nose model. Whose front ending shape to represent point, line and surface contacting. Impact analysis based on J-C model is performed for the low-carbon steel plate with different thicknesses in this paper. One critical thickness computational model for the protective plate of high-speed machine is established according to the damage characteristics of the thin plate to get relation among plate thickness and mass, shape and size and impact speed of impact object. The air cannon is used for impact test. The model accuracy is validated. This model can guide identification of the thickness of single-layer outer protective plate of a high-speed machine.
PERSONAL COMPUTERS AND ENVIRONMENTAL ENGINEERING
This article discusses how personal computers can be applied to environmental engineering. fter explaining some of the differences between mainframe and Personal computers, we will review the development of personal computers and describe the areas of data management, interactive...
Perceptions about computers and the internet in a pediatric clinic population.
Carroll, Aaron E; Zimmerman, Frederick J; Rivara, Frederick P; Ebel, Beth E; Christakis, Dimitri A
2005-01-01
A digital divide with respect to computer and Internet access has been noted in numerous studies and reports. Equally important to ownership is comfort with computers and Internet technology, and concerns about privacy of personal data. To measure how households in a pediatric clinic vary in their attitudes toward computers, concerns about Internet confidentiality, and comfort using the Internet and whether these views are associated with household income or education. A phone survey was administered to a population-based sample of parents with children aged 0 to 11 years. All children received medical care from a community-based clinic network serving patients in King County, Wash. Eighty-eight percent of respondents used a computer once a week or more, and 83% of respondents reported favorable feelings toward computers. Although 97% of respondents were willing to share personal information over the Internet, many respondents considered data security important. While household income and parental education were associated with comfort and familiarity with computers, the effect is small. Respondents who already owned a computer and had Internet access did not differ in their perceptions according to socioeconomic or educational attainment. Most families like using computers and feel comfortable using the Internet regardless of socioeconomic status. Fears about the digital divide's impact on the attitudes of parents toward computers or their comfort using the Internet should not be seen as a barrier to developing Internet-based health interventions for a pediatric clinic population.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...
Consequence assessment of large rock slope failures in Norway
NASA Astrophysics Data System (ADS)
Oppikofer, Thierry; Hermanns, Reginald L.; Horton, Pascal; Sandøy, Gro; Roberts, Nicholas J.; Jaboyedoff, Michel; Böhme, Martina; Yugsi Molina, Freddy X.
2014-05-01
Steep glacially carved valleys and fjords in Norway are prone to many landslide types, including large rockslides, rockfalls, and debris flows. Large rockslides and their secondary effects (rockslide-triggered displacement waves, inundation behind landslide dams and outburst floods from failure of landslide dams) pose a significant hazard to the population living in the valleys and along the fjords shoreline. The Geological Survey of Norway performs systematic mapping of unstable rock slopes in Norway and has detected more than 230 unstable slopes with significant postglacial deformation. This large number necessitates prioritisation of follow-up activities, such as more detailed investigations, periodic displacement measurements, continuous monitoring and early-warning systems. Prioritisation is achieved through a hazard and risk classification system, which has been developed by a panel of international and Norwegian experts (www.ngu.no/en-gb/hm/Publications/Reports/2012/2012-029). The risk classification system combines a qualitative hazard assessment with a consequences assessment focusing on potential life losses. The hazard assessment is based on a series of nine geomorphological, engineering geological and structural criteria, as well as displacement rates, past events and other signs of activity. We present a method for consequence assessment comprising four main steps: 1. computation of the volume of the unstable rock slope; 2. run-out assessment based on the volume-dependent angle of reach (Fahrböschung) or detailed numerical run-out modelling; 3. assessment of possible displacement wave propagation and run-up based on empirical relations or modelling in 2D or 3D; and 4. estimation of the number of persons exposed to rock avalanches or displacement waves. Volume computation of an unstable rock slope is based on the sloping local base level technique, which uses a digital elevation model to create a second-order curved surface between the mapped extent of the unstable rock slope. This surface represents the possible basal sliding surface of an unstable rock slope. The elevation difference between this surface and the topographic surface estimates the volume of the unstable rock slope. A tool has been developed for the present study to adapt the curvature parameters of the computed surface to local geological and structural conditions. The obtained volume is then used to define the angle of reach of a possible rock avalanche from the unstable rock slope by using empirical derived values of angle of reach vs. volume relations. Run-out area is calculated using FlowR; the software is widely used for run-out assessment of debris flows and is adapted here for assessment of rock avalanches, including their potential to ascend opposing slopes. Under certain conditions, more sophisticated and complex numerical run-out models are also used. For rock avalanches with potential to reach a fjord or a lake the propagation and run-up area of triggered displacement waves is assessed. Empirical relations of wave run-up height as a function of rock avalanche volume and distance from impact location are derived from a national and international inventory of landslide-triggered displacement waves. These empirical relations are used in first-level hazard assessment and where necessary, followed by 2D or 3D displacement wave modelling. Finally, the population exposed in the rock avalanche run-out area and in the run-up area of a possible displacement wave is assessed taking into account different population groups: inhabitants, persons in critical infrastructure (hospitals and other emergency services), persons in schools and kindergartens, persons at work or in shops, tourists, persons on ferries and so on. Exposure levels are defined for each population group and vulnerability values are set for the rock avalanche run-out area (100%) and the run-up area of a possible displacement wave (70%). Finally, the total number of persons within the hazard area is calculated taking into account exposure and vulnerability. The method for consequence assessment is currently tested through several case studies in Norway and, thereafter, applied to all unstable rock slopes in the country to assess their risk level. Follow-up activities (detailed investigations, periodic displacement measurements or continuous monitoring and early-warning systems) can then be prioritized based on the risk level and with a standard approach for whole Norway.
A Cloud Computing Approach to Personal Risk Management: The Open Hazards Group
NASA Astrophysics Data System (ADS)
Graves, W. R.; Holliday, J. R.; Rundle, J. B.
2010-12-01
According to the California Earthquake Authority, only about 12% of current California residences are covered by any form of earthquake insurance, down from about 30% in 1996 following the 1994, M6.7 Northridge earthquake. Part of the reason for this decreasing rate of insurance uptake is the high deductible, either 10% or 15% of the value of the structure, and the relatively high cost of the premiums, as much as thousands of dollars per year. The earthquake insurance industry is composed of the CEA, a public-private partnership; modeling companies that produce damage and loss models similar to the FEMA HAZUS model; and financial companies such as the insurance, reinsurance, and investment banking companies in New York, London, the Cayman Islands, Zurich, Dubai, Singapore, and elsewhere. In setting earthquake insurance rates, financial companies rely on models like HAZUS, that calculate on risk and exposure. In California, the process begins with an official earthquake forecast by the Working Group on California Earthquake Probabilities. Modeling companies use these 30 year earthquake probabilities as inputs to their attenuation and damage models to estimate the possible damage factors from scenario earthquakes. Economic loss is then estimated from processes such as structural failure, lost economic activity, demand surge, and fire following the earthquake. Once the potential losses are known, rates can be set so that a target ruin probability of less than 1% or so can be assured. Open Hazards Group was founded with the idea that the global public might be interested in a personal estimate of earthquake risk, computed using data supplied by the public, with models running in a cloud computing environment. These models process data from the ANSS catalog, updated at least daily, to produce rupture forecasts that are backtested with standard Reliability/Attributes and Receiver Operating Characteristic tests, among others. Models for attenuation and structural damage are then used in a computationally efficient workflow to produce real-time estimates of damage and loss for individual structures. All models are based on techniques that either have been published in the literature or will soon be published. Using these results, members of the public can gain an appreciation of their risk of exposure to damage from destructive earthquakes, information that has heretofore only been available to a few members of the financial and insurance industries.
Models-3 is a flexible system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric...
Petrican, Raluca; Todorov, Alexander; Grady, Cheryl
2016-01-01
Character judgments, based on facial appearance, impact both perceivers’ and targets’ interpersonal decisions and behaviors. Nonetheless, the resilience of such effects in the face of longer acquaintanceship duration is yet to be determined. To address this question, we had 51 elderly long-term married couples complete self and informant versions of a Big Five Inventory. Participants were also photographed, while they were requested to maintain an emotionally neutral expression. A subset of the initial sample completed a shortened version of the Big Five Inventory in response to the pictures of other opposite sex participants (with whom they were unacquainted). Oosterhof and Todorov’s (2008) computer-based model of face evaluation was used to generate facial trait scores on trustworthiness, dominance, and attractiveness, based on participants’ photographs. Results revealed that structural facial characteristics, suggestive of greater trustworthiness, predicted positively biased, global informant evaluations of a target’s personality, among both spouses and strangers. Among spouses, this effect was impervious to marriage length. There was also evidence suggestive of a Dorian Gray effect on personality, since facial trustworthiness predicted not only spousal and stranger, but also self-ratings of extraversion. Unexpectedly, though, follow-up analyses revealed that (low) facial dominance, rather than (high) trustworthiness, was the strongest predictor of self-rated extraversion. Our present findings suggest that subtle emotional cues, embedded in the structure of emotionally neutral faces, exert long-lasting effects on personality judgments even among very well-acquainted targets and perceivers. PMID:27330234
Enabling smart personalized healthcare: a hybrid mobile-cloud approach for ECG telemonitoring.
Wang, Xiaoliang; Gui, Qiong; Liu, Bingwei; Jin, Zhanpeng; Chen, Yu
2014-05-01
The severe challenges of the skyrocketing healthcare expenditure and the fast aging population highlight the needs for innovative solutions supporting more accurate, affordable, flexible, and personalized medical diagnosis and treatment. Recent advances of mobile technologies have made mobile devices a promising tool to manage patients' own health status through services like telemedicine. However, the inherent limitations of mobile devices make them less effective in computation- or data-intensive tasks such as medical monitoring. In this study, we propose a new hybrid mobile-cloud computational solution to enable more effective personalized medical monitoring. To demonstrate the efficacy and efficiency of the proposed approach, we present a case study of mobile-cloud based electrocardiograph monitoring and analysis and develop a mobile-cloud prototype. The experimental results show that the proposed approach can significantly enhance the conventional mobile-based medical monitoring in terms of diagnostic accuracy, execution efficiency, and energy efficiency, and holds the potential in addressing future large-scale data analysis in personalized healthcare.
GPU-based RFA simulation for minimally invasive cancer treatment of liver tumours.
Mariappan, Panchatcharam; Weir, Phil; Flanagan, Ronan; Voglreiter, Philip; Alhonnoro, Tuomas; Pollari, Mika; Moche, Michael; Busse, Harald; Futterer, Jurgen; Portugaller, Horst Rupert; Sequeiros, Roberto Blanco; Kolesnik, Marina
2017-01-01
Radiofrequency ablation (RFA) is one of the most popular and well-standardized minimally invasive cancer treatments (MICT) for liver tumours, employed where surgical resection has been contraindicated. Less-experienced interventional radiologists (IRs) require an appropriate planning tool for the treatment to help avoid incomplete treatment and so reduce the tumour recurrence risk. Although a few tools are available to predict the ablation lesion geometry, the process is computationally expensive. Also, in our implementation, a few patient-specific parameters are used to improve the accuracy of the lesion prediction. Advanced heterogeneous computing using personal computers, incorporating the graphics processing unit (GPU) and the central processing unit (CPU), is proposed to predict the ablation lesion geometry. The most recent GPU technology is used to accelerate the finite element approximation of Penne's bioheat equation and a three state cell model. Patient-specific input parameters are used in the bioheat model to improve accuracy of the predicted lesion. A fast GPU-based RFA solver is developed to predict the lesion by doing most of the computational tasks in the GPU, while reserving the CPU for concurrent tasks such as lesion extraction based on the heat deposition at each finite element node. The solver takes less than 3 min for a treatment duration of 26 min. When the model receives patient-specific input parameters, the deviation between real and predicted lesion is below 3 mm. A multi-centre retrospective study indicates that the fast RFA solver is capable of providing the IR with the predicted lesion in the short time period before the intervention begins when the patient has been clinically prepared for the treatment.
Quality of Care as an Emergent Phenomenon out of a Small-World Network of Relational Actors.
Fiorini, Rodolfo; De Giacomo, Piero; Marconi, Pier Luigi; L'Abate, Luciano
2014-01-01
In Healthcare Decision Support System, the development and evaluation of effective "Quality of Care" (QOC) indicators, in simulation-based training, are key feature to develop resilient and antifragile organization scenarios. Is it possible to conceive of QOC not only as a result of a voluntary and rational decision, imposed or even not, but also as an overall system "emergent phenomenon" out of a small-world network of relational synthetic actors, endowed with their own personality profiles to simulate human behaviour (for short, called "subjects")? In order to answer this question and to observe the phenomena of real emergence we should use computational models of high complexity, with heavy computational load and extensive computational time. Nevertheless, De Giacomo's Elementary Pragmatic Model (EPM) intrinsic self-reflexive functional logical closure enables to run simulation examples to classify the outcomes grown out of a small-world network of relational subjects fast and effectively. Therefore, it is possible to take note and to learn of how much strategic systemic interventions can induce context conditions of QOC facilitation, which can improve the effectiveness of specific actions, which otherwise might be paradoxically counterproductive also. Early results are so encouraging to use EPM as basic block to start designing more powerful Evolutive Elementary Pragmatic Model (E2PM) for real emergence computational model, to cope with ontological uncertainty at system level.
Hansen, D J; Toy, V M; Deininger, R A; Collopy, T K
1983-06-01
Three of the most popular microcomputers, the TRS-80 Model I, the APPLE II+, and the IBM Personal Computer were connected to a spirometer for data acquisition and analysis. Simple programs were written which allow the collection, analysis and storage of the data produced during spirometry. Three examples demonstrate the relative ease for automating spirometers.
Advanced Technology for Portable Personal Visualization
1991-03-01
walking. The building model is created using AutoCAD. Realism is enhanced by calculating a radiosity solution for the lighting model. This has an added...lighting, color combinations and decor. Due to the computationally intensive nature of the radiosity solution, modeling changes cannot be made on-line
[Measurement of intracranial hematoma volume by personal computer].
DU, Wanping; Tan, Lihua; Zhai, Ning; Zhou, Shunke; Wang, Rui; Xue, Gongshi; Xiao, An
2011-01-01
To explore the method for intracranial hematoma volume measurement by the personal computer. Forty cases of various intracranial hematomas were measured by the computer tomography with quantitative software and personal computer with Photoshop CS3 software, respectively. the data from the 2 methods were analyzed and compared. There was no difference between the data from the computer tomography and the personal computer (P>0.05). The personal computer with Photoshop CS3 software can measure the volume of various intracranial hematomas precisely, rapidly and simply. It should be recommended in the clinical medicolegal identification.
Computer and internet use by persons after traumatic spinal cord injury.
Goodman, Naomi; Jette, Alan M; Houlihan, Bethlyn; Williams, Steve
2008-08-01
To determine whether computer and internet use by persons post spinal cord injury (SCI) is sufficiently prevalent and broad-based to consider using this technology as a long-term treatment modality for patients who have sustained SCI. A multicenter cohort study. Twenty-six past and current U.S. regional Model Spinal Cord Injury Systems. Patients with traumatic SCI (N=2926) with follow-up interviews between 2004 and 2006, conducted at 1 or 5 years postinjury. Not applicable. Results revealed that 69.2% of participants with SCI used a computer; 94.2% of computer users accessed the internet. Among computer users, 19.1% used assistive devices for computer access. Of the internet users, 68.6% went online 5 to 7 days a week. The most frequent use for internet was e-mail (90.5%) and shopping sites (65.8%), followed by health sites (61.1%). We found no statistically significant difference in computer use by sex or level of neurologic injury, and no difference in internet use by level of neurologic injury. Computer and internet access differed significantly by age, with use decreasing as age group increased. The highest computer and internet access rates were seen among participants injured before the age of 18. Computer and internet use varied by race: 76% of white compared with 46% of black subjects were computer users (P<.001), and 95.3% of white respondents who used computers used the internet, compared with 87.6% of black respondents (P<.001). Internet use increased with education level (P<.001): eighty-six percent of participants who did not graduate from high school or receive a degree used the internet, while over 97% of those with a college or associate's degree did. While the internet holds considerable potential as a long-term treatment modality after SCI, limited access to the internet by those who are black, those injured after age 18, and those with less education does reduce its usefulness in the short term for these subgroups.
Robb, P; Pawlowski, B
1990-05-01
The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.
Software Accelerates Computing Time for Complex Math
NASA Technical Reports Server (NTRS)
2014-01-01
Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.
Wang, Chunfei; Zhang, Guang; Wu, Taihu; Zhan, Ningbo; Wang, Yaling
2016-03-01
High-quality cardiopulmonary resuscitation contributes to cardiac arrest survival. The traditional chest compression (CC) standard, which neglects individual differences, uses unified standards for compression depth and compression rate in practice. In this study, an effective and personalized CC method for automatic mechanical compression devices is provided. We rebuild Charles F. Babbs' human circulation model with a coronary perfusion pressure (CPP) simulation module and propose a closed-loop controller based on a fuzzy control algorithm for CCs, which adjusts the CC depth according to the CPP. Compared with a traditional proportion-integration-differentiation (PID) controller, the performance of the fuzzy controller is evaluated in computer simulation studies. The simulation results demonstrate that the fuzzy closed-loop controller results in shorter regulation time, fewer oscillations and smaller overshoot than traditional PID controllers and outperforms the traditional PID controller for CPP regulation and maintenance.
Applications of personal computers in geophysics
NASA Astrophysics Data System (ADS)
Lee, W. H. K.; Lahr, J. C.; Habermann, R. E.
Since 1981, the use of personal computers (PCs) to increase productivity has become widespread. At present, more than 5 million personal computers are in operation for business, education, engineering, and scientific purposes. Activities within AGU reflect this trend: KOSMOS, the AGU electronic network, was introduced this year, and the AGU Committee on Personal Computers, chaired by W.H K. Lee (U.S. Geological Survey, Menlo Park, Calif.), was recently formed. In addition, in conjunction with the 1986 AGU Fall Meeting, this committee is organizing a personal computer session and hands-on demonstrations to promote applications of personal computers in geophysics.
A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools
2015-07-14
computer that establishes an encrypted Virtual Private Network ( OpenVPN [44]) based on the Secure Socket Layer (SSL) paradigm. Each user is given a...security certificate for each device used to connect to the computing nodes. Stable OpenVPN clients are available for Linux, Microsoft Windows, Apple OSX...platform is granted by an encrypted connection base on the Secure Socket Layer (SSL) protocol, and implemented in the OpenVPN Virtual Personal Network
In person versus Computer Screening for Intimate Partner Violence Among Pregnant Patients
Dado, Diane; Schussler, Sara; Hawker, Lynn; Holland, Cynthia L.; Burke, Jessica G.; Cluss, Patricia A.
2012-01-01
Objective To compare in person versus computerized screening for intimate partner violence (IPV) in a hospital-based prenatal clinic and explore women’s assessment of the screening methods. Methods We compared patient IPV disclosures on a computerized questionnaire to audio-taped first obstetric visits with an obstetric care provider and performed semi-structured interviews with patient participants who reported experiencing IPV. Results Two-hundred and fifty patient participants and 52 provider participants were in the study. Ninety-one (36%) patients disclosed IPV either via computer or in person. Of those who disclosed IPV, 60 (66%) disclosed via both methods, but 31 (34%) disclosed IPV via only one of the two methods. Twenty-three women returned for interviews. They recommended using both types together. While computerized screening was felt to be non-judgmental and more anonymous, in person screening allowed for tailored questioning and more emotional connection with the provider. Conclusion Computerized screening allowed disclosure without fear of immediate judgment. In person screening allows more flexibility in wording of questions regarding IPV and opportunity for interpersonal rapport. Practice Implications Both computerized or self-completed screening and in person screening is recommended. Providers should address IPV using non-judgmental, descriptive language, include assessments for psychological IPV, and repeat screening in person, even if no patient disclosure occurs via computer. PMID:22770815
In person versus computer screening for intimate partner violence among pregnant patients.
Chang, Judy C; Dado, Diane; Schussler, Sara; Hawker, Lynn; Holland, Cynthia L; Burke, Jessica G; Cluss, Patricia A
2012-09-01
To compare in person versus computerized screening for intimate partner violence (IPV) in a hospital-based prenatal clinic and explore women's assessment of the screening methods. We compared patient IPV disclosures on a computerized questionnaire to audio-taped first obstetric visits with an obstetric care provider and performed semi-structured interviews with patient participants who reported experiencing IPV. Two-hundred and fifty patient participants and 52 provider participants were in the study. Ninety-one (36%) patients disclosed IPV either via computer or in person. Of those who disclosed IPV, 60 (66%) disclosed via both methods, but 31 (34%) disclosed IPV via only one of the two methods. Twenty-three women returned for interviews. They recommended using both types together. While computerized screening was felt to be non-judgmental and more anonymous, in person screening allowed for tailored questioning and more emotional connection with the provider. Computerized screening allowed disclosure without fear of immediate judgment. In person screening allows more flexibility in wording of questions regarding IPV and opportunity for interpersonal rapport. Both computerized or self-completed screening and in person screening is recommended. Providers should address IPV using non-judgmental, descriptive language, include assessments for psychological IPV, and repeat screening in person, even if no patient disclosure occurs via computer. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Designing Interaction for Next Generation Personal Computing
NASA Astrophysics Data System (ADS)
de Michelis, Giorgio; Loregian, Marco; Moderini, Claudio; Marti, Patrizia; Colombo, Cesare; Bannon, Liam; Storni, Cristiano; Susani, Marco
Over two decades of research in the field of Interaction Design and Computer Supported Cooperative Work convinced us that the current design of workstations no longer fits users’ needs. It is time to design new personal computers based on metaphors alternative to the desktop one. With this SIG, we are seeking to involve international HCI professionals into the challenges of designing products that are radically new and tackling the many different issues of modern knowledge workers. We would like to engage a wider cross-section of the community: our focus will be on issues of development and participation and the impact of different values in our work.
Pre- and post-processing for Cosmic/NASTRAN on personal computers and mainframes
NASA Technical Reports Server (NTRS)
Kamel, H. A.; Mobley, A. V.; Nagaraj, B.; Watkins, K. W.
1986-01-01
An interface between Cosmic/NASTRAN and GIFTS has recently been released, combining the powerful pre- and post-processing capabilities of GIFTS with Cosmic/NASTRAN's analysis capabilities. The interface operates on a wide range of computers, even linking Cosmic/NASTRAN and GIFTS when the two are on different computers. GIFTS offers a wide range of elements for use in model construction, each translated by the interface into the nearest Cosmic/NASTRAN equivalent; and the options of automatic or interactive modelling and loading in GIFTS make pre-processing easy and effective. The interface itself includes the programs GFTCOS, which creates the Cosmic/NASTRAN input deck (and, if desired, control deck) from the GIFTS Unified Data Base, COSGFT, which translates the displacements from the Cosmic/NASTRAN analysis back into GIFTS; and HOSTR, which handles stress computations for a few higher-order elements available in the interface, but not supported by the GIFTS processor STRESS. Finally, the versatile display options in GIFTS post-processing allow the user to examine the analysis results through an especially wide range of capabilities, including such possibilities as creating composite loading cases, plotting in color and animating the analysis.
EasyModeller: A graphical interface to MODELLER
2010-01-01
Background MODELLER is a program for automated protein Homology Modeling. It is one of the most widely used tool for homology or comparative modeling of protein three-dimensional structures, but most users find it a bit difficult to start with MODELLER as it is command line based and requires knowledge of basic Python scripting to use it efficiently. Findings The study was designed with an aim to develop of "EasyModeller" tool as a frontend graphical interface to MODELLER using Perl/Tk, which can be used as a standalone tool in windows platform with MODELLER and Python preinstalled. It helps inexperienced users to perform modeling, assessment, visualization, and optimization of protein models in a simple and straightforward way. Conclusion EasyModeller provides a graphical straight forward interface and functions as a stand-alone tool which can be used in a standard personal computer with Microsoft Windows as the operating system. PMID:20712861
Blow, Nikolaus; Biswas, Pradipta
2017-01-01
As computers become more and more essential for everyday life, people who cannot use them are missing out on an important tool. The predominant method of interaction with a screen is a mouse, and difficulty in using a mouse can be a huge obstacle for people who would otherwise gain great value from using a computer. If mouse pointing were to be made easier, then a large number of users may be able to begin using a computer efficiently where they may previously have been unable to. The present article aimed to improve pointing speeds for people with arm or hand impairments. The authors investigated different smoothing and prediction models on a stored data set involving 25 people, and the best of these algorithms were chosen. A web-based prototype was developed combining a polynomial smoothing algorithm with a time-weighted gradient target prediction model. The adapted interface gave an average improvement of 13.5% in target selection times in a 10-person study of representative users of the system. A demonstration video of the system is available at https://youtu.be/sAzbrKHivEY.
NASA Astrophysics Data System (ADS)
Langer-Osuna, Jennifer
2015-03-01
This paper draws on the constructs of hybridity, figured worlds, and cultural capital to examine how a group of African-American students in a technology-driven, project-based algebra classroom utilized the computer as a resource to coordinate personal and mathematical positional identities during group work. Analyses of several vignettes of small group dynamics highlight how hybridity was established as the students engaged in multiple on-task and off-task computer-based activities, each of which drew on different lived experiences and forms of cultural capital. The paper ends with a discussion on how classrooms that make use of student-led collaborative work, and where students are afforded autonomy, have the potential to support the academic engagement of students from historically marginalized communities.
Application of Computer Simulation to Teach ATM Access to Individuals with Intellectual Disabilities
ERIC Educational Resources Information Center
Davies, Daniel K.; Stock, Steven E.; Wehmeyer, Michael L.
2003-01-01
This study investigates use of computer simulation for teaching ATM use to adults with intellectual disabilities. ATM-SIM is a computer-based trainer used for teaching individuals with intellectual disabilities how to use an automated teller machine (ATM) to access their personal bank accounts. In the pilot evaluation, a prototype system was…
Argonne simulation framework for intelligent transportation systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewing, T.; Doss, E.; Hanebutte, U.
1996-04-01
A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically tomore » reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.« less
Launch Site Computer Simulation and its Application to Processes
NASA Technical Reports Server (NTRS)
Sham, Michael D.
1995-01-01
This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.
An Ecological Framework for Cancer Communication: Implications for Research
Intille, Stephen S; Zabinski, Marion F
2005-01-01
The field of cancer communication has undergone a major revolution as a result of the Internet. As recently as the early 1990s, face-to-face, print, and the telephone were the dominant methods of communication between health professionals and individuals in support of the prevention and treatment of cancer. Computer-supported interactive media existed, but this usually required sophisticated computer and video platforms that limited availability. The introduction of point-and-click interfaces for the Internet dramatically improved the ability of non-expert computer users to obtain and publish information electronically on the Web. Demand for Web access has driven computer sales for the home setting and improved the availability, capability, and affordability of desktop computers. New advances in information and computing technologies will lead to similarly dramatic changes in the affordability and accessibility of computers. Computers will move from the desktop into the environment and onto the body. Computers are becoming smaller, faster, more sophisticated, more responsive, less expensive, and—essentially—ubiquitous. Computers are evolving into much more than desktop communication devices. New computers include sensing, monitoring, geospatial tracking, just-in-time knowledge presentation, and a host of other information processes. The challenge for cancer communication researchers is to acknowledge the expanded capability of the Web and to move beyond the approaches to health promotion, behavior change, and communication that emerged during an era when language- and image-based interpersonal and mass communication strategies predominated. Ecological theory has been advanced since the early 1900s to explain the highly complex relationships among individuals, society, organizations, the built and natural environments, and personal and population health and well-being. This paper provides background on ecological theory, advances an Ecological Model of Internet-Based Cancer Communication intended to broaden the vision of potential uses of the Internet for cancer communication, and provides some examples of how such a model might inform future research and development in cancer communication. PMID:15998614
A Practical Model for Forecasting New Freshman Enrollment during the Application Period.
ERIC Educational Resources Information Center
Paulsen, Michael B.
1989-01-01
A simple and effective model for forecasting freshman enrollment during the application period is presented step by step. The model requires minimal and readily available information, uses a simple linear regression analysis on a personal computer, and provides updated monthly forecasts. (MSE)
Affect, Risk and Uncertainty in Decision-Marking an Integrated Computational-Empirical Approach
2009-07-26
OF ABSTRACT UU 18. NUMBER O PAGES 61 19a. NAME OF RESPONSIBLE PERSON Eva Hudlicka, Ph.D. 19b. TELEPHONE NUMBER (include area code...developed by Hudlicka (2002; 2003). MAMID was designed with the explicit purpose to model the effects of affective states and personality traits on...influenced by risk and uncertainty? • How do personality traits and affective states facilitate or prevent the expression of particular types of
Detection of abnormal item based on time intervals for recommender systems.
Gao, Min; Yuan, Quan; Ling, Bin; Xiong, Qingyu
2014-01-01
With the rapid development of e-business, personalized recommendation has become core competence for enterprises to gain profits and improve customer satisfaction. Although collaborative filtering is the most successful approach for building a recommender system, it suffers from "shilling" attacks. In recent years, the research on shilling attacks has been greatly improved. However, the approaches suffer from serious problem in attack model dependency and high computational cost. To solve the problem, an approach for the detection of abnormal item is proposed in this paper. In the paper, two common features of all attack models are analyzed at first. A revised bottom-up discretized approach is then proposed based on time intervals and the features for the detection. The distributions of ratings in different time intervals are compared to detect anomaly based on the calculation of chi square distribution (χ(2)). We evaluated our approach on four types of items which are defined according to the life cycles of these items. The experimental results show that the proposed approach achieves a high detection rate with low computational cost when the number of attack profiles is more than 15. It improves the efficiency in shilling attacks detection by narrowing down the suspicious users.
Categorization-based stranger avoidance does not explain the uncanny valley effect.
MacDorman, Karl F; Chattopadhyay, Debaleena
2017-04-01
The uncanny valley hypothesis predicts that an entity appearing almost human risks eliciting cold, eerie feelings in viewers. Categorization-based stranger avoidance theory identifies the cause of this feeling as categorizing the entity into a novel category. This explanation is doubtful because stranger is not a novel category in adults; infants do not avoid strangers while the category stranger remains novel; infants old enough to fear strangers prefer photographs of strangers to those more closely resembling a familiar person; and the uncanny valley's characteristic eeriness is seldom felt when meeting strangers. We repeated our original experiment with a more realistic 3D computer model and found no support for categorization-based stranger avoidance theory. By contrast, realism inconsistency theory explains cold, eerie feelings elicited by transitions between instances of two different, mutually exclusive categories, given that at least one category is anthropomorphic: Cold, eerie feelings are caused by prediction error from perceiving some features as features of the first category and other features as features of the second category. In principle, realism inconsistency theory can explain not only negative evaluations of transitions between real and computer modeled humans but also between different vertebrate species. Copyright © 2017 Elsevier B.V. All rights reserved.
Predicting personality traits related to consumer behavior using SNS analysis
NASA Astrophysics Data System (ADS)
Baik, Jongbum; Lee, Kangbok; Lee, Soowon; Kim, Yongbum; Choi, Jayoung
2016-07-01
Modeling a user profile is one of the important factors for devising a personalized recommendation. The traditional approach for modeling a user profile in computer science is to collect and generalize the user's buying behavior or preference history, generated from the user's interactions with recommender systems. According to consumer behavior research, however, internal factors such as personality traits influence a consumer's buying behavior. Existing studies have tried to adapt the Big 5 personality traits to personalized recommendations. However, although studies have shown that these traits can be useful to some extent for personalized recommendation, the causal relationship between the Big 5 personality traits and the buying behaviors of actual consumers has not been validated. In this paper, we propose a novel method for predicting the four personality traits-Extroversion, Public Self-consciousness, Desire for Uniqueness, and Self-esteem-that correlate with buying behaviors. The proposed method automatically constructs a user-personality-traits prediction model for each user by analyzing the user behavior on a social networking service. The experimental results from an analysis of the collected Facebook data show that the proposed method can predict user-personality traits with greater precision than methods that use the variables proposed in previous studies.
Web-based tailored nutrition education: results of a randomized controlled trial.
Oenema, A; Brug, J; Lechner, L
2001-12-01
There is ample evidence that printed, computer-tailored nutrition education is a more effective tool for motivating people to change to healthier diets than general nutrition education. New technology is now providing more advanced ways of delivering tailored messages, e.g. via the World Wide Web (WWW). Before disseminating a tailored intervention via the web, it is important to investigate the potential of web-based tailored nutrition education. The present study investigated the immediate impact of web-based computer-tailored nutrition education on personal awareness and intentions related to intake of fat, fruit and vegetables. A randomized controlled trial, with a pre-test-post-test control group design was conducted. Significant differences in awareness and intention to change were found between the intervention and control group at post-test. The tailored intervention was appreciated better, was rated as more personally relevant, and had more subjective impact on opinion and intentions to change than the general nutrition information. Computer literacy had no effect on these ratings. The results indicate that interactive, web-based computer-tailored nutrition education can lead to changes in determinants of behavior. Future research should be aimed at longer-term (behavioral) effects and the practicability of distributing tailored interventions via the WWW.
SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach.
Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang
2017-01-01
As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project.
SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach
Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang
2017-01-01
As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project. PMID:29854245
Research on Multi - Person Parallel Modeling Method Based on Integrated Model Persistent Storage
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying
2018-03-01
This paper mainly studies the multi-person parallel modeling method based on the integrated model persistence storage. The integrated model refers to a set of MDDT modeling graphics system, which can carry out multi-angle, multi-level and multi-stage description of aerospace general embedded software. Persistent storage refers to converting the data model in memory into a storage model and converting the storage model into a data model in memory, where the data model refers to the object model and the storage model is a binary stream. And multi-person parallel modeling refers to the need for multi-person collaboration, the role of separation, and even real-time remote synchronization modeling.
Miao, Yinbin; Ma, Jianfeng; Liu, Ximeng; Wei, Fushan; Liu, Zhiquan; Wang, Xu An
2016-11-01
Online personal health record (PHR) is more inclined to shift data storage and search operations to cloud server so as to enjoy the elastic resources and lessen computational burden in cloud storage. As multiple patients' data is always stored in the cloud server simultaneously, it is a challenge to guarantee the confidentiality of PHR data and allow data users to search encrypted data in an efficient and privacy-preserving way. To this end, we design a secure cryptographic primitive called as attribute-based multi-keyword search over encrypted personal health records in multi-owner setting to support both fine-grained access control and multi-keyword search via Ciphertext-Policy Attribute-Based Encryption. Formal security analysis proves our scheme is selectively secure against chosen-keyword attack. As a further contribution, we conduct empirical experiments over real-world dataset to show its feasibility and practicality in a broad range of actual scenarios without incurring additional computational burden.
Privacy Policy Enforcement for Ambient Ubiquitous Services
NASA Astrophysics Data System (ADS)
Oyomno, Were; Jäppinen, Pekka; Kerttula, Esa
Ubiquitous service providers leverage miniaturised computing terminals equipped with wireless capabilities to avail new service models. These models are pivoted on personal and inexpensive terminals to customise services to individual preferences. Portability, small sizes and compact keyboards are few features popularising mobile terminals. Features enable storing and carrying of ever increasing proportions of personal data and ability to use them in service adaptations. Ubiquitous services automate deeper soliciting of personal data transparently without the need for user interactions. Transparent solicitations, acquisitions and handling of personal data legitimises privacy concerns regarding disclosures, retention and re-use of the data. This study presents a policy enforcement for ubiquitous services that safeguards handling of users personal data and monitors adherence to stipulated privacy policies. Enforcement structures towards usability and scalability are presented.
ERIC Educational Resources Information Center
Motschnig-Pitrik, Renate; Mallich, Katharina
2004-01-01
Web-based technology increases the hours we spend sitting in front of the screens of our computers. But can it also be used in a way to improve our social skills? The blended learning paradigm of Person-Centered e-Learning (PCeL) precisely aims to achieve intellectual as well as social and personal development by combining the benefits of online…
When Everybody Anticipates in a Different Way …
NASA Astrophysics Data System (ADS)
Kindler, Eugene
2002-09-01
The paper is oriented to the computer modeling of anticipatory systems in which there are more than one anticipating individuals. The anticipating of each of them can mutually differ. In such a case we can meet four main cases: (1) the anticipating persons make a dialogue to access some agreement and by such a way they can optimize the anticipation, (2) one of the anticipating persons is a teacher of the other ones and can show them where they had to be better in their anticipation, (3) the anticipating persons compete, each of them expecting to make the best anticipation and wishes to apply it in order to make the other ones weaker, (4) the anticipating persons do not mutually communicate. A human often anticipates so that he imagines the possible processes of the future and so he performs a certain "mental simulation", but nowadays a human uses computer simulation to replace that (insufficient) mental simulation. All the variants were simulated so that the human imagining was transferred to a computer simulation. Thus systems containing several simulating elements were simulated. Experiences with that "nested" simulation and applications of it are described.
Physician Utilization of a Hospital Information System: A Computer Simulation Model
Anderson, James G.; Jay, Stephen J.; Clevenger, Stephen J.; Kassing, David R.; Perry, Jane; Anderson, Marilyn M.
1988-01-01
The purpose of this research was to develop a computer simulation model that represents the process through which physicians enter orders into a hospital information system (HIS). Computer simulation experiments were performed to estimate the effects of two methods of order entry on outcome variables. The results of the computer simulation experiments were used to perform a cost-benefit analysis to compare the two different means of entering medical orders into the HIS. The results indicate that the use of personal order sets to enter orders into the HIS will result in a significant reduction in manpower, salaries and fringe benefits, and errors in order entry.
Synchronous computer mediated group discussion.
Gallagher, Peter
2005-01-01
Over the past 20 years, focus groups have become increasingly popular with nursing researchers as a data collection method, as has the use of computer-based technologies to support all forms of nursing research. This article describes the conduct of a series of focus groups in which the participants were in the same room as part of a "real-time" discussion during which they also used personal computers as an interface between each other and the moderator. Synchronous Computer Mediated Group Discussion differed from other forms of focus group discussion in that participants used personal computers rather than verbal expressions to respond to specific questions, engage in communication with other participants, and to record their thoughts. This form of focus group maintained many of the features of spoken exchanges, a cornerstone of the focus group, while capturing the advantages of online discussion.
NASA Astrophysics Data System (ADS)
Park, Sang Chul
1989-09-01
We develop a mathematical analysis model to calculate the probability of intercept (POI) for the ground-based communication intercept (COMINT) system. The POI is a measure of the effectiveness of the intercept system. We define the POI as the product of the probability of detection and the probability of coincidence. The probability of detection is a measure of the receiver's capability to detect a signal in the presence of noise. The probability of coincidence is the probability that an intercept system is available, actively listening in the proper frequency band, in the right direction and at the same time that the signal is received. We investigate the behavior of the POI with respect to the observation time, the separation distance, antenna elevations, the frequency of the signal, and the receiver bandwidths. We observe that the coincidence characteristic between the receiver scanning parameters and the signal parameters is the key factor to determine the time to obtain a given POI. This model can be used to find the optimal parameter combination to maximize the POI in a given scenario. We expand this model to a multiple system. This analysis is conducted on a personal computer to provide the portability. The model is also flexible and can be easily implemented under different situations.
Kling, Daniel; Tillmar, Andreas; Egeland, Thore; Mostad, Petter
2015-09-01
Several applications necessitate an unbiased determination of relatedness, be it in linkage or association studies or in a forensic setting. An appropriate model to compute the joint probability of some genetic data for a set of persons given some hypothesis about the pedigree structure is then required. The increasing number of markers available through high-density SNP microarray typing and NGS technologies intensifies the demand, where using a large number of markers may lead to biased results due to strong dependencies between closely located loci, both within pedigrees (linkage) and in the population (allelic association or linkage disequilibrium (LD)). We present a new general model, based on a Markov chain for inheritance patterns and another Markov chain for founder allele patterns, the latter allowing us to account for LD. We also demonstrate a specific implementation for X chromosomal markers that allows for computation of likelihoods based on hypotheses of alleged relationships and genetic marker data. The algorithm can simultaneously account for linkage, LD, and mutations. We demonstrate its feasibility using simulated examples. The algorithm is implemented in the software FamLinkX, providing a user-friendly GUI for Windows systems (FamLinkX, as well as further usage instructions, is freely available at www.famlink.se ). Our software provides the necessary means to solve cases where no previous implementation exists. In addition, the software has the possibility to perform simulations in order to further study the impact of linkage and LD on computed likelihoods for an arbitrary set of markers.
Computer program for the reservoir model of metabolic crossroads.
Ribeiro, J M; Juzgado, D; Crespo, E; Sillero, A
1990-01-01
A program containing 344 sentences, written in BASIC and adapted to run in personal computers (PC) has been developed to simulate the reservoir model of metabolic crossroads. The program draws the holes of the reservoir with shapes reflecting the Vmax, Km (S0.5) and cooperativity coefficients (n) of the enzymes and calculates both the actual velocities and the percentage of contribution of every enzyme to the overall removal of their common substrate.
Introduction to Financial Projection Models. Business Management Instructional Software.
ERIC Educational Resources Information Center
Pomeroy, Robert W., III
This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projection modeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
A Web-based home welfare and care services support system using a pen type image sensor.
Ogawa, Hidekuni; Yonezawa, Yoshiharu; Maki, Hiromichi; Sato, Haruhiko; Hahn, Allen W; Caldwell, W Morton
2003-01-01
A long-term care insurance law for elderly persons was put in force two years ago in Japan. The Home Helpers, who are employed by hospitals, care companies or the welfare office, provide home welfare and care services for the elderly, such as cooking, bathing, washing, cleaning, shopping, etc. We developed a web-based home welfare and care services support system using wireless Internet mobile phones and Internet client computers, which employs a pen type image sensor. The pen type image sensor is used by the elderly people as the entry device for their care requests. The client computer sends the requests to the server computer in the Home Helper central office, and then the server computer automatically transfers them to the Home Helper's mobile phone. This newly-developed home welfare and care services support system is easily operated by elderly persons and enables Homes Helpers to save a significant amount of time and extra travel.
Development of a computational model of glucose toxicity in the progression of diabetes mellitus.
Perez-Rivera, Danilo T; Torres-Torres, Veronica L; Torres-Colon, Abraham E; Cruz-Aponte, Maytee
2016-10-01
Diabetes mellitus is a disease characterized by a range of metabolic complications involving an individual's blood glucose levels, and its main regulator, insulin. These complications can vary largely from person to person depending on their current biophysical state. Biomedical research day-by-day makes strides to impact the lives of patients of a variety of diseases, including diabetes. One large stride that is being made is the generation of techniques to assist physicians to ``personalize medicine''. From available physiological data, biological understanding of the system, and dimensional analysis, a differential equation-based mathematical model was built in a sequential matter, to be able to elucidate clearly how each parameter correlates to the patient's current physiological state. We developed a simple mathematical model that accurately simulates the dynamics between glucose, insulin, and pancreatic $\\beta$-cells throughout disease progression with constraints to maintain biological relevance. The current framework is clearly capable of tracking the patient's current progress through the disease, dependent on factors such as latent insulin resistance or an attrite $\\beta$-cell population. Further interests would be to develop tools that allow the direct and feasible testing of how effective a given plan of treatment would be at returning the patient to a desirable biophysical state.
Enhancing Privacy in Participatory Sensing Applications with Multidimensional Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forrest, Stephanie; He, Wenbo; Groat, Michael
2013-01-01
Participatory sensing applications rely on individuals to share personal data to produce aggregated models and knowledge. In this setting, privacy concerns can discourage widespread adoption of new applications. We present a privacy-preserving participatory sensing scheme based on negative surveys for both continuous and multivariate categorical data. Without relying on encryption, our algorithms enhance the privacy of sensed data in an energy and computation efficient manner. Simulations and implementation on Android smart phones illustrate how multidimensional data can be aggregated in a useful and privacy-enhancing manner.
Negotiation Performance: Antecedents, Outcomes, and Training Recommendations
2011-10-01
Tutorial Cognitive Apprenticeships Instructional Conversations Independent Programmed Instruction Computer-Based Instruction I Rr La...procedural knowledge, as well as the more distal antecedents of individual difference variables (e.g., cognitive ability , personality) and psychological...individual difference variables (e.g., cognitive ability , personality) and psychological processes (e.g., cognitive , motivational, and emotional). This
A Curriculum Model for Teaching Telecommunications to Middle and Secondary School Students.
ERIC Educational Resources Information Center
Daughenbaugh, Richard L.
This curriculum guide is intended for use in teaching a unit on telecommunications to students with a basic understanding of computing. Introductory materials spell out the purpose of the unit--to provide an introduction to the sending and receiving of electronic information using a personal computer system and the telephone communications…
Developing Instructional Applications at the Secondary Level. The Computer as a Tool.
ERIC Educational Resources Information Center
McManus, Jack; And Others
Case studies are presented for seven Los Angeles area (California) high schools that worked with Pepperdine University in the IBM/ETS (International Business Machines/Educational Testing Service) Model Schools program, a project which provided training for selected secondary school teachers in the use of personal computers and selected software as…
Processing Diabetes Mellitus Composite Events in MAGPIE.
Brugués, Albert; Bromuri, Stefano; Barry, Michael; Del Toro, Óscar Jiménez; Mazurkiewicz, Maciej R; Kardas, Przemyslaw; Pegueroles, Josep; Schumacher, Michael
2016-02-01
The focus of this research is in the definition of programmable expert Personal Health Systems (PHS) to monitor patients affected by chronic diseases using agent oriented programming and mobile computing to represent the interactions happening amongst the components of the system. The paper also discusses issues of knowledge representation within the medical domain when dealing with temporal patterns concerning the physiological values of the patient. In the presented agent based PHS the doctors can personalize for each patient monitoring rules that can be defined in a graphical way. Furthermore, to achieve better scalability, the computations for monitoring the patients are distributed among their devices rather than being performed in a centralized server. The system is evaluated using data of 21 diabetic patients to detect temporal patterns according to a set of monitoring rules defined. The system's scalability is evaluated by comparing it with a centralized approach. The evaluation concerning the detection of temporal patterns highlights the system's ability to monitor chronic patients affected by diabetes. Regarding the scalability, the results show the fact that an approach exploiting the use of mobile computing is more scalable than a centralized approach. Therefore, more likely to satisfy the needs of next generation PHSs. PHSs are becoming an adopted technology to deal with the surge of patients affected by chronic illnesses. This paper discusses architectural choices to make an agent based PHS more scalable by using a distributed mobile computing approach. It also discusses how to model the medical knowledge in the PHS in such a way that it is modifiable at run time. The evaluation highlights the necessity of distributing the reasoning to the mobile part of the system and that modifiable rules are able to deal with the change in lifestyle of the patients affected by chronic illnesses.
Indexing and retrieving motions of characters in close contact.
Ho, Edmond S L; Komura, Taku
2009-01-01
Human motion indexing and retrieval are important for animators due to the need to search for motions in the database which can be blended and concatenated. Most of the previous researches of human motion indexing and retrieval compute the Euclidean distance of joint angles or joint positions. Such approaches are difficult to apply for cases in which multiple characters are closely interacting with each other, as the relationships of the characters are not encoded in the representation. In this research, we propose a topology-based approach to index the motions of two human characters in close contact. We compute and encode how the two bodies are tangled based on the concept of rational tangles. The encoded relationships, which we define as TangleList, are used to determine the similarity of the pairs of postures. Using our method, we can index and retrieve motions such as one person piggy-backing another, one person assisting another in walking, and two persons dancing / wrestling. Our method is useful to manage a motion database of multiple characters. We can also produce motion graph structures of two characters closely interacting with each other by interpolating and concatenating topologically similar postures and motion clips, which are applicable to 3D computer games and computer animation.
Nutritional metabolomics: Progress in addressing complexity in diet and health
Jones, Dean P.; Park, Youngja; Ziegler, Thomas R.
2013-01-01
Nutritional metabolomics is rapidly maturing to use small molecule chemical profiling to support integration of diet and nutrition in complex biosystems research. These developments are critical to facilitate transition of nutritional sciences from population-based to individual-based criteria for nutritional research, assessment and management. This review addresses progress in making these approaches manageable for nutrition research. Important concept developments concerning the exposome, predictive health and complex pathobiology, serve to emphasize the central role of diet and nutrition in integrated biosystems models of health and disease. Improved analytic tools and databases for targeted and non-targeted metabolic profiling, along with bioinformatics, pathway mapping and computational modeling, are now used for nutrition research on diet, metabolism, microbiome and health associations. These new developments enable metabolome-wide association studies (MWAS) and provide a foundation for nutritional metabolomics, along with genomics, epigenomics and health phenotyping, to support integrated models required for personalized diet and nutrition forecasting. PMID:22540256
Project EASE: a study to test a psychosocial model of epilepsy medication managment.
DiIorio, Collen; Shafer, Patricia Osborne; Letz, Richard; Henry, Thomas R; Schomer, Donal L; Yeager, Kate
2004-12-01
The purpose of this study was to test a psychosocial model of medication self-management among people with epilepsy. This model was based primarily on social cognitive theory and included personal (self-efficacy, outcome expectations, goals, stigma, and depressive symptoms), social (social support), and provider (patient satisfaction and desire for control) variables. Participants for the study were enrolled at research sites in Atlanta, Georgia, and Boston, Massachusetts and completed computer-based assessments that included measures of the study variables listed above. The mean age of the 317 participants was 43.3 years; about 50% were female, and 81%white. Self-efficacy and patient satisfaction explained the most variance in medication management. Social support was related to self-efficacy; stigma to self-efficacy and depressive symptoms; and self-efficacy to outcome expectations and depressive symptoms. Findings reinforce that medication-taking behavior is affected by a complex set of interactions among psychosocial variables.
NASA Astrophysics Data System (ADS)
Maes, Pieter-Jan; Amelynck, Denis; Leman, Marc
2012-12-01
In this article, a computational platform is presented, entitled "Dance-the-Music", that can be used in a dance educational context to explore and learn the basics of dance steps. By introducing a method based on spatiotemporal motion templates, the platform facilitates to train basic step models from sequentially repeated dance figures performed by a dance teacher. Movements are captured with an optical motion capture system. The teachers' models can be visualized from a first-person perspective to instruct students how to perform the specific dance steps in the correct manner. Moreover, recognition algorithms-based on a template matching method-can determine the quality of a student's performance in real time by means of multimodal monitoring techniques. The results of an evaluation study suggest that the Dance-the-Music is effective in helping dance students to master the basics of dance figures.
Age synthesis and estimation via faces: a survey.
Fu, Yun; Guo, Guodong; Huang, Thomas S
2010-11-01
Human age, as an important personal trait, can be directly inferred by distinct patterns emerging from the facial appearance. Derived from rapid advances in computer graphics and machine vision, computer-based age synthesis and estimation via faces have become particularly prevalent topics recently because of their explosively emerging real-world applications, such as forensic art, electronic customer relationship management, security control and surveillance monitoring, biometrics, entertainment, and cosmetology. Age synthesis is defined to rerender a face image aesthetically with natural aging and rejuvenating effects on the individual face. Age estimation is defined to label a face image automatically with the exact age (year) or the age group (year range) of the individual face. Because of their particularity and complexity, both problems are attractive yet challenging to computer-based application system designers. Large efforts from both academia and industry have been devoted in the last a few decades. In this paper, we survey the complete state-of-the-art techniques in the face image-based age synthesis and estimation topics. Existing models, popular algorithms, system performances, technical difficulties, popular face aging databases, evaluation protocols, and promising future directions are also provided with systematic discussions.
Directory of Energy Information Administration model abstracts 1988
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-01-01
This directory contains descriptions about each basic and auxiliary model, including the title, acronym, purpose, and type, followed by more detailed information on characteristics, uses, and requirements. For developing models, limited information is provided. Sources for additional information are identified. Included in this directory are 44 EIA models active as of February 1, 1988; 16 of which operate on personal computers. Models that run on personal computers are identified by ''PC'' as part of the acronyms. The main body of this directory is an alphabetical listing of all basic and auxiliary EIA models. Appendix A identifies major EIA modeling systemsmore » and the models within these systems, and Appendix B identifies EIA models by type (basic or auxiliary). Appendix C lists developing models and contact persons for those models. A basic model is one designated by the EIA Administrator as being sufficiently important to require sustained support and public scrutiny. An auxiliary model is one designated by the EIA Administrator as being used only occasionally in analyses, and therefore requires minimal levels of documentation. A developing model is one designated by the EIA Administrator as being under development and yet of sufficient interest to require a basic level of documentation at a future date. EIA also leases models developed by proprietary software vendors. Documentation for these ''proprietary'' models is the responsibility of the companies from which they are leased. EIA has recently leased models from Chase Econometrics, Inc., Data Resources, Inc. (DRI), the Oak Ridge National Laboratory (ORNL), and Wharton Econometric Forecasting Associates (WEFA). Leased models are not abstracted here.« less
Lai, Tsai-Ya; Larson, Elaine L; Rockoff, Maxine L; Bakken, Suzanne
2008-01-01
The Tailored Interventions for management of DEpressive Symptoms (TIDES) program was designed based on social cognitive theory to provide tailored, computer-based education on key elements and self-care strategies for depressive symptoms in persons living with HIV/AIDS (PLWHAs). Based on an extension of the Technology Acceptance Model (TAM), a cross-sectional design was used to assess the acceptance of the HIV TIDES prototype and explore the relationships among system acceptance factors proposed in the conceptual model. Thirty-two PLWHAs were recruited from HIV/AIDS clinics. The majority were African American (68.8%), male (65.6%), with high school or lower education (68.7%), and in their 40s (62.5%). PARTICIPANTS spent an average of 10.4 minutes (SD = 5.6) using HIV TIDES. The PLWHAs rated the system as easy to use (Mean = 9.61, SD = 0.76) and useful (Mean = 9.50, SD = 1.16). The high ratings of behavior intention to use (Mean = 9.47, SD = 1.24) suggest that HIV TIDES has the potential to be accepted and used by PLWHAs. Four factors were positively correlated with behavioral intention to use: perceived usefulness (r = 0.61), perceived ease of use (r = 0.61), internal control (r = 0.59), and external control (r = 0.46). Computer anxiety (r = -0.80), tailoring path (r = 0-.35) and depressive symptoms (r = -0.49) were negatively correlated with behavioral intention to use. The results of this study provide evidence of the acceptability of HIV TIDES by PLWHAs. Individuals are expected to be empowered through participating in the interactive process to generate their self-care plan. HIV TIDES enables information sharing about depression prevention and health promotion and has the potential to reframe the traditional patient-provider relationship.
Costa, P T; McCrae, R R
1997-02-01
The Revised NEO Personality Inventory (NEO-PI-R) consists of 30 facet scales that define the broad domains of the Five-Factor Model of personality. No major revisions of the basic model are anticipated in the near future. Despite their popularity, social desirability and inconsistency scales will not be added to the NEO-PI-R because their validity and utility have not yet been demonstrated. Among possible changes are minor modifications in wording and more extensive adaptations for adolescents and for populations with low reading levels. Contextualized (e.g., work-related) versions of the instrument will be further explored. Many changes are more easily implemented on the computer than the print version of the instrument.
Personality from a cognitive-biological perspective.
Neuman, Yair
2014-12-01
The term "personality" is used to describe a distinctive and relatively stable set of mental traits that aim to explain the organism's behavior. The concept of personality that emerged in human psychology has been also applied to the study of non-human organisms from birds to horses. In this paper, I critically review the concept of personality from an interdisciplinary perspective, and point to some ideas that may be used for developing a cognitive-biological theory of personality. Integrating theories and research findings from various fields such as cognitive ethnology, clinical psychology, and neuroscience, I argue that the common denominator of various personality theories are neural systems of threat/trust management and their emotional, cognitive, and behavioral dimensions. In this context, personality may be also conceived as a meta-heuristics both human and non-human organisms apply to model and predict the behavior of others. The paper concludes by suggesting a minimal computational model of personality that may guide future research. Copyright © 2014 Elsevier B.V. All rights reserved.
Investigation of a computer virus outbreak in the pharmacy of a tertiary care teaching hospital.
Bailey, T C; Reichley, R M
1992-10-01
A computer virus outbreak was recognized, verified, defined, investigated, and controlled using an infection control approach. The pathogenesis and epidemiology of computer virus infection are reviewed. Case-control study. Pharmacy of a tertiary care teaching institution. On October 28, 1991, 2 personal computers in the drug information center manifested symptoms consistent with the "Jerusalem" virus infection. The same day, a departmental personal computer began playing "Yankee Doodle," a sign of "Doodle" virus infection. An investigation of all departmental personal computers identified the "Stoned" virus in an additional personal computer. Controls were functioning virus-free personal computers within the department. Cases were associated with users who brought diskettes from outside the department (5/5 cases versus 5/13 controls, p = .04) and with College of Pharmacy student users (3/5 cases versus 0/13 controls, p = .012). The detection of a virus-infected diskette or personal computer was associated with the number of 5 1/4-inch diskettes in the files of personal computers, a surrogate for rate of media exchange (mean = 17.4 versus 152.5, p = .018, Wilcoxon rank sum test). After education of departmental personal computer users regarding appropriate computer hygiene and installation of virus protection software, no further spread of personal computer viruses occurred, although 2 additional Stoned-infected and 1 Jerusalem-infected diskettes were detected. We recommend that virus detection software be installed on personal computers where the interchange of diskettes among computers is necessary, that write-protect tabs be placed on all program master diskettes and data diskettes where data are being read and not written, that in the event of a computer virus outbreak, all available diskettes be quarantined and scanned by virus detection software, and to facilitate quarantine and scanning in an outbreak, that diskettes be stored in organized files.
Riley, Elizabeth N.; Peterson, Sarah J.; Smith, Gregory T.
2017-01-01
While the overall stability of personality across the lifespan has been well-documented, one does see incremental changes in a number of personality traits, changes that may impact overall life trajectories in both positive and negative ways. In this chapter, we present a new, developmentally-oriented and integrative model of the factors that might lead to personality change, drawing from the theoretical and empirical work of prior models (e.g. Caspi & Roberts, 2001; Roberts et al., 2005) as well as from our own longitudinal studies of personality change and risky behavior engagement in children, adolescents, and young adults (Boyle et al., 2016; Riley & Smith, 2016; Riley et al., 2016). We focus on change in the trait of urgency, which is a high-risk personality trait that represents the tendency to act rashly when highly emotional. We explore processes of both biologically-based personality change in adolescence, integrating neurocognitive and puberty-based models, as well as behavior-based personality change, in which behaviors and the personality traits underlying those behaviors are incrementally reinforced and shaped over time. One implication of our model for clinical psychology is the apparent presence of a positive feedback loop of risk, in which maladaptive behaviors increase high-risk personality traits, which in turn further increase the likelihood of maladaptive behaviors, a process that continues far beyond the initial experiences of maladaptive behavior engagement. Finally, we examine important future directions for continuing work on personality change, including trauma-based personality change and more directive (e.g., therapeutic) approaches aimed at shaping personality. PMID:29109672
Code of Federal Regulations, 2013 CFR
2013-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
Code of Federal Regulations, 2012 CFR
2012-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.706(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...
NASA Technical Reports Server (NTRS)
1994-01-01
The Data Egg, a prototype chord key-based data entry device, can be used autonomously or as an auxiliary keyboard with a personal computer. Data is entered by pressing combinations of seven buttons positioned where the fingers naturally fall when clasping the device. An experienced user can enter text at 30 to 35 words per minute. No transcription is required. The input is downloaded into a computer and printed. The Data Egg can be used by an astronaut in space, a journalist, a bedridden person, etc. It was developed by a Jet Propulsion Laboratory engineer. Product is not currently manufactured.
Development of a PC-based ground support system for a small satellite instrument
NASA Astrophysics Data System (ADS)
Deschambault, Robert L.; Gregory, Philip R.; Spenler, Stephen; Whalen, Brian A.
1993-11-01
The importance of effective ground support for the remote control and data retrieval of a satellite instrument cannot be understated. Problems with ground support may include the need to base personnel at a ground tracking station for extended periods, and the delay between the instrument observation and the processing of the data by the science team. Flexible solutions to such problems in the case of small satellite systems are provided by using low-cost, powerful personal computers and off-the-shelf software for data acquisition and processing, and by using Internet as a communication pathway to enable scientists to view and manipulate satellite data in real time at any ground location. The personal computer based ground support system is illustrated for the case of the cold plasma analyzer flown on the Freja satellite. Commercial software was used as building blocks for writing the ground support equipment software. Several levels of hardware support, including unit tests and development, functional tests, and integration were provided by portable and desktop personal computers. Satellite stations in Saskatchewan and Sweden were linked to the science team via phone lines and Internet, which provided remote control through a central point. These successful strategies will be used on future small satellite space programs.
IAServ: an intelligent home care web services platform in a cloud for aging-in-place.
Su, Chuan-Jun; Chiang, Chang-Yu
2013-11-12
As the elderly population has been rapidly expanding and the core tax-paying population has been shrinking, the need for adequate elderly health and housing services continues to grow while the resources to provide such services are becoming increasingly scarce. Thus, increasing the efficiency of the delivery of healthcare services through the use of modern technology is a pressing issue. The seamless integration of such enabling technologies as ontology, intelligent agents, web services, and cloud computing is transforming healthcare from hospital-based treatments to home-based self-care and preventive care. A ubiquitous healthcare platform based on this technological integration, which synergizes service providers with patients' needs to be developed to provide personalized healthcare services at the right time, in the right place, and the right manner. This paper presents the development and overall architecture of IAServ (the Intelligent Aging-in-place Home care Web Services Platform) to provide personalized healthcare service ubiquitously in a cloud computing setting to support the most desirable and cost-efficient method of care for the aged-aging in place. The IAServ is expected to offer intelligent, pervasive, accurate and contextually-aware personal care services. Architecturally the implemented IAServ leverages web services and cloud computing to provide economic, scalable, and robust healthcare services over the Internet.
IAServ: An Intelligent Home Care Web Services Platform in a Cloud for Aging-in-Place
Su, Chuan-Jun; Chiang, Chang-Yu
2013-01-01
As the elderly population has been rapidly expanding and the core tax-paying population has been shrinking, the need for adequate elderly health and housing services continues to grow while the resources to provide such services are becoming increasingly scarce. Thus, increasing the efficiency of the delivery of healthcare services through the use of modern technology is a pressing issue. The seamless integration of such enabling technologies as ontology, intelligent agents, web services, and cloud computing is transforming healthcare from hospital-based treatments to home-based self-care and preventive care. A ubiquitous healthcare platform based on this technological integration, which synergizes service providers with patients’ needs to be developed to provide personalized healthcare services at the right time, in the right place, and the right manner. This paper presents the development and overall architecture of IAServ (the Intelligent Aging-in-place Home care Web Services Platform) to provide personalized healthcare service ubiquitously in a cloud computing setting to support the most desirable and cost-efficient method of care for the aged-aging in place. The IAServ is expected to offer intelligent, pervasive, accurate and contextually-aware personal care services. Architecturally the implemented IAServ leverages web services and cloud computing to provide economic, scalable, and robust healthcare services over the Internet. PMID:24225647
NASA Astrophysics Data System (ADS)
Zakharova, Natalia; Piskovatsky, Nicolay; Gusev, Anatoly
2014-05-01
Development of Informational-Computational Systems (ICS) for data assimilation procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The above problems are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for personal computers. In this work the results on the Special data base development for ICS "INM RAS - Black Sea" are presented. In the presentation the input information for ICS is discussed, some special data processing procedures are described. In this work the results of forecast using ICS "INM RAS - Black Sea" with operational observation data assimilation are presented. This study was supported by the Russian Foundation for Basic Research (project No 13-01-00753) and by Presidium Program of Russian Academy of Sciences (project P-23 "Black sea as an imitational ocean model"). References 1. V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 5-31. 2. E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 69-94. 3. V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 95-111. 4. Agoshkov V.I.,Assovsky M.B., Giniatulin S. V., Zakharova N.B., Kuimov G.V., Parmuzin E.I., Fomin V.V. Informational Computational system of variational assimilation of observation data "INM RAS - Black sea"// Ecological safety of coastal and shelf zones and complex use of shelf resources: Collection of scientific works. Issue 26, Volume 2. - National Academy of Sciences of Ukraine, Marine Hydrophysical Institute, Sebastopol, 2012. Pages 352-360. (In russian)
User assessment of smoke-dispersion models for wildland biomass burning.
Steve Breyfogle; Sue A. Ferguson
1996-01-01
Several smoke-dispersion models, which currently are available for modeling smoke from biomass burns, were evaluated for ease of use, availability of input data, and output data format. The input and output components of all models are listed, and differences in model physics are discussed. Each model was installed and run on a personal computer with a simple-case...
Revealing Real-Time Emotional Responses: a Personalized Assessment based on Heartbeat Dynamics
NASA Astrophysics Data System (ADS)
Valenza, Gaetano; Citi, Luca; Lanatá, Antonio; Scilingo, Enzo Pasquale; Barbieri, Riccardo
2014-05-01
Emotion recognition through computational modeling and analysis of physiological signals has been widely investigated in the last decade. Most of the proposed emotion recognition systems require relatively long-time series of multivariate records and do not provide accurate real-time characterizations using short-time series. To overcome these limitations, we propose a novel personalized probabilistic framework able to characterize the emotional state of a subject through the analysis of heartbeat dynamics exclusively. The study includes thirty subjects presented with a set of standardized images gathered from the international affective picture system, alternating levels of arousal and valence. Due to the intrinsic nonlinearity and nonstationarity of the RR interval series, a specific point-process model was devised for instantaneous identification considering autoregressive nonlinearities up to the third-order according to the Wiener-Volterra representation, thus tracking very fast stimulus-response changes. Features from the instantaneous spectrum and bispectrum, as well as the dominant Lyapunov exponent, were extracted and considered as input features to a support vector machine for classification. Results, estimating emotions each 10 seconds, achieve an overall accuracy in recognizing four emotional states based on the circumplex model of affect of 79.29%, with 79.15% on the valence axis, and 83.55% on the arousal axis.
Revealing real-time emotional responses: a personalized assessment based on heartbeat dynamics.
Valenza, Gaetano; Citi, Luca; Lanatá, Antonio; Scilingo, Enzo Pasquale; Barbieri, Riccardo
2014-05-21
Emotion recognition through computational modeling and analysis of physiological signals has been widely investigated in the last decade. Most of the proposed emotion recognition systems require relatively long-time series of multivariate records and do not provide accurate real-time characterizations using short-time series. To overcome these limitations, we propose a novel personalized probabilistic framework able to characterize the emotional state of a subject through the analysis of heartbeat dynamics exclusively. The study includes thirty subjects presented with a set of standardized images gathered from the international affective picture system, alternating levels of arousal and valence. Due to the intrinsic nonlinearity and nonstationarity of the RR interval series, a specific point-process model was devised for instantaneous identification considering autoregressive nonlinearities up to the third-order according to the Wiener-Volterra representation, thus tracking very fast stimulus-response changes. Features from the instantaneous spectrum and bispectrum, as well as the dominant Lyapunov exponent, were extracted and considered as input features to a support vector machine for classification. Results, estimating emotions each 10 seconds, achieve an overall accuracy in recognizing four emotional states based on the circumplex model of affect of 79.29%, with 79.15% on the valence axis, and 83.55% on the arousal axis.
Importance of Personalized Health-Care Models: A Case Study in Activity Recognition.
Zdravevski, Eftim; Lameski, Petre; Trajkovik, Vladimir; Pombo, Nuno; Garcia, Nuno
2018-01-01
Novel information and communication technologies create possibilities to change the future of health care. Ambient Assisted Living (AAL) is seen as a promising supplement of the current care models. The main goal of AAL solutions is to apply ambient intelligence technologies to enable elderly people to continue to live in their preferred environments. Applying trained models from health data is challenging because the personalized environments could differ significantly than the ones which provided training data. This paper investigates the effects on activity recognition accuracy using single accelerometer of personalized models compared to models built on general population. In addition, we propose a collaborative filtering based approach which provides balance between fully personalized models and generic models. The results show that the accuracy could be improved to 95% with fully personalized models, and up to 91.6% with collaborative filtering based models, which is significantly better than common models that exhibit accuracy of 85.1%. The collaborative filtering approach seems to provide highly personalized models with substantial accuracy, while overcoming the cold start problem that is common for fully personalized models.
Davidson, R W
1985-01-01
The increasing need to communicate to exchange data can be handled by personal microcomputers. The necessity for the transference of information stored in one type of personal computer to another type of personal computer is often encountered in the process of integrating multiple sources of information stored in different and incompatible computers in Medical Research and Practice. A practical example is demonstrated with two relatively inexpensive commonly used computers, the IBM PC jr. and the Apple IIe. The basic input/output (I/O) interface chip for serial communication for each computer are joined together using a Null connector and cable to form a communications link. Using BASIC (Beginner's All-purpose Symbolic Instruction Code) Computer Language and the Disk Operating System (DOS) the communications handshaking protocol and file transfer is established between the two computers. The BASIC programming languages used are Applesoft (Apple Personal Computer) and PC BASIC (IBM Personal computer).
Geiss, Karla; Meyer, Martin
2013-09-01
Standardized mortality ratios and standardized incidence ratios are widely used in cohort studies to compare mortality or incidence in a study population to that in the general population on a age-time-specific basis, but their computation is not included in standard statistical software packages. Here we present a user-friendly Microsoft Windows program for computing standardized mortality ratios and standardized incidence ratios based on calculation of exact person-years at risk stratified by sex, age and calendar time. The program offers flexible import of different file formats for input data and easy handling of general population reference rate tables, such as mortality or incidence tables exported from cancer registry databases. The application of the program is illustrated with two examples using empirical data from the Bavarian Cancer Registry. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
A FORTRAN program for multivariate survival analysis on the personal computer.
Mulder, P G
1988-01-01
In this paper a FORTRAN program is presented for multivariate survival or life table regression analysis in a competing risks' situation. The relevant failure rate (for example, a particular disease or mortality rate) is modelled as a log-linear function of a vector of (possibly time-dependent) explanatory variables. The explanatory variables may also include the variable time itself, which is useful for parameterizing piecewise exponential time-to-failure distributions in a Gompertz-like or Weibull-like way as a more efficient alternative to Cox's proportional hazards model. Maximum likelihood estimates of the coefficients of the log-linear relationship are obtained from the iterative Newton-Raphson method. The program runs on a personal computer under DOS; running time is quite acceptable, even for large samples.
Public-Private Partnerships in Cloud-Computing Services in the Context of Genomic Research.
Granados Moreno, Palmira; Joly, Yann; Knoppers, Bartha Maria
2017-01-01
Public-private partnerships (PPPs) have been increasingly used to spur and facilitate innovation in a number of fields. In healthcare, the purpose of using a PPP is commonly to develop and/or provide vaccines and drugs against communicable diseases, mainly in developing or underdeveloped countries. With the advancement of technology and of the area of genomics, these partnerships also focus on large-scale genomic research projects that aim to advance the understanding of diseases that have a genetic component and to develop personalized treatments. This new focus has created new forms of PPPs that involve information technology companies, which provide computing infrastructure and services to store, analyze, and share the massive amounts of data genomic-related projects produce. In this article, we explore models of PPPs proposed to handle, protect, and share the genomic data collected and to further develop genomic-based medical products. We also identify the reasons that make these models suitable and the challenges they have yet to overcome. To achieve this, we describe the details and complexities of MSSNG, International Cancer Genome Consortium, and 100,000 Genomes Project, the three PPPs that focus on large-scale genomic research to better understand the genetic components of autism, cancer, rare diseases, and infectious diseases with the intention to find appropriate treatments. Organized as PPP and employing cloud-computing services, the three projects have advanced quickly and are likely to be important sources of research and development for future personalized medicine. However, there still are unresolved matters relating to conflicts of interest, commercialization, and data control. Learning from the challenges encountered by past PPPs allowed us to establish that developing guidelines to adequately manage personal health information stored in clouds and ensuring the protection of data integrity and privacy would be critical steps in the development of future PPPs.
Public–Private Partnerships in Cloud-Computing Services in the Context of Genomic Research
Granados Moreno, Palmira; Joly, Yann; Knoppers, Bartha Maria
2017-01-01
Public–private partnerships (PPPs) have been increasingly used to spur and facilitate innovation in a number of fields. In healthcare, the purpose of using a PPP is commonly to develop and/or provide vaccines and drugs against communicable diseases, mainly in developing or underdeveloped countries. With the advancement of technology and of the area of genomics, these partnerships also focus on large-scale genomic research projects that aim to advance the understanding of diseases that have a genetic component and to develop personalized treatments. This new focus has created new forms of PPPs that involve information technology companies, which provide computing infrastructure and services to store, analyze, and share the massive amounts of data genomic-related projects produce. In this article, we explore models of PPPs proposed to handle, protect, and share the genomic data collected and to further develop genomic-based medical products. We also identify the reasons that make these models suitable and the challenges they have yet to overcome. To achieve this, we describe the details and complexities of MSSNG, International Cancer Genome Consortium, and 100,000 Genomes Project, the three PPPs that focus on large-scale genomic research to better understand the genetic components of autism, cancer, rare diseases, and infectious diseases with the intention to find appropriate treatments. Organized as PPP and employing cloud-computing services, the three projects have advanced quickly and are likely to be important sources of research and development for future personalized medicine. However, there still are unresolved matters relating to conflicts of interest, commercialization, and data control. Learning from the challenges encountered by past PPPs allowed us to establish that developing guidelines to adequately manage personal health information stored in clouds and ensuring the protection of data integrity and privacy would be critical steps in the development of future PPPs. PMID:28164085
Machine learning-based method for personalized and cost-effective detection of Alzheimer's disease.
Escudero, Javier; Ifeachor, Emmanuel; Zajicek, John P; Green, Colin; Shearer, James; Pearson, Stephen
2013-01-01
Diagnosis of Alzheimer's disease (AD) is often difficult, especially early in the disease process at the stage of mild cognitive impairment (MCI). Yet, it is at this stage that treatment is most likely to be effective, so there would be great advantages in improving the diagnosis process. We describe and test a machine learning approach for personalized and cost-effective diagnosis of AD. It uses locally weighted learning to tailor a classifier model to each patient and computes the sequence of biomarkers most informative or cost-effective to diagnose patients. Using ADNI data, we classified AD versus controls and MCI patients who progressed to AD within a year, against those who did not. The approach performed similarly to considering all data at once, while significantly reducing the number (and cost) of the biomarkers needed to achieve a confident diagnosis for each patient. Thus, it may contribute to a personalized and effective detection of AD, and may prove useful in clinical settings.
Spectral analysis method and sample generation for real time visualization of speech
NASA Astrophysics Data System (ADS)
Hobohm, Klaus
A method for translating speech signals into optical models, characterized by high sound discrimination and learnability and designed to provide to deaf persons a feedback towards control of their way of speaking, is presented. Important properties of speech production and perception processes and organs involved in these mechanisms are recalled in order to define requirements for speech visualization. It is established that the spectral representation of time, frequency and amplitude resolution of hearing must be fair and continuous variations of acoustic parameters of speech signal must be depicted by a continuous variation of images. A color table was developed for dynamic illustration and sonograms were generated with five spectral analysis methods such as Fourier transformations and linear prediction coding. For evaluating sonogram quality, test persons had to recognize consonant/vocal/consonant words and an optimized analysis method was achieved with a fast Fourier transformation and a postprocessor. A hardware concept of a real time speech visualization system, based on multiprocessor technology in a personal computer, is presented.
Principles of three-dimensional printing and clinical applications within the abdomen and pelvis.
Bastawrous, Sarah; Wake, Nicole; Levin, Dmitry; Ripley, Beth
2018-04-04
Improvements in technology and reduction in costs have led to widespread interest in three-dimensional (3D) printing. 3D-printed anatomical models contribute to personalized medicine, surgical planning, and education across medical specialties, and these models are rapidly changing the landscape of clinical practice. A physical object that can be held in one's hands allows for significant advantages over standard two-dimensional (2D) or even 3D computer-based virtual models. Radiologists have the potential to play a significant role as consultants and educators across all specialties by providing 3D-printed models that enhance clinical care. This article reviews the basics of 3D printing, including how models are created from imaging data, clinical applications of 3D printing within the abdomen and pelvis, implications for education and training, limitations, and future directions.
Visual Privacy by Context: Proposal and Evaluation of a Level-Based Visualisation Scheme
Padilla-López, José Ramón; Chaaraoui, Alexandros Andre; Gu, Feng; Flórez-Revuelta, Francisco
2015-01-01
Privacy in image and video data has become an important subject since cameras are being installed in an increasing number of public and private spaces. Specifically, in assisted living, intelligent monitoring based on computer vision can allow one to provide risk detection and support services that increase people's autonomy at home. In the present work, a level-based visualisation scheme is proposed to provide visual privacy when human intervention is necessary, such as at telerehabilitation and safety assessment applications. Visualisation levels are dynamically selected based on the previously modelled context. In this way, different levels of protection can be provided, maintaining the necessary intelligibility required for the applications. Furthermore, a case study of a living room, where a top-view camera is installed, is presented. Finally, the performed survey-based evaluation indicates the degree of protection provided by the different visualisation models, as well as the personal privacy preferences and valuations of the users. PMID:26053746
Enhancing collaborative filtering by user interest expansion via personalized ranking.
Liu, Qi; Chen, Enhong; Xiong, Hui; Ding, Chris H Q; Chen, Jian
2012-02-01
Recommender systems suggest a few items from many possible choices to the users by understanding their past behaviors. In these systems, the user behaviors are influenced by the hidden interests of the users. Learning to leverage the information about user interests is often critical for making better recommendations. However, existing collaborative-filtering-based recommender systems are usually focused on exploiting the information about the user's interaction with the systems; the information about latent user interests is largely underexplored. To that end, inspired by the topic models, in this paper, we propose a novel collaborative-filtering-based recommender system by user interest expansion via personalized ranking, named iExpand. The goal is to build an item-oriented model-based collaborative-filtering framework. The iExpand method introduces a three-layer, user-interests-item, representation scheme, which leads to more accurate ranking recommendation results with less computation cost and helps the understanding of the interactions among users, items, and user interests. Moreover, iExpand strategically deals with many issues that exist in traditional collaborative-filtering approaches, such as the overspecialization problem and the cold-start problem. Finally, we evaluate iExpand on three benchmark data sets, and experimental results show that iExpand can lead to better ranking performance than state-of-the-art methods with a significant margin.
NASA Technical Reports Server (NTRS)
Yates, Leslie A.
1992-01-01
Software for an automated film-reading system that uses personal computers and digitized shadowgraphs is described. The software identifies pixels associated with fiducial-line and model images, and least-squares procedures are used to calculate the positions and orientations of the images. Automated position and orientation readings for sphere and cone models are compared to those obtained using a manual film reader. When facility calibration errors are removed from these readings, the accuracy of the automated readings is better than the pixel resolution, and it is equal to, or better than, the manual readings. The effects of film-reading and facility-calibration errors on calculated aerodynamic coefficients is discussed.
Thermospheric dynamics - A system theory approach
NASA Technical Reports Server (NTRS)
Codrescu, M.; Forbes, J. M.; Roble, R. G.
1990-01-01
A system theory approach to thermospheric modeling is developed, based upon a linearization method which is capable of preserving nonlinear features of a dynamical system. The method is tested using a large, nonlinear, time-varying system, namely the thermospheric general circulation model (TGCM) of the National Center for Atmospheric Research. In the linearized version an equivalent system, defined for one of the desired TGCM output variables, is characterized by a set of response functions that is constructed from corresponding quasi-steady state and unit sample response functions. The linearized version of the system runs on a personal computer and produces an approximation of the desired TGCM output field height profile at a given geographic location.
correlcalc: Two-point correlation function from redshift surveys
NASA Astrophysics Data System (ADS)
Rohin, Yeluripati
2017-11-01
correlcalc calculates two-point correlation function (2pCF) of galaxies/quasars using redshift surveys. It can be used for any assumed geometry or Cosmology model. Using BallTree algorithms to reduce the computational effort for large datasets, it is a parallelised code suitable for running on clusters as well as personal computers. It takes redshift (z), Right Ascension (RA) and Declination (DEC) data of galaxies and random catalogs as inputs in form of ascii or fits files. If random catalog is not provided, it generates one of desired size based on the input redshift distribution and mangle polygon file (in .ply format) describing the survey geometry. It also calculates different realisations of (3D) anisotropic 2pCF. Optionally it makes healpix maps of the survey providing visualization.
Adelson, David; Brown, Fred; Chaudhri, Naeem
2017-01-01
The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice. PMID:28812013
Banjar, Haneen; Adelson, David; Brown, Fred; Chaudhri, Naeem
2017-01-01
The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice.
Bringing computational models of bone regeneration to the clinic.
Carlier, Aurélie; Geris, Liesbet; Lammens, Johan; Van Oosterwyck, Hans
2015-01-01
Although the field of bone regeneration has experienced great advancements in the last decades, integrating all the relevant, patient-specific information into a personalized diagnosis and optimal treatment remains a challenging task due to the large number of variables that affect bone regeneration. Computational models have the potential to cope with this complexity and to improve the fundamental understanding of the bone regeneration processes as well as to predict and optimize the patient-specific treatment strategies. However, the current use of computational models in daily orthopedic practice is very limited or inexistent. We have identified three key hurdles that limit the translation of computational models of bone regeneration from bench to bed side. First, there exists a clear mismatch between the scope of the existing and the clinically required models. Second, most computational models are confronted with limited quantitative information of insufficient quality thereby hampering the determination of patient-specific parameter values. Third, current computational models are only corroborated with animal models, whereas a thorough (retrospective and prospective) assessment of the computational model will be crucial to convince the health care providers of the capabilities thereof. These challenges must be addressed so that computational models of bone regeneration can reach their true potential, resulting in the advancement of individualized care and reduction of the associated health care costs. © 2015 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Williams, Fred D.
An adventure game is a role-playing game that usually, but not always, has some fantasy aspect. The role-playing aspect is the key element because players become personally involved when they assume a role, and defeat becomes personal and less acceptable than in other types of games. Computer-based role-playing games are extremely popular because…
Computer Based Language Training: A Conversation with Duane M. Rumbaugh and Mary Ann Romski.
ERIC Educational Resources Information Center
Thomas, M. Angele
1981-01-01
An interview with Duane Rumbaugh and Mary Ann Romski, researchers on the use of alternative communication systems for severely and profoundly retarded persons, focuses on applications from their primate research and the use of a computerized keyboard system to investigate language acquisition in severely retarded persons. (CL)
Lifelong Learning for the 21st Century.
ERIC Educational Resources Information Center
Goodnight, Ron
The Lifelong Learning Center for the 21st Century was proposed to provide personal renewal and technical training for employees at a major United States automotive manufacturing company when it implemented a new, computer-based Computer Numerical Controlled (CNC) machining, robotics, and high technology facility. The employees needed training for…
Factors Influencing Trainee Participation in Computer Software Applications Training.
ERIC Educational Resources Information Center
Alexander, Melody Webler
1993-01-01
Participants (n=130) who had completed training in WordPerfect, Lotus 1-2-3, and dBase III+ completed a questionnaire related to demographic characteristics and factors that influence training participation. Trainees are participating in computer training for personal reasons, seeking convenient time, location, and length. Child care or…
ERIC Educational Resources Information Center
Mizell, Al P.; Centini, Barry M.
The role of telecommunications in establishing the electronic classroom in distance education is illustrated. Using a computer-based doctoral program and the UNIX operating system as an example, how a personal computer and modem may be combined with a telephone line for instructional delivery is described. A number of issues must be addressed in…
75 FR 21630 - Proposed Data Collections Submitted for Public Comment and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-26
... participate in the second phase. Quantitative surveys will be administered by computers and personal... administer a survey, conduct interviews and offer HIV rapid testing in Black Men who have sex with Men (BMSM...-minute eligibility screening interview. The baseline computer-based survey will take 45 minutes. The...
Real-World Physics: A Portable MBL for Field Measurements.
ERIC Educational Resources Information Center
Albergotti, Clifton
1994-01-01
Uses a moderately priced digital multimeter that has output and software compatible with personal computers to make a portable, computer-based data-acquisition system. The system can measure voltage, current, frequency, capacitance, transistor hFE, and temperature. Describes field measures of velocity, acceleration, and temperature as function of…
Are Opinions Based on Science: Modelling Social Response to Scientific Facts
Iñiguez, Gerardo; Tagüeña-Martínez, Julia; Kaski, Kimmo K.; Barrio, Rafael A.
2012-01-01
As scientists we like to think that modern societies and their members base their views, opinions and behaviour on scientific facts. This is not necessarily the case, even though we are all (over-) exposed to information flow through various channels of media, i.e. newspapers, television, radio, internet, and web. It is thought that this is mainly due to the conflicting information on the mass media and to the individual attitude (formed by cultural, educational and environmental factors), that is, one external factor and another personal factor. In this paper we will investigate the dynamical development of opinion in a small population of agents by means of a computational model of opinion formation in a co-evolving network of socially linked agents. The personal and external factors are taken into account by assigning an individual attitude parameter to each agent, and by subjecting all to an external but homogeneous field to simulate the effect of the media. We then adjust the field strength in the model by using actual data on scientific perception surveys carried out in two different populations, which allow us to compare two different societies. We interpret the model findings with the aid of simple mean field calculations. Our results suggest that scientifically sound concepts are more difficult to acquire than concepts not validated by science, since opposing individuals organize themselves in close communities that prevent opinion consensus. PMID:22905117
Are opinions based on science: modelling social response to scientific facts.
Iñiguez, Gerardo; Tagüeña-Martínez, Julia; Kaski, Kimmo K; Barrio, Rafael A
2012-01-01
As scientists we like to think that modern societies and their members base their views, opinions and behaviour on scientific facts. This is not necessarily the case, even though we are all (over-) exposed to information flow through various channels of media, i.e. newspapers, television, radio, internet, and web. It is thought that this is mainly due to the conflicting information on the mass media and to the individual attitude (formed by cultural, educational and environmental factors), that is, one external factor and another personal factor. In this paper we will investigate the dynamical development of opinion in a small population of agents by means of a computational model of opinion formation in a co-evolving network of socially linked agents. The personal and external factors are taken into account by assigning an individual attitude parameter to each agent, and by subjecting all to an external but homogeneous field to simulate the effect of the media. We then adjust the field strength in the model by using actual data on scientific perception surveys carried out in two different populations, which allow us to compare two different societies. We interpret the model findings with the aid of simple mean field calculations. Our results suggest that scientifically sound concepts are more difficult to acquire than concepts not validated by science, since opposing individuals organize themselves in close communities that prevent opinion consensus.
Automatic glaucoma diagnosis through medical imaging informatics.
Liu, Jiang; Zhang, Zhuo; Wong, Damon Wing Kee; Xu, Yanwu; Yin, Fengshou; Cheng, Jun; Tan, Ngan Meng; Kwoh, Chee Keong; Xu, Dong; Tham, Yih Chung; Aung, Tin; Wong, Tien Yin
2013-01-01
Computer-aided diagnosis for screening utilizes computer-based analytical methodologies to process patient information. Glaucoma is the leading irreversible cause of blindness. Due to the lack of an effective and standard screening practice, more than 50% of the cases are undiagnosed, which prevents the early treatment of the disease. To design an automatic glaucoma diagnosis architecture automatic glaucoma diagnosis through medical imaging informatics (AGLAIA-MII) that combines patient personal data, medical retinal fundus image, and patient's genome information for screening. 2258 cases from a population study were used to evaluate the screening software. These cases were attributed with patient personal data, retinal images and quality controlled genome data. Utilizing the multiple kernel learning-based classifier, AGLAIA-MII, combined patient personal data, major image features, and important genome single nucleotide polymorphism (SNP) features. Receiver operating characteristic curves were plotted to compare AGLAIA-MII's performance with classifiers using patient personal data, images, and genome SNP separately. AGLAIA-MII was able to achieve an area under curve value of 0.866, better than 0.551, 0.722 and 0.810 by the individual personal data, image and genome information components, respectively. AGLAIA-MII also demonstrated a substantial improvement over the current glaucoma screening approach based on intraocular pressure. AGLAIA-MII demonstrates for the first time the capability of integrating patients' personal data, medical retinal image and genome information for automatic glaucoma diagnosis and screening in a large dataset from a population study. It paves the way for a holistic approach for automatic objective glaucoma diagnosis and screening.
Computational Psychometrics in Communication and Implications in Decision Making.
Cipresso, Pietro; Villani, Daniela; Repetto, Claudia; Bosone, Lucia; Balgera, Anna; Mauri, Maurizio; Villamira, Marco; Antonietti, Alessandro; Riva, Giuseppe
2015-01-01
Recent investigations emphasized the role of communication features on behavioral trust and reciprocity in economic decision making but no studies have been focused on the effect of communication on affective states in such a context. Thanks to advanced methods of computational psychometrics, in this study, affective states were deeply examined using simultaneous and synchronized recordings of gazes and psychophysiological signals in 28 female students during an investment game. Results showed that participants experienced different affective states according to the type of communication (personal versus impersonal). In particular, participants involved in personal communication felt more relaxed than participants involved in impersonal communication. Moreover, personal communication influenced reciprocity and participants' perceptions about trust and reciprocity. Findings were interpreted in the light of the Arousal/Valence Model and self-disclosure process.
Computational Psychometrics in Communication and Implications in Decision Making
Repetto, Claudia; Bosone, Lucia; Balgera, Anna; Mauri, Maurizio; Villamira, Marco; Antonietti, Alessandro
2015-01-01
Recent investigations emphasized the role of communication features on behavioral trust and reciprocity in economic decision making but no studies have been focused on the effect of communication on affective states in such a context. Thanks to advanced methods of computational psychometrics, in this study, affective states were deeply examined using simultaneous and synchronized recordings of gazes and psychophysiological signals in 28 female students during an investment game. Results showed that participants experienced different affective states according to the type of communication (personal versus impersonal). In particular, participants involved in personal communication felt more relaxed than participants involved in impersonal communication. Moreover, personal communication influenced reciprocity and participants' perceptions about trust and reciprocity. Findings were interpreted in the light of the Arousal/Valence Model and self-disclosure process. PMID:26339285
The Rise of K-12 Blended Learning: Profiles of Emerging Models
ERIC Educational Resources Information Center
Staker, Heather
2011-01-01
Some innovations change everything. The rise of personal computers in the 1970s decimated the mini-computer industry. TurboTax forever changed tax accounting, and MP3s made libraries of compact discs obsolete. These innovations bear the traits of what Harvard Business School Professor Clayton M. Christensen terms a "disruptive innovation."…
System capacity and economic modeling computer tool for satellite mobile communications systems
NASA Technical Reports Server (NTRS)
Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.
1988-01-01
A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.
Video-based depression detection using local Curvelet binary patterns in pairwise orthogonal planes.
Pampouchidou, Anastasia; Marias, Kostas; Tsiknakis, Manolis; Simos, Panagiotis; Fan Yang; Lemaitre, Guillaume; Meriaudeau, Fabrice
2016-08-01
Depression is an increasingly prevalent mood disorder. This is the reason why the field of computer-based depression assessment has been gaining the attention of the research community during the past couple of years. The present work proposes two algorithms for depression detection, one Frame-based and the second Video-based, both employing Curvelet transform and Local Binary Patterns. The main advantage of these methods is that they have significantly lower computational requirements, as the extracted features are of very low dimensionality. This is achieved by modifying the previously proposed algorithm which considers Three-Orthogonal-Planes, to only Pairwise-Orthogonal-Planes. Performance of the algorithms was tested on the benchmark dataset provided by the Audio/Visual Emotion Challenge 2014, with the person-specific system achieving 97.6% classification accuracy, and the person-independed one yielding promising preliminary results of 74.5% accuracy. The paper concludes with open issues, proposed solutions, and future plans.
Desktop Computing Integration Project
NASA Technical Reports Server (NTRS)
Tureman, Robert L., Jr.
1992-01-01
The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.
Chan, Steven; Hwang, Tiffany; Wong, Alice; Bauer, Amy M.
2017-01-01
Mobile health (mHealth), telemedicine and other technology-based services facilitate mental health service delivery and may be considered part of an e-mental health (eMH) spectrum of care. Web- and Internet-based resources provide a great opportunity for the public, patients, healthcare providers and others to improve wellness, practice prevention and reduce suffering from illnesses. Mobile apps offer portability for access anytime/anywhere, are inexpensive versus traditional desktop computers, and have additional features (e.g., context-aware interventions and sensors with real-time feedback. This paper discusses mobile mental health (mMH) options, as part of a broader framework of eMH options. The evidence-based literature shows that many people have an openness to technology as a way to help themselves, change behaviors and engage additional clinical services. Studies show that traditional video-based synchronous telepsychiatry (TP) is as good as in-person service, but mHealth outcomes have been rarely, directly compared to in-person and other eMH care options. Similarly, technology options added to in-person care or combined with others have not been evaluated nor linked with specific goals and desired outcomes. Skills and competencies for clinicians are needed for mHealth, social media and other new technologies in the eMH spectrum, in addition to research by randomized trials and study of health service delivery models with an emphasis on effectiveness. PMID:28894744
An enhanced mobile-healthcare emergency system based on extended chaotic maps.
Lee, Cheng-Chi; Hsu, Che-Wei; Lai, Yan-Ming; Vasilakos, Athanasios
2013-10-01
Mobile Healthcare (m-Healthcare) systems, namely smartphone applications of pervasive computing that utilize wireless body sensor networks (BSNs), have recently been proposed to provide smartphone users with health monitoring services and received great attentions. An m-Healthcare system with flaws, however, may leak out the smartphone user's personal information and cause security, privacy preservation, or user anonymity problems. In 2012, Lu et al. proposed a secure and privacy-preserving opportunistic computing (SPOC) framework for mobile-Healthcare emergency. The brilliant SPOC framework can opportunistically gather resources on the smartphone such as computing power and energy to process the computing-intensive personal health information (PHI) in case of an m-Healthcare emergency with minimal privacy disclosure. To balance between the hazard of PHI privacy disclosure and the necessity of PHI processing and transmission in m-Healthcare emergency, in their SPOC framework, Lu et al. introduced an efficient user-centric privacy access control system which they built on the basis of an attribute-based access control mechanism and a new privacy-preserving scalar product computation (PPSPC) technique. However, we found out that Lu et al.'s protocol still has some secure flaws such as user anonymity and mutual authentication. To fix those problems and further enhance the computation efficiency of Lu et al.'s protocol, in this article, the authors will present an improved mobile-Healthcare emergency system based on extended chaotic maps. The new system is capable of not only providing flawless user anonymity and mutual authentication but also reducing the computation cost.
A PC based fault diagnosis expert system
NASA Technical Reports Server (NTRS)
Marsh, Christopher A.
1990-01-01
The Integrated Status Assessment (ISA) prototype expert system performs system level fault diagnosis using rules and models created by the user. The ISA evolved from concepts to a stand-alone demonstration prototype using OPS5 on a LISP Machine. The LISP based prototype was rewritten in C and the C Language Integrated Production System (CLIPS) to run on a Personal Computer (PC) and a graphics workstation. The ISA prototype has been used to demonstrate fault diagnosis functions of Space Station Freedom's Operation Management System (OMS). This paper describes the development of the ISA prototype from early concepts to the current PC/workstation version used today and describes future areas of development for the prototype.
NASA Technical Reports Server (NTRS)
Gupta, K. K.
1997-01-01
A multidisciplinary, finite element-based, highly graphics-oriented, linear and nonlinear analysis capability that includes such disciplines as structures, heat transfer, linear aerodynamics, computational fluid dynamics, and controls engineering has been achieved by integrating several new modules in the original STARS (STructural Analysis RoutineS) computer program. Each individual analysis module is general-purpose in nature and is effectively integrated to yield aeroelastic and aeroservoelastic solutions of complex engineering problems. Examples of advanced NASA Dryden Flight Research Center projects analyzed by the code in recent years include the X-29A, F-18 High Alpha Research Vehicle/Thrust Vectoring Control System, B-52/Pegasus Generic Hypersonics, National AeroSpace Plane (NASP), SR-71/Hypersonic Launch Vehicle, and High Speed Civil Transport (HSCT) projects. Extensive graphics capabilities exist for convenient model development and postprocessing of analysis results. The program is written in modular form in standard FORTRAN language to run on a variety of computers, such as the IBM RISC/6000, SGI, DEC, Cray, and personal computer; associated graphics codes use OpenGL and IBM/graPHIGS language for color depiction. This program is available from COSMIC, the NASA agency for distribution of computer programs.
Three-dimensional (3D) printing and its applications for aortic diseases.
Hangge, Patrick; Pershad, Yash; Witting, Avery A; Albadawi, Hassan; Oklu, Rahmi
2018-04-01
Three-dimensional (3D) printing is a process which generates prototypes from virtual objects in computer-aided design (CAD) software. Since 3D printing enables the creation of customized objects, it is a rapidly expanding field in an age of personalized medicine. We discuss the use of 3D printing in surgical planning, training, and creation of devices for the treatment of aortic diseases. 3D printing can provide operators with a hands-on model to interact with complex anatomy, enable prototyping of devices for implantation based upon anatomy, or even provide pre-procedural simulation. Potential exists to expand upon current uses of 3D printing to create personalized implantable devices such as grafts. Future studies should aim to demonstrate the impact of 3D printing on outcomes to make this technology more accessible to patients with complex aortic diseases.
NASA Astrophysics Data System (ADS)
Qin, Cheng-Zhi; Zhan, Lijun
2012-06-01
As one of the important tasks in digital terrain analysis, the calculation of flow accumulations from gridded digital elevation models (DEMs) usually involves two steps in a real application: (1) using an iterative DEM preprocessing algorithm to remove the depressions and flat areas commonly contained in real DEMs, and (2) using a recursive flow-direction algorithm to calculate the flow accumulation for every cell in the DEM. Because both algorithms are computationally intensive, quick calculation of the flow accumulations from a DEM (especially for a large area) presents a practical challenge to personal computer (PC) users. In recent years, rapid increases in hardware capacity of the graphics processing units (GPUs) provided in modern PCs have made it possible to meet this challenge in a PC environment. Parallel computing on GPUs using a compute-unified-device-architecture (CUDA) programming model has been explored to speed up the execution of the single-flow-direction algorithm (SFD). However, the parallel implementation on a GPU of the multiple-flow-direction (MFD) algorithm, which generally performs better than the SFD algorithm, has not been reported. Moreover, GPU-based parallelization of the DEM preprocessing step in the flow-accumulation calculations has not been addressed. This paper proposes a parallel approach to calculate flow accumulations (including both iterative DEM preprocessing and a recursive MFD algorithm) on a CUDA-compatible GPU. For the parallelization of an MFD algorithm (MFD-md), two different parallelization strategies using a GPU are explored. The first parallelization strategy, which has been used in the existing parallel SFD algorithm on GPU, has the problem of computing redundancy. Therefore, we designed a parallelization strategy based on graph theory. The application results show that the proposed parallel approach to calculate flow accumulations on a GPU performs much faster than either sequential algorithms or other parallel GPU-based algorithms based on existing parallelization strategies.
A Neural Network Model of the Structure and Dynamics of Human Personality
ERIC Educational Resources Information Center
Read, Stephen J.; Monroe, Brian M.; Brownstein, Aaron L.; Yang, Yu; Chopra, Gurveen; Miller, Lynn C.
2010-01-01
We present a neural network model that aims to bridge the historical gap between dynamic and structural approaches to personality. The model integrates work on the structure of the trait lexicon, the neurobiology of personality, temperament, goal-based models of personality, and an evolutionary analysis of motives. It is organized in terms of two…
The Watershed Health Assessment Tools-Investigating Fisheries (WHAT-IF) is a decision-analysis modeling toolkit for personal computers that supports watershed and fisheries management. The WHAT-IF toolkit includes a relational database, help-system functions and documentation, a...
Highway Effects on Vehicle Performance
DOT National Transportation Integrated Search
2001-01-01
A user-friendly model for personal computers, "Vehicle/Highway Performance Predictor," was developed to estimate fuel consumption and exhaust emissions related to modes of vehicle operations on highways of various configurations and traffic controls ...
The impact of computer science in molecular medicine: enabling high-throughput research.
de la Iglesia, Diana; García-Remesal, Miguel; de la Calle, Guillermo; Kulikowski, Casimir; Sanz, Ferran; Maojo, Víctor
2013-01-01
The Human Genome Project and the explosion of high-throughput data have transformed the areas of molecular and personalized medicine, which are producing a wide range of studies and experimental results and providing new insights for developing medical applications. Research in many interdisciplinary fields is resulting in data repositories and computational tools that support a wide diversity of tasks: genome sequencing, genome-wide association studies, analysis of genotype-phenotype interactions, drug toxicity and side effects assessment, prediction of protein interactions and diseases, development of computational models, biomarker discovery, and many others. The authors of the present paper have developed several inventories covering tools, initiatives and studies in different computational fields related to molecular medicine: medical informatics, bioinformatics, clinical informatics and nanoinformatics. With these inventories, created by mining the scientific literature, we have carried out several reviews of these fields, providing researchers with a useful framework to locate, discover, search and integrate resources. In this paper we present an analysis of the state-of-the-art as it relates to computational resources for molecular medicine, based on results compiled in our inventories, as well as results extracted from a systematic review of the literature and other scientific media. The present review is based on the impact of their related publications and the available data and software resources for molecular medicine. It aims to provide information that can be useful to support ongoing research and work to improve diagnostics and therapeutics based on molecular-level insights.
Knijnenburg, Theo A.; Klau, Gunnar W.; Iorio, Francesco; Garnett, Mathew J.; McDermott, Ultan; Shmulevich, Ilya; Wessels, Lodewyk F. A.
2016-01-01
Mining large datasets using machine learning approaches often leads to models that are hard to interpret and not amenable to the generation of hypotheses that can be experimentally tested. We present ‘Logic Optimization for Binary Input to Continuous Output’ (LOBICO), a computational approach that infers small and easily interpretable logic models of binary input features that explain a continuous output variable. Applying LOBICO to a large cancer cell line panel, we find that logic combinations of multiple mutations are more predictive of drug response than single gene predictors. Importantly, we show that the use of the continuous information leads to robust and more accurate logic models. LOBICO implements the ability to uncover logic models around predefined operating points in terms of sensitivity and specificity. As such, it represents an important step towards practical application of interpretable logic models. PMID:27876821
Knijnenburg, Theo A; Klau, Gunnar W; Iorio, Francesco; Garnett, Mathew J; McDermott, Ultan; Shmulevich, Ilya; Wessels, Lodewyk F A
2016-11-23
Mining large datasets using machine learning approaches often leads to models that are hard to interpret and not amenable to the generation of hypotheses that can be experimentally tested. We present 'Logic Optimization for Binary Input to Continuous Output' (LOBICO), a computational approach that infers small and easily interpretable logic models of binary input features that explain a continuous output variable. Applying LOBICO to a large cancer cell line panel, we find that logic combinations of multiple mutations are more predictive of drug response than single gene predictors. Importantly, we show that the use of the continuous information leads to robust and more accurate logic models. LOBICO implements the ability to uncover logic models around predefined operating points in terms of sensitivity and specificity. As such, it represents an important step towards practical application of interpretable logic models.
Model-Based Approaches for Teaching and Practicing Personality Assessment.
Blais, Mark A; Hopwood, Christopher J
2017-01-01
Psychological assessment is a complex professional skill. Competence in assessment requires an extensive knowledge of personality, neuropsychology, social behavior, and psychopathology, a background in psychometrics, familiarity with a range of multimethod tools, cognitive flexibility, skepticism, and interpersonal sensitivity. This complexity makes assessment a challenge to teach and learn, particularly as the investment of resources and time in assessment has waned in psychological training programs over the last few decades. In this article, we describe 3 conceptual models that can assist teaching and learning psychological assessments. The transtheoretical model of personality provides a personality systems-based framework for understanding how multimethod assessment data relate to major personality systems and can be combined to describe and explain complex human behavior. The quantitative psychopathology-personality trait model is an empirical model based on the hierarchical organization of individual differences. Application of this model can help students understand diagnostic comorbidity and symptom heterogeneity, focus on more meaningful high-order domains, and identify the most effective assessment tools for addressing a given question. The interpersonal situation model is rooted in interpersonal theory and can help students connect test data to here-and-now interactions with patients. We conclude by demonstrating the utility of these models using a case example.
An ontology for factors affecting tuberculosis treatment adherence behavior in sub-Saharan Africa.
Ogundele, Olukunle Ayodeji; Moodley, Deshendran; Pillay, Anban W; Seebregts, Christopher J
2016-01-01
Adherence behavior is a complex phenomenon influenced by diverse personal, cultural, and socioeconomic factors that may vary between communities in different regions. Understanding the factors that influence adherence behavior is essential in predicting which individuals and communities are at risk of nonadherence. This is necessary for supporting resource allocation and intervention planning in disease control programs. Currently, there is no known concrete and unambiguous computational representation of factors that influence tuberculosis (TB) treatment adherence behavior that is useful for prediction. This study developed a computer-based conceptual model for capturing and structuring knowledge about the factors that influence TB treatment adherence behavior in sub-Saharan Africa (SSA). An extensive review of existing categorization systems in the literature was used to develop a conceptual model that captured scientific knowledge about TB adherence behavior in SSA. The model was formalized as an ontology using the web ontology language. The ontology was then evaluated for its comprehensiveness and applicability in building predictive models. The outcome of the study is a novel ontology-based approach for curating and structuring scientific knowledge of adherence behavior in patients with TB in SSA. The ontology takes an evidence-based approach by explicitly linking factors to published clinical studies. Factors are structured around five dimensions: factor type, type of effect, regional variation, cross-dependencies between factors, and treatment phase. The ontology is flexible and extendable and provides new insights into the nature of and interrelationship between factors that influence TB adherence.
An ontology for factors affecting tuberculosis treatment adherence behavior in sub-Saharan Africa
Ogundele, Olukunle Ayodeji; Moodley, Deshendran; Pillay, Anban W; Seebregts, Christopher J
2016-01-01
Purpose Adherence behavior is a complex phenomenon influenced by diverse personal, cultural, and socioeconomic factors that may vary between communities in different regions. Understanding the factors that influence adherence behavior is essential in predicting which individuals and communities are at risk of nonadherence. This is necessary for supporting resource allocation and intervention planning in disease control programs. Currently, there is no known concrete and unambiguous computational representation of factors that influence tuberculosis (TB) treatment adherence behavior that is useful for prediction. This study developed a computer-based conceptual model for capturing and structuring knowledge about the factors that influence TB treatment adherence behavior in sub-Saharan Africa (SSA). Methods An extensive review of existing categorization systems in the literature was used to develop a conceptual model that captured scientific knowledge about TB adherence behavior in SSA. The model was formalized as an ontology using the web ontology language. The ontology was then evaluated for its comprehensiveness and applicability in building predictive models. Conclusion The outcome of the study is a novel ontology-based approach for curating and structuring scientific knowledge of adherence behavior in patients with TB in SSA. The ontology takes an evidence-based approach by explicitly linking factors to published clinical studies. Factors are structured around five dimensions: factor type, type of effect, regional variation, cross-dependencies between factors, and treatment phase. The ontology is flexible and extendable and provides new insights into the nature of and interrelationship between factors that influence TB adherence. PMID:27175067
NASA Astrophysics Data System (ADS)
Montillo, Albert; Song, Qi; Das, Bipul; Yin, Zhye
2015-03-01
Parsing volumetric computed tomography (CT) into 10 or more salient organs simultaneously is a challenging task with many applications such as personalized scan planning and dose reporting. In the clinic, pre-scan data can come in the form of very low dose volumes acquired just prior to the primary scan or from an existing primary scan. To localize organs in such diverse data, we propose a new learning based framework that we call hierarchical pictorial structures (HPS) which builds multiple levels of models in a tree-like hierarchy that mirrors the natural decomposition of human anatomy from gross structures to finer structures. Each node of our hierarchical model learns (1) the local appearance and shape of structures, and (2) a generative global model that learns probabilistic, structural arrangement. Our main contribution is twofold. First we embed the pictorial structures approach in a hierarchical framework which reduces test time image interpretation and allows for the incorporation of additional geometric constraints that robustly guide model fitting in the presence of noise. Second we guide our HPS framework with the probabilistic cost maps extracted using random decision forests using volumetric 3D HOG features which makes our model fast to train and fast to apply to novel test data and posses a high degree of invariance to shape distortion and imaging artifacts. All steps require approximate 3 mins to compute and all organs are located with suitably high accuracy for our clinical applications such as personalized scan planning for radiation dose reduction. We assess our method using a database of volumetric CT scans from 81 subjects with widely varying age and pathology and with simulated ultra-low dose cadaver pre-scan data.
Lechner, William J; MacGlashan, James; Wray, Tyler B; Littman, Michael L
2017-01-01
Background Computer-delivered interventions have been shown to be effective in reducing alcohol consumption in heavy drinking college students. However, these computer-delivered interventions rely on mouse, keyboard, or touchscreen responses for interactions between the users and the computer-delivered intervention. The principles of motivational interviewing suggest that in-person interventions may be effective, in part, because they encourage individuals to think through and speak aloud their motivations for changing a health behavior, which current computer-delivered interventions do not allow. Objective The objective of this study was to take the initial steps toward development of a voice-based computer-delivered intervention that can ask open-ended questions and respond appropriately to users’ verbal responses, more closely mirroring a human-delivered motivational intervention. Methods We developed (1) a voice-based computer-delivered intervention that was run by a human controller and that allowed participants to speak their responses to scripted prompts delivered by speech generation software and (2) a text-based computer-delivered intervention that relied on the mouse, keyboard, and computer screen for all interactions. We randomized 60 heavy drinking college students to interact with the voice-based computer-delivered intervention and 30 to interact with the text-based computer-delivered intervention and compared their ratings of the systems as well as their motivation to change drinking and their drinking behavior at 1-month follow-up. Results Participants reported that the voice-based computer-delivered intervention engaged positively with them in the session and delivered content in a manner consistent with motivational interviewing principles. At 1-month follow-up, participants in the voice-based computer-delivered intervention condition reported significant decreases in quantity, frequency, and problems associated with drinking, and increased perceived importance of changing drinking behaviors. In comparison to the text-based computer-delivered intervention condition, those assigned to voice-based computer-delivered intervention reported significantly fewer alcohol-related problems at the 1-month follow-up (incident rate ratio 0.60, 95% CI 0.44-0.83, P=.002). The conditions did not differ significantly on perceived importance of changing drinking or on measures of drinking quantity and frequency of heavy drinking. Conclusions Results indicate that it is feasible to construct a series of open-ended questions and a bank of responses and follow-up prompts that can be used in a future fully automated voice-based computer-delivered intervention that may mirror more closely human-delivered motivational interventions to reduce drinking. Such efforts will require using advanced speech recognition capabilities and machine-learning approaches to train a program to mirror the decisions made by human controllers in the voice-based computer-delivered intervention used in this study. In addition, future studies should examine enhancements that can increase the perceived warmth and empathy of voice-based computer-delivered intervention, possibly through greater personalization, improvements in the speech generation software, and embodying the computer-delivered intervention in a physical form. PMID:28659259
Patient Similarity in Prediction Models Based on Health Data: A Scoping Review
Sharafoddini, Anis; Dubin, Joel A
2017-01-01
Background Physicians and health policy makers are required to make predictions during their decision making in various medical problems. Many advances have been made in predictive modeling toward outcome prediction, but these innovations target an average patient and are insufficiently adjustable for individual patients. One developing idea in this field is individualized predictive analytics based on patient similarity. The goal of this approach is to identify patients who are similar to an index patient and derive insights from the records of similar patients to provide personalized predictions.. Objective The aim is to summarize and review published studies describing computer-based approaches for predicting patients’ future health status based on health data and patient similarity, identify gaps, and provide a starting point for related future research. Methods The method involved (1) conducting the review by performing automated searches in Scopus, PubMed, and ISI Web of Science, selecting relevant studies by first screening titles and abstracts then analyzing full-texts, and (2) documenting by extracting publication details and information on context, predictors, missing data, modeling algorithm, outcome, and evaluation methods into a matrix table, synthesizing data, and reporting results. Results After duplicate removal, 1339 articles were screened in abstracts and titles and 67 were selected for full-text review. In total, 22 articles met the inclusion criteria. Within included articles, hospitals were the main source of data (n=10). Cardiovascular disease (n=7) and diabetes (n=4) were the dominant patient diseases. Most studies (n=18) used neighborhood-based approaches in devising prediction models. Two studies showed that patient similarity-based modeling outperformed population-based predictive methods. Conclusions Interest in patient similarity-based predictive modeling for diagnosis and prognosis has been growing. In addition to raw/coded health data, wavelet transform and term frequency-inverse document frequency methods were employed to extract predictors. Selecting predictors with potential to highlight special cases and defining new patient similarity metrics were among the gaps identified in the existing literature that provide starting points for future work. Patient status prediction models based on patient similarity and health data offer exciting potential for personalizing and ultimately improving health care, leading to better patient outcomes. PMID:28258046
Creation and use of a survey instrument for comparing mobile computing devices.
Macri, Jennifer M; Lee, Paul P; Silvey, Garry M; Lobach, David F
2005-01-01
Both personal digital assistants (PDAs) and tablet computers have emerged to facilitate data collection at the point of care. However, little research has been reported comparing these mobile computing devices in specific care settings. In this study we present an approach for comparing functionally identical applications on a Palm operating system-based PDA and a Windows-based tablet computer for point-of-care documentation of clinical observations by eye care professionals when caring for patients with diabetes. Eye-care professionals compared the devices through focus group sessions and through validated usability surveys. This poster describes the development and use of the survey instrument used for comparing mobile computing devices.
Wearable computer technology for dismounted applications
NASA Astrophysics Data System (ADS)
Daniels, Reginald
2010-04-01
Small computing devices which rival the compact size of traditional personal digital assistants (PDA) have recently established a market niche. These computing devices are small enough to be considered unobtrusive for humans to wear. The computing devices are also powerful enough to run full multi-tasking general purpose operating systems. This paper will explore the wearable computer information system for dismounted applications recently fielded for ground-based US Air Force use. The environments that the information systems are used in will be reviewed, as well as a description of the net-centric, ground-based warrior. The paper will conclude with a discussion regarding the importance of intuitive, usable, and unobtrusive operator interfaces for dismounted operators.
Generic algorithms for high performance scalable geocomputing
NASA Astrophysics Data System (ADS)
de Jong, Kor; Schmitz, Oliver; Karssenberg, Derek
2016-04-01
During the last decade, the characteristics of computing hardware have changed a lot. For example, instead of a single general purpose CPU core, personal computers nowadays contain multiple cores per CPU and often general purpose accelerators, like GPUs. Additionally, compute nodes are often grouped together to form clusters or a supercomputer, providing enormous amounts of compute power. For existing earth simulation models to be able to use modern hardware platforms, their compute intensive parts must be rewritten. This can be a major undertaking and may involve many technical challenges. Compute tasks must be distributed over CPU cores, offloaded to hardware accelerators, or distributed to different compute nodes. And ideally, all of this should be done in such a way that the compute task scales well with the hardware resources. This presents two challenges: 1) how to make good use of all the compute resources and 2) how to make these compute resources available for developers of simulation models, who may not (want to) have the required technical background for distributing compute tasks. The first challenge requires the use of specialized technology (e.g.: threads, OpenMP, MPI, OpenCL, CUDA). The second challenge requires the abstraction of the logic handling the distribution of compute tasks from the model-specific logic, hiding the technical details from the model developer. To assist the model developer, we are developing a C++ software library (called Fern) containing algorithms that can use all CPU cores available in a single compute node (distributing tasks over multiple compute nodes will be done at a later stage). The algorithms are grid-based (finite difference) and include local and spatial operations such as convolution filters. The algorithms handle distribution of the compute tasks to CPU cores internally. In the resulting model the low-level details of how this is done is separated from the model-specific logic representing the modeled system. This contrasts with practices in which code for distributing of compute tasks is mixed with model-specific code, and results in a better maintainable model. For flexibility and efficiency, the algorithms are configurable at compile-time with the respect to the following aspects: data type, value type, no-data handling, input value domain handling, and output value range handling. This makes the algorithms usable in very different contexts, without the need for making intrusive changes to existing models when using them. Applications that benefit from using the Fern library include the construction of forward simulation models in (global) hydrology (e.g. PCR-GLOBWB (Van Beek et al. 2011)), ecology, geomorphology, or land use change (e.g. PLUC (Verstegen et al. 2014)) and manipulation of hyper-resolution land surface data such as digital elevation models and remote sensing data. Using the Fern library, we have also created an add-on to the PCRaster Python Framework (Karssenberg et al. 2010) allowing its users to speed up their spatio-temporal models, sometimes by changing just a single line of Python code in their model. In our presentation we will give an overview of the design of the algorithms, providing examples of different contexts where they can be used to replace existing sequential algorithms, including the PCRaster environmental modeling software (www.pcraster.eu). We will show how the algorithms can be configured to behave differently when necessary. References Karssenberg, D., Schmitz, O., Salamon, P., De Jong, K. and Bierkens, M.F.P., 2010, A software framework for construction of process-based stochastic spatio-temporal models and data assimilation. Environmental Modelling & Software, 25, pp. 489-502, Link. Best Paper Award 2010: Software and Decision Support. Van Beek, L. P. H., Y. Wada, and M. F. P. Bierkens. 2011. Global monthly water stress: 1. Water balance and water availability. Water Resources Research. 47. Verstegen, J. A., D. Karssenberg, F. van der Hilst, and A. P. C. Faaij. 2014. Identifying a land use change cellular automaton by Bayesian data assimilation. Environmental Modelling & Software 53:121-136.
Wieland, L. Susan; Falzon, Louise; Sciamanna, Chris N; Trudeau, Kimberlee J; Folse, Suzanne Brodney; Schwartz, Joseph E; Davidson, Karina W
2014-01-01
Background The World Health Organization (WHO) estimates that the number of obese or overweight individuals worldwide will increase to 1.5 billion by 2015. Chronic diseases associated with overweight or obesity include diabetes, heart disease, hypertension and stroke. Objectives To assess the effects of interactive computer-based interventions for weight loss or weight maintenance in overweight or obese people. Search methods We searched several electronic databases, including CENTRAL, MEDLINE, EMBASE, CINAHL, LILACS and PsycINFO, through 25 May 2011. We also searched clinical trials registries to identify studies. We scanned reference lists of included studies and relevant systematic reviews. Selection criteria Studies were included if they were randomized controlled trials or quasi-randomized controlled trials that evaluated interactive computer-based weight loss or weight maintenance programs in adults with overweight or obesity. We excluded trials if the duration of the intervention was less than four weeks or the loss to follow-up was greater than 20% overall. Data collection and analysis Two authors independently extracted study data and assessed risk of bias. Where interventions, control conditions, outcomes and time frames were similar between studies, we combined study data using meta-analysis. Main results We included 14 weight loss studies with a total of 2537 participants, and four weight maintenance studies with a total of 1603 participants. Treatment duration was between four weeks and 30 months. At six months, computer-based interventions led to greater weight loss than minimal interventions (mean difference (MD) −1.5 kg; 95% confidence interval (CI) −2.1 to −0.9; two trials) but less weight loss than in-person treatment (MD 2.1 kg; 95% CI 0.8 to 3.4; one trial). At six months, computer-based interventions were superior to a minimal control intervention in limiting weight regain (MD −0.7 kg; 95% CI −1.2 to −0.2; two trials), but not superior to infrequent in-person treatment (MD 0.5 kg; 95% −0.5 to 1.6; two trials). We did not observe consistent differences in dietary or physical activity behaviors between intervention and control groups in either weight loss or weight maintenance trials. Three weight loss studies estimated the costs of computer-based interventions compared to usual care, however two of the studies were 11 and 28 years old, and recent advances in technology render these estimates unlikely to be applicable to current or future interventions, while the third study was conducted in active duty military personnel, and it is unclear whether the costs are relevant to other settings. One weight loss study reported the cost-effectiveness ratio for a weekly in-person weight loss intervention relative to a computer-based intervention as USD 7177 (EUR 5678) per life year gained (80% CI USD 3055 to USD 60,291 (EUR 2417 to EUR 47,702)). It is unclear whether this could be extrapolated to other studies. No data were identified on adverse events, morbidity, complications or health-related quality of life. Authors’ conclusions Compared to no intervention or minimal interventions (pamphlets, usual care), interactive computer-based interventions are an effective intervention for weight loss and weight maintenance. Compared to in-person interventions, interactive computer-based interventions result in smaller weight losses and lower levels of weight maintenance. The amount of additional weight loss, however, is relatively small and of brief duration, making the clinical significance of these differences unclear. PMID:22895964
ERIC Educational Resources Information Center
Miller, Christopher T.; Mazur, Joan M.
A person-centered model of instruction has been developed for use in designing instruction in virtual, Web-based environments. This model, based on the work of Carl Rogers, attempts to address several issues raised in the literature regarding: (1) the changing role of instructors and students; (2) the broadening of the notion of learning outcomes;…
Closed-loop dialog model of face-to-face communication with a photo-real virtual human
NASA Astrophysics Data System (ADS)
Kiss, Bernadette; Benedek, Balázs; Szijárto, Gábor; Takács, Barnabás
2004-01-01
We describe an advanced Human Computer Interaction (HCI) model that employs photo-realistic virtual humans to provide digital media users with information, learning services and entertainment in a highly personalized and adaptive manner. The system can be used as a computer interface or as a tool to deliver content to end-users. We model the interaction process between the user and the system as part of a closed loop dialog taking place between the participants. This dialog, exploits the most important characteristics of a face-to-face communication process, including the use of non-verbal gestures and meta communication signals to control the flow of information. Our solution is based on a Virtual Human Interface (VHI) technology that was specifically designed to be able to create emotional engagement between the virtual agent and the user, thus increasing the efficiency of learning and/or absorbing any information broadcasted through this device. The paper reviews the basic building blocks and technologies needed to create such a system and discusses its advantages over other existing methods.
[Registration technology for mandibular angle osteotomy based on augmented reality].
Zhu, Ming; Chai, Gang; Zhang, Yan; Ma, Xiao-Fei; Yu, Zhe-Yuan; Zhu, Yi-Jia
2010-12-01
To establish an effective path to register the operative plan to the real model of mandible made by rapid prototyping (RP) technology. Computerize tomography (CT) was performed on 20 patients to create 3D images, and computer aided operation planning information can be merged with the 3D images. Then dental cast was used to fix the signal which can be recognized by the software. The dental cast was transformed to 3D data with a laser scanner and a programmer that run on a personal computer named Rapidform matching the dental cast and the mandible image to generate the virtual image. Then the registration was achieved by video monitoring system. By using this technology, the virtual image of mandible and the cutting planes both can overlay the real model of mandible made by RP. This study found an effective way for registration by using dental cast, and this way might be a powerful option for the registration of augmented reality. Supported by Program for Innovation Research Team of Shanghai Municipal Education Commission.
Ng, Wai-Yin; Chau, Chi-Kwan
2014-01-15
This study evaluated the effectiveness of different configurations for two building design elements, namely building permeability and setback, proposed for mitigating air pollutant exposure problems in isolated deep canyons by using an indirect exposure approach. The indirect approach predicted the exposures of three different population subgroups (i.e. pedestrians, shop vendors and residents) by multiplying the pollutant concentrations with the duration of exposure within a specific micro-environment. In this study, the pollutant concentrations for different configurations were predicted using a computational fluid dynamics model. The model was constructed based on the Reynolds-Averaged Navier-Stokes (RANS) equations with the standard k-ε turbulence model. Fifty-one canyon configurations with aspect ratios of 2, 4, 6 and different building permeability values (ratio of building spacing to the building façade length) or different types of building setback (recess of a high building from the road) were examined. The findings indicated that personal exposures of shop vendors were extremely high if they were present inside a canyon without any setback or separation between buildings and when the prevailing wind was perpendicular to the canyon axis. Building separation and building setbacks were effective in reducing personal air exposures in canyons with perpendicular wind, although their effectiveness varied with different configurations. Increasing the permeability value from 0 to 10% significantly lowered the personal exposures on the different population subgroups. Likewise, the personal exposures could also be reduced by the introduction of building setbacks despite their effects being strongly influenced by the aspect ratio of a canyon. Equivalent findings were observed if the reduction in the total development floor area (the total floor area permitted to be developed within a particular site area) was also considered. These findings were employed to formulate a hierarchy decision making model to guide the planning of deep canyons in high density urban cities. © 2013 Elsevier B.V. All rights reserved.
Curran, Patrick J.; Howard, Andrea L.; Bainter, Sierra; Lane, Stephanie T.; McGinley, James S.
2014-01-01
Objective Although recent statistical and computational developments allow for the empirical testing of psychological theories in ways not previously possible, one particularly vexing challenge remains: how to optimally model the prospective, reciprocal relations between two constructs as they developmentally unfold over time. Several analytic methods currently exist that attempt to model these types of relations, and each approach is successful to varying degrees. However, none provide the unambiguous separation of between-person and within-person components of stability and change over time, components that are often hypothesized to exist in the psychological sciences. The goal of our paper is to propose and demonstrate a novel extension of the multivariate latent curve model to allow for the disaggregation of these effects. Method We begin with a review of the standard latent curve models and describe how these primarily capture between-person differences in change. We then extend this model to allow for regression structures among the time-specific residuals to capture within-person differences in change. Results We demonstrate this model using an artificial data set generated to mimic the developmental relation between alcohol use and depressive symptomatology spanning five repeated measures. Conclusions We obtain a specificity of results from the proposed analytic strategy that are not available from other existing methodologies. We conclude with potential limitations of our approach and directions for future research. PMID:24364798
Babbitt, Callie W; Kahhat, Ramzy; Williams, Eric; Babbitt, Gregory A
2009-07-01
Product lifespan is a fundamental variable in understanding the environmental impacts associated with the life cycle of products. Existing life cycle and materials flow studies of products, almost without exception, consider lifespan to be constant over time. To determine the validity of this assumption, this study provides an empirical documentation of the long-term evolution of personal computer lifespan, using a major U.S. university as a case study. Results indicate that over the period 1985-2000, computer lifespan (purchase to "disposal") decreased steadily from a mean of 10.7 years in 1985 to 5.5 years in 2000. The distribution of lifespan also evolved, becoming narrower over time. Overall, however, lifespan distribution was broader than normally considered in life cycle assessments or materials flow forecasts of electronic waste management for policy. We argue that these results suggest that at least for computers, the assumption of constant lifespan is problematic and that it is important to work toward understanding the dynamics of use patterns. We modify an age-structured model of population dynamics from biology as a modeling approach to describe product life cycles. Lastly, the purchase share and generation of obsolete computers from the higher education sector is estimated using different scenarios for the dynamics of product lifespan.
Mindfulness as a personal resource to reduce work stress in the job demands-resources model.
Grover, Steven L; Teo, Stephen T T; Pick, David; Roche, Maree
2017-10-01
Based on the job demands-resources (JD-R) model, this study examines the different ways that the personal resource of mindfulness reduces stress. Structural equation modeling based on data from 415 Australian nurses shows that mindfulness relates directly and negatively to work stress and perceptions of emotional demands as well as buffering the relation of emotional demands on psychological stress. This study contributes to the literature by employing empirical analysis to the task of unravelling how personal resources function within the JD-R model. It also introduces mindfulness as a personal resource in the JD-R model. Copyright © 2016 John Wiley & Sons, Ltd.
Patricia K. Lebow; Henry Spelter; Peter J. Ince
2003-01-01
This report provides documentation and user information for FPL-PELPS, a personal computer price endogenous linear programming system for economic modeling. Originally developed to model the North American pulp and paper industry, FPL-PELPS follows its predecessors in allowing the modeling of any appropriate sector to predict consumption, production and capacity by...
Modeling and simulation in biomedicine.
Aarts, J.; Möller, D.; van Wijk van Brievingh, R.
1991-01-01
A group of researchers and educators in The Netherlands, Germany and Czechoslovakia have developed and adapted mathematical computer models of phenomena in the field of physiology and biomedicine for use in higher education. The models are graphical and highly interactive, and are all written in TurboPascal or the mathematical simulation language PSI. An educational shell has been developed to launch the models. The shell allows students to interact with the models and teachers to edit the models, to add new models and to monitor the achievements of the students. The models and the shell have been implemented on a MS-DOS personal computer. This paper describes the features of the modeling package and presents the modeling and simulation of the heart muscle as an example. PMID:1807745
Exploring Factors That Influence Technology-Based Distractions in Bring Your Own Device Classrooms
ERIC Educational Resources Information Center
Kay, Robin; Benzimra, Daniel; Li, Jia
2017-01-01
Previous research on distractions and the use of mobile devices (personal digital assistants, tablet personal computers, or laptops) have been conducted almost exclusively in higher education. The purpose of the current study was to examine the frequency and influence of distracting behaviors in Bring Your Own Device secondary school classrooms.…
Accessing a personalized bibliography with a searchable system on the World Wide Web
Malchus B. Baker; Daniel P. Huebner; Peter F. Ffolliott
2000-01-01
Researchers, educator's and land management personnel routinely construct bibliographies to assist them in managing publications that relate to their work. These personalized bibliographies are unique and valuable to others in the same discipline. This paper presents a computer data base system that provides users with the ability to search a bibliography through...
Choi, Okkyung; Han, SangYong
2007-01-01
Ubiquitous Computing makes it possible to determine in real time the location and situations of service requesters in a web service environment as it enables access to computers at any time and in any place. Though research on various aspects of ubiquitous commerce is progressing at enterprises and research centers, both domestically and overseas, analysis of a customer's personal preferences based on semantic web and rule based services using semantics is not currently being conducted. This paper proposes a Ubiquitous Computing Services System that enables a rule based search as well as semantics based search to support the fact that the electronic space and the physical space can be combined into one and the real time search for web services and the construction of efficient web services thus become possible.
Improved Gaussian Beam-Scattering Algorithm
NASA Technical Reports Server (NTRS)
Lock, James A.
1995-01-01
The localized model of the beam-shape coefficients for Gaussian beam-scattering theory by a spherical particle provides a great simplification in the numerical implementation of the theory. We derive an alternative form for the localized coefficients that is more convenient for computer computations and that provides physical insight into the details of the scattering process. We construct a FORTRAN program for Gaussian beam scattering with the localized model and compare its computer run time on a personal computer with that of a traditional Mie scattering program and with three other published methods for computing Gaussian beam scattering. We show that the analytical form of the beam-shape coefficients makes evident the fact that the excitation rate of morphology-dependent resonances is greatly enhanced for far off-axis incidence of the Gaussian beam.
Steerability Analysis of Tracked Vehicles: Theory and User’s Guide for Computer Program TVSTEER
1986-08-01
Baladi , Donald E. Barnes, Rebecca P. BergerC oStructures Laboratory NDEPARTMENT OF THE ARMY ___ Waterways Experiment Station, Corps of Engineers . U P0 Box...Analysis of Tracked Vehicles: Theory and User’s Guide for Computer Program TVSTEER - 12 PERSONAL AUTHOR(S) Baladi , George Y., Barnes, Donald E...mathematical model was formulated by Drs. George Y. Baladi and Behzad Rohani. The logic and computer programming were accomplished by Dr. Baladi and
On-Site to On-Line: Barriers to the Use of Computers for Continuing Education.
ERIC Educational Resources Information Center
Mamary, Edward M.; Charles, Patricia
2000-01-01
A survey of 1,120 physicians, nurse practitioners, and physician assistants identified their top preferences for continuing education delivery methods: in-person conferences, print-based self-study, and CD-ROM. Least favored were interactive audioconferences. Although most had computer access, traditional methods were more frequently used; lack of…
The Cybermobile: A Gateway for Public Access to Network-Based Information.
ERIC Educational Resources Information Center
Drumm, John E.; Groom, Frank M.
1997-01-01
Though the bookmobile has fallen on hard times, the cybermobile, a technology platform combining personal computing, CD-ROMs, fiber network, and wireless access to the Internet, may be the next step in mobile library services. Discusses standard vehicle, computer hardware, software, wireless access, and alliances with users, vendors, and community…
Am I Extravert or Introvert? Considering the Personality Effect toward e-Learning System
ERIC Educational Resources Information Center
Al-Dujaily, Amal; Kim, Jieun; Ryu, Hokyoung
2013-01-01
A concern of computer-based learning system design is how to accommodate learners' individual differences during learning activities. Previous research suggests that adaptive e-learning systems can effectively address such individual differences and, consequently, they enable more directed tutoring via computer-assisted instruction. In this paper,…
ERIC Educational Resources Information Center
Sisson, Lee Hansen; And Others
This paper describes the use of commercially-available software for the Apple Computer to augment diagnostic evaluations of learning disabled children and to enhance "learning to learn" strategies at the application/transfer level of learning. A short rationale discusses levels of evaluation and learning, using a model that synthesizes the ideas…
Multi person detection and tracking based on hierarchical level-set method
NASA Astrophysics Data System (ADS)
Khraief, Chadia; Benzarti, Faouzi; Amiri, Hamid
2018-04-01
In this paper, we propose an efficient unsupervised method for mutli-person tracking based on hierarchical level-set approach. The proposed method uses both edge and region information in order to effectively detect objects. The persons are tracked on each frame of the sequence by minimizing an energy functional that combines color, texture and shape information. These features are enrolled in covariance matrix as region descriptor. The present method is fully automated without the need to manually specify the initial contour of Level-set. It is based on combined person detection and background subtraction methods. The edge-based is employed to maintain a stable evolution, guide the segmentation towards apparent boundaries and inhibit regions fusion. The computational cost of level-set is reduced by using narrow band technique. Many experimental results are performed on challenging video sequences and show the effectiveness of the proposed method.
ERIC Educational Resources Information Center
Carman, Carol A.
2011-01-01
One of the underutilized tools in gifted identification is personality-based measures. A multiple confirmatory factor analysis was utilized to examine the relationships between traditional identification methods and personality-based measures. The pattern of correlations indicated this model could be measuring two constructs, one related to…
Students' Acceptance of Tablet PCs in the Classroom
ERIC Educational Resources Information Center
Ifenthaler, Dirk; Schweinbenz, Volker
2016-01-01
In recent years digital technologies, such as tablet personal computers (TPCs), have become an integral part of a school's infrastructure and are seen as a promising way to facilitate students' learning processes. This study empirically tested a theoretical model derived from the technology acceptance model containing key constructs developed in…
ERIC Educational Resources Information Center
Finch, Harold L.; Tatham, Elaine L.
This document presents a modified cohort survival model which can be of use in making enrollment projections. The model begins by analytically profiling an area's residents. Each person's demographic characteristics--sex, age, place of residence--are recorded in the computer memory. Four major input variables are then incorporated into the model:…
Implementation of a Personal Computer Based Parameter Estimation Program
1992-03-01
if necessary and identify by biock nunrbet) FEILD GROUP SUBGROUP Il’arunietar uetinkatlUln 19 ABSTRACT (continue on reverse it necessary and identity...model constant ix L,M,N X,Y,Z moment components Lp: •sbc.’.• T’ = sb C . r, - 2 V C, , L, = _sb 2 C 2V C L8,=qsbC 1 , Lw Scale of the turbulence M Vector ...u,v,w X,Y,Z velocity components V Vector velocity V Magnitude of velocity vector w9 Z velocity due to gust X.. x-distance to normal acclerometer X.P x
Active optical control system design of the SONG-China Telescope
NASA Astrophysics Data System (ADS)
Ye, Yu; Kou, Songfeng; Niu, Dongsheng; Li, Cheng; Wang, Guomin
2012-09-01
The standard SONG node structure of control system is presented. The active optical control system of the project is a distributed system, and a host computer and a slave intelligent controller are included. The host control computer collects the information from wave front sensor and sends commands to the slave computer to realize a closed loop model. For intelligent controller, a programmable logic controller (PLC) system is used. This system combines with industrial personal computer (IPC) and PLC to make up a control system with powerful and reliable.
2006-10-01
NCAPS ) Christina M. Underhill, Ph.D. Approved for public release; distribution is unlimited. NPRST-TN-06-9 October 2006...Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) Christina M. Underhill, Ph.D...documents one of the steps in our development of the Navy Computer Adaptive Personality Scales ( NCAPS ). NCAPS is a computer adaptive personality measure
47 CFR 15.102 - CPU boards and power supplies used in personal computers.
Code of Federal Regulations, 2013 CFR
2013-10-01
... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...
47 CFR 15.102 - CPU boards and power supplies used in personal computers.
Code of Federal Regulations, 2011 CFR
2011-10-01
... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...
47 CFR 15.102 - CPU boards and power supplies used in personal computers.
Code of Federal Regulations, 2010 CFR
2010-10-01
... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...
47 CFR 15.102 - CPU boards and power supplies used in personal computers.
Code of Federal Regulations, 2014 CFR
2014-10-01
... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...
47 CFR 15.102 - CPU boards and power supplies used in personal computers.
Code of Federal Regulations, 2012 CFR
2012-10-01
... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...
Design Tools for Accelerating Development and Usage of Multi-Core Computing Platforms
2014-04-01
Government formulated or supplied the drawings, specifications, or other data does not license the holder or any other person or corporation ; or convey...multicore PDSP platforms. The GPU- based capabilities of TDIF are currently oriented towards NVIDIA GPUs, based on the Compute Unified Device Architecture...CUDA) programming language [ NVIDIA 2007], which can be viewed as an extension of C. The multicore PDSP capabilities currently in TDIF are oriented
Code of Federal Regulations, 2014 CFR
2014-04-01
...), for purposes of computing the payor's net foreign base company income (as defined in § 1.954-1T(a)(4), net insurance income (as defined in § 1.954-1T(a)(6)), or income described in sections 952(a) (3), (4... computing the payor's net foreign base company income (as defined in § 1.954-1T(a)(4)), net insurance income...