Sample records for central computing group

  1. Closely Spaced Independent Parallel Runway Simulation.

    DTIC Science & Technology

    1984-10-01

    facility consists of the Central Computer Facility, the Controller Laboratory, and the Simulator Pilot Complex. CENTRAL COMPUTER FACILITY. The Central... Computer Facility consists of a group of mainframes, minicomputers, and associated peripherals which host the operational and data acquisition...in the Controller Laboratory and convert their verbal directives into a keyboard entry which is transmitted to the Central Computer Complex, where

  2. The Organization and Evaluation of a Computer-Assisted, Centralized Immunization Registry.

    ERIC Educational Resources Information Center

    Loeser, Helen; And Others

    1983-01-01

    Evaluation of a computer-assisted, centralized immunization registry after one year shows that 93 percent of eligible health practitioners initially agreed to provide data and that 73 percent continue to do so. Immunization rates in audited groups have improved significantly. (GC)

  3. Centralized Authorization Using a Direct Service, Part II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wachsmann, A

    Authorization is the process of deciding if entity X is allowed to have access to resource Y. Determining the identity of X is the job of the authentication process. One task of authorization in computer networks is to define and determine which user has access to which computers in the network. On Linux, the tendency exists to create a local account for each single user who should be allowed to logon to a computer. This is typically the case because a user not only needs login privileges to a computer but also additional resources like a home directory to actuallymore » do some work. Creating a local account on every computer takes care of all this. The problem with this approach is that these local accounts can be inconsistent with each other. The same user name could have a different user ID and/or group ID on different computers. Even more problematic is when two different accounts share the same user ID and group ID on different computers: User joe on computer1 could have user ID 1234 and group ID 56 and user jane on computer2 could have the same user ID 1234 and group ID 56. This is a big security risk in case shared resources like NFS are used. These two different accounts are the same for an NFS server so that these users can wipe out each other's files. The solution to this inconsistency problem is to have only one central, authoritative data source for this kind of information and a means of providing all your computers with access to this central source. This is what a ''Directory Service'' is. The two directory services most widely used for centralizing authorization data are the Network Information Service (NIS, formerly known as Yellow Pages or YP) and Lightweight Directory Access Protocol (LDAP).« less

  4. A Computer Program for Training Eccentric Reading in Persons with Central Scotoma

    ERIC Educational Resources Information Center

    Kasten, Erich; Haschke, Peggy; Meinhold, Ulrike; Oertel-Verweyen, Petra

    2010-01-01

    This article explores the effectiveness of a computer program--Xcentric viewing--for training eccentric reading in persons with central scotoma. The authors conducted a small study to investigate whether this program increases the reading capacities of individuals with age-related macular degeneration (AMD). Instead of a control group, they…

  5. Mechanistic experimental pain assessment in computer users with and without chronic musculoskeletal pain.

    PubMed

    Ge, Hong-You; Vangsgaard, Steffen; Omland, Øyvind; Madeleine, Pascal; Arendt-Nielsen, Lars

    2014-12-06

    Musculoskeletal pain from the upper extremity and shoulder region is commonly reported by computer users. However, the functional status of central pain mechanisms, i.e., central sensitization and conditioned pain modulation (CPM), has not been investigated in this population. The aim was to evaluate sensitization and CPM in computer users with and without chronic musculoskeletal pain. Pressure pain threshold (PPT) mapping in the neck-shoulder (15 points) and the elbow (12 points) was assessed together with PPT measurement at mid-point in the tibialis anterior (TA) muscle among 47 computer users with chronic pain in the upper extremity and/or neck-shoulder pain (pain group) and 17 pain-free computer users (control group). Induced pain intensities and profiles over time were recorded using a 0-10 cm electronic visual analogue scale (VAS) in response to different levels of pressure stimuli on the forearm with a new technique of dynamic pressure algometry. The efficiency of CPM was assessed using cuff-induced pain as conditioning pain stimulus and PPT at TA as test stimulus. The demographics, job seniority and number of working hours/week using a computer were similar between groups. The PPTs measured at all 15 points in the neck-shoulder region were not significantly different between groups. There were no significant differences between groups neither in PPTs nor pain intensity induced by dynamic pressure algometry. No significant difference in PPT was observed in TA between groups. During CPM, a significant increase in PPT at TA was observed in both groups (P < 0.05) without significant differences between groups. For the chronic pain group, higher clinical pain intensity, lower PPT values from the neck-shoulder and higher pain intensity evoked by the roller were all correlated with less efficient descending pain modulation (P < 0.05). This suggests that the excitability of the central pain system is normal in a large group of computer users with low pain intensity chronic upper extremity and/or neck-shoulder pain and that increased excitability of the pain system cannot explain the reported pain. However, computer users with higher pain intensity and lower PPTs were found to have decreased efficiency in descending pain modulation.

  6. Scientific Visualization, Seeing the Unseeable

    ScienceCinema

    LBNL

    2017-12-09

    June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in bo... June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.

  7. Effects of Multidimensional Concept Maps on Fourth Graders' Learning in Web-Based Computer Course

    ERIC Educational Resources Information Center

    Huang, Hwa-Shan; Chiou, Chei-Chang; Chiang, Heien-Kun; Lai, Sung-Hsi; Huang, Chiun-Yen; Chou, Yin-Yu

    2012-01-01

    This study explores the effect of multidimensional concept mapping instruction on students' learning performance in a web-based computer course. The subjects consisted of 103 fourth graders from an elementary school in central Taiwan. They were divided into three groups: multidimensional concept map (MCM) instruction group, Novak concept map (NCM)…

  8. Protocols for Handling Messages Between Simulation Computers

    NASA Technical Reports Server (NTRS)

    Balcerowski, John P.; Dunnam, Milton

    2006-01-01

    Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

  9. An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Randal Scott

    CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest andmore » emerging HPC systems.« less

  10. Autonomously Organized and Funded IT Groups

    ERIC Educational Resources Information Center

    Nichol, Bruce

    2004-01-01

    Central IT organizations under stress often cannot offer a high level of service to groups with above-average support needs. An example of such a group would be a well-funded, research-oriented computer science department. Several factors contribute to the increased demand on IT organizations. Given the availability of relatively…

  11. Vertebrobasilar system computed tomographic angiography in central vertigo

    PubMed Central

    Paşaoğlu, Lale

    2017-01-01

    Abstract The incidence of vertigo in the population is 20% to 30% and one-fourth of the cases are related to central causes. The aim of this study was to evaluate computed tomography angiography (CTA) findings of the vertebrobasilar system in central vertigo without stroke. CTA and magnetic resonance images of patients with vertigo were retrospectively evaluated. One hundred twenty-nine patients suspected of having central vertigo according to history, physical examination, and otological and neurological tests without signs of infarction on diffusion-weighted magnetic resonance imaging were included in the study. The control group included 120 patients with similar vascular disease risk factors but without vertigo. Vertebral and basilar artery diameters, hypoplasias, exit-site variations of vertebral artery, vertebrobasilar tortuosity, and stenosis of ≥50% detected on CTA were recorded for all patients. Independent-samples t test was used in variables with normal distribution, and Mann–Whitney U test in non-normal distribution. The difference of categorical variable distribution according to groups was analyzed with χ2 and/or Fisher exact test. Vertebral artery hypoplasia and ≥50% stenosis were seen more often in the vertigo group (P = 0.000, <0.001). Overall 78 (60.5%) vertigo patients had ≥50% stenosis, 54 (69.2%) had stenosis at V1 segment, 9 (11.5%) at V2 segment, 2 (2.5%) at V3 segment, and 13 (16.6%) at V4 segment. Both vertigo and control groups had similar basilar artery hypoplasia and ≥50% stenosis rates (P = 0.800, >0.05). CTA may be helpful to clarify the association between abnormal CTA findings of vertebral arteries and central vertigo. This article reveals the opportunity to diagnose posterior circulation abnormalities causing central vertigo with a feasible method such as CTA. PMID:28328808

  12. Vertebrobasilar system computed tomographic angiography in central vertigo.

    PubMed

    Paşaoğlu, Lale

    2017-03-01

    The incidence of vertigo in the population is 20% to 30% and one-fourth of the cases are related to central causes. The aim of this study was to evaluate computed tomography angiography (CTA) findings of the vertebrobasilar system in central vertigo without stroke.CTA and magnetic resonance images of patients with vertigo were retrospectively evaluated. One hundred twenty-nine patients suspected of having central vertigo according to history, physical examination, and otological and neurological tests without signs of infarction on diffusion-weighted magnetic resonance imaging were included in the study. The control group included 120 patients with similar vascular disease risk factors but without vertigo. Vertebral and basilar artery diameters, hypoplasias, exit-site variations of vertebral artery, vertebrobasilar tortuosity, and stenosis of ≥50% detected on CTA were recorded for all patients. Independent-samples t test was used in variables with normal distribution, and Mann-Whitney U test in non-normal distribution. The difference of categorical variable distribution according to groups was analyzed with χ and/or Fisher exact test.Vertebral artery hypoplasia and ≥50% stenosis were seen more often in the vertigo group (P = 0.000, <0.001). Overall 78 (60.5%) vertigo patients had ≥50% stenosis, 54 (69.2%) had stenosis at V1 segment, 9 (11.5%) at V2 segment, 2 (2.5%) at V3 segment, and 13 (16.6%) at V4 segment. Both vertigo and control groups had similar basilar artery hypoplasia and ≥50% stenosis rates (P = 0.800, >0.05).CTA may be helpful to clarify the association between abnormal CTA findings of vertebral arteries and central vertigo.This article reveals the opportunity to diagnose posterior circulation abnormalities causing central vertigo with a feasible method such as CTA.

  13. The Effects of a Computerized Study Program on the Acquisition of Science Vocabulary

    ERIC Educational Resources Information Center

    Rollins, Karen F.

    2012-01-01

    The following study examined the difference in science vocabulary acquisition comparing computer-assisted learning and a traditional study review sheet. Fourth and fifth grade students from a suburban school in central Texas were randomly selected and randomly assigned to either experimental group or control group. Both groups were given a…

  14. Scientific Visualization: The Modern Oscilloscope for "Seeing the Unseeable" (LBNL Summer Lecture Series)

    ScienceCinema

    Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division and Scientific Visualization Group

    2018-05-07

    Summer Lecture Series 2008: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.

  15. Organizational Communication: Theoretical Implications of Communication Technology Applications.

    ERIC Educational Resources Information Center

    Danowski, James A.

    Communication technology (CT), which involves the use of computers in private and group communication, has had a major impact on theory and research in organizational communication over the past 30 years. From the 1950s to the early 1970s, mainframe computers were seen as managerial tools in creating more centralized organizational structures.…

  16. Observing Engineering Student Teams from the Organization Behavior Perspective Using Linguistic Analysis of Student Reflections and Focus Group Interviews

    ERIC Educational Resources Information Center

    Kearney, Kerri S.; Damron, Rebecca; Sohoni, Sohum

    2015-01-01

    This paper investigates group/team development in computer engineering courses at a University in the Central USA from the perspective of organization behavior theory, specifically Tuckman's model of the stages of group development. The investigation, conducted through linguistic analysis of student reflection essays, and through focus group…

  17. Root morphology and development of labial inversely impacted maxillary central incisors in the mixed dentition: a retrospective cone-beam computed tomography study.

    PubMed

    Sun, Hao; Wang, Yi; Sun, Chaofan; Ye, Qingsong; Dai, Weiwei; Wang, Xiuying; Xu, Qingchao; Pan, Sisi; Hu, Rongdang

    2014-12-01

    The aim of this study was to analyze 3-dimensional data of root morphology and development in labial inversely impacted maxillary central incisors. Cone-beam computed tomography images from 41 patients with impacted incisors were divided into early and late dental age groups according to their dental age. Sagittal slices in which the labiolingual width of the tooth was the widest in the axial view were evaluated. The inverse angle, the dilaceration angle, and the length of both impacted and homonym teeth were evaluated with SimPlant Pro software (version 13.0; Materialise Dental NV, Leuven, Belgium). The Student t test indicated that the lengths of the impacted teeth were significantly shorter than those of the homonym teeth (P <0.05), and the root lengths of the early dental age group were significantly shorter than those of the late dental age group. The results from chi-square tests indicated that the incidence of dilacerations was significantly higher in the late dental age group when compared with the early dental age group. Multiple regression analyses indicated that the independent variables for root length of the impacted teeth were dental age (β = 0.958; P <0.001) and length of the nondilacerated part of the root (β = 0.435; P <0.001). Dilaceration was more common in the late dental age group. The roots of labial inversely impacted maxillary central incisors continue developing, but their potential is limited. Copyright © 2014 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  18. Summary of Research Academic Departments, 1987-1988

    DTIC Science & Technology

    1988-12-01

    quantify the computer nccring students and their faculty with roughly system’s ability to enhance learning of the course equivalent computers; one group...Sponsor: Naval Academy Instructional Development Advisory Committee To understand mathematics , a student must under- also to explain the central concepts... Mathematics Department. The project will attempt resources for in-class and extra instruction , to move toward these goals by preparing extra Students

  19. CRYSNET manual. Informal report. [Hardware and software of crystallographic computing network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None,

    1976-07-01

    This manual describes the hardware and software which together make up the crystallographic computing network (CRYSNET). The manual is intended as a users' guide and also provides general information for persons without any experience with the system. CRYSNET is a network of intelligent remote graphics terminals that are used to communicate with the CDC Cyber 70/76 computing system at the Brookhaven National Laboratory (BNL) Central Scientific Computing Facility. Terminals are in active use by four research groups in the field of crystallography. A protein data bank has been established at BNL to store in machine-readable form atomic coordinates and othermore » crystallographic data for macromolecules. The bank currently includes data for more than 20 proteins. This structural information can be accessed at BNL directly by the CRYSNET graphics terminals. More than two years of experience has been accumulated with CRYSNET. During this period, it has been demonstrated that the terminals, which provide access to a large, fast third-generation computer, plus stand-alone interactive graphics capability, are useful for computations in crystallography, and in a variety of other applications as well. The terminal hardware, the actual operations of the terminals, and the operations of the BNL Central Facility are described in some detail, and documentation of the terminal and central-site software is given. (RWR)« less

  20. Student leadership in small group science inquiry

    NASA Astrophysics Data System (ADS)

    Oliveira, Alandeom W.; Boz, Umit; Broadwell, George A.; Sadler, Troy D.

    2014-09-01

    Background: Science educators have sought to structure collaborative inquiry learning through the assignment of static group roles. This structural approach to student grouping oversimplifies the complexities of peer collaboration and overlooks the highly dynamic nature of group activity. Purpose: This study addresses this issue of oversimplification of group dynamics by examining the social leadership structures that emerge in small student groups during science inquiry. Sample: Two small student groups investigating the burning of a candle under a jar participated in this study. Design and method: We used a mixed-method research approach that combined computational discourse analysis (computational quantification of social aspects of small group discussions) with microethnography (qualitative, in-depth examination of group discussions). Results: While in one group social leadership was decentralized (i.e., students shared control over topics and tasks), the second group was dominated by a male student (centralized social leadership). Further, decentralized social leadership was found to be paralleled by higher levels of student cognitive engagement. Conclusions: It is argued that computational discourse analysis can provide science educators with a powerful means of developing pedagogical models of collaborative science learning that take into account the emergent nature of group structures and highly fluid nature of student collaboration.

  1. Synergies and Distinctions between Computational Disciplines in Biomedical Research: Perspective from the Clinical and Translational Science Award Programs

    PubMed Central

    Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.

    2010-01-01

    Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198

  2. Awareness in the Home: The Nuances of Relationships, Domestic Coordination and Communication

    NASA Astrophysics Data System (ADS)

    Greenberg, Saul; Neustaedter, Carman; Elliot, Kathryn

    Computing has changed dramatically over the last decade. While some changes arose from technological advances, the most profound effects are in how technologies are used by everyday people for activities other than task-oriented work. Computers are now central to new ways of engaging in play, interpersonal and small group communication, community interaction, entertainment, personal creativity dissemination, personal publication, and so on. We are particularly interested in domestic computing, where technology mediates how families and other inhabitants interact within the context of the home. While domestic computing can incorporate many things, we focus in this chapter on the role awareness plays in domestic coordination and communication.

  3. Prevalence of C-shaped canals in mandibular second and third molars in a central India population: A cone beam computed tomography analysis.

    PubMed

    Wadhwani, Shefali; Singh, Mahesh Pratap; Agarwal, Manish; Somasundaram, Pavithra; Rawtiya, Manjusha; Wadhwani, P K

    2017-01-01

    To evaluate the prevalence of C-shaped root canals in mandibular molars using cone beam computed tomography (CBCT) in a subpopulation of Central India. CBCT scans of patients from diagnostic imaging center were selected in accordance with the criteria given by Fan et al . (2004) for C-shaped canals. A total of 238 CBCT scans fulfilled the inclusion criteria and thereby divided into two groups: Group 1: Images showing C-shaped canal configuration in mandibular second molars. Group 2: Images showing C-shaped canal configuration in mandibular third molars. The frequency and distribution of canals and their configuration along with the position of lingual/buccal grooves in the images were evaluated, and the data was analyzed. CBCT evaluation showed that 9.7% of second molars and 8% of third molars had C-shaped canals. A prominent buccal groove was seen in these teeth. The data showed a significant difference ( P = 0.038) for the presence of such anatomy on the right side for mandibular third molars. The study showed a significant prevalence of C-shaped canal configuration in the subpopulation studied.

  4. Prevalence of C-shaped canals in mandibular second and third molars in a central India population: A cone beam computed tomography analysis

    PubMed Central

    Wadhwani, Shefali; Singh, Mahesh Pratap; Agarwal, Manish; Somasundaram, Pavithra; Rawtiya, Manjusha; Wadhwani, P. K.

    2017-01-01

    Introduction: To evaluate the prevalence of C-shaped root canals in mandibular molars using cone beam computed tomography (CBCT) in a subpopulation of Central India. Materials and Methods: CBCT scans of patients from diagnostic imaging center were selected in accordance with the criteria given by Fan et al. (2004) for C-shaped canals. A total of 238 CBCT scans fulfilled the inclusion criteria and thereby divided into two groups: Group 1: Images showing C-shaped canal configuration in mandibular second molars. Group 2: Images showing C-shaped canal configuration in mandibular third molars. The frequency and distribution of canals and their configuration along with the position of lingual/buccal grooves in the images were evaluated, and the data was analyzed. Results: CBCT evaluation showed that 9.7% of second molars and 8% of third molars had C-shaped canals. A prominent buccal groove was seen in these teeth. The data showed a significant difference (P = 0.038) for the presence of such anatomy on the right side for mandibular third molars. Conclusion: The study showed a significant prevalence of C-shaped canal configuration in the subpopulation studied. PMID:29386785

  5. The role of the host in a cooperating mainframe and workstation environment, volumes 1 and 2

    NASA Technical Reports Server (NTRS)

    Kusmanoff, Antone; Martin, Nancy L.

    1989-01-01

    In recent years, advancements made in computer systems have prompted a move from centralized computing based on timesharing a large mainframe computer to distributed computing based on a connected set of engineering workstations. A major factor in this advancement is the increased performance and lower cost of engineering workstations. The shift to distributed computing from centralized computing has led to challenges associated with the residency of application programs within the system. In a combined system of multiple engineering workstations attached to a mainframe host, the question arises as to how does a system designer assign applications between the larger mainframe host and the smaller, yet powerful, workstation. The concepts related to real time data processing are analyzed and systems are displayed which use a host mainframe and a number of engineering workstations interconnected by a local area network. In most cases, distributed systems can be classified as having a single function or multiple functions and as executing programs in real time or nonreal time. In a system of multiple computers, the degree of autonomy of the computers is important; a system with one master control computer generally differs in reliability, performance, and complexity from a system in which all computers share the control. This research is concerned with generating general criteria principles for software residency decisions (host or workstation) for a diverse yet coupled group of users (the clustered workstations) which may need the use of a shared resource (the mainframe) to perform their functions.

  6. PAGANI Toolkit: Parallel graph-theoretical analysis package for brain network big data.

    PubMed

    Du, Haixiao; Xia, Mingrui; Zhao, Kang; Liao, Xuhong; Yang, Huazhong; Wang, Yu; He, Yong

    2018-05-01

    The recent collection of unprecedented quantities of neuroimaging data with high spatial resolution has led to brain network big data. However, a toolkit for fast and scalable computational solutions is still lacking. Here, we developed the PArallel Graph-theoretical ANalysIs (PAGANI) Toolkit based on a hybrid central processing unit-graphics processing unit (CPU-GPU) framework with a graphical user interface to facilitate the mapping and characterization of high-resolution brain networks. Specifically, the toolkit provides flexible parameters for users to customize computations of graph metrics in brain network analyses. As an empirical example, the PAGANI Toolkit was applied to individual voxel-based brain networks with ∼200,000 nodes that were derived from a resting-state fMRI dataset of 624 healthy young adults from the Human Connectome Project. Using a personal computer, this toolbox completed all computations in ∼27 h for one subject, which is markedly less than the 118 h required with a single-thread implementation. The voxel-based functional brain networks exhibited prominent small-world characteristics and densely connected hubs, which were mainly located in the medial and lateral fronto-parietal cortices. Moreover, the female group had significantly higher modularity and nodal betweenness centrality mainly in the medial/lateral fronto-parietal and occipital cortices than the male group. Significant correlations between the intelligence quotient and nodal metrics were also observed in several frontal regions. Collectively, the PAGANI Toolkit shows high computational performance and good scalability for analyzing connectome big data and provides a friendly interface without the complicated configuration of computing environments, thereby facilitating high-resolution connectomics research in health and disease. © 2018 Wiley Periodicals, Inc.

  7. Randomized controlled within-subject evaluation of digital and conventional workflows for the fabrication of lithium disilicate single crowns. Part II: CAD-CAM versus conventional laboratory procedures.

    PubMed

    Sailer, Irena; Benic, Goran I; Fehmer, Vincent; Hämmerle, Christoph H F; Mühlemann, Sven

    2017-07-01

    Clinical studies are needed to evaluate the entire digital and conventional workflows in prosthetic dentistry. The purpose of the second part of this clinical study was to compare the laboratory production time for tooth-supported single crowns made with 4 different digital workflows and 1 conventional workflow and to compare these crowns clinically. For each of 10 participants, a monolithic crown was fabricated in lithium disilicate-reinforced glass ceramic (IPS e.max CAD). The computer-aided design and computer-aided manufacturing (CAD-CAM) systems were Lava C.O.S. CAD software and centralized CAM (group L), Cares CAD software and centralized CAM (group iT), Cerec Connect CAD software and lab side CAM (group CiL), and Cerec Connect CAD software with centralized CAM (group CiD). The conventional fabrication (group K) included a wax pattern of the crown and heat pressing according to the lost-wax technique (IPS e.max Press). The time for the fabrication of the casts and the crowns was recorded. Subsequently, the crowns were clinically evaluated and the corresponding treatment times were recorded. The Paired Wilcoxon test with the Bonferroni correction was applied to detect differences among treatment groups (α=.05). The total mean (±standard deviation) active working time for the dental technician was 88 ±6 minutes in group L, 74 ±12 minutes in group iT, 74 ±5 minutes in group CiL, 92 ±8 minutes in group CiD, and 148 ±11 minutes in group K. The dental technician spent significantly more working time for the conventional workflow than for the digital workflows (P<.001). No statistically significant differences were found between group L and group CiD or between group iT and group CiL. No statistical differences in time for the clinical evaluation were found among groups, indicating similar outcomes (P>.05). Irrespective of the CAD-CAM system, the overall laboratory working time for a digital workflow was significantly shorter than for the conventional workflow, since the dental technician needed less active working time. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  8. LaRC local area networks to support distributed computing

    NASA Technical Reports Server (NTRS)

    Riddle, E. P.

    1984-01-01

    The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.

  9. The Effects of Using Dynamic Geometry on Eighth Grade Students' Achievement and Attitude towards Triangles

    ERIC Educational Resources Information Center

    Turk, Halime Samur; Akyuz, Didem

    2016-01-01

    This study investigates the effects of dynamic geometry based computer instruction on eighth grade students' achievement in geometry and their attitudes toward geometry and technology compared to traditional instruction. Central to the study was a controlled experiment, which contained experimental and control groups both instructed by the same…

  10. 4d $$ \\mathcal{N} $$=2 theories with disconnected gauge groups

    DOE PAGES

    Argyres, Philip C.; Martone, Mario

    2017-03-28

    In this paper we present a beautifully consistent web of evidence for the existence of interacting 4d rank-1more » $$ \\mathcal{N} $$ = 2 SCFTs obtained from gauging discrete subgroups of global symmetries of other existing 4d rank-1 $$ \\mathcal{N} $$ = 2 SCFTs. The global symmetries that can be gauged involve a non-trivial combination of discrete subgroups of the U(1) R, low-energy EM duality group SL(2,Z), and the outer automorphism group of the flavor symmetry algebra, Out(F ). The theories that we construct are remarkable in many ways: (i) two of them have exceptional F 4 and G 2 flavor groups; (ii) they substantially complete the picture of the landscape of rank-1 $$ \\mathcal{N} $$ = 2 SCFTs as they realize all but one of the remaining consistent rank-1 Seiberg-Witten geometries that we previously constructed but were not associated to known SCFTs; and (iii) some of them have enlarged $$ \\mathcal{N} $$ = 3 SUSY, and have not been previously constructed. They are also examples of SCFTs which violate the ShapereTachikawa relation between the conformal central charges and the scaling dimension of the Coulomb branch vev. Here, we propose a modification of the formulas computing these central charges from the topologically twisted Coulomb branch partition function which correctly compute them for discretely gauged theories.« less

  11. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  12. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  13. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  14. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  15. 21 CFR 1305.24 - Central processing of orders.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Argyres, Philip C.; Martone, Mario

    In this paper we present a beautifully consistent web of evidence for the existence of interacting 4d rank-1more » $$ \\mathcal{N} $$ = 2 SCFTs obtained from gauging discrete subgroups of global symmetries of other existing 4d rank-1 $$ \\mathcal{N} $$ = 2 SCFTs. The global symmetries that can be gauged involve a non-trivial combination of discrete subgroups of the U(1) R, low-energy EM duality group SL(2,Z), and the outer automorphism group of the flavor symmetry algebra, Out(F ). The theories that we construct are remarkable in many ways: (i) two of them have exceptional F 4 and G 2 flavor groups; (ii) they substantially complete the picture of the landscape of rank-1 $$ \\mathcal{N} $$ = 2 SCFTs as they realize all but one of the remaining consistent rank-1 Seiberg-Witten geometries that we previously constructed but were not associated to known SCFTs; and (iii) some of them have enlarged $$ \\mathcal{N} $$ = 3 SUSY, and have not been previously constructed. They are also examples of SCFTs which violate the ShapereTachikawa relation between the conformal central charges and the scaling dimension of the Coulomb branch vev. Here, we propose a modification of the formulas computing these central charges from the topologically twisted Coulomb branch partition function which correctly compute them for discretely gauged theories.« less

  17. Multipurpose Arcade Combat Simulator (MACS) Basic Rifle Marksmanship (BRM) Program

    DTIC Science & Technology

    1989-10-01

    that shooters are aiming at the center of mass of each target to the best of their abilities. It then computes the central point of the three-round shot...group, measures the distance between this central point and the actual center of the target, and uses this distance as a constant offset value, which...WAE LIto, SIT 3 CLA HO CHANGE 151100 CM? $111 ,10-Im6%AAG ISE LEST30 IF$ SCREEN OCO ias NEXT 2 syltJ4 NSK L91T1)S Olt ?ILA ,r6 SORDERA COLo0 410 Ltx7 2

  18. Centralized vs. decentralized nursing stations: effects on nurses' functional use of space and work environment.

    PubMed

    Zborowsky, Terri; Bunker-Hellmich, Lou; Morelli, Agneta; O'Neill, Mike

    2010-01-01

    Evidence-based findings of the effects of nursing station design on nurses' work environment and work behavior are essential to improve conditions and increase retention among these fundamental members of the healthcare delivery team. The purpose of this exploratory study was to investigate how nursing station design (i.e., centralized and decentralized nursing station layouts) affected nurses' use of space, patient visibility, noise levels, and perceptions of the work environment. Advances in information technology have enabled nurses to move away from traditional centralized paper-charting stations to smaller decentralized work stations and charting substations located closer to, or inside of, patient rooms. Improved understanding of the trade-offs presented by centralized and decentralized nursing station design has the potential to provide useful information for future nursing station layouts. This information will be critical for understanding the nurse environment "fit." The study used an exploratory design with both qualitative and quantitative methods. Qualitative data regarding the effects of nursing station design on nurses' health and work environment were gathered by means of focus group interviews. Quantitative data-gathering techniques included place- and person-centered space use observations, patient visibility assessments, sound level measurements, and an online questionnaire regarding perceptions of the work environment. Nurses on all units were observed most frequently performing telephone, computer, and administrative duties. Time spent using telephones, computers, and performing other administrative duties was significantly higher in the centralized nursing stations. Consultations with medical staff and social interactions were significantly less frequent in decentralized nursing stations. There were no indications that either centralized or decentralized nursing station designs resulted in superior visibility. Sound levels measured in all nursing stations exceeded recommended levels during all shifts. No significant differences were identified in nurses' perceptions of work control-demand-support in centralized and decentralized nursing station designs. The "hybrid" nursing design model in which decentralized nursing stations are coupled with centralized meeting rooms for consultation between staff members may strike a balance between the increase in computer duties and the ongoing need for communication and consultation that addresses the conflicting demands of technology and direct patient care.

  19. Fluid Centrality: A Social Network Analysis of Social-Technical Relations in Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Enriquez, Judith Guevarra

    2010-01-01

    In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…

  20. Visitor meanings of place: using computer content analysis to examine visitor meanings at three national capitol sites

    Treesearch

    Wei-Li Jasmine Chen; Chad L. Pierskalla; Theresa L. Goldman; David L. Larsen

    2002-01-01

    A mix method study designed to explore the meanings, interest, and connections visitors ascribe to three National Park Service sites: National Capital Parks Central, Rock Creek Park, and George Washington Memorial Parkway's Great Falls Park. The researchers employed the focus group interview technique and asked visitors prior to and then after an interpretive...

  1. Introduction to the LaRC central scientific computing complex

    NASA Technical Reports Server (NTRS)

    Shoosmith, John N.

    1993-01-01

    The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.

  2. Degree-Pruning Dynamic Programming Approaches to Central Time Series Minimizing Dynamic Time Warping Distance.

    PubMed

    Sun, Tao; Liu, Hongbo; Yu, Hong; Chen, C L Philip

    2016-06-28

    The central time series crystallizes the common patterns of the set it represents. In this paper, we propose a global constrained degree-pruning dynamic programming (g(dp)²) approach to obtain the central time series through minimizing dynamic time warping (DTW) distance between two time series. The DTW matching path theory with global constraints is proved theoretically for our degree-pruning strategy, which is helpful to reduce the time complexity and computational cost. Our approach can achieve the optimal solution between two time series. An approximate method to the central time series of multiple time series [called as m_g(dp)²] is presented based on DTW barycenter averaging and our g(dp)² approach by considering hierarchically merging strategy. As illustrated by the experimental results, our approaches provide better within-group sum of squares and robustness than other relevant algorithms.

  3. Refueling Strategies for a Team of Cooperating AUVs

    DTIC Science & Technology

    2011-01-01

    manager, and thus the constraint a centrally managed underwater network imposes on the mission. Task management utilizing Robust Decentralized Task ...the computational complexity. A bid based approach to task management has also been studied as a possible means of decentralization of group task ...currently performing another task . In [18], ground robots perform distributed task allocation using the ASyMTRy-D algorithm, which is based on CNP

  4. Peripheral Distribution of Thrombus Does Not Affect Outcomes After Surgical Pulmonary Embolectomy.

    PubMed

    Pasrija, Chetan; Shah, Aakash; George, Praveen; Mohammed, Isa; Brigante, Francis A; Ghoreishi, Mehrdad; Jeudy, Jean; Taylor, Bradley S; Gammie, James S; Griffith, Bartley P; Kon, Zachary N

    2018-04-04

    Thrombus located distal to the main or primary pulmonary arteries has been previously viewed as a relative contraindication to surgical pulmonary embolectomy. We compared outcomes for surgical pulmonary embolectomy for submassive and massive pulmonary embolism (PE) in patients with central versus peripheral thrombus burden. All consecutive patients (2011-2016) undergoing surgical pulmonary embolectomy at a single center were retrospectively reviewed. Based on computed tomographic angiography of each patient, central PE was defined as any thrombus originating within the lateral pericardial borders (main or right/left pulmonary arteries). Peripheral PE was defined as thrombus exclusively beyond the lateral pericardial borders, involving the lobar pulmonary arteries or distal. The primary outcome was in-hospital and 90-day survival. 70 patients were identified: 52 (74%) with central PE and 18 (26%) with peripheral PE. Preoperative vital signs and right ventricular dysfunction were similar between the two groups. Compared to the central PE cohort, operative time was significantly longer in the peripheral PE group (191 vs. 210 minutes, p<0.005)). Median right ventricular dysfunction decreased from moderate dysfunction preoperatively to no dysfunction at discharge in both groups. Overall 90-day survival was 94%, with 100% survival in patients with submassive PE in both cohorts. This single center experience demonstrates excellent overall outcomes for surgical pulmonary embolectomy with resolution of right ventricular dysfunction, and comparable morbidity and mortality for central and peripheral PE. In an experienced center and when physiologically warranted, surgical pulmonary embolectomy for peripheral distribution of thrombus is both technically feasible and effective. Copyright © 2018. Published by Elsevier Inc.

  5. Computer graphics and the graphic artist

    NASA Technical Reports Server (NTRS)

    Taylor, N. L.; Fedors, E. G.; Pinelli, T. E.

    1985-01-01

    A centralized computer graphics system is being developed at the NASA Langley Research Center. This system was required to satisfy multiuser needs, ranging from presentation quality graphics prepared by a graphic artist to 16-mm movie simulations generated by engineers and scientists. While the major thrust of the central graphics system was directed toward engineering and scientific applications, hardware and software capabilities to support the graphic artists were integrated into the design. This paper briefly discusses the importance of computer graphics in research; the central graphics system in terms of systems, software, and hardware requirements; the application of computer graphics to graphic arts, discussed in terms of the requirements for a graphic arts workstation; and the problems encountered in applying computer graphics to the graphic arts. The paper concludes by presenting the status of the central graphics system.

  6. A Privacy-Preserving Platform for User-Centric Quantitative Benchmarking

    NASA Astrophysics Data System (ADS)

    Herrmann, Dominik; Scheuer, Florian; Feustel, Philipp; Nowey, Thomas; Federrath, Hannes

    We propose a centralised platform for quantitative benchmarking of key performance indicators (KPI) among mutually distrustful organisations. Our platform offers users the opportunity to request an ad-hoc benchmarking for a specific KPI within a peer group of their choice. Architecture and protocol are designed to provide anonymity to its users and to hide the sensitive KPI values from other clients and the central server. To this end, we integrate user-centric peer group formation, exchangeable secure multi-party computation protocols, short-lived ephemeral key pairs as pseudonyms, and attribute certificates. We show by empirical evaluation of a prototype that the performance is acceptable for reasonably sized peer groups.

  7. 31 CFR 285.7 - Salary offset.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Secretary, has waived certain requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U... process known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...

  8. Machine Learning-based Texture Analysis of Contrast-enhanced MR Imaging to Differentiate between Glioblastoma and Primary Central Nervous System Lymphoma.

    PubMed

    Kunimatsu, Akira; Kunimatsu, Natsuko; Yasaka, Koichiro; Akai, Hiroyuki; Kamiya, Kouhei; Watadani, Takeyuki; Mori, Harushi; Abe, Osamu

    2018-05-16

    Although advanced MRI techniques are increasingly available, imaging differentiation between glioblastoma and primary central nervous system lymphoma (PCNSL) is sometimes confusing. We aimed to evaluate the performance of image classification by support vector machine, a method of traditional machine learning, using texture features computed from contrast-enhanced T 1 -weighted images. This retrospective study on preoperative brain tumor MRI included 76 consecutives, initially treated patients with glioblastoma (n = 55) or PCNSL (n = 21) from one institution, consisting of independent training group (n = 60: 44 glioblastomas and 16 PCNSLs) and test group (n = 16: 11 glioblastomas and 5 PCNSLs) sequentially separated by time periods. A total set of 67 texture features was computed on routine contrast-enhanced T 1 -weighted images of the training group, and the top four most discriminating features were selected as input variables to train support vector machine classifiers. These features were then evaluated on the test group with subsequent image classification. The area under the receiver operating characteristic curves on the training data was calculated at 0.99 (95% confidence interval [CI]: 0.96-1.00) for the classifier with a Gaussian kernel and 0.87 (95% CI: 0.77-0.95) for the classifier with a linear kernel. On the test data, both of the classifiers showed prediction accuracy of 75% (12/16) of the test images. Although further improvement is needed, our preliminary results suggest that machine learning-based image classification may provide complementary diagnostic information on routine brain MRI.

  9. 31 CFR 285.7 - Salary offset.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...

  10. 31 CFR 285.7 - Salary offset.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...

  11. 31 CFR 285.7 - Salary offset.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...

  12. 31 CFR 285.7 - Salary offset.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...

  13. Radar Detection Models in Computer Supported Naval War Games

    DTIC Science & Technology

    1979-06-08

    revealed a requirement for the effective centralized manage- ment of computer supported war game development and employment in the U.S. Navy. A...considerations and supports the requirement for centralized Io 97 management of computerized war game development . Therefore it is recommended that a central...managerial and fiscal authority be estab- lished for computerized tactical war game development . This central authority should ensure that new games

  14. En Garde: Fencing at Kansas City's Central Computers Unlimited/Classical Greek Magnet High School, 1991-1995

    ERIC Educational Resources Information Center

    Poos, Bradley W.

    2015-01-01

    Central High School in Kansas City, Missouri is one of the oldest schools west of the Mississippi and the first public high school built in Kansas City. Kansas City's magnet plan resulted in Central High School being rebuilt as the Central Computers Unlimited/Classical Greek Magnet High School, a school that was designed to offer students an…

  15. Dormitory renovation project reduces energy use by 69%

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kokayko, M.J.

    1997-06-01

    Baldwin Hall is a three-story, 46,000 ft{sup 2} (4,273 m{sup 1}) dormitory on the campus of Allegheny College in Meadville, Pa. The building was originally built in the 1950s; an additional wing was added in the 1970s so that it has about 37,000 ft{sup 2} (3,437 m{sup 2}). The building contains approximately 100 double-occupancy student rooms; three common bathroom groups per floor; central study, lounge, and computer areas; and a laundry. Design for the renovation started in the winter of 1993; construction took place in the summer of 1994. The major goals of the renovation were: (1) to replace themore » entire building heating system (central boiler plant, distribution piping, and room heating terminals); (2) add a ventilation system within the building; (3) upgrade the building electrical system; (4) provide computer data cabling and cable TV wiring to each room; and, (5) improve room and hallway lighting and finishes.« less

  16. Tensor models, Kronecker coefficients and permutation centralizer algebras

    NASA Astrophysics Data System (ADS)

    Geloun, Joseph Ben; Ramgoolam, Sanjaye

    2017-11-01

    We show that the counting of observables and correlators for a 3-index tensor model are organized by the structure of a family of permutation centralizer algebras. These algebras are shown to be semi-simple and their Wedderburn-Artin decompositions into matrix blocks are given in terms of Clebsch-Gordan coefficients of symmetric groups. The matrix basis for the algebras also gives an orthogonal basis for the tensor observables which diagonalizes the Gaussian two-point functions. The centres of the algebras are associated with correlators which are expressible in terms of Kronecker coefficients (Clebsch-Gordan multiplicities of symmetric groups). The color-exchange symmetry present in the Gaussian model, as well as a large class of interacting models, is used to refine the description of the permutation centralizer algebras. This discussion is extended to a general number of colors d: it is used to prove the integrality of an infinite family of number sequences related to color-symmetrizations of colored graphs, and expressible in terms of symmetric group representation theory data. Generalizing a connection between matrix models and Belyi maps, correlators in Gaussian tensor models are interpreted in terms of covers of singular 2-complexes. There is an intriguing difference, between matrix and higher rank tensor models, in the computational complexity of superficially comparable correlators of observables parametrized by Young diagrams.

  17. Education and Training Practices: 2010 and Beyond

    DTIC Science & Technology

    1989-05-01

    compute attentional load at different points in the acquisition of piloting skills, or to determine fidelity standards for visual and auditory stimuli in...q electrical/magnetic cerebral stimulation . ’i instructions given out to all attendees for each Working Group were del. ..tely designed to depict...in the use of non-invasive electronic stimulation of targeted areas of the Central Nervous System (CNS). 17 Research should therefore be directed more

  18. Comparison of interpupillary distance and combined mesiodistal width of maxillary central incisor teeth in two ethnic groups of Northeast India: An in vivo study.

    PubMed

    Barman, Jogeswar; Serin, Sangma

    2018-01-01

    Anthropometric measurements of the face can be used as a guide in selecting proper sized anterior teeth. The aim of this study is to evaluate the relationship between the interpupillary distance (IPD) and the combined mesiodistal width of maxillary central incisors (MDW of MCIs) to establish their morphometric criterion and their significance in two ethnic groups of Northeast India. A total of 120 participants consisting of 60 indigenous students each from Assam and Meghalaya in the age group of 18-25 years were selected after taking their written consent. Standardized facial frontal photographs of all the participants were taken using a digital camera in such a manner that maxillary anterior teeth were visible. The photographs were uploaded onto the computer and saved in a file. Anthropometric measurements of IPD and combined MDW of MCIs in centimeters were made using both Adobe Photoshop ® 7.0 software program and manually using a digital vernier caliper on the developed photographs to a same size of 15 cm × 10 cm. Data obtained were tabulated and analyzed using Student "t"-test and Pearson correlation test. The present study reveals a positive correlation with a high degree of statistical significance between IPD and combined mesiodistal width of maxillary central incisors among all the samples irrespective of gender and ethnicity where P < 0.01. IPD can be used as a guide in determining the suitable mesiodistal dimension of the maxillary central incisors.

  19. Central Fetal Monitoring With and Without Computer Analysis: A Randomized Controlled Trial.

    PubMed

    Nunes, Inês; Ayres-de-Campos, Diogo; Ugwumadu, Austin; Amin, Pina; Banfield, Philip; Nicoll, Antony; Cunningham, Simon; Sousa, Paulo; Costa-Santos, Cristina; Bernardes, João

    2017-01-01

    To evaluate whether intrapartum fetal monitoring with computer analysis and real-time alerts decreases the rate of newborn metabolic acidosis or obstetric intervention when compared with visual analysis. A randomized clinical trial carried out in five hospitals in the United Kingdom evaluated women with singleton, vertex fetuses of 36 weeks of gestation or greater during labor. Continuous central fetal monitoring by computer analysis and online alerts (experimental arm) was compared with visual analysis (control arm). Fetal blood sampling and electrocardiographic ST waveform analysis were available in both arms. The primary outcome was incidence of newborn metabolic acidosis (pH less than 7.05 and base deficit greater than 12 mmol/L). Prespecified secondary outcomes included operative delivery, use of fetal blood sampling, low 5-minute Apgar score, neonatal intensive care unit admission, hypoxic-ischemic encephalopathy, and perinatal death. A sample size of 3,660 per group (N=7,320) was planned to be able to detect a reduction in the rate of metabolic acidosis from 2.8% to 1.8% (two-tailed α of 0.05 with 80% power). From August 2011 through July 2014, 32,306 women were assessed for eligibility and 7,730 were randomized: 3,961 to computer analysis and online alerts, and 3,769 to visual analysis. Baseline characteristics were similar in both groups. Metabolic acidosis occurred in 16 participants (0.40%) in the experimental arm and 22 participants (0.58%) in the control arm (relative risk 0.69 [0.36-1.31]). No statistically significant differences were found in the incidence of secondary outcomes. Compared with visual analysis, computer analysis of fetal monitoring signals with real-time alerts did not significantly reduce the rate of metabolic acidosis or obstetric intervention. A lower-than-expected rate of newborn metabolic acidosis was observed in both arms of the trial. ISRCTN Registry, http://www.isrctn.com, ISRCTN42314164.

  20. The PLATO IV Architecture.

    ERIC Educational Resources Information Center

    Stifle, Jack

    The PLATO IV computer-based instructional system consists of a large scale centrally located CDC 6400 computer and a large number of remote student terminals. This is a brief and general description of the proposed input/output hardware necessary to interface the student terminals with the computer's central processing unit (CPU) using available…

  1. Central Computational Facility CCF communications subsystem options

    NASA Technical Reports Server (NTRS)

    Hennigan, K. B.

    1979-01-01

    A MITRE study which investigated the communication options available to support both the remaining Central Computational Facility (CCF) computer systems and the proposed U1108 replacements is presented. The facilities utilized to link the remote user terminals with the CCF were analyzed and guidelines to provide more efficient communications were established.

  2. Embedding global and collective in a torus network with message class map based tree path selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Dong; Coteus, Paul W.; Eisley, Noel A.

    Embodiments of the invention provide a method, system and computer program product for embedding a global barrier and global interrupt network in a parallel computer system organized as a torus network. The computer system includes a multitude of nodes. In one embodiment, the method comprises taking inputs from a set of receivers of the nodes, dividing the inputs from the receivers into a plurality of classes, combining the inputs of each of the classes to obtain a result, and sending said result to a set of senders of the nodes. Embodiments of the invention provide a method, system and computermore » program product for embedding a collective network in a parallel computer system organized as a torus network. In one embodiment, the method comprises adding to a torus network a central collective logic to route messages among at least a group of nodes in a tree structure.« less

  3. Centralized Authentication with Kerberos 5, Part I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wachsmann, A

    Account administration in a distributed Unix/Linux environment can become very complicated and messy if done by hand. Large sites use special tools to deal with this problem. I will describe how even very small installations like your three computer network at home can take advantage of the very same tools. The problem in a distributed environment is that password and shadow files need to be changed individually on each machine if an account change occurs. Account changes include: password change, addition/removal of accounts, name change of an account (UID/GID changes are a big problem in any case), additional or removedmore » login privileges to a (group of) computer(s), etc. In this article, I will show how Kerberos 5 solves the authentication problem in a distributed computing environment. A second article will describe a solution for the authorization problem.« less

  4. Technical Challenges and Opportunities of Centralizing Space Science Mission Operations (SSMO) at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ido, Haisam; Burns, Rich

    2015-01-01

    The NASA Goddard Space Science Mission Operations project (SSMO) is performing a technical cost-benefit analysis for centralizing and consolidating operations of a diverse set of missions into a unified and integrated technical infrastructure. The presentation will focus on the notion of normalizing spacecraft operations processes, workflows, and tools. It will also show the processes of creating a standardized open architecture, creating common security models and implementations, interfaces, services, automations, notifications, alerts, logging, publish, subscribe and middleware capabilities. The presentation will also discuss how to leverage traditional capabilities, along with virtualization, cloud computing services, control groups and containers, and possibly Big Data concepts.

  5. Computing at DESY — current setup, trends and strategic directions

    NASA Astrophysics Data System (ADS)

    Ernst, Michael

    1998-05-01

    Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.

  6. Computer simulation of the cumulative effects of brushland fire-management policies

    NASA Astrophysics Data System (ADS)

    Bonnicksen, Thomas M.

    1980-01-01

    A mathematical model simulates the cumulative volume of debris produced from brushland watersheds. Application of this model to a 176-km2 (0.678 = mi2) watershed along the southern flank of the Central San Gabriel Mountains permits assessment of expected debris production associated with alternative fire-management policies. The political implications of simulated debris production are evaluated through a conceptual model that links interest groups to particular successional stages in brushland watersheds by means of the resources claimed by each group. It is concluded that in theory, a rotation burn policy would provide benefits to more interest groups concerned about southern California's brushland watersheds than does the current fire exclusion policy.

  7. Network dynamics of social influence in the wisdom of crowds

    PubMed Central

    Brackbill, Devon; Centola, Damon

    2017-01-01

    A longstanding problem in the social, biological, and computational sciences is to determine how groups of distributed individuals can form intelligent collective judgments. Since Galton’s discovery of the “wisdom of crowds” [Galton F (1907) Nature 75:450–451], theories of collective intelligence have suggested that the accuracy of group judgments requires individuals to be either independent, with uncorrelated beliefs, or diverse, with negatively correlated beliefs [Page S (2008) The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies]. Previous experimental studies have supported this view by arguing that social influence undermines the wisdom of crowds. These results showed that individuals’ estimates became more similar when subjects observed each other’s beliefs, thereby reducing diversity without a corresponding increase in group accuracy [Lorenz J, Rauhut H, Schweitzer F, Helbing D (2011) Proc Natl Acad Sci USA 108:9020–9025]. By contrast, we show general network conditions under which social influence improves the accuracy of group estimates, even as individual beliefs become more similar. We present theoretical predictions and experimental results showing that, in decentralized communication networks, group estimates become reliably more accurate as a result of information exchange. We further show that the dynamics of group accuracy change with network structure. In centralized networks, where the influence of central individuals dominates the collective estimation process, group estimates become more likely to increase in error. PMID:28607070

  8. Network dynamics of social influence in the wisdom of crowds.

    PubMed

    Becker, Joshua; Brackbill, Devon; Centola, Damon

    2017-06-27

    A longstanding problem in the social, biological, and computational sciences is to determine how groups of distributed individuals can form intelligent collective judgments. Since Galton's discovery of the "wisdom of crowds" [Galton F (1907) Nature 75:450-451], theories of collective intelligence have suggested that the accuracy of group judgments requires individuals to be either independent, with uncorrelated beliefs, or diverse, with negatively correlated beliefs [Page S (2008) The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies ]. Previous experimental studies have supported this view by arguing that social influence undermines the wisdom of crowds. These results showed that individuals' estimates became more similar when subjects observed each other's beliefs, thereby reducing diversity without a corresponding increase in group accuracy [Lorenz J, Rauhut H, Schweitzer F, Helbing D (2011) Proc Natl Acad Sci USA 108:9020-9025]. By contrast, we show general network conditions under which social influence improves the accuracy of group estimates, even as individual beliefs become more similar. We present theoretical predictions and experimental results showing that, in decentralized communication networks, group estimates become reliably more accurate as a result of information exchange. We further show that the dynamics of group accuracy change with network structure. In centralized networks, where the influence of central individuals dominates the collective estimation process, group estimates become more likely to increase in error.

  9. Integrating Micro-computers with a Centralized DBMS: ORACLE, SEED AND INGRES

    NASA Technical Reports Server (NTRS)

    Hoerger, J.

    1984-01-01

    Users of ADABAS, a relational-like data base management system (ADABAS) with its data base programming language (NATURAL) are acquiring microcomputers with hopes of solving their individual word processing, office automation, decision support, and simple data processing problems. As processor speeds, memory sizes, and disk storage capacities increase, individual departments begin to maintain "their own" data base on "their own" micro-computer. This situation can adversely affect several of the primary goals set for implementing a centralized DBMS. In order to avoid this potential problem, these micro-computers must be integrated with the centralized DBMS. An easy to use and flexible means for transferring logic data base files between the central data base machine and micro-computers must be provided. Some of the problems encounted in an effort to accomplish this integration and possible solutions are discussed.

  10. Educational technology in care management: technological profile of nurses in Portuguese hospitals.

    PubMed

    Landeiro, Maria José Lumini; Freire, Rosa Maria Albuquerque; Martins, Maria Manuela; Martins, Teresa Vieira; Peres, Heloísa Helena Ciqueto

    2015-12-01

    Objective To identify the technological profile of nurses in Portuguese hospitals. Method A quantitative exploratory study conducted in two hospitals in the northern region and one in the central region of Portugal. The sample was randomly selected and included 960 nurses. Results Of the participants, 420 (46.1%) used computers, 196 (23.4%) reported having knowledge about using computers for teaching, 174 (21.1%) used computers to teach, 112 (15.1%) recognized that using computers can be a technological means to supplement classroom training, 477 (61.6%) would like to receive training on using computers, and 382 (40.9%) reported self-learning of information technology. In relation to distance education, 706 (74.9%) reported they were familiar with it and 752 (76.4%) indicated an interest in participating in training using this modality. Conclusion Organizations should be mindful of the technological profile shown by this group of nurses and look for ways to introduce educational technologies in the management of care.

  11. Embedding global barrier and collective in torus network with each node combining input from receivers according to class map for output to senders

    DOEpatents

    Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Heidelberger, Philip; Senger, Robert M; Salapura, Valentina; Steinmacher-Burow, Burkhard; Sugawara, Yutaka; Takken, Todd E

    2013-08-27

    Embodiments of the invention provide a method, system and computer program product for embedding a global barrier and global interrupt network in a parallel computer system organized as a torus network. The computer system includes a multitude of nodes. In one embodiment, the method comprises taking inputs from a set of receivers of the nodes, dividing the inputs from the receivers into a plurality of classes, combining the inputs of each of the classes to obtain a result, and sending said result to a set of senders of the nodes. Embodiments of the invention provide a method, system and computer program product for embedding a collective network in a parallel computer system organized as a torus network. In one embodiment, the method comprises adding to a torus network a central collective logic to route messages among at least a group of nodes in a tree structure.

  12. Hand-held computer operating system program for collection of resident experience data.

    PubMed

    Malan, T K; Haffner, W H; Armstrong, A Y; Satin, A J

    2000-11-01

    To describe a system for recording resident experience involving hand-held computers with the Palm Operating System (3 Com, Inc., Santa Clara, CA). Hand-held personal computers (PCs) are popular, easy to use, inexpensive, portable, and can share data among other operating systems. Residents in our program carry individual hand-held database computers to record Residency Review Committee (RRC) reportable patient encounters. Each resident's data is transferred to a single central relational database compatible with Microsoft Access (Microsoft Corporation, Redmond, WA). Patient data entry and subsequent transfer to a central database is accomplished with commercially available software that requires minimal computer expertise to implement and maintain. The central database can then be used for statistical analysis or to create required RRC resident experience reports. As a result, the data collection and transfer process takes less time for residents and program director alike, than paper-based or central computer-based systems. The system of collecting resident encounter data using hand-held computers with the Palm Operating System is easy to use, relatively inexpensive, accurate, and secure. The user-friendly system provides prompt, complete, and accurate data, enhancing the education of residents while facilitating the job of the program director.

  13. The effects of computer simulation versus hands-on dissection and the placement of computer simulation within the learning cycle on student achievement and attitude

    NASA Astrophysics Data System (ADS)

    Hopkins, Kathryn Susan

    The value of dissection as an instructional strategy has been debated, but not evidenced in research literature. The purpose of this study was to examine the efficacy of using computer simulated frog dissection as a substitute for traditional hands-on frog dissection and to examine the possible enhancement of achievement by combining the two strategies in a specific sequence. In this study, 134 biology students at two Central Texas schools were divided into the five following treatment groups: computer simulation of frog dissection, computer simulation before dissection, traditional hands-on frog dissection, dissection before computer simulation, and textual worksheet materials. The effects on achievement were evaluated by labeling 10 structures on three diagrams, identifying 11 pinned structures on a prosected frog, and answering 9 multiple-choice questions over the dissection process. Attitude was evaluated using a thirty item survey with a five-point Likert scale. The quasi-experimental design was pretest/post-test/post-test nonequivalent group for both control and experimental groups, a 2 x 2 x 5 completely randomized factorial design (gender, school, five treatments). The pretest/post-test design was incorporated to control for prior knowledge using analysis of covariance. The dissection only group evidenced a significantly higher performance than all other treatments except dissection-then-computer on the post-test segment requiring students to label pinned anatomical parts on a prosected frog. Interactions between treatment and school in addition to interaction between treatment and gender were found to be significant. The diagram and attitude post-tests evidenced no significant difference. Results on the nine multiple-choice questions about dissection procedures indicated a significant difference between schools. The interaction between treatment and school was also found to be significant. On a delayed post-test, a significant difference in gender was found on the diagram labeling segment of the post-test. Males were reported to have the higher score. Since existing research conflicts with this study's results, additional research using authentic assessment is recommended. Instruction should be aligned with dissection content and process objectives for each treatment group, and the teacher variable should be controlled.

  14. Does non-central nervous system tuberculosis increase the risk of ischemic stroke? A population-based propensity score-matched follow-up study.

    PubMed

    Wu, Chueh-Hung; Chen, Li-Sheng; Yen, Ming-Fang; Chiu, Yueh-Hsia; Fann, Ching-Yuan; Chen, Hsiu-Hsi; Pan, Shin-Liang

    2014-01-01

    Previous studies on the association between tuberculosis and the risk of developing ischemic stroke have generated inconsistent results. We therefore performed a population-based, propensity score-matched longitudinal follow-up study to investigate whether contracting non-central nervous system (CNS) tuberculosis leads to an increased risk of ischemic stroke. We used a logistic regression model that includes age, sex, pre-existing comorbidities and socioeconomic status as covariates to compute the propensity score. A total of 5804 persons with at least three ambulatory visits in 2001 with the principal diagnosis of non-CNS tuberculosis were enrolled in the tuberculosis group. The non-tuberculosis group consisted of 5804, propensity score-matched subjects without tuberculosis. The three-year ischemic stroke-free survival rates for these 2 groups were estimated using the Kaplan-Meier method. The stratified Cox proportional hazards regression was used to estimate the effect of tuberculosis on the occurrence of ischemic stroke. During three-year follow-up, 176 subjects in the tuberculosis group (3.0%) and 207 in the non-tuberculosis group (3.6%) had ischemic stroke. The hazard ratio for developing ischemic stroke in the tuberculosis group was 0.92 compared to the non-tuberculosis group (95% confidence interval: 0.73-1.14, P = 0.4299). Non-CNS tuberculosis does not increase the risk of subsequent ischemic stroke.

  15. Pneumatic tube system transport does not alter platelet function in optical and whole blood aggregometry, prothrombin time, activated partial thromboplastin time, platelet count and fibrinogen in patients on anti-platelet drug therapy

    PubMed Central

    Enko, Dietmar; Mangge, Harald; Münch, Andreas; Niedrist, Tobias; Mahla, Elisabeth; Metzler, Helfried; Prüller, Florian

    2017-01-01

    Introduction The aim of this study was to assess pneumatic tube system (PTS) alteration on platelet function by the light transmission aggregometry (LTA) and whole blood aggregometry (WBA) method, and on the results of platelet count, prothrombin time (PT), activated partial thromboplastin time (APTT), and fibrinogen. Materials and methods Venous blood was collected into six 4.5 mL VACUETTE® 9NC coagulation sodium citrate 3.8% tubes (Greiner Bio-One International GmbH, Kremsmünster, Austria) from 49 intensive care unit (ICU) patients on dual anti-platelet therapy and immediately hand carried to the central laboratory. Blood samples were divided into 2 Groups: Group 1 samples (N = 49) underwent PTS (4 m/s) transport from the central laboratory to the distant laboratory and back to the central laboratory, whereas Group 2 samples (N = 49) were excluded from PTS forces. In both groups, LTA and WBA stimulated with collagen, adenosine-5’-diphosphate (ADP), arachidonic acid (AA) and thrombin-receptor-activated-peptide 6 (TRAP-6) as well as platelet count, PT, APTT, and fibrinogen were performed. Results No statistically significant differences were observed between blood samples with (Group 1) and without (Group 2) PTS transport (P values from 0.064 – 0.968). The AA-induced LTA (bias: 68.57%) exceeded the bias acceptance limit of ≤ 25%. Conclusions Blood sample transportation with computer controlled PTS in our hospital had no statistically significant effects on platelet aggregation determined in patients with anti-platelet therapy. Although AA induced LTA showed a significant bias, the diagnostic accuracy was not influenced. PMID:28392742

  16. Identifying a system of predominant negative symptoms: Network analysis of three randomized clinical trials.

    PubMed

    Levine, Stephen Z; Leucht, Stefan

    2016-12-01

    Reasons for the recent mixed success of research into negative symptoms may be informed by conceptualizing negative symptoms as a system that is identifiable from network analysis. We aimed to identify: (I) negative symptom systems; (I) central negative symptoms within each system; and (III) differences between the systems, based on network analysis of negative symptoms for baseline, endpoint and change. Patients with chronic schizophrenia and predominant negative symptoms participated in three clinical trials that compared placebo and amisulpride to 60days (n=487). Networks analyses were computed from the Scale for the Assessment of Negative Symptoms (SANS) scores for baseline and endpoint for severity, and estimated change based on mixed models. Central symptoms to each network were identified. The networks were contrasted for connectivity with permutation tests. Network analysis showed that the baseline and endpoint symptom severity systems formed symptom groups of Affect, Poor responsiveness, Lack of interest, and Apathy-inattentiveness. The baseline and endpoint networks did not significantly differ in terms of connectivity, but both significantly (P<0.05) differed to the change network. In the change network the apathy-inattentiveness symptom group split into three other groups. The most central symptoms were Decreased Spontaneous Movements at baseline and endpoint, and Poverty of Speech for estimated change. Results provide preliminary evidence for: (I) a replicable negative symptom severity system; and (II) symptoms with high centrality (e.g., Decreased Spontaneous Movement), that may be future treatment targets following replication to ensure the curent results generalize to other samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Grid site availability evaluation and monitoring at CMS

    DOE PAGES

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; ...

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  18. Grid site availability evaluation and monitoring at CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  19. Grid site availability evaluation and monitoring at CMS

    NASA Astrophysics Data System (ADS)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; Lammel, Stephan; Sciabà, Andrea

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impact data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.

  20. A DICOM based radiotherapy plan database for research collaboration and reporting

    NASA Astrophysics Data System (ADS)

    Westberg, J.; Krogh, S.; Brink, C.; Vogelius, I. R.

    2014-03-01

    Purpose: To create a central radiotherapy (RT) plan database for dose analysis and reporting, capable of calculating and presenting statistics on user defined patient groups. The goal is to facilitate multi-center research studies with easy and secure access to RT plans and statistics on protocol compliance. Methods: RT institutions are able to send data to the central database using DICOM communications on a secure computer network. The central system is composed of a number of DICOM servers, an SQL database and in-house developed software services to process the incoming data. A web site within the secure network allows the user to manage their submitted data. Results: The RT plan database has been developed in Microsoft .NET and users are able to send DICOM data between RT centers in Denmark. Dose-volume histogram (DVH) calculations performed by the system are comparable to those of conventional RT software. A permission system was implemented to ensure access control and easy, yet secure, data sharing across centers. The reports contain DVH statistics for structures in user defined patient groups. The system currently contains over 2200 patients in 14 collaborations. Conclusions: A central RT plan repository for use in multi-center trials and quality assurance was created. The system provides an attractive alternative to dummy runs by enabling continuous monitoring of protocol conformity and plan metrics in a trial.

  1. Automatic measurement; Mesures automatiques (in French)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ringeard, C.

    1974-11-28

    By its ability to link-up operations sequentially and memorize the data collected, the computer can introduce a statistical approach in the evaluation of a result. To benefit fully from the advantages of automation, a special effort was made to reduce the programming time to a minimum and to simplify link-ups between the existing system and instruments from different sources. The practical solution of the test laboratory of the C.E.A. Centralized Administration Groupe (GEC) is given.

  2. Percolation Centrality: Quantifying Graph-Theoretic Impact of Nodes during Percolation in Networks

    PubMed Central

    Piraveenan, Mahendra; Prokopenko, Mikhail; Hossain, Liaquat

    2013-01-01

    A number of centrality measures are available to determine the relative importance of a node in a complex network, and betweenness is prominent among them. However, the existing centrality measures are not adequate in network percolation scenarios (such as during infection transmission in a social network of individuals, spreading of computer viruses on computer networks, or transmission of disease over a network of towns) because they do not account for the changing percolation states of individual nodes. We propose a new measure, percolation centrality, that quantifies relative impact of nodes based on their topological connectivity, as well as their percolation states. The measure can be extended to include random walk based definitions, and its computational complexity is shown to be of the same order as that of betweenness centrality. We demonstrate the usage of percolation centrality by applying it to a canonical network as well as simulated and real world scale-free and random networks. PMID:23349699

  3. Die spacer thickness reproduction for central incisor crown fabrication with combined computer-aided design and 3D printing technology: an in vitro study.

    PubMed

    Hoang, Lisa N; Thompson, Geoffrey A; Cho, Seok-Hwan; Berzins, David W; Ahn, Kwang Woo

    2015-05-01

    The inability to control die spacer thickness has been reported. However, little information is available on the congruency between the computer-aided design parameters for die spacer thickness and the actual printout. The purpose of this study was to evaluate the accuracy and precision of the die spacer thickness achieved by combining computer-aided design and 3-dimensional printing technology. An ivorine maxillary central incisor was prepared for a ceramic crown. The prepared tooth was duplicated by using polyvinyl siloxane duplicating silicone, and 80 die-stone models were produced from Type IV dental stone. The dies were randomly divided into 5 groups with assigned die spacer thicknesses of 25 μm, 45 μm, 65 μm, 85 μm, and 105 μm (n=16). The printed resin copings, obtained from a printer (ProJet DP 3000; 3D Systems), were cemented onto their respective die-stone models with self-adhesive resin cement and stored at room temperature until sectioning into halves in a buccolingual direction. The internal gap was measured at 5 defined locations per side of the sectioned die. Images of the printed resin coping/die-stone model internal gap dimensions were obtained with an inverted bright field metallurgical microscope at ×100 magnification. The acquired digital image was calibrated, and measurements were made using image analysis software. Mixed models (α=.05) were used to evaluate accuracy. A false discovery rate at 5% was used to adjust for multiple testing. Coefficient of variation was used to determine the precision for each group and was evaluated statistically with the Wald test (α=.05). The accuracy, expressed in terms of the mean differences between the prescribed die spacer thickness and the measured internal gap (standard deviation), was 50 μm (11) for the 25 μm group simulated die spacer thickness, 30 μm (10) for the 45 μm group, 15 μm (14) for the 65 μm group, 3 μm (23) for the 85 μm group, and -10 μm (32) for the 105 μm group. The precision mean of the measurements, expressed as a coefficient of variation, ranged between 14% and 33% for the 5 groups. For the accuracy evaluation, statistically significant differences were found for all the groups, except the group of 85 μm. For the precision assessment, the coefficient of variation was above 10% for all groups, showing the printer's inability to reproduce the uniform internal gap within the same group. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  4. Automatic Mexican sign language and digits recognition using normalized central moments

    NASA Astrophysics Data System (ADS)

    Solís, Francisco; Martínez, David; Espinosa, Oscar; Toxqui, Carina

    2016-09-01

    This work presents a framework for automatic Mexican sign language and digits recognition based on computer vision system using normalized central moments and artificial neural networks. Images are captured by digital IP camera, four LED reflectors and a green background in order to reduce computational costs and prevent the use of special gloves. 42 normalized central moments are computed per frame and used in a Multi-Layer Perceptron to recognize each database. Four versions per sign and digit were used in training phase. 93% and 95% of recognition rates were achieved for Mexican sign language and digits respectively.

  5. Value of 18F-FDG PET/CT in diagnosing chronic Q fever in patients with central vascular disease.

    PubMed

    Hagenaars, J C J P; Wever, P C; Vlake, A W; Renders, N H M; van Petersen, A S; Hilbink, M; de Jager-Leclercq, M G L; Moll, F L; Koning, O H J; Hoekstra, C J

    2016-08-01

    The aim of this study is to describe the value of 2-deoxy-2-[18F]fluoro-D-glucose positron emission tomography/computed tomography (18F-FDG PET/CT) in diagnosing chronic Q fever in patients with central vascular disease and the added value of 18F-FDG PET/CT in the diagnostic combination strategy as described in the Dutch consensus guideline for diagnosing chronic Q fever. 18F-FDG PET/CT was performed in patients with an abdominal aortic aneurysm or aorto-iliac reconstruction and chronic Q fever, diagnosed by serology and positive PCR for Coxiella burnetii DNA in blood and/or tissue (PCR-positive study group). Patients with an abdominal aortic aneurysm or aorto-iliac reconstruction without clinical and serological findings indicating Q fever infection served as a control group. Patients with a serological profile of chronic Q fever and a negative PCR in blood were included in additional analyses (PCR-negative study group). Thirteen patients were evaluated in the PCR-positive study group and 22 patients in the control group. 18F-FDG PET/CT indicated vascular infection in 6/13 patients in the PCR-positive study group and 2/22 patients in the control group. 18F-FDG PET/CT demonstrated a sensitivity of 46% (95% CI: 23-71%), specificity of 91% (95% CI: 71-99%), positive predictive value of 75% (95% CI:41-93%) and negative predictive value of 74% (95% CI: 55-87%). In the PCR-negative study group, 18F-FDG PET/CT was positive in 10/20 patients (50%). The combination of 18F-FDG PET/CT, as an imaging tool for identifying a focus of infection, and Q fever serology is a valid diagnostic strategy for diagnosing chronic Q fever in patients with central vascular disease.

  6. Round Robin Study: Molecular Simulation of Thermodynamic Properties from Models with Internal Degrees of Freedom.

    PubMed

    Schappals, Michael; Mecklenfeld, Andreas; Kröger, Leif; Botan, Vitalie; Köster, Andreas; Stephan, Simon; García, Edder J; Rutkai, Gabor; Raabe, Gabriele; Klein, Peter; Leonhard, Kai; Glass, Colin W; Lenhard, Johannes; Vrabec, Jadran; Hasse, Hans

    2017-09-12

    Thermodynamic properties are often modeled by classical force fields which describe the interactions on the atomistic scale. Molecular simulations are used for retrieving thermodynamic data from such models, and many simulation techniques and computer codes are available for that purpose. In the present round robin study, the following fundamental question is addressed: Will different user groups working with different simulation codes obtain coinciding results within the statistical uncertainty of their data? A set of 24 simple simulation tasks is defined and solved by five user groups working with eight molecular simulation codes: DL_POLY, GROMACS, IMC, LAMMPS, ms2, NAMD, Tinker, and TOWHEE. Each task consists of the definition of (1) a pure fluid that is described by a force field and (2) the conditions under which that property is to be determined. The fluids are four simple alkanes: ethane, propane, n-butane, and iso-butane. All force fields consider internal degrees of freedom: OPLS, TraPPE, and a modified OPLS version with bond stretching vibrations. Density and potential energy are determined as a function of temperature and pressure on a grid which is specified such that all states are liquid. The user groups worked independently and reported their results to a central instance. The full set of results was disclosed to all user groups only at the end of the study. During the study, the central instance gave only qualitative feedback. The results reveal the challenges of carrying out molecular simulations. Several iterations were needed to eliminate gross errors. For most simulation tasks, the remaining deviations between the results of the different groups are acceptable from a practical standpoint, but they are often outside of the statistical errors of the individual simulation data. However, there are also cases where the deviations are unacceptable. This study highlights similarities between computer experiments and laboratory experiments, which are both subject not only to statistical error but also to systematic error.

  7. Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level

    NASA Astrophysics Data System (ADS)

    Christiansen, Henning

    2004-09-01

    Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural science or humanities. It has been developed for a course that integrates theoretical material on computer languages and abstract machines with practical programming techniques. Prolog used as meta-language for describing language issues is the central instrument in the approach: Formal descriptions become running prototypes that are easy and appealing to test and modify, and can be extended into analyzers, interpreters, and tools such as tracers and debuggers. Experience shows a high learning curve, especially when the principles are extended into a learning-by-doing approach having the students to develop such descriptions themselves from an informal introduction.

  8. A Real-Time High Performance Computation Architecture for Multiple Moving Target Tracking Based on Wide-Area Motion Imagery via Cloud and Graphic Processing Units

    PubMed Central

    Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik

    2017-01-01

    This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions. PMID:28208684

  9. A Real-Time High Performance Computation Architecture for Multiple Moving Target Tracking Based on Wide-Area Motion Imagery via Cloud and Graphic Processing Units.

    PubMed

    Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik

    2017-02-12

    This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions.

  10. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-94, with projections to 2020; (supplement one to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.). Output files resulting from the computer simulations are included for reference.

  11. Experiments in Computing: A Survey

    PubMed Central

    Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general. PMID:24688404

  12. Experiments in computing: a survey.

    PubMed

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  13. Locating the Acupoint Baihui (GV20) Beneath the Cerebral Cortex with MRI Reconstructed 3D Neuroimages.

    PubMed

    Shen, Ein-Yiao; Chen, Fun-Jou; Chen, Yun-Yin; Lin, Ming-Fan

    2011-01-01

    Baihui (GV20) is one of the most important acupoints of the Du meridian (the government vessel) and is commonly used in neurology and psychiatry and as a distal point of anorectal disorders by general practitioners. The anatomical relationship between the scalp region of the acupoint and the underlying corresponding cortex remains obscure. In this study, we first prepared the indicator for MRI scanning on a GE 1.5 T excite machine in a mode suitable for 3D reconstruction. The 3D Avizo software system (version 6.0, Mercury Computer Systems, Inc., Germany) was then used for image processing and the resulting data subsequently analyzed using descriptive statistics and analysis of variance (ANOVA). The mean distance from the Baihui anterior to the central sulcus in the adult group was greater than that in the child group (22.7 ± 2.2 and 19.7 ± 2.2 mm, resp., P = .042), whereas in the child group the distance between the Baihui anterior and the precentral sulcus was greater than in the adult group (6.8 ± 0.8 and 3.8 ± 0.8 mm, resp., P < .001). This MRI presentation demonstrates that the location of Baihui (GV20) can be identified using the distance from the central or precentral sulcus.

  14. U.S. EPA computational toxicology programs: Central role of chemical-annotation efforts and molecular databases

    EPA Science Inventory

    EPA’s National Center for Computational Toxicology is engaged in high-profile research efforts to improve the ability to more efficiently and effectively prioritize and screen thousands of environmental chemicals for potential toxicity. A central component of these efforts invol...

  15. Statistical inference on genetic data reveals the complex demographic history of human populations in central Asia.

    PubMed

    Palstra, Friso P; Heyer, Evelyne; Austerlitz, Frédéric

    2015-06-01

    The demographic history of modern humans constitutes a combination of expansions, colonizations, contractions, and remigrations. The advent of large scale genetic data combined with statistically refined methods facilitates inference of this complex history. Here we study the demographic history of two genetically admixed ethnic groups in Central Asia, an area characterized by high levels of genetic diversity and a history of recurrent immigration. Using Approximate Bayesian Computation, we infer that the timing of admixture markedly differs between the two groups. Admixture in the traditionally agricultural Tajiks could be dated back to the onset of the Neolithic transition in the region, whereas admixture in Kyrgyz is more recent, and may have involved the westward movement of Turkic peoples. These results are confirmed by a coalescent method that fits an isolation-with-migration model to the genetic data, with both Central Asian groups having received gene flow from the extremities of Eurasia. Interestingly, our analyses also uncover signatures of gene flow from Eastern to Western Eurasia during Paleolithic times. In conclusion, the high genetic diversity currently observed in these two Central Asian peoples most likely reflects the effects of recurrent immigration that likely started before historical times. Conversely, conquests during historical times may have had a relatively limited genetic impact. These results emphasize the need for a better understanding of the genetic consequences of transmission of culture and technological innovations, as well as those of invasions and conquests. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Normal body mass index with central obesity has increased risk of coronary artery calcification in Korean patients with chronic kidney disease.

    PubMed

    Lee, Mi Jung; Park, Jung Tak; Park, Kyoung Sook; Kwon, Young Eun; Han, Seung Hyeok; Kang, Shin-Wook; Choi, Kyu Hun; Oh, Kook-Hwan; Park, Sue Kyung; Chae, Dong Wan; Lee, Kyubeck; Hwang, Young-Hwan; Kim, Soo Wan; Kim, Yeong Hoon; Kang, Sun Woo; Lee, Joongyub; Ahn, Curie; Yoo, Tae-Hyun

    2016-12-01

    In chronic kidney disease (CKD), overweight and mild obesity have shown the lowest cardiovascular (CV) risk. However, central obesity has been directly associated with CV risk in these patients. This bidirectional relationship of body mass index (BMI) and central obesity prompted us to evaluate CV risk based on a combination of BMI and waist-to-hip ratio (WHR) in nondialysis CKD patients. We included 1078 patients with CKD stage 2 through 5 (nondialysis) enrolled in a nationwide prospective cohort of Korea. Patients were divided into 3 groups by BMI (normal BMI, 18.5-22.9; overweight, 23.0-27.4; and obese, 27.5 and over kg/m 2 ) and were dichotomized by a sex-specific median WHR (0.92 in males and 0.88 in females). Coronary artery calcification (CAC) was determined by multislice computed tomography. CAC (score above 10 Agatston units) was found in 477 patients. Multivariate logistic regression analysis indicated that BMI was not independently associated with CAC. However, WHR showed an independent linear and significant association with CAC (odds ratio, 1.036; 95% confidence interval, 1.007-1.065 per 0.01 increase). Furthermore, when patients were categorized into 6 groups according to a combination of BMI and WHR, normal BMI but higher WHR had the highest risk of CAC compared with the normal BMI with lower WHR group (2.104; 1.074-4.121). Thus, a normal BMI with central obesity was associated with the highest risk of CAC, suggesting that considering BMI and WHR, 2 surrogates of obesity, can help to discriminate CV risk in Korean nondialysis CKD patients. Copyright © 2016 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.

  17. The ATLAS Production System Evolution: New Data Processing and Analysis Paradigm for the LHC Run2 and High-Luminosity

    NASA Astrophysics Data System (ADS)

    Barreiro, F. H.; Borodin, M.; De, K.; Golubkov, D.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Padolski, S.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The second generation of the ATLAS Production System called ProdSys2 is a distributed workload manager that runs daily hundreds of thousands of jobs, from dozens of different ATLAS specific workflows, across more than hundred heterogeneous sites. It achieves high utilization by combining dynamic job definition based on many criteria, such as input and output size, memory requirements and CPU consumption, with manageable scheduling policies and by supporting different kind of computational resources, such as GRID, clouds, supercomputers and volunteer-computers. The system dynamically assigns a group of jobs (task) to a group of geographically distributed computing resources. Dynamic assignment and resources utilization is one of the major features of the system, it didn’t exist in the earliest versions of the production system where Grid resources topology was predefined using national or/and geographical pattern. Production System has a sophisticated job fault-recovery mechanism, which efficiently allows to run multi-Terabyte tasks without human intervention. We have implemented “train” model and open-ended production which allow to submit tasks automatically as soon as new set of data is available and to chain physics groups data processing and analysis with central production by the experiment. We present an overview of the ATLAS Production System and its major components features and architecture: task definition, web user interface and monitoring. We describe the important design decisions and lessons learned from an operational experience during the first year of LHC Run2. We also report the performance of the designed system and how various workflows, such as data (re)processing, Monte-Carlo and physics group production, users analysis, are scheduled and executed within one production system on heterogeneous computing resources.

  18. Distributed Computing with Centralized Support Works at Brigham Young.

    ERIC Educational Resources Information Center

    McDonald, Kelly; Stone, Brad

    1992-01-01

    Brigham Young University (Utah) has addressed the need for maintenance and support of distributed computing systems on campus by implementing a program patterned after a national business franchise, providing the support and training of a centralized administration but allowing each unit to operate much as an independent small business.…

  19. Design of the central region in the Gustaf Werner cyclotron at the Uppsala university

    NASA Astrophysics Data System (ADS)

    Toprek, Dragan; Reistad, Dag; Lundstrom, Bengt; Wessman, Dan

    2002-07-01

    This paper describes the design of the central region in the Gustaf Werner cyclotron for h=1, 2 and 3 modes of acceleration. The electric field distribution in the inflector and in the four acceleration gaps has been numerically calculated from an electric potential map produced by the program RELAX3D. The geometry of the central region has been tested with the computations of orbits carried out by means of the computer code CYCLONE. The optical properties of the spiral inflector and the central region were studied by using the programs CASINO and CYCLONE, respectively.

  20. Development, Evaluation and Implementation of Chief Complaint Groupings to Activate Data Collection: A Multi-Center Study of Clinical Decision Support for Children with Head Trauma.

    PubMed

    Deakyne, S J; Bajaj, L; Hoffman, J; Alessandrini, E; Ballard, D W; Norris, R; Tzimenatos, L; Swietlik, M; Tham, E; Grundmeier, R W; Kuppermann, N; Dayan, P S

    2015-01-01

    Overuse of cranial computed tomography scans in children with blunt head trauma unnecessarily exposes them to radiation. The Pediatric Emergency Care Applied Research Network (PECARN) blunt head trauma prediction rules identify children who do not require a computed tomography scan. Electronic health record (EHR) based clinical decision support (CDS) may effectively implement these rules but must only be provided for appropriate patients in order to minimize excessive alerts. To develop, implement and evaluate site-specific groupings of chief complaints (CC) that accurately identify children with head trauma, in order to activate data collection in an EHR. As part of a 13 site clinical trial comparing cranial computed tomography use before and after implementation of CDS, four PECARN sites centrally developed and locally implemented CC groupings to trigger a clinical trial alert (CTA) to facilitate the completion of an emergency department head trauma data collection template. We tested and chose CC groupings to attain high sensitivity while maintaining at least moderate specificity. Due to variability in CCs available, identical groupings across sites were not possible. We noted substantial variability in the sensitivity and specificity of seemingly similar CC groupings between sites. The implemented CC groupings had sensitivities greater than 90% with specificities between 75-89%. During the trial, formal testing and provider feedback led to tailoring of the CC groupings at some sites. CC groupings can be successfully developed and implemented across multiple sites to accurately identify patients who should have a CTA triggered to facilitate EHR data collection. However, CC groupings will necessarily vary in order to attain high sensitivity and moderate-to-high specificity. In future trials, the balance between sensitivity and specificity should be considered based on the nature of the clinical condition, including prevalence and morbidity, in addition to the goals of the intervention being considered.

  1. Constraining the Single-degenerate Channel of Type Ia Supernovae with Stable Iron-group Elements in SNR 3C 397

    NASA Astrophysics Data System (ADS)

    Dave, Pranav; Kashyap, Rahul; Fisher, Robert; Timmes, Frank; Townsley, Dean; Byrohl, Chris

    2017-05-01

    Recent Suzaku X-ray spectra of supernova remnant (SNR) 3C 397 indicate enhanced stable iron group element abundances of Ni, Mn, Cr, and Fe. Seeking to address key questions about the progenitor and explosion mechanism of 3C 397, we compute nucleosynthetic yields from a suite of multidimensional hydrodynamics models in the near-Chandrasekhar-mass, single-degenerate paradigm for Type Ia supernovae (SNe Ia). Varying the progenitor white dwarf (WD) internal structure, composition, ignition, and explosion mechanism, we find that the best match to the observed iron peak elements of 3C 397 are dense (central density ≥6 × 109 g cm-3), low-carbon WDs that undergo a weak, centrally ignited deflagration, followed by a subsequent detonation. The amount of 56Ni produced is consistent with a normal or bright normal SNe Ia. A pure deflagration of a centrally ignited, low central density (≃2 × 109 g cm-3) progenitor WD, frequently considered in the literature, is also found to produce good agreement with 3C 397 nucleosynthetic yields, but leads to a subluminous SN Ia event, in conflict with X-ray line width data. Additionally, in contrast to prior work that suggested a large supersolar metallicity for the WD progenitor for SNR 3C 397, we find satisfactory agreement for solar- and subsolar-metallicity progenitors. We discuss a range of implications our results have for the single-degenerate channel.

  2. Illuminating cancer health disparities using ethnogenetic layering (EL) and phenotype segregation network analysis (PSNA).

    PubMed

    Jackson, Fatimah L C

    2006-01-01

    Resolving cancer health disparities continues to befuddle simplistic racial models. The racial groups alluded to in biomedicine, public health, and epidemiology are often profoundly substructured. EL and PSNA are computational assisted techniques that focus on microethnic group (MEG) substructure. Geographical variations in cancer may be due to differences in MEG ancestry or similar environmental exposures to a recognized carcinogen. Examples include breast and prostate cancers in the Chesapeake Bay region and Bight of Biafra biological ancestry, hypertension and stroke in the Carolina Coast region and Central African biological ancestry, and pancreatic cancer in the Mississippi Delta region and dietary/medicinal exposure to safrol from Sassafras albidum.

  3. Computed tomography demonstrates abnormalities of contralateral ear in subjects with unilateral sensorineural hearing loss.

    PubMed

    Marcus, Sonya; Whitlow, Christopher T; Koonce, James; Zapadka, Michael E; Chen, Michael Y; Williams, Daniel W; Lewis, Meagan; Evans, Adele K

    2014-02-01

    Prior studies have associated gross inner ear abnormalities with pediatric sensorineural hearing loss (SNHL) using computed tomography (CT). No studies to date have specifically investigated morphologic inner ear abnormalities involving the contralateral unaffected ear in patients with unilateral SNHL. The purpose of this study is to evaluate contralateral inner ear structures of subjects with unilateral SNHL but no grossly abnormal findings on CT. IRB-approved retrospective analysis of pediatric temporal bone CT scans. 97 temporal bone CT scans, previously interpreted as "normal" based upon previously accepted guidelines by board certified neuroradiologists, were assessed using 12 measurements of the semicircular canals, cochlea and vestibule. The control-group consisted of 72 "normal" temporal bone CTs with underlying SNHL in the subject excluded. The study-group consisted of 25 normal-hearing contralateral temporal bones in subjects with unilateral SNHL. Multivariate analysis of covariance (MANCOVA) was then conducted to evaluate for differences between the study and control group. Cochlea basal turn lumen width was significantly greater in magnitude and central lucency of the lateral semicircular canal bony island was significantly lower in density for audiometrically normal ears of subjects with unilateral SNHL compared to controls. Abnormalities of the inner ear were present in the contralateral audiometrically normal ears of subjects with unilateral SNHL. These data suggest that patients with unilateral SNHL may have a more pervasive disease process that results in abnormalities of both ears. The findings of a cochlea basal turn lumen width disparity >5% from "normal" and/or a lateral semicircular canal bony island central lucency disparity of >5% from "normal" may indicate inherent risk to the contralateral unaffected ear in pediatric patients with unilateral sensorineural hearing loss. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Scale Space for Camera Invariant Features.

    PubMed

    Puig, Luis; Guerrero, José J; Daniilidis, Kostas

    2014-09-01

    In this paper we propose a new approach to compute the scale space of any central projection system, such as catadioptric, fisheye or conventional cameras. Since these systems can be explained using a unified model, the single parameter that defines each type of system is used to automatically compute the corresponding Riemannian metric. This metric, is combined with the partial differential equations framework on manifolds, allows us to compute the Laplace-Beltrami (LB) operator, enabling the computation of the scale space of any central projection system. Scale space is essential for the intrinsic scale selection and neighborhood description in features like SIFT. We perform experiments with synthetic and real images to validate the generalization of our approach to any central projection system. We compare our approach with the best-existing methods showing competitive results in all type of cameras: catadioptric, fisheye, and perspective.

  5. Childhood Adversity and Pain Sensitization.

    PubMed

    You, Dokyoung Sophia; Meagher, Mary W

    Childhood adversity is a vulnerability factor for chronic pain. However, the underlying pain mechanisms influenced by childhood adversity remain unknown. The aim of the current study was to evaluate the impact of childhood adversity on dynamic pain sensitivity in young adults. After screening for childhood adverse events and health status, healthy individuals reporting low (below median; n = 75) or high levels of adversity (the top 5%; n = 51) were invited for pain testing. Both groups underwent heat pain threshold and temporal summation of second pain (TSSP) testing after reporting depressive symptoms. TSSP refers to a progressive increase in pain intensity with repetition of identical noxious stimuli and is attributed to central sensitization. Changes in pain ratings over time (slope) were computed for TSSP sensitization and decay of subsequent aftersensations. The high-adversity group showed greater TSSP sensitization (meanslope, 0.75; SDpositive slope, 1.78), and a trend toward a slower decay (meanslope, -11.9; SD, 3.4), whereas the low-adversity group showed minimal sensitization (meanslope, 0.07; SDnear-zero slope, 1.77), F(1,123) = 5.84, p = .017 and faster decay (meanslope, -13.1; SD, 3.4), F(1,123) = 3.79, p = .054. This group difference remained significant even after adjusting for adult depressive symptoms (p = .033). No group difference was found in heat pain threshold (p = .85). Lastly, the high-adversity group showed blunted cardiac and skin conductance responses. These findings suggest that enhancement of central sensitization may provide a mechanism underlying the pain hypersensitivity and chronicity linked to childhood adversity.

  6. Economics of Computing: The Case of Centralized Network File Servers.

    ERIC Educational Resources Information Center

    Solomon, Martin B.

    1994-01-01

    Discusses computer networking and the cost effectiveness of decentralization, including local area networks. A planned experiment with a centralized approach to the operation and management of file servers at the University of South Carolina is described that hopes to realize cost savings and the avoidance of staffing problems. (Contains four…

  7. 51. VIEW OF LORAL ADS 100A COMPUTERS LOCATED CENTRALLY ON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    51. VIEW OF LORAL ADS 100A COMPUTERS LOCATED CENTRALLY ON NORTH WALL OF TELEMETRY ROOM (ROOM 106). SLC-3W CONTROL ROOM IS VISIBLE IN BACKGROUND THROUGH WINDOW IN NORTH WALL. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  8. Inter-hemispheric functional connectivity disruption in children with prenatal alcohol exposure

    PubMed Central

    Wozniak, Jeffrey R.; Mueller, Bryon A.; Muetzel, Ryan L.; Bell, Christopher J.; Hoecker, Heather L.; Nelson, Miranda L.; Chang, Pi-Nian; Lim, Kelvin O.

    2010-01-01

    Background MRI studies, including recent diffusion tensor imaging (DTI) studies, have shown corpus callosum abnormalities in children prenatally exposed to alcohol, especially in the posterior regions. These abnormalities appear across the range of Fetal Alcohol Spectrum Disorders (FASD). Several studies have demonstrated cognitive correlates of callosal abnormalities in FASD including deficits in visual-motor skill, verbal learning, and executive functioning. The goal of this study was to determine if inter-hemispheric structural connectivity abnormalities in FASD are associated with disrupted inter-hemispheric functional connectivity and disrupted cognition. Methods Twenty-one children with FASD and 23 matched controls underwent a six minute resting-state functional MRI scan as well as anatomical imaging and DTI. Using a semiautomated method, we parsed the corpus callosum and delineated seven inter-hemispheric white matter tracts with DTI tractography. Cortical regions of interest (ROIs) at the distal ends of these tracts were identified. Right-left correlations in resting fMRI signal were computed for these sets of ROIs and group comparisons were done. Correlations with facial dysmorphology, cognition, and DTI measures were computed. Results A significant group difference in inter-hemispheric functional connectivity was seen in a posterior set of ROIs, the para-central region. Children with FASD had functional connectivity that was 12% lower than controls in this region. Sub-group analyses were not possible due to small sample size, but the data suggest that there were effects across the FASD spectrum. No significant association with facial dysmorphology was found. Para-central functional connectivity was significantly correlated with DTI mean diffusivity, a measure of microstructural integrity, in posterior callosal tracts in controls but not in FASD. Significant correlations were seen between these structural and functional measures and Wechsler perceptual reasoning ability. Conclusions Inter-hemispheric functional connectivity disturbances were observed in children with FASD relative to controls. The disruption was measured in medial parietal regions (para-central) that are connected by posterior callosal fiber projections. We have previously shown microstructural abnormalities in these same posterior callosal regions and the current study suggests a possible relationship between the two. These measures have clinical relevance as they are associated with cognitive functioning. PMID:21303384

  9. Design of the central region in the Warsaw K-160 cyclotron

    NASA Astrophysics Data System (ADS)

    Toprek, Dragan; Sura, Josef; Choinski, Jaroslav; Czosnyka, Tomas

    2001-08-01

    This paper describes the design of the central region for h=2 and 3 modes of acceleration in the Warsaw K-160 cyclotron. The central region is unique and compatible with the two above-mentioned harmonic modes of operation. Only one spiral type inflector will be used. The electric field distribution in the inflector and in the four acceleration gaps has been numerically calculated from an electric potential map produced by the program RELAX3D. The geometry of the central region has been tested with the computations of orbits carried out by means of the computer code CYCLONE. The optical properties of the spiral inflector and the central region were studied by using the programs CASINO and CYCLONE, respectively.

  10. Managing data from multiple disciplines, scales, and sites to support synthesis and modeling

    USGS Publications Warehouse

    Olson, R. J.; Briggs, J. M.; Porter, J.H.; Mah, Grant R.; Stafford, S.G.

    1999-01-01

    The synthesis and modeling of ecological processes at multiple spatial and temporal scales involves bringing together and sharing data from numerous sources. This article describes a data and information system model that facilitates assembling, managing, and sharing diverse data from multiple disciplines, scales, and sites to support integrated ecological studies. Cross-site scientific-domain working groups coordinate the development of data associated with their particular scientific working group, including decisions about data requirements, data to be compiled, data formats, derived data products, and schedules across the sites. The Web-based data and information system consists of nodes for each working group plus a central node that provides data access, project information, data query, and other functionality. The approach incorporates scientists and computer experts in the working groups and provides incentives for individuals to submit documented data to the data and information system.

  11. Animals as Mobile Biological Sensors for Forest Fire Detection.

    PubMed

    Sahin, Yasar Guneri

    2007-12-04

    This paper proposes a mobile biological sensor system that can assist in earlydetection of forest fires one of the most dreaded natural disasters on the earth. The main ideapresented in this paper is to utilize animals with sensors as Mobile Biological Sensors(MBS). The devices used in this system are animals which are native animals living inforests, sensors (thermo and radiation sensors with GPS features) that measure thetemperature and transmit the location of the MBS, access points for wireless communicationand a central computer system which classifies of animal actions. The system offers twodifferent methods, firstly: access points continuously receive data about animals' locationusing GPS at certain time intervals and the gathered data is then classified and checked tosee if there is a sudden movement (panic) of the animal groups: this method is called animalbehavior classification (ABC). The second method can be defined as thermal detection(TD): the access points get the temperature values from the MBS devices and send the datato a central computer to check for instant changes in the temperatures. This system may beused for many purposes other than fire detection, namely animal tracking, poachingprevention and detecting instantaneous animal death.

  12. Animals as Mobile Biological Sensors for Forest Fire Detection

    PubMed Central

    2007-01-01

    This paper proposes a mobile biological sensor system that can assist in early detection of forest fires one of the most dreaded natural disasters on the earth. The main idea presented in this paper is to utilize animals with sensors as Mobile Biological Sensors (MBS). The devices used in this system are animals which are native animals living in forests, sensors (thermo and radiation sensors with GPS features) that measure the temperature and transmit the location of the MBS, access points for wireless communication and a central computer system which classifies of animal actions. The system offers two different methods, firstly: access points continuously receive data about animals' location using GPS at certain time intervals and the gathered data is then classified and checked to see if there is a sudden movement (panic) of the animal groups: this method is called animal behavior classification (ABC). The second method can be defined as thermal detection (TD): the access points get the temperature values from the MBS devices and send the data to a central computer to check for instant changes in the temperatures. This system may be used for many purposes other than fire detection, namely animal tracking, poaching prevention and detecting instantaneous animal death. PMID:28903281

  13. The origins of lactase persistence in Europe.

    PubMed

    Itan, Yuval; Powell, Adam; Beaumont, Mark A; Burger, Joachim; Thomas, Mark G

    2009-08-01

    Lactase persistence (LP) is common among people of European ancestry, but with the exception of some African, Middle Eastern and southern Asian groups, is rare or absent elsewhere in the world. Lactase gene haplotype conservation around a polymorphism strongly associated with LP in Europeans (-13,910 C/T) indicates that the derived allele is recent in origin and has been subject to strong positive selection. Furthermore, ancient DNA work has shown that the--13,910*T (derived) allele was very rare or absent in early Neolithic central Europeans. It is unlikely that LP would provide a selective advantage without a supply of fresh milk, and this has lead to a gene-culture coevolutionary model where lactase persistence is only favoured in cultures practicing dairying, and dairying is more favoured in lactase persistent populations. We have developed a flexible demic computer simulation model to explore the spread of lactase persistence, dairying, other subsistence practices and unlinked genetic markers in Europe and western Asia's geographic space. Using data on--13,910*T allele frequency and farming arrival dates across Europe, and approximate Bayesian computation to estimate parameters of interest, we infer that the--13,910*T allele first underwent selection among dairying farmers around 7,500 years ago in a region between the central Balkans and central Europe, possibly in association with the dissemination of the Neolithic Linearbandkeramik culture over Central Europe. Furthermore, our results suggest that natural selection favouring a lactase persistence allele was not higher in northern latitudes through an increased requirement for dietary vitamin D. Our results provide a coherent and spatially explicit picture of the coevolution of lactase persistence and dairying in Europe.

  14. The Origins of Lactase Persistence in Europe

    PubMed Central

    Itan, Yuval; Powell, Adam; Beaumont, Mark A.; Burger, Joachim; Thomas, Mark G.

    2009-01-01

    Lactase persistence (LP) is common among people of European ancestry, but with the exception of some African, Middle Eastern and southern Asian groups, is rare or absent elsewhere in the world. Lactase gene haplotype conservation around a polymorphism strongly associated with LP in Europeans (−13,910 C/T) indicates that the derived allele is recent in origin and has been subject to strong positive selection. Furthermore, ancient DNA work has shown that the −13,910*T (derived) allele was very rare or absent in early Neolithic central Europeans. It is unlikely that LP would provide a selective advantage without a supply of fresh milk, and this has lead to a gene-culture coevolutionary model where lactase persistence is only favoured in cultures practicing dairying, and dairying is more favoured in lactase persistent populations. We have developed a flexible demic computer simulation model to explore the spread of lactase persistence, dairying, other subsistence practices and unlinked genetic markers in Europe and western Asia's geographic space. Using data on −13,910*T allele frequency and farming arrival dates across Europe, and approximate Bayesian computation to estimate parameters of interest, we infer that the −13,910*T allele first underwent selection among dairying farmers around 7,500 years ago in a region between the central Balkans and central Europe, possibly in association with the dissemination of the Neolithic Linearbandkeramik culture over Central Europe. Furthermore, our results suggest that natural selection favouring a lactase persistence allele was not higher in northern latitudes through an increased requirement for dietary vitamin D. Our results provide a coherent and spatially explicit picture of the coevolution of lactase persistence and dairying in Europe. PMID:19714206

  15. Command, Control, Communications, Computers and Intelligence Electronic Warfare (C4IEW) Project Book, Fiscal Year 1994. (Non-FOUO Version)

    DTIC Science & Technology

    1994-04-01

    TSW-7A, AIR TRAFFIC CONTROL CENTRAL (ATCC) 32- 8 AN/TTC-41(V), CENTRAL OFFICE, TELEPHONE, AUTOMATIC 32- 9 MISSILE COUNTERMEASURE DEVICE (MCD) .- 0 MK...a Handheld Terminal Unit (HTU), Portable Computer Unit (PCU), Transportable Computer Unit (TCU), and compatible NOI peripheral devices . All but the...CLASSIFICATION: ASARC-III, Jun 80, Standard. I I I AN/TIC-39 IS A MOBILE , AUTOMATIC , MODULAR ELECTRONIC CIRCUIT SWITCH UNDER PROCESSOR CONTROL WITH INTEGRAL

  16. In silico Interrogation of Insect Central Complex Suggests Computational Roles for the Ellipsoid Body in Spatial Navigation.

    PubMed

    Fiore, Vincenzo G; Kottler, Benjamin; Gu, Xiaosi; Hirth, Frank

    2017-01-01

    The central complex in the insect brain is a composite of midline neuropils involved in processing sensory cues and mediating behavioral outputs to orchestrate spatial navigation. Despite recent advances, however, the neural mechanisms underlying sensory integration and motor action selections have remained largely elusive. In particular, it is not yet understood how the central complex exploits sensory inputs to realize motor functions associated with spatial navigation. Here we report an in silico interrogation of central complex-mediated spatial navigation with a special emphasis on the ellipsoid body. Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs. Our simulations show integration of multiple sensory sources can be effectively performed in the ellipsoid body. This processed information is used to trigger continuous sequences of action selections resulting in self-motion, obstacle avoidance and the navigation of simulated environments of varying complexity. The motor responses to perceived sensory stimuli can be stored in the neural structure of the central complex to simulate navigation relying on a collective of guidance cues, akin to sensory-driven innate or habitual behaviors. By comparing behaviors under different conditions of accessible sources of input information, we show the simulated insect computes visual inputs and body posture to estimate its position in space. Finally, we tested whether the local connectome of the central complex might also allow the flexibility required to recall an intentional behavioral sequence, among different courses of actions. Our simulations suggest that the central complex can encode combined representations of motor and spatial information to pursue a goal and thus successfully guide orientation behavior. Together, the observed computational features identify central complex circuitry, and especially the ellipsoid body, as a key neural correlate involved in spatial navigation.

  17. In silico Interrogation of Insect Central Complex Suggests Computational Roles for the Ellipsoid Body in Spatial Navigation

    PubMed Central

    Fiore, Vincenzo G.; Kottler, Benjamin; Gu, Xiaosi; Hirth, Frank

    2017-01-01

    The central complex in the insect brain is a composite of midline neuropils involved in processing sensory cues and mediating behavioral outputs to orchestrate spatial navigation. Despite recent advances, however, the neural mechanisms underlying sensory integration and motor action selections have remained largely elusive. In particular, it is not yet understood how the central complex exploits sensory inputs to realize motor functions associated with spatial navigation. Here we report an in silico interrogation of central complex-mediated spatial navigation with a special emphasis on the ellipsoid body. Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs. Our simulations show integration of multiple sensory sources can be effectively performed in the ellipsoid body. This processed information is used to trigger continuous sequences of action selections resulting in self-motion, obstacle avoidance and the navigation of simulated environments of varying complexity. The motor responses to perceived sensory stimuli can be stored in the neural structure of the central complex to simulate navigation relying on a collective of guidance cues, akin to sensory-driven innate or habitual behaviors. By comparing behaviors under different conditions of accessible sources of input information, we show the simulated insect computes visual inputs and body posture to estimate its position in space. Finally, we tested whether the local connectome of the central complex might also allow the flexibility required to recall an intentional behavioral sequence, among different courses of actions. Our simulations suggest that the central complex can encode combined representations of motor and spatial information to pursue a goal and thus successfully guide orientation behavior. Together, the observed computational features identify central complex circuitry, and especially the ellipsoid body, as a key neural correlate involved in spatial navigation. PMID:28824390

  18. 20 CFR 404.1588 - Your responsibility to tell us of events that may change your disability status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... may change your disability status. 404.1588 Section 404.1588 Employees' Benefits SOCIAL SECURITY... issue a receipt to you or your representative at least until a centralized computer file that records... centralized computer file is in place, we will continue to issue receipts to you or your representative if you...

  19. 20 CFR 404.1588 - Your responsibility to tell us of events that may change your disability status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... may change your disability status. 404.1588 Section 404.1588 Employees' Benefits SOCIAL SECURITY... issue a receipt to you or your representative at least until a centralized computer file that records... centralized computer file is in place, we will continue to issue receipts to you or your representative if you...

  20. 20 CFR 404.1588 - Your responsibility to tell us of events that may change your disability status.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... may change your disability status. 404.1588 Section 404.1588 Employees' Benefits SOCIAL SECURITY... issue a receipt to you or your representative at least until a centralized computer file that records... centralized computer file is in place, we will continue to issue receipts to you or your representative if you...

  1. 20 CFR 404.1588 - Your responsibility to tell us of events that may change your disability status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... may change your disability status. 404.1588 Section 404.1588 Employees' Benefits SOCIAL SECURITY... issue a receipt to you or your representative at least until a centralized computer file that records... centralized computer file is in place, we will continue to issue receipts to you or your representative if you...

  2. Researcher's guide to the NASA Ames Flight Simulator for Advanced Aircraft (FSAA)

    NASA Technical Reports Server (NTRS)

    Sinacori, J. B.; Stapleford, R. L.; Jewell, W. F.; Lehman, J. M.

    1977-01-01

    Performance, limitations, supporting software, and current checkout and operating procedures are presented for the flight simulator, in terms useful to the researcher who intends to use it. Suggestions to help the researcher prepare the experimental plan are also given. The FSAA's central computer, cockpit, and visual and motion systems are addressed individually but their interaction is considered as well. Data required, available options, user responsibilities, and occupancy procedures are given in a form that facilitates the initial communication required with the NASA operations' group.

  3. Molecular Imaging and Precision Medicine in Uterine and Ovarian Cancers.

    PubMed

    Zukotynski, Katherine A; Kim, Chun K

    2017-10-01

    Gynecologic cancer is a heterogeneous group of diseases both functionally and morphologically. Today, PET coupled with computed tomography (PET/CT) or PET/MR imaging play a central role in the precision medicine algorithm of patients with gynecologic malignancy. In particular, PET/CT and PET/MR imaging are molecular imaging techniques that not only are useful tools for initial staging and restaging but provide anatomofunctional insight and can serve as predictive and prognostic biomarkers of response in patients with gynecologic malignancy. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. An Implemented Strategy for Campus Connectivity and Cooperative Computing.

    ERIC Educational Resources Information Center

    Halaris, Antony S.; Sloan, Lynda W.

    1989-01-01

    ConnectPac, a software package developed at Iona College to allow a computer user to access all services from a single personal computer, is described. ConnectPac uses mainframe computing to support a campus computing network, integrating personal and centralized computing into a menu-driven user environment. (Author/MLW)

  5. Description of data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    One of the major responsibilities of the JPL Computing and Information Services Office is to develop and maintain a JPL plan for providing computing services to the JPL management and administrative community that will lead to improved productivity. The CISO plan to accomplish this objective has been titled 'Management and Administrative Support Systems' (MASS). The MASS plan is based on the continued use of JPL's IBM 3032 Computer system for administrative computing and for the MASS functions. The current candidate administrative Data Base Management Systems required to support the MASS include ADABASE, Cullinane IDMS and TOTAL. Previous uses of administrative Data Base Systems have been applied to specific local functions rather than in a centralized manner with elements common to the many user groups. Limited capacity data base systems have been installed in microprocessor based office automation systems in a few Project and Management Offices using Ashton-Tate dBASE II. These experiences plus some other localized in house DBMS uses have provided an excellent background for developing user and system requirements for a single DBMS to support the MASS program.

  6. Computational biology and bioinformatics in Nigeria.

    PubMed

    Fatumo, Segun A; Adoga, Moses P; Ojo, Opeolu O; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-04-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries.

  7. Computational Biology and Bioinformatics in Nigeria

    PubMed Central

    Fatumo, Segun A.; Adoga, Moses P.; Ojo, Opeolu O.; Oluwagbemi, Olugbenga; Adeoye, Tolulope; Ewejobi, Itunuoluwa; Adebiyi, Marion; Adebiyi, Ezekiel; Bewaji, Clement; Nashiru, Oyekanmi

    2014-01-01

    Over the past few decades, major advances in the field of molecular biology, coupled with advances in genomic technologies, have led to an explosive growth in the biological data generated by the scientific community. The critical need to process and analyze such a deluge of data and turn it into useful knowledge has caused bioinformatics to gain prominence and importance. Bioinformatics is an interdisciplinary research area that applies techniques, methodologies, and tools in computer and information science to solve biological problems. In Nigeria, bioinformatics has recently played a vital role in the advancement of biological sciences. As a developing country, the importance of bioinformatics is rapidly gaining acceptance, and bioinformatics groups comprised of biologists, computer scientists, and computer engineers are being constituted at Nigerian universities and research institutes. In this article, we present an overview of bioinformatics education and research in Nigeria. We also discuss professional societies and academic and research institutions that play central roles in advancing the discipline in Nigeria. Finally, we propose strategies that can bolster bioinformatics education and support from policy makers in Nigeria, with potential positive implications for other developing countries. PMID:24763310

  8. Relationship of central incisor implant placement to the ridge configuration anterior to the nasopalatine canal in dentate and partially edentulous individuals: a comparative study

    PubMed Central

    2015-01-01

    Background. The aims of this study were to investigate the ridge contour anterior to the nasopalatine canal, and the difference between the incidences of the nasopalatine canal perforation in dentate and partially edentulous patients by cone-beam computed tomography. Methods. Cone-beam computed tomography scan images from 72 patients were selected from database and divided into dentate and partially edentulous groups. The configuration of the ridge anterior to the canal including palatal concavity depth, palatal concavity height, palatal concavity angle, bone height coronal to the incisive foramen, and bone width anterior to the canal was measured. A virtual implant placement procedure was used, and the incidences of perforation were evaluated after implant placement in the cingulum position with the long axis along with the designed crown. Results. Comparing with variable values from dentate patients, the palatal concavity depth and angle were greater by 0.9 mm and 4°, and bone height was shorter by 1.1 mm in partially edentulous patients, respectively. Bone width in edentulous patients was narrower than in dentate patients by 1.2 mm at incisive foramen level and 0.9 mm at 8 mm subcrestal level, respectively. After 72 virtual cylindrical implants (4.1 × 12 mm) were placed, a total of 12 sites (16.7%) showed a perforation and three-fourths occurred in partially edentulous patients. After replacing with 72 tapered implants (4.3 × 13 mm), only 6 implants (8.3%) broke into the canal in the partially edentulous patient group. Conclusions. The nasopalatine canal may get close to the implant site and the bone width anterior to the canal decreases after the central incisor extraction. The incidence of nasopalatine canal perforation may occur more commonly during delayed implant placement in central incisor missing patients. PMID:26557434

  9. The DFVLR main department for central data processing, 1976 - 1983

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Data processing, equipment and systems operation, operative and user systems, user services, computer networks and communications, text processing, computer graphics, and high power computers are discussed.

  10. Multiphasic Health Testing in the Clinic Setting

    PubMed Central

    LaDou, Joseph

    1971-01-01

    The economy of automated multiphasic health testing (amht) activities patterned after the high-volume Kaiser program can be realized in low-volume settings. amht units have been operated at daily volumes of 20 patients in three separate clinical environments. These programs have displayed economics entirely compatible with cost figures published by the established high-volume centers. This experience, plus the expanding capability of small, general purpose, digital computers (minicomputers) indicates that a group of six or more physicians generating 20 laboratory appraisals per day can economically justify a completely automated multiphasic health testing facility. This system would reside in the clinic or hospital where it is used and can be configured to do analyses such as electrocardiography and generate laboratory reports, and communicate with large computer systems in university medical centers. Experience indicates that the most effective means of implementing these benefits of automation is to make them directly available to the medical community with the physician playing the central role. Economic justification of a dedicated computer through low-volume health testing then allows, as a side benefit, automation of administrative as well as other diagnostic activities—for example, patient billing, computer-aided diagnosis, and computer-aided therapeutics. PMID:4935771

  11. Locating the Acupoint Baihui (GV20) Beneath the Cerebral Cortex with MRI Reconstructed 3D Neuroimages

    PubMed Central

    Shen, Ein-Yiao; Chen, Fun-Jou; Chen, Yun-Yin; Lin, Ming-Fan

    2011-01-01

    Baihui (GV20) is one of the most important acupoints of the Du meridian (the government vessel) and is commonly used in neurology and psychiatry and as a distal point of anorectal disorders by general practitioners. The anatomical relationship between the scalp region of the acupoint and the underlying corresponding cortex remains obscure. In this study, we first prepared the indicator for MRI scanning on a GE 1.5 T excite machine in a mode suitable for 3D reconstruction. The 3D Avizo software system (version 6.0, Mercury Computer Systems, Inc., Germany) was then used for image processing and the resulting data subsequently analyzed using descriptive statistics and analysis of variance (ANOVA). The mean distance from the Baihui anterior to the central sulcus in the adult group was greater than that in the child group (22.7 ± 2.2 and 19.7 ± 2.2 mm, resp., P = .042), whereas in the child group the distance between the Baihui anterior and the precentral sulcus was greater than in the adult group (6.8 ± 0.8 and 3.8 ± 0.8 mm, resp., P < .001). This MRI presentation demonstrates that the location of Baihui (GV20) can be identified using the distance from the central or precentral sulcus. PMID:21785620

  12. Beam orbit simulation in the central region of the RIKEN AVF cyclotron

    NASA Astrophysics Data System (ADS)

    Toprek, Dragan; Goto, Akira; Yano, Yasushige

    1999-04-01

    This paper describes the modification design of the central region for h=2 mode of acceleration in the RIKEN AVF cyclotron. we made a small modification to the electrode shape in the central region for optimization of the beam transmission. The central region is equipped with an axial injection system. The spiral type inflector is used for axial injection. The electric field distribution in the inflector and in four acceleration gaps has been numerically calculated from an electric potential map produced by the program RELAX3D. The magnetic field is measured. The geometry of the central region has been tested with the computations of orbits carried out by means of the computer code CYCLONE. The optical properties of the spiral inflector and the central region are studied by using the program CASINO and CYCLONE, respectively. We have also made an effort to minimize the inflector fringe field effects using the RELAX3D program.

  13. Tomographic Rayleigh-wave group velocities in the Central Valley, California centered on the Sacramento/San Joaquin Delta

    USGS Publications Warehouse

    Fletcher, Jon Peter B.; Erdem, Jemile; Seats, Kevin; Lawrence, Jesse

    2016-01-01

    If shaking from a local or regional earthquake in the San Francisco Bay region were to rupture levees in the Sacramento/San Joaquin Delta then brackish water from San Francisco Bay would contaminate the water in the Delta: the source of fresh water for about half of California. As a prelude to a full shear-wave velocity model that can be used in computer simulations and further seismic hazard analysis, we report on the use of ambient noise tomography to build a fundamental-mode, Rayleigh-wave group velocity model for the region around the Sacramento/San Joaquin Delta in the western Central Valley, California. Recordings from the vertical component of about 31 stations were processed to compute the spatial distribution of Rayleigh wave group velocities. Complex coherency between pairs of stations were stacked over 8 months to more than a year. Dispersion curves were determined from 4 to about 18 seconds. We calculated average group velocities for each period and inverted for deviations from the average for a matrix of cells that covered the study area. Smoothing using the first difference is applied. Cells of the model were about 5.6 km in either dimension. Checkerboard tests of resolution, which is dependent on station density, suggest that the resolving ability of the array is reasonably good within the middle of the array with resolution between 0.2 and 0.4 degrees. Overall, low velocities in the middle of each image reflect the deeper sedimentary syncline in the Central Valley. In detail, the model shows several centers of low velocity that may be associated with gross geologic features such as faulting along the western margin of the Central Valley, oil and gas reservoirs, and large cross cutting features like the Stockton arch. At shorter periods around 5.5s, the model’s western boundary between low and high velocities closely follows regional fault geometry and the edge of a residual isostatic gravity low. In the eastern part of the valley, the boundaries of the low velocity zone and gravity anomaly are better aligned at longer periods (around 10.5s) suggesting that the eastern edge of the gravity low is associated with deeper structure. There is a strong correspondence between a low in gravity near the Kirby Hills fault and low velocities from the ambient noise tomography. At longer periods, higher velocities creep in from the east and narrow the overall dimension defined by the lower velocities. Overall, there is a strong correspondence between the shape and location of low velocities in the Rayleigh wave velocity images, and geological and geophysical features.

  14. Tomographic Rayleigh wave group velocities in the Central Valley, California, centered on the Sacramento/San Joaquin Delta

    NASA Astrophysics Data System (ADS)

    Fletcher, Jon B.; Erdem, Jemile; Seats, Kevin; Lawrence, Jesse

    2016-04-01

    If shaking from a local or regional earthquake in the San Francisco Bay region were to rupture levees in the Sacramento/San Joaquin Delta, then brackish water from San Francisco Bay would contaminate the water in the Delta: the source of freshwater for about half of California. As a prelude to a full shear-wave velocity model that can be used in computer simulations and further seismic hazard analysis, we report on the use of ambient noise tomography to build a fundamental mode, Rayleigh wave group velocity model for the region around the Sacramento/San Joaquin Delta in the western Central Valley, California. Recordings from the vertical component of about 31 stations were processed to compute the spatial distribution of Rayleigh wave group velocities. Complex coherency between pairs of stations was stacked over 8 months to more than a year. Dispersion curves were determined from 4 to about 18 s. We calculated average group velocities for each period and inverted for deviations from the average for a matrix of cells that covered the study area. Smoothing using the first difference is applied. Cells of the model were about 5.6 km in either dimension. Checkerboard tests of resolution, which are dependent on station density, suggest that the resolving ability of the array is reasonably good within the middle of the array with resolution between 0.2 and 0.4°. Overall, low velocities in the middle of each image reflect the deeper sedimentary syncline in the Central Valley. In detail, the model shows several centers of low velocity that may be associated with gross geologic features such as faulting along the western margin of the Central Valley, oil and gas reservoirs, and large crosscutting features like the Stockton arch. At shorter periods around 5.5 s, the model's western boundary between low and high velocities closely follows regional fault geometry and the edge of a residual isostatic gravity low. In the eastern part of the valley, the boundaries of the low-velocity zone and gravity anomaly are better aligned at longer periods (around 10.5 s) suggesting that the eastern edge of the gravity low is associated with deeper structure. There is a strong correspondence between a low in gravity near the Kirby Hills fault and low velocities from the ambient noise tomography. At longer periods, higher velocities creep in from the east and narrow the overall dimension defined by the lower velocities. Overall, there is a strong correspondence between the shape and location of low velocities in the Rayleigh wave velocity images, and geological and geophysical features.

  15. BNL ATLAS Grid Computing

    ScienceCinema

    Michael Ernst

    2017-12-09

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  16. Free recall behaviour in children with and without spelling impairment: the impact of working memory subcapacities.

    PubMed

    Malstädt, Nadine; Hasselhorn, Marcus; Lehmann, Martin

    2012-11-01

    This study examined supraspan free recall in children with and without spelling impairment. A repeated free recall task involving overt rehearsal and three computer-based adaptive working memory tasks were administered to 54 eight-year-old children. Children without spelling impairments tended to recall more items than did those children with spelling deficits. Video analyses revealed that recall behaviour was similar in impaired and unimpaired children, indicating that both groups applied similar learning activities. Group differences in number of recalled items were attributed to differences in working memory subcapacities between children with and without spelling impairment, especially with regard to central executive and phonological loop functioning. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Constraining the Single-degenerate Channel of Type Ia Supernovae with Stable Iron-group Elements in SNR 3C 397

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dave, Pranav; Kashyap, Rahul; Fisher, Robert

    Recent Suzaku X-ray spectra of supernova remnant (SNR) 3C 397 indicate enhanced stable iron group element abundances of Ni, Mn, Cr, and Fe. Seeking to address key questions about the progenitor and explosion mechanism of 3C 397, we compute nucleosynthetic yields from a suite of multidimensional hydrodynamics models in the near-Chandrasekhar-mass, single-degenerate paradigm for Type Ia supernovae (SNe Ia). Varying the progenitor white dwarf (WD) internal structure, composition, ignition, and explosion mechanism, we find that the best match to the observed iron peak elements of 3C 397 are dense (central density ≥6 × 10{sup 9} g cm{sup −3}), low-carbon WDsmore » that undergo a weak, centrally ignited deflagration, followed by a subsequent detonation. The amount of {sup 56}Ni produced is consistent with a normal or bright normal SNe Ia. A pure deflagration of a centrally ignited, low central density (≃2 × 10{sup 9} g cm{sup −3}) progenitor WD, frequently considered in the literature, is also found to produce good agreement with 3C 397 nucleosynthetic yields, but leads to a subluminous SN Ia event, in conflict with X-ray line width data. Additionally, in contrast to prior work that suggested a large supersolar metallicity for the WD progenitor for SNR 3C 397, we find satisfactory agreement for solar- and subsolar-metallicity progenitors. We discuss a range of implications our results have for the single-degenerate channel.« less

  18. A Faster Parallel Algorithm and Efficient Multithreaded Implementations for Evaluating Betweenness Centrality on Massive Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madduri, Kamesh; Ediger, David; Jiang, Karl

    2009-05-29

    We present a new lock-free parallel algorithm for computing betweenness centrality of massive small-world networks. With minor changes to the data structures, our algorithm also achieves better spatial cache locality compared to previous approaches. Betweenness centrality is a key algorithm kernel in the HPCS SSCA#2 Graph Analysis benchmark, which has been extensively used to evaluate the performance of emerging high-performance computing architectures for graph-theoretic computations. We design optimized implementations of betweenness centrality and the SSCA#2 benchmark for two hardware multithreaded systems: a Cray XMT system with the ThreadStorm processor, and a single-socket Sun multicore server with the UltraSparc T2 processor.more » For a small-world network of 134 million vertices and 1.073 billion edges, the 16-processor XMT system and the 8-core Sun Fire T5120 server achieve TEPS scores (an algorithmic performance count for the SSCA#2 benchmark) of 160 million and 90 million respectively, which corresponds to more than a 2X performance improvement over the previous parallel implementations. To better characterize the performance of these multithreaded systems, we correlate the SSCA#2 performance results with data from the memory-intensive STREAM and RandomAccess benchmarks. Finally, we demonstrate the applicability of our implementation to analyze massive real-world datasets by computing approximate betweenness centrality for a large-scale IMDb movie-actor network.« less

  19. Eigenvector centrality for geometric and topological characterization of porous media

    NASA Astrophysics Data System (ADS)

    Jimenez-Martinez, Joaquin; Negre, Christian F. A.

    2017-07-01

    Solving flow and transport through complex geometries such as porous media is computationally difficult. Such calculations usually involve the solution of a system of discretized differential equations, which could lead to extreme computational cost depending on the size of the domain and the accuracy of the model. Geometric simplifications like pore networks, where the pores are represented by nodes and the pore throats by edges connecting pores, have been proposed. These models, despite their ability to preserve the connectivity of the medium, have difficulties capturing preferential paths (high velocity) and stagnation zones (low velocity), as they do not consider the specific relations between nodes. Nonetheless, network theory approaches, where a complex network is a graph, can help to simplify and better understand fluid dynamics and transport in porous media. Here we present an alternative method to address these issues based on eigenvector centrality, which has been corrected to overcome the centralization problem and modified to introduce a bias in the centrality distribution along a particular direction to address the flow and transport anisotropy in porous media. We compare the model predictions with millifluidic transport experiments, which shows that, albeit simple, this technique is computationally efficient and has potential for predicting preferential paths and stagnation zones for flow and transport in porous media. We propose to use the eigenvector centrality probability distribution to compute the entropy as an indicator of the "mixing capacity" of the system.

  20. [CONTENT OF TRANS FATTY ACIDS IN FOOD PRODUCTS IN SPAIN].

    PubMed

    Robledo de Dios, Teresa; Dal Re Saavedra, M Ángeles; Villar Villalba, Carmen; Pérez-Farinós, Napoleón

    2015-09-01

    trans fatty acids are associated to several health disorders, as ischemic heart disease or diabetes mellitus. to assess the content of trans fatty acids in products in Spain, and the percentage of trans fatty acids respecting total fatty acids. 443 food products were acquired in Spain, and they were classified into groups. The content in fatty acids was analyzed using gas chromatography. Estimates of central tendency and variability of the content of trans fatty acids in each food group were computed (in g of trans fatty acids/100 g of product). The percentage of trans fatty acids respecting total fatty acids was calculated in each group. 443 products were grouped into 42 groups. Median of trans fatty acids was less than 0.55 g / 100 g of product in all groups except one. 83 % of groups had less than 2 % of trans fatty acids, and 71 % of groups had less than 1 %. the content of trans fatty acids in Spain is low, and it currently doesn't play a public health problem. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  1. Gauging Spatial Symmetries and the Classification of Topological Crystalline Phases

    NASA Astrophysics Data System (ADS)

    Thorngren, Ryan; Else, Dominic V.

    2018-01-01

    We put the theory of interacting topological crystalline phases on a systematic footing. These are topological phases protected by space-group symmetries. Our central tool is an elucidation of what it means to "gauge" such symmetries. We introduce the notion of a crystalline topological liquid and argue that most (and perhaps all) phases of interest are likely to satisfy this criterion. We prove a crystalline equivalence principle, which states that in Euclidean space, crystalline topological liquids with symmetry group G are in one-to-one correspondence with topological phases protected by the same symmetry G , but acting internally, where if an element of G is orientation reversing, it is realized as an antiunitary symmetry in the internal symmetry group. As an example, we explicitly compute, using group cohomology, a partial classification of bosonic symmetry-protected topological phases protected by crystalline symmetries in (3 +1 ) dimensions for 227 of the 230 space groups. For the 65 space groups not containing orientation-reversing elements (Sohncke groups), there are no cobordism invariants that may contribute phases beyond group cohomology, so we conjecture that our classification is complete.

  2. Evaluating Computer Technology Integration in a Centralized School System

    ERIC Educational Resources Information Center

    Eteokleous, N.

    2008-01-01

    The study evaluated the current situation in Cyprus elementary classrooms regarding computer technology integration in an attempt to identify ways of expanding teachers' and students' experiences with computer technology. It examined how Cypriot elementary teachers use computers, and the factors that influence computer integration in their…

  3. Methodological considerations for the evaluation of EEG mapping data: a practical example based on a placebo/diazepam crossover trial.

    PubMed

    Jähnig, P; Jobert, M

    1995-01-01

    Quantitative EEG is a sensitive method for measuring pharmacological effects on the central nervous system. Nowadays, computers enable EEG data to be stored and spectral parameters to be computed for signals obtained from a large number of electrode locations. However, the statistical analysis of such vast amounts of EEG data is complicated due to the limited number of subjects usually involved in pharmacological studies. In the present study, data from a trial aimed at comparing diazepam and placebo were used to investigate different properties of EEG mapping data and to compare different methods of data analysis. Both the topography and the temporal changes of EEG activity were investigated using descriptive data analysis, which is based on an inspection of patterns of pd values (descriptive p values) assessed for all pair-wise tests for differences in time or treatment. An empirical measure (tri-mean) for the computation of group maps is suggested, allowing a better description of group effects with skewed data of small samples size. Finally, both the investigation of maps based on principal component analysis and the notion of distance between maps are discussed and applied to the analysis of the data collected under diazepam treatment, exemplifying the evaluation of pharmacodynamic drug effects.

  4. Conceptual Modeling in the Time of the Revolution: Part II

    NASA Astrophysics Data System (ADS)

    Mylopoulos, John

    Conceptual Modeling was a marginal research topic at the very fringes of Computer Science in the 60s and 70s, when the discipline was dominated by topics focusing on programs, systems and hardware architectures. Over the years, however, the field has moved to centre stage and has come to claim a central role both in Computer Science research and practice in diverse areas, such as Software Engineering, Databases, Information Systems, the Semantic Web, Business Process Management, Service-Oriented Computing, Multi-Agent Systems, Knowledge Management, and more. The transformation was greatly aided by the adoption of standards in modeling languages (e.g., UML), and model-based methodologies (e.g., Model-Driven Architectures) by the Object Management Group (OMG) and other standards organizations. We briefly review the history of the field over the past 40 years, focusing on the evolution of key ideas. We then note some open challenges and report on-going research, covering topics such as the representation of variability in conceptual models, capturing model intentions, and models of laws.

  5. Cone-beam computed tomography analysis of curved root canals after mechanical preparation with three nickel-titanium rotary instruments

    PubMed Central

    Elsherief, Samia M.; Zayet, Mohamed K.; Hamouda, Ibrahim M.

    2013-01-01

    Cone beam computed tomography is a 3-dimensional high resolution imaging method. The purpose of this study was to compare the effects of 3 different NiTi rotary instruments used to prepare curved root canals on the final shape of the curved canals and total amount of root canal transportation by using cone-beam computed tomography. A total of 81 mesial root canals from 42 extracted human mandibular molars, with a curvature ranging from 15 to 45 degrees, were selected. Canals were randomly divided into 3 groups of 27 each. After preparation with Protaper, Revo-S and Hero Shaper, the amount of transportation and centering ability that occurred were assessed by using cone beam computed tomography. Utilizing pre- and post-instrumentation radiographs, straightening of the canal curvatures was determined with a computer image analysis program. Canals were metrically assessed for changes (surface area, changes in curvature and transportation) during canal preparation by using software SimPlant; instrument failures were also recorded. Mean total widths and outer and inner width measurements were determined on each central canal path and differences were statistically analyzed. The results showed that all instruments maintained the original canal curvature well with no significant differences between the different files (P = 0.226). During preparation there was failure of only one file (the protaper group). In conclusion, under the conditions of this study, all instruments maintained the original canal curvature well and were safe to use. Areas of uninstrumented root canal wall were left in all regions using the various systems. PMID:23885273

  6. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    NASA Astrophysics Data System (ADS)

    Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-10-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.

  7. Computational and Biochemical Docking of the Irreversible Cocaine Analog RTI 82 Directly Demonstrates Ligand Positioning in the Dopamine Transporter Central Substrate-binding Site*

    PubMed Central

    Dahal, Rejwi Acharya; Pramod, Akula Bala; Sharma, Babita; Krout, Danielle; Foster, James D.; Cha, Joo Hwan; Cao, Jianjing; Newman, Amy Hauck; Lever, John R.; Vaughan, Roxanne A.; Henry, L. Keith

    2014-01-01

    The dopamine transporter (DAT) functions as a key regulator of dopaminergic neurotransmission via re-uptake of synaptic dopamine (DA). Cocaine binding to DAT blocks this activity and elevates extracellular DA, leading to psychomotor stimulation and addiction, but the mechanisms by which cocaine interacts with DAT and inhibits transport remain incompletely understood. Here, we addressed these questions using computational and biochemical methodologies to localize the binding and adduction sites of the photoactivatable irreversible cocaine analog 3β-(p-chlorophenyl)tropane-2β-carboxylic acid, 4′-azido-3′-iodophenylethyl ester ([125I]RTI 82). Comparative modeling and small molecule docking indicated that the tropane pharmacophore of RTI 82 was positioned in the central DA active site with an orientation that juxtaposed the aryliodoazide group for cross-linking to rat DAT Phe-319. This prediction was verified by focused methionine substitution of residues flanking this site followed by cyanogen bromide mapping of the [125I]RTI 82-labeled mutants and by the substituted cysteine accessibility method protection analyses. These findings provide positive functional evidence linking tropane pharmacophore interaction with the core substrate-binding site and support a competitive mechanism for transport inhibition. This synergistic application of computational and biochemical methodologies overcomes many uncertainties inherent in other approaches and furnishes a schematic framework for elucidating the ligand-protein interactions of other classes of DA transport inhibitors. PMID:25179220

  8. The Lilongwe Central Hospital Patient Management Information System: A Success in Computer-Based Order Entry Where One Might Least Expect It

    PubMed Central

    GP, Douglas; RA, Deula; SE, Connor

    2003-01-01

    Computer-based order entry is a powerful tool for enhancing patient care. A pilot project in the pediatric department of the Lilongwe Central Hospital (LCH) in Malawi, Africa has demonstrated that computer-based order entry (COE): 1) can be successfully deployed and adopted in resource-poor settings, 2) can be built, deployed and sustained at relatively low cost and with local resources, and 3) has a greater potential to improve patient care in developing than in developed countries. PMID:14728338

  9. Unified algorithm of cone optics to compute solar flux on central receiver

    NASA Astrophysics Data System (ADS)

    Grigoriev, Victor; Corsi, Clotilde

    2017-06-01

    Analytical algorithms to compute flux distribution on central receiver are considered as a faster alternative to ray tracing. They have quite too many modifications, with HFLCAL and UNIZAR being the most recognized and verified. In this work, a generalized algorithm is presented which is valid for arbitrary sun shape of radial symmetry. Heliostat mirrors can have a nonrectangular profile, and the effects of shading and blocking, strong defocusing and astigmatism can be taken into account. The algorithm is suitable for parallel computing and can benefit from hardware acceleration of polygon texturing.

  10. [Groupamatic 360 C1 and automated blood donor processing in a transfusion center].

    PubMed

    Guimbretiere, J; Toscer, M; Harousseau, H

    1978-03-01

    Automation of donor management flow path is controlled by: --a 3 slip "port a punch" card, --the groupamatic unit with a result sorted out on punch paper tape, --the management computer off line connected to groupamatic. Data tracking at blood collection time is made by punching a card with the donor card used as a master card. Groupamatic performs: --a standard blood grouping with one run for registered donors and two runs for new donors, --a phenotyping with two runs, --a screening of irregular antibodies. Themanagement computer checks the correlation between the data of the two runs or the data of a single run and that of previous file. It updates the data resident in the central file and prints out: --the controls of the different blood group for the red cell panel, --The listing of error messages, --The listing of emergency call up, --The listing of collected blood units when arrived at the blood center, with quantitative and qualitative information such as: number of blood, units collected, donor addresses, etc., --Statistics, --Donor cards, --Diplomas.

  11. Participant, Rater, and Computer Measures of Coherence in Posttraumatic Stress Disorder

    PubMed Central

    Rubin, David C.; Deffler, Samantha A.; Ogle, Christin M.; Dowell, Nia M.; Graesser, Arthur C.; Beckham, Jean C.

    2015-01-01

    We examined the coherence of trauma memories in a trauma-exposed community sample of 30 adults with and 30 without PTSD. The groups had similar categories of traumas and were matched on multiple factors that could affect the coherence of memories. We compared the transcribed oral trauma memories of participants with their most important and most positive memories. A comprehensive set of 28 measures of coherence including 3 ratings by the participants, 7 ratings by outside raters, and 18 computer-scored measures, provided a variety of approaches to defining and measuring coherence. A MANOVA indicated differences in coherence among the trauma, important, and positive memories, but not between the diagnostic groups or their interaction with these memory types. Most differences were small in magnitude; in some cases, the trauma memories were more, rather than less, coherent than the control memories. Where differences existed, the results agreed with the existing literature, suggesting that factors other than the incoherence of trauma memories are most likely to be central to the maintenance of PTSD and thus its treatment. PMID:26523945

  12. Centralized Planning for Multiple Exploratory Robots

    NASA Technical Reports Server (NTRS)

    Estlin, Tara; Rabideau, Gregg; Chien, Steve; Barrett, Anthony

    2005-01-01

    A computer program automatically generates plans for a group of robotic vehicles (rovers) engaged in geological exploration of terrain. The program rapidly generates multiple command sequences that can be executed simultaneously by the rovers. Starting from a set of high-level goals, the program creates a sequence of commands for each rover while respecting hardware constraints and limitations on resources of each rover and of hardware (e.g., a radio communication terminal) shared by all the rovers. First, a separate model of each rover is loaded into a centralized planning subprogram. The centralized planning software uses the models of the rovers plus an iterative repair algorithm to resolve conflicts posed by demands for resources and by constraints associated with the all the rovers and the shared hardware. During repair, heuristics are used to make planning decisions that will result in solutions that will be better and will be found faster than would otherwise be possible. In particular, techniques from prior solutions of the multiple-traveling- salesmen problem are used as heuristics to generate plans in which the paths taken by the rovers to assigned scientific targets are shorter than they would otherwise be.

  13. General-purpose interface bus for multiuser, multitasking computer system

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1990-01-01

    The architecture of a multiuser, multitasking, virtual-memory computer system intended for the use by a medium-size research group is described. There are three central processing units (CPU) in the configuration, each with 16 MB memory, and two 474 MB hard disks attached. CPU 1 is designed for data analysis and contains an array processor for fast-Fourier transformations. In addition, CPU 1 shares display images viewed with the image processor. CPU 2 is designed for image analysis and display. CPU 3 is designed for data acquisition and contains 8 GPIB channels and an analog-to-digital conversion input/output interface with 16 channels. Up to 9 users can access the third CPU simultaneously for data acquisition. Focus is placed on the optimization of hardware interfaces and software, facilitating instrument control, data acquisition, and processing.

  14. Functional Analysis and Preliminary Specifications for a Single Integrated Central Computer System for Secondary Schools and Junior Colleges. Interim Report.

    ERIC Educational Resources Information Center

    1968

    The present report proposes a central computing facility and presents the preliminary specifications for such a system. It is based, in part, on the results of earlier studies by two previous contractors on behalf of the U.S. Office of Education. The recommendations are based upon the present contractors considered evaluation of the earlier…

  15. Geometric and topological characterization of porous media: insights from eigenvector centrality

    NASA Astrophysics Data System (ADS)

    Jimenez-Martinez, J.; Negre, C.

    2017-12-01

    Solving flow and transport through complex geometries such as porous media involves an extreme computational cost. Simplifications such as pore networks, where the pores are represented by nodes and the pore throats by edges connecting pores, have been proposed. These models have the ability to preserve the connectivity of the medium. However, they have difficulties capturing preferential paths (high velocity) and stagnation zones (low velocity), as they do not consider the specific relations between nodes. Network theory approaches, where the complex network is conceptualized like a graph, can help to simplify and better understand fluid dynamics and transport in porous media. To address this issue, we propose a method based on eigenvector centrality. It has been corrected to overcome the centralization problem and modified to introduce a bias in the centrality distribution along a particular direction which allows considering the flow and transport anisotropy in porous media. The model predictions are compared with millifluidic transport experiments, showing that this technique is computationally efficient and has potential for predicting preferential paths and stagnation zones for flow and transport in porous media. Entropy computed from the eigenvector centrality probability distribution is proposed as an indicator of the "mixing capacity" of the system.

  16. The effect of soft tissue distraction on deformity recurrence after centralization for radial longitudinal deficiency.

    PubMed

    Manske, M Claire; Wall, Lindley B; Steffen, Jennifer A; Goldfarb, Charles A

    2014-05-01

    To assess recurrence and complications in children with radial longitudinal deficiency treated with or without external fixator soft tissue distraction prior to centralization. Thirteen upper extremities treated with centralization alone were compared with 13 treated with ring fixator distraction followed by centralization. Resting wrist position between the 2 groups was compared before surgery, approximately 2 years after surgery (midterm), and at final follow-up, which was at a mean of 10 years for the centralization-alone group and 6 years for the distraction group. Radiographs were reviewed for hand-forearm angle, hand-forearm position, volar carpal subluxation, ulnar length, and physeal integrity. The clinical resting wrist position was improved significantly after surgery and at final follow-up in both groups, but recurrence was worse at final follow-up in the distraction group patients. Radiographically, in the centralization alone group, the hand-forearm angle improved from 53° before surgery to 13° at midterm but worsened to 27° at final follow-up. In the distraction group, the hand-forearm angle improved from 53° before surgery to 21° at midterm but worsened to 36° at final follow-up. The hand-forearm position improved between preoperative and final assessment in both groups, but at final follow-up, the centralization-alone group had a significantly better position. Volar subluxation was 4 mm improved in the centralization alone group and 2 mm worse in the distraction group at final follow-up. Centralization, with or without distraction with an external fixator, resulted in improved alignment of the wrist. Distraction facilitated centralization, but it did not prevent deformity recurrence and was associated with a worse final radial deviation and volar subluxation position compared with wrists treated with centralization alone. Therapeutic III. Copyright © 2014 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  17. TOWARD A COMPUTER BASED INSTRUCTIONAL SYSTEM.

    ERIC Educational Resources Information Center

    GARIGLIO, LAWRENCE M.; RODGERS, WILLIAM A.

    THE INFORMATION FOR THIS REPORT WAS OBTAINED FROM VARIOUS COMPUTER ASSISTED INSTRUCTION INSTALLATIONS. COMPUTER BASED INSTRUCTION REFERS TO A SYSTEM AIMED AT INDIVIDUALIZED INSTRUCTION, WITH THE COMPUTER AS CENTRAL CONTROL. SUCH A SYSTEM HAS 3 MAJOR SUBSYSTEMS--INSTRUCTIONAL, RESEARCH, AND MANAGERIAL. THIS REPORT EMPHASIZES THE INSTRUCTIONAL…

  18. Does methamphetamine affect bone metabolism?

    PubMed

    Tomita, Masafumi; Katsuyama, Hironobu; Watanabe, Yoko; Okuyama, Toshiko; Fushimi, Shigeko; Ishikawa, Takaki; Nata, Masayuki; Miyamoto, Osamu

    2014-05-07

    There is a close relationship between the central nervous system activity and bone metabolism. Therefore, methamphetamine (METH), which stimulates the central nervous system, is expected to affect bone turnover. The aim of this study was to investigate the role of METH in bone metabolism. Mice were divided into 3 groups, the control group receiving saline injections, and the 5 and 10mg/kg METH groups (n=6 in each group). All groups received an injection of saline or METH every other day for 8 weeks. Bone mineral density (BMD) was assessed by X-ray computed tomography. We examined biochemical markers and histomorphometric changes in the second cancellous bone of the left femoral distal end. The animals that were administered 5mg/kg METH showed an increased locomotor activity, whereas those receiving 10mg/kg displayed an abnormal and stereotyped behavior. Serum calcium and phosphorus concentrations were normal compared to the controls, whereas the serum protein concentration was lower in the METH groups. BMD was unchanged in all groups. Bone formation markers such as alkaline phosphatase and osteocalcin significantly increased in the 5mg/kg METH group, but not in the 10mg/kg METH group. In contrast, bone resorption markers such as C-terminal telopeptides of type I collagen and tartrate-resistant acid phosphatase 5b did not change in any of the METH groups. Histomorphometric analyses were consistent with the biochemical markers data. A significant increase in osteoblasts, especially in type III osteoblasts, was observed in the 5mg/kg METH group, whereas other parameters of bone resorption and mineralization remained unchanged. These results indicate that bone remodeling in this group was unbalanced. In contrast, in the 10mg/kg METH group, some parameters of bone formation were significantly or slightly decreased, suggesting a low turnover metabolism. Taken together, our results suggest that METH had distinct dose-dependent effects on bone turnover and that METH might induce adverse effects, leading to osteoporosis. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. Central tarsal bone fractures in horses not used for racing: Computed tomographic configuration and long-term outcome of lag screw fixation.

    PubMed

    Gunst, S; Del Chicca, F; Fürst, A E; Kuemmerle, J M

    2016-09-01

    There are no reports on the configuration of equine central tarsal bone fractures based on cross-sectional imaging and clinical and radiographic long-term outcome after internal fixation. To report clinical, radiographic and computed tomographic findings of equine central tarsal bone fractures and to evaluate the long-term outcome of internal fixation. Retrospective case series. All horses diagnosed with a central tarsal bone fracture at our institution in 2009-2013 were included. Computed tomography and internal fixation using lag screw technique was performed in all patients. Medical records and diagnostic images were reviewed retrospectively. A clinical and radiographic follow-up examination was performed at least 1 year post operatively. A central tarsal bone fracture was diagnosed in 6 horses. Five were Warmbloods used for showjumping and one was a Quarter Horse used for reining. All horses had sagittal slab fractures that began dorsally, ran in a plantar or plantaromedial direction and exited the plantar cortex at the plantar or plantaromedial indentation of the central tarsal bone. Marked sclerosis of the central tarsal bone was diagnosed in all patients. At long-term follow-up, 5/6 horses were sound and used as intended although mild osteophyte formation at the distal intertarsal joint was commonly observed. Central tarsal bone fractures in nonracehorses had a distinct configuration but radiographically subtle additional fracture lines can occur. A chronic stress related aetiology seems likely. Internal fixation of these fractures based on an accurate diagnosis of the individual fracture configuration resulted in a very good prognosis. © 2015 EVJ Ltd.

  20. Cloud Based Educational Systems and Its Challenges and Opportunities and Issues

    ERIC Educational Resources Information Center

    Paul, Prantosh Kr.; Lata Dangwal, Kiran

    2014-01-01

    Cloud Computing (CC) is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC) actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC) is extension of Grid computing with independency and…

  1. 77 FR 26660 - Guidelines for the Transfer of Excess Computers or Other Technical Equipment Pursuant to Section...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ....usda.gov . SUPPLEMENTARY INFORMATION: A. Background A proposed rule was published in the Federal.... Computers or other technical equipment means central processing units, laptops, desktops, computer mouses...

  2. 41 CFR 105-56.017 - Centralized salary offset computer match.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...

  3. 41 CFR 105-56.017 - Centralized salary offset computer match.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...

  4. 41 CFR 105-56.027 - Centralized salary offset computer match.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...

  5. 41 CFR 105-56.017 - Centralized salary offset computer match.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...

  6. 41 CFR 105-56.027 - Centralized salary offset computer match.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...

  7. 41 CFR 105-56.027 - Centralized salary offset computer match.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...

  8. 41 CFR 105-56.027 - Centralized salary offset computer match.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...

  9. 41 CFR 105-56.027 - Centralized salary offset computer match.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...

  10. 41 CFR 105-56.017 - Centralized salary offset computer match.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...

  11. 41 CFR 105-56.017 - Centralized salary offset computer match.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...

  12. [Multidimensional Strategy Regarding the Reduction of Central-Line Associated Infection in Pediatric Intensive Care].

    PubMed

    Rodrigues, Jorge; Dias, Andrea; Oliveira, Guiomar; Farela Neves, José

    2016-06-01

    To determine the central-line associated bloodstream infection rate after implementation of central venous catheter-care practice bundles and guidelines and to compare it with the previous central-line associated bloodstream infection rate. A prospective, longitudinal, observational descriptive study with an exploratory component was performed in a Pediatric Intensive Care Unit during five months. The universe was composed of every child admitted to Pediatric Intensive Care Unit who inserted a central venous catheter. A comparative study with historical controls was performed to evaluate the result of the intervention (group 1 versus group 2). Seventy five children were included, with a median age of 23 months: 22 (29.3%) newborns; 28 (37.3%) with recent surgery and 32 (43.8%) with underlying illness. A total of 105 central venous catheter were inserted, the majority a single central venous catheter (69.3%), with a mean duration of 6.8 ± 6.7 days. The most common type of central venous catheter was the short-term, non-tunneled central venous catheter (45.7%), while the subclavian and brachial flexure veins were the most frequent insertion sites (both 25.7%). There were no cases of central-line associated bloodstream infection reported during this study. Comparing with historical controls (group 1), both groups were similar regarding age, gender, department of origin and place of central venous catheter insertion. In the current study (group 2), the median length of stay was higher, while the mean duration of central venous catheter (excluding peripherally inserted central line) was similar in both groups. There were no statistical differences regarding central venous catheter caliber and number of lumens. Fewer children admitted to Pediatric Intensive Care Unit had central venous catheter inserted in group 2, with no significant difference between single or multiple central venous catheter. After multidimensional strategy implementation there was no reported central-line associated bloodstream infection Conclusions: Efforts must be made to preserve the same degree of multidimensional prevention, in order to confirm the effective reduction of the central-line associated bloodstream infection rate and to allow its maintenance.

  13. Regional subsidence history and 3D visualization with MATLAB of the Vienna Basin, central Europe

    NASA Astrophysics Data System (ADS)

    Lee, E.; Novotny, J.; Wagreich, M.

    2013-12-01

    This study reconstructed the subsidence history by the backstripping and 3D visualization techniques, to understand tectonic evolution of the Neogene Vienna Basin. The backstripping removes the compaction effect of sediment loading and quantifies the tectonic subsidence. The amount of decompaction was calculated by porosity-depth relationships evaluated from seismic velocity data acquired from two boreholes. About 100 wells have been investigated to quantify the subsidence history of the Vienna Basin. The wells have been sorted into 10 groups; N1-4 in the northern part, C1-4 in the central part and L1-2 in the northernmost and easternmost parts, based on their position within the same block bordered by major faults. To visualize 3D subsidence maps, the wells were arranged to a set of 3D points based on their map location (x, y) and depths (z1, z2, z3 ...). The division of the stratigraphic column and age range was arranged based on the Central Paratethys regional Stages. In this study, MATLAB, a numerical computing environment, was used to calculate the TPS interpolation function. The Thin-Plate Spline (TPS) can be employed to reconstruct a smooth surface from a set of 3D points. The basic physical model of the TPS is based on the bending behavior of a thin metal sheet that is constrained only by a sparse set of fixed points. In the Lower Miocene, 3D subsidence maps show strong evidence that the pre-Neogene basement of the Vienna Basin was subsiding along borders of the Alpine-Carpathian nappes. This subsidence event is represented by a piggy-back basin developed on top of the NW-ward moving thrust sheets. In the late Lower Miocene, Group C and N display a typical subsidence pattern for the pull-apart basin with a very high subsidence event (0.2 - 1.0 km/Ma). After the event, Group N shows remarkably decreasing subsidence, following the thin-skinned extension which was regarded as the extension model of the Vienna Basin in the literature. But the subsidence in Group C decreases gradually, which demonstrates a trend of increasing thermal subsidence during the Middle to Upper Miocene. The traditional model cannot explain the thermal subsidence observed in the central part. This study supports a non-uniform extension model changing from the thin-skinned extension in the northern part to the thick-skinned extension in the central part. And 3D subsidence maps propose the existence of a decoupling between lithospheric and crustal extensions along the Steinberg Fault. Group L shows very different subsidence trends compared to Group C and N. In this Group a subsidence halt occurred in the late Lower Miocene. After the halt, Group L1 shows small tectonic and subsidence events. Some former studies presented that the area of Group L1 uplifted during the early Middle Miocene. It can be concluded that the missing sediments were eroded by the local uplift. But the subsidence of Group L2 stopped completely. It suggests that Group L2 was not influenced by the extension of the strike-slip fault system.

  14. Computer-Enriched Instruction (CEI) Is Better for Preview Material Instead of Review Material: An Example of a Biostatistics Chapter, the Central Limit Theorem

    ERIC Educational Resources Information Center

    See, Lai-Chu; Huang, Yu-Hsun; Chang, Yi-Hu; Chiu, Yeo-Ju; Chen, Yi-Fen; Napper, Vicki S.

    2010-01-01

    This study examines the timing using computer-enriched instruction (CEI), before or after a traditional lecture to determine cross-over effect, period effect, and learning effect arising from sequencing of instruction. A 2 x 2 cross-over design was used with CEI to teach central limit theorem (CLT). Two sequences of graduate students in nursing…

  15. 23. VIEW OF THE FIRST FLOOR PLAN. THE FIRST FLOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    23. VIEW OF THE FIRST FLOOR PLAN. THE FIRST FLOOR HOUSED ADMINISTRATIVE OFFICES, THE CENTRAL COMPUTING, UTILITY SYSTEMS, ANALYTICAL LABORATORIES, AND MAINTENANCE SHOPS. THE ORIGINAL DRAWING HAS BEEN ARCHIVED ON MICROFILM. THE DRAWING WAS REPRODUCED AT THE BEST QUALITY POSSIBLE. LETTERS AND NUMBERS IN THE CIRCLES INDICATE FOOTER AND/OR COLUMN LOCATIONS. - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO

  16. Bipartite graphs as models of population structures in evolutionary multiplayer games.

    PubMed

    Peña, Jorge; Rochat, Yannick

    2012-01-01

    By combining evolutionary game theory and graph theory, "games on graphs" study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner's dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner's dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures.

  17. Infrared-Proximity-Sensor Modules For Robot

    NASA Technical Reports Server (NTRS)

    Parton, William; Wegerif, Daniel; Rosinski, Douglas

    1995-01-01

    Collision-avoidance system for articulated robot manipulators uses infrared proximity sensors grouped together in array of sensor modules. Sensor modules, called "sensorCells," distributed processing board-level products for acquiring data from proximity-sensors strategically mounted on robot manipulators. Each sensorCell self-contained and consists of multiple sensing elements, discrete electronics, microcontroller and communications components. Modules connected to central control computer by redundant serial digital communication subsystem including both serial and a multi-drop bus. Detects objects made of various materials at distance of up to 50 cm. For some materials, such as thermal protection system tiles, detection range reduced to approximately 20 cm.

  18. TFTR diagnostic control and data acquisition system

    NASA Astrophysics Data System (ADS)

    Sauthoff, N. R.; Daniels, R. E.

    1985-05-01

    General computerized control and data-handling support for TFTR diagnostics is presented within the context of the Central Instrumentation, Control and Data Acquisition (CICADA) System. Procedures, hardware, the interactive man-machine interface, event-driven task scheduling, system-wide arming and data acquisition, and a hierarchical data base of raw data and results are described. Similarities in data structures involved in control, monitoring, and data acquisition afford a simplification of the system functions, based on ``groups'' of devices. Emphases and optimizations appropriate for fusion diagnostic system designs are provided. An off-line data reduction computer system is under development.

  19. TFTR diagnostic control and data acquisition system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sauthoff, N.R.; Daniels, R.E.; PPL Computer Division

    1985-05-01

    General computerized control and data-handling support for TFTR diagnostics is presented within the context of the Central Instrumentation, Control and Data Acquisition (CICADA) System. Procedures, hardware, the interactive man--machine interface, event-driven task scheduling, system-wide arming and data acquisition, and a hierarchical data base of raw data and results are described. Similarities in data structures involved in control, monitoring, and data acquisition afford a simplification of the system functions, based on ''groups'' of devices. Emphases and optimizations appropriate for fusion diagnostic system designs are provided. An off-line data reduction computer system is under development.

  20. HAL/SM system functional design specification. [systems analysis and design analysis of central processing units

    NASA Technical Reports Server (NTRS)

    Ross, C.; Williams, G. P. W., Jr.

    1975-01-01

    The functional design of a preprocessor, and subsystems is described. A structure chart and a data flow diagram are included for each subsystem. Also a group of intermodule interface definitions (one definition per module) is included immediately following the structure chart and data flow for a particular subsystem. Each of these intermodule interface definitions consists of the identification of the module, the function the module is to perform, the identification and definition of parameter interfaces to the module, and any design notes associated with the module. Also described are compilers and computer libraries.

  1. Two-way cable television project

    NASA Astrophysics Data System (ADS)

    Wilkens, H.; Guenther, P.; Kiel, F.; Kraus, F.; Mahnkopf, P.; Schnee, R.

    1982-02-01

    The market demand for a multiuser computer system with interactive services was studied. Mean system work load at peak use hours was estimated and the complexity of dialog with a central computer was determined. Man machine communication by broadband cable television transmission, using digital techniques, was assumed. The end to end system is described. It is user friendly, able to handle 10,000 subscribers, and provides color television display. The central computer system architecture with remote audiovisual terminals is depicted and software is explained. Signal transmission requirements are dealt with. International availability of the test system, including sample programs, is indicated.

  2. Imaging performance of annular apertures. II - Line spread functions

    NASA Technical Reports Server (NTRS)

    Tschunko, H. F. A.

    1978-01-01

    Line images formed by aberration-free optical systems with annular apertures are investigated in the whole range of central obstruction ratios. Annular apertures form lines images with central and side line groups. The number of lines in each line group is given by the ratio of the outer diameter of the annular aperture divided by the width of the annulus. The theoretical energy fraction of 0.889 in the central line of the image formed by an unobstructed aperture increases for centrally obstructed apertures to 0.932 for the central line group. Energy fractions for the central and side line groups are practically constant for all obstruction ratios and for each line group. The illumination of rectangular secondary apertures of various length/width ratios by apertures of various obstruction ratios is discussed.

  3. Computer Instructional Aids for Undergraduate Control Education. 1978 Edition.

    ERIC Educational Resources Information Center

    Volz, Richard A.; And Others

    This work represents the development of computer tools for undergraduate students. Emphasis is on automatic control theory using hybrid and digital computation. The routine calculations of control system analysis are presented as students would use them on the University of Michigan's central digital computer and the time-shared graphic terminals…

  4. Embracing the Cloud: Six Ways to Look at the Shift to Cloud Computing

    ERIC Educational Resources Information Center

    Ullman, David F.; Haggerty, Blake

    2010-01-01

    Cloud computing is the latest paradigm shift for the delivery of IT services. Where previous paradigms (centralized, decentralized, distributed) were based on fairly straightforward approaches to technology and its management, cloud computing is radical in comparison. The literature on cloud computing, however, suffers from many divergent…

  5. Canal Transportation and Centering Ability of ProTaper and SafeSider in Preparation of Curved Root Canals: A CBCT Evaluation

    PubMed Central

    Delgoshayi, Negar; Abbasi, Mansoure; Bakhtiar, Hengameh; Sakhdari, Shirin; Ghannad, Setareh; Ellini, Mohammad Reza

    2018-01-01

    Introduction: Maintaining the original central canal path is an important parameter in efficient root canal preparation. Instruments causing minimal changes in original canal path are preferred for this purpose. This study sought to compare canal transportation and centering ability of ProTaper and SafeSider instruments in curved mesiobuccal root canals of mandibular first molars using cone beam computed tomography (CBCT). Methods and Materials : In this experimental study, 30 mesiobuccal root canals of extracted human mandibular first molars with 20° to 40° curvature were randomly divided into two groups (n=15). After mounting in putty, preoperative CBCT scans were obtained of teeth. Root canals in group A were shaped using S1, S2, F1 and F2 of ProTaper system. Root canals in group B were instrumented to size 25 using SafeSider system according to the manufacturers’ instructions. Postoperative CBCT scans were then obtained. The distance between the external root surface and internal canal wall was measured at the mesial and distal at 1, 3 and 7 mm from the apex. The values measured on primary and secondary CBCT scans were compared to assess possible changes in original central canal path and canal transportation. Data were compared using the t-test and repeated measure ANOVA. Results: ProTaper and SafeSider were significantly different in terms of canal transportation and centering ability, and ProTaper was significantly superior to SafeSider in this respect (P<0.001). Conclusion: ProTaper (in contrast to SafeSider) is well capable of maintaining the original central canal path with the least amount of transportation. PMID:29707022

  6. [Comparison of effectiveness and safety between Twisted File technique and ProTaper Universal rotary full sequence based on micro-computed tomography].

    PubMed

    Chen, Xiao-bo; Chen, Chen; Liang, Yu-hong

    2016-02-18

    To evaluate the efficacy and security of two type of rotary nickel titanium system (Twisted File and ProTaper Universal) for root canal preparation based on micro-computed tomography(micro-CT). Twenty extracted molars (including 62 canals) were divided into two experimental groups and were respectively instrumented using Twisted File rotary nickel titanium system (TF) and ProTaper Universal rotary nickel titanium system (PU) to #25/0.08 following recommended protocol. Time for root canal instrumentation (accumulation of time for every single file) was recorded. The 0-3 mm root surface from apex was observed under an optical stereomicroscope at 25 × magnification. The presence of crack line was noted. The root canals were scanned with micro-CT before and after root canal preparation. Three-dimensional shape images of canals were reconstructed, calculated and evaluated. The amount of canal central transportation of the two groups was calculated and compared. The shorter preparation time [(0.53 ± 0.14) min] was observed in TF group, while the preparation time of PU group was (2.06 ± 0.39) min (P<0.05). In mid-root level, TF group shaping resulted in less canal center transportation than PU group [(0.070 ± 0.056) mm vs. (0.097 ± 0.084) mm, P<0.05]. No instrument separation was observed in both the groups. Cracks were not found in both the groups either based in micro-CT images or observation under an optical stereomicroscope at 25 × magnification. Compared with ProTaper Universal, Twisted File took less time in root canal preparation and exhibited better shaping ability, and less canal transportation.

  7. A Scheduling Algorithm for Computational Grids that Minimizes Centralized Processing in Genome Assembly of Next-Generation Sequencing Data

    PubMed Central

    Lima, Jakelyne; Cerdeira, Louise Teixeira; Bol, Erick; Schneider, Maria Paula Cruz; Silva, Artur; Azevedo, Vasco; Abelém, Antônio Jorge Gomes

    2012-01-01

    Improvements in genome sequencing techniques have resulted in generation of huge volumes of data. As a consequence of this progress, the genome assembly stage demands even more computational power, since the incoming sequence files contain large amounts of data. To speed up the process, it is often necessary to distribute the workload among a group of machines. However, this requires hardware and software solutions specially configured for this purpose. Grid computing try to simplify this process of aggregate resources, but do not always offer the best performance possible due to heterogeneity and decentralized management of its resources. Thus, it is necessary to develop software that takes into account these peculiarities. In order to achieve this purpose, we developed an algorithm aimed to optimize the functionality of de novo assembly software ABySS in order to optimize its operation in grids. We run ABySS with and without the algorithm we developed in the grid simulator SimGrid. Tests showed that our algorithm is viable, flexible, and scalable even on a heterogeneous environment, which improved the genome assembly time in computational grids without changing its quality. PMID:22461785

  8. Computer and photogrammetric general land use study of central north Alabama

    NASA Technical Reports Server (NTRS)

    Jayroe, R. R.; Larsen, P. A.; Campbell, C. W.

    1974-01-01

    The object of this report is to acquaint potential users with two computer programs, developed at NASA, Marshall Space Flight Center. They were used in producing a land use survey and maps of central north Alabama from Earth Resources Technology Satellite (ERTS) digital data. The report describes in detail the thought processes and analysis procedures used from the initiation of the land use study to its completion, as well as a photogrammetric study that was used in conjunction with the computer analysis to produce similar land use maps. The results of the land use demonstration indicate that, with respect to computer time and cost, such a study may be economically and realistically feasible on a statewide basis.

  9. Constructing Confidence Intervals for Reliability Coefficients Using Central and Noncentral Distributions.

    ERIC Educational Resources Information Center

    Weber, Deborah A.

    Greater understanding and use of confidence intervals is central to changes in statistical practice (G. Cumming and S. Finch, 2001). Reliability coefficients and confidence intervals for reliability coefficients can be computed using a variety of methods. Estimating confidence intervals includes both central and noncentral distribution approaches.…

  10. Modeling Memory for Language Understanding.

    DTIC Science & Technology

    1982-02-01

    Abstract Research on natural language understanding by computer has shown that the nature and organization of memory plays j central role in the...block number) Research on natural language understanding by computer has shown that the nature and organization of memory plays a central role in the...understanding mechanism. Further we claim that such reminding is at the root of how we learn. Issues such as these have played an important part in shaping the

  11. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020; (supplement three to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) and revised by Kernodle (Kernodle, J.M., 1998, Simulation of ground-water flow in the Albuquerque Basin, 1901-95, with projections to 2020 (supplement two to U.S. Geological Survey Water-Resources Investigations Report 94-4251): U.S. Geological Survey Open-File Report 96-209, 54 p.). Output files resulting from the computer simulations are included for reference.

  12. Hydrogen-bond landscapes, geometry and energetics of squaric acid and its mono- and dianions: a Cambridge Structural Database, IsoStar and computational study.

    PubMed

    Allen, Frank H; Cruz-Cabeza, Aurora J; Wood, Peter A; Bardwell, David A

    2013-10-01

    As part of a programme of work to extend central-group coverage in the Cambridge Crystallographic Data Centre's (CCDC) IsoStar knowledge base of intermolecular interactions, we have studied the hydrogen-bonding abilities of squaric acid (H2SQ) and its mono- and dianions (HSQ(-) and SQ(2-)) using the Cambridge Structural Database (CSD) along with dispersion-corrected density functional theory (DFT-D) calculations for a range of hydrogen-bonded dimers. The -OH and -C=O groups of H2SQ, HSQ(-) and SQ(2-) are potent donors and acceptors, as indicated by their hydrogen-bond geometries in available crystal structures in the CSD, and by the attractive energies calculated for their dimers with acetone and methanol, which were used as model acceptors and donors. The two anions have sufficient examples in the CSD for their addition as new central groups in IsoStar. It is also shown that charge- and resonance-assisted hydrogen bonds involving H2SQ and HSQ(-) are similar in strength to those made by carboxylate COO(-) acceptors, while hydrogen bonds made by the dianion SQ(2-) are somewhat stronger. The study reinforces the value of squaric acid and its anions as cocrystal formers and their actual and potential importance as isosteric replacements for carboxylic acid and carboxylate functions.

  13. Apollo lunar descent guidance

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1974-01-01

    Apollo lunar-descent guidance transfers the Lunar Module from a near-circular orbit to touchdown, traversing a 17 deg central angle and a 15 km altitude in 11 min. A group of interactive programs in an onboard computer guide the descent, controlling altitude and the descent propulsion system throttle. A ground-based program pre-computes guidance targets. The concepts involved in this guidance are described. Explicit and implicit guidance are discussed, guidance equations are derived, and the earlier Apollo explicit equation is shown to be an inferior special case of the later implicit equation. Interactive guidance, by which the two-man crew selects a landing site in favorable terrain and directs the trajectory there, is discussed. Interactive terminal-descent guidance enables the crew to control the essentially vertical descent rate in order to land in minimum time with safe contact speed. The altitude maneuver routine uses concepts that make gimbal lock inherently impossible.

  14. A quantum approach to homomorphic encryption

    PubMed Central

    Tan, Si-Hui; Kettlewell, Joshua A.; Ouyang, Yingkai; Chen, Lin; Fitzsimons, Joseph F.

    2016-01-01

    Encryption schemes often derive their power from the properties of the underlying algebra on the symbols used. Inspired by group theoretic tools, we use the centralizer of a subgroup of operations to present a private-key quantum homomorphic encryption scheme that enables a broad class of quantum computation on encrypted data. The quantum data is encoded on bosons of distinct species in distinct spatial modes, and the quantum computations are manipulations of these bosons in a manner independent of their species. A particular instance of our encoding hides up to a constant fraction of the information encrypted. This fraction can be made arbitrarily close to unity with overhead scaling only polynomially in the message length. This highlights the potential of our protocol to hide a non-trivial amount of information, and is suggestive of a large class of encodings that might yield better security. PMID:27658349

  15. Decision making in recurrent neuronal circuits.

    PubMed

    Wang, Xiao-Jing

    2008-10-23

    Decision making has recently emerged as a central theme in neurophysiological studies of cognition, and experimental and computational work has led to the proposal of a cortical circuit mechanism of elemental decision computations. This mechanism depends on slow recurrent synaptic excitation balanced by fast feedback inhibition, which not only instantiates attractor states for forming categorical choices but also long transients for gradually accumulating evidence in favor of or against alternative options. Such a circuit endowed with reward-dependent synaptic plasticity is able to produce adaptive choice behavior. While decision threshold is a core concept for reaction time tasks, it can be dissociated from a general decision rule. Moreover, perceptual decisions and value-based economic choices are described within a unified framework in which probabilistic choices result from irregular neuronal activity as well as iterative interactions of a decision maker with an uncertain environment or other unpredictable decision makers in a social group.

  16. Geoinformatics in the public service: building a cyberinfrastructure across the geological surveys

    USGS Publications Warehouse

    Allison, M. Lee; Gundersen, Linda C.; Richard, Stephen M.; Keller, G. Randy; Baru, Chaitanya

    2011-01-01

    Advanced information technology infrastructure is increasingly being employed in the Earth sciences to provide researchers with efficient access to massive central databases and to integrate diversely formatted information from a variety of sources. These geoinformatics initiatives enable manipulation, modeling and visualization of data in a consistent way, and are helping to develop integrated Earth models at various scales, and from the near surface to the deep interior. This book uses a series of case studies to demonstrate computer and database use across the geosciences. Chapters are thematically grouped into sections that cover data collection and management; modeling and community computational codes; visualization and data representation; knowledge management and data integration; and web services and scientific workflows. Geoinformatics is a fascinating and accessible introduction to this emerging field for readers across the solid Earth sciences and an invaluable reference for researchers interested in initiating new cyberinfrastructure projects of their own.

  17. Oxytocin promotes human ethnocentrism

    PubMed Central

    De Dreu, Carsten K. W.; Greer, Lindred L.; Van Kleef, Gerben A.; Shalvi, Shaul; Handgraaf, Michel J. J.

    2011-01-01

    Human ethnocentrism—the tendency to view one's group as centrally important and superior to other groups—creates intergroup bias that fuels prejudice, xenophobia, and intergroup violence. Grounded in the idea that ethnocentrism also facilitates within-group trust, cooperation, and coordination, we conjecture that ethnocentrism may be modulated by brain oxytocin, a peptide shown to promote cooperation among in-group members. In double-blind, placebo-controlled designs, males self-administered oxytocin or placebo and privately performed computer-guided tasks to gauge different manifestations of ethnocentric in-group favoritism as well as out-group derogation. Experiments 1 and 2 used the Implicit Association Test to assess in-group favoritism and out-group derogation. Experiment 3 used the infrahumanization task to assess the extent to which humans ascribe secondary, uniquely human emotions to their in-group and to an out-group. Experiments 4 and 5 confronted participants with the option to save the life of a larger collective by sacrificing one individual, nominated as in-group or as out-group. Results show that oxytocin creates intergroup bias because oxytocin motivates in-group favoritism and, to a lesser extent, out-group derogation. These findings call into question the view of oxytocin as an indiscriminate “love drug” or “cuddle chemical” and suggest that oxytocin has a role in the emergence of intergroup conflict and violence. PMID:21220339

  18. BridgeRank: A novel fast centrality measure based on local structure of the network

    NASA Astrophysics Data System (ADS)

    Salavati, Chiman; Abdollahpouri, Alireza; Manbari, Zhaleh

    2018-04-01

    Ranking nodes in complex networks have become an important task in many application domains. In a complex network, influential nodes are those that have the most spreading ability. Thus, identifying influential nodes based on their spreading ability is a fundamental task in different applications such as viral marketing. One of the most important centrality measures to ranking nodes is closeness centrality which is efficient but suffers from high computational complexity O(n3) . This paper tries to improve closeness centrality by utilizing the local structure of nodes and presents a new ranking algorithm, called BridgeRank centrality. The proposed method computes local centrality value for each node. For this purpose, at first, communities are detected and the relationship between communities is completely ignored. Then, by applying a centrality in each community, only one best critical node from each community is extracted. Finally, the nodes are ranked based on computing the sum of the shortest path length of nodes to obtained critical nodes. We have also modified the proposed method by weighting the original BridgeRank and selecting several nodes from each community based on the density of that community. Our method can find the best nodes with high spread ability and low time complexity, which make it applicable to large-scale networks. To evaluate the performance of the proposed method, we use the SIR diffusion model. Finally, experiments on real and artificial networks show that our method is able to identify influential nodes so efficiently, and achieves better performance compared to other recent methods.

  19. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  20. Root resorption during orthodontic treatment.

    PubMed

    Walker, Sally

    2010-01-01

    Medline, Embase, LILACS, The Cochrane Library (Cochrane Database of Systematic Reviews, CENTRAL, and Cochrane Oral Health Group Trials Register) Web of Science, EBM Reviews, Computer Retrieval of Information on Scientific Project (CRISP, www.crisp.cit.nih.gov), On-Line Computer Library Center (www.oclc.org), Google Index to Scientific and Technical Proceedings, PAHO (www.paho.org), WHOLis (www.who.int/library/databases/en), BBO (Brazilian Bibliography of Dentistry), CEPS (Chinese Electronic Periodical Services), Conference materials (www.bl.uk/services/bsds/dsc/conference.html), ProQuest Dissertation Abstracts and Thesis database, TrialCentral (www.trialscentral.org), National Research Register (www.controlled-trials.com), www.Clinicaltrials.gov and SIGLE (System for Information on Grey Literature in Europe). Randomised controlled trials including split mouth design, recording the presence or absence of external apical root resorption (EARR) by treatment group at the end of the treatment period. Data were extracted independently by two reviewers using specially designed and piloted forms. Quality was also assessed independently by the same reviewers. After evaluating titles and abstracts, 144 full articles were obtained of which 13 articles, describing 11 trials, fulfilled the criteria for inclusion. Differences in the methodological approaches and reporting results made quantitative statistical comparisons impossible. Evidence suggests that comprehensive orthodontic treatment causes increased incidence and severity of root resorption, and heavy forces might be particularly harmful. Orthodontically induced inflammatory root resorption is unaffected by archwire sequencing, bracket prescription, and self-ligation. Previous trauma and tooth morphology are unlikely causative factors. There is some evidence that a two- to three-month pause in treatment decreases total root resorption. The results were inconclusive in the clinical management of root resorption, but there is evidence to support the use of light forces, especially with incisor intrusion.

  1. Organising a University Computer System: Analytical Notes.

    ERIC Educational Resources Information Center

    Jacquot, J. P.; Finance, J. P.

    1990-01-01

    Thirteen trends in university computer system development are identified, system user requirements are analyzed, critical system qualities are outlined, and three options for organizing a computer system are presented. The three systems include a centralized network, local network, and federation of local networks. (MSE)

  2. Best Practice Guidelines for Computer Technology in the Montessori Early Childhood Classroom.

    ERIC Educational Resources Information Center

    Montminy, Peter

    1999-01-01

    Presents a draft for a principle-centered position statement of a Montessori early childhood program in central Pennsylvania, on the pros and cons of computer use in a Montessori 3-6 classroom. Includes computer software rating form. (Author/KB)

  3. Data processing for water monitoring system

    NASA Technical Reports Server (NTRS)

    Monford, L.; Linton, A. T.

    1978-01-01

    Water monitoring data acquisition system is structured about central computer that controls sampling and sensor operation, and analyzes and displays data in real time. Unit is essentially separated into two systems: computer system, and hard wire backup system which may function separately or with computer.

  4. Root resorption due to orthodontic treatment using self-ligating and conventional brackets : A cone-beam computed tomography study.

    PubMed

    Aras, Isil; Unal, Idil; Huniler, Gencer; Aras, Aynur

    2018-05-01

    Purpose of the present study was to compare external root resorption (ERR) volumetrically in maxillary incisors induced by orthodontic treatment using self-ligating brackets (Damon Q, DQ) or conventional brackets (Titanium Orthos, TO) with the help of cone-beam computed tomography (CBCT). A sample of 32 subjects, with Angle Class I malocclusion and anterior crowding of 4-10 mm, was divided randomly into two groups: a DQ group, in which self-ligating DQ brackets with Damon archwires were used; and a TO group, in which conventional TO brackets with large Orthos archwires were applied. The study was conducted using CBCT scans taken before (T1), and near the end (9 months after the initiation of treatment; T2) of the orthodontic treatment. The extent of ERR was determined volumetrically using Mimics software. Changes in root volume were evaluated by repeated-measures analysis of variance as well as by paired and independent t-tests. While significant differences were found between T1 and T2 for root volume in both groups (p < 0.05), there was no difference between the groups regarding the amount (mm 3 or relative change) of ERR (p > 0.05). Maxillary central and lateral incisors showed similar volume loss (p > 0.05). Furthermore, the TO group showed a higher prevalence of palatinal and proximal slanted RR compared with the DQ group (p < 0.05). It is not possible to suggest superiority of one bracket system over the other only considering root resorption pattern or amount. Higher incidence of slanted RR found in patients treated with the TO system warrants further research to identify possible specific causes.

  5. Integrating Xgrid into the HENP distributed computing model

    NASA Astrophysics Data System (ADS)

    Hajdu, L.; Kocoloski, A.; Lauret, J.; Miller, M.

    2008-07-01

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.

  6. The genetics of shovel shape in maxillary central incisors in man.

    PubMed

    Blanco, R; Chakraborty, R

    1976-03-01

    From dental casts of 94 parent-offspring and 127 full-sib pairs, sampled from two Chilean populations, shovelling indices are computed to measure the degree of shovelling of maxillary central incisors quantitatively. Genetic correlations are computed to determine the role of genetic factors in explaining the variation in this trait. Assuming only hereditary factors to be responsible for the transmission of shovel shape, 68% of total variability is ascribed to the additive effect of genes.

  7. The Effects of Closed-Loop Medical Devices on the Autonomy and Accountability of Persons and Systems.

    PubMed

    Kellmeyer, Philipp; Cochrane, Thomas; Müller, Oliver; Mitchell, Christine; Ball, Tonio; Fins, Joseph J; Biller-Andorno, Nikola

    2016-10-01

    Closed-loop medical devices such as brain-computer interfaces are an emerging and rapidly advancing neurotechnology. The target patients for brain-computer interfaces (BCIs) are often severely paralyzed, and thus particularly vulnerable in terms of personal autonomy, decisionmaking capacity, and agency. Here we analyze the effects of closed-loop medical devices on the autonomy and accountability of both persons (as patients or research participants) and neurotechnological closed-loop medical systems. We show that although BCIs can strengthen patient autonomy by preserving or restoring communicative abilities and/or motor control, closed-loop devices may also create challenges for moral and legal accountability. We advocate the development of a comprehensive ethical and legal framework to address the challenges of emerging closed-loop neurotechnologies like BCIs and stress the centrality of informed consent and refusal as a means to foster accountability. We propose the creation of an international neuroethics task force with members from medical neuroscience, neuroengineering, computer science, medical law, and medical ethics, as well as representatives of patient advocacy groups and the public.

  8. [The laboratory of tomorrow. Particular reference to hematology].

    PubMed

    Cazal, P

    1985-01-01

    A serious prediction can only be an extrapolation of recent developments. To be exact, the development has to continue in the same direction, which is only a probability. Probable development of hematological technology: Progress in methods. Development of new labelling methods: radio-elements, antibodies. Monoclonal antibodies. Progress in equipment: Cell counters and their adaptation to routine hemograms is a certainty. From analyzers: a promise that will perhaps become reality. Coagulometers: progress still to be made. Hemagglutination detectors and their application to grouping: good achievements, but the market is too limited. Computerization and automation: What form will the computerizing take? What will the computer do? Who will the computer control? What should the automatic analyzers be? Two current levels. Relationships between the automatic analysers and the computer. rapidity, fidelity and above all, reliability. Memory: large capacity and easy access. Disadvantages: conservatism and technical dependency. How can they be avoided? Development of the environment: Laboratory input: outside supplies, electricity, reagents, consumables. Samples and their identification. Output: distribution of results and communication problems. Centralization or decentralization? What will tomorrow's laboratory be? 3 hypotheses: optimistic, pessimistic, and balanced.

  9. Diversity of bilateral synaptic assemblies for binaural computation in midbrain single neurons.

    PubMed

    He, Na; Kong, Lingzhi; Lin, Tao; Wang, Shaohui; Liu, Xiuping; Qi, Jiyao; Yan, Jun

    2017-11-01

    Binaural hearing confers many beneficial functions but our understanding of its underlying neural substrates is limited. This study examines the bilateral synaptic assemblies and binaural computation (or integration) in the central nucleus of the inferior colliculus (ICc) of the auditory midbrain, a key convergent center. Using in-vivo whole-cell patch-clamp, the excitatory and inhibitory postsynaptic potentials (EPSPs/IPSPs) of single ICc neurons to contralateral, ipsilateral and bilateral stimulation were recorded. According to the contralateral and ipsilateral EPSP/IPSP, 7 types of bilateral synaptic assemblies were identified. These include EPSP-EPSP (EE), E-IPSP (EI), E-no response (EO), II, IE, IO and complex-mode (CM) neurons. The CM neurons showed frequency- and/or amplitude-dependent EPSPs/IPSPs to contralateral or ipsilateral stimulation. Bilateral stimulation induced EPSPs/IPSPs that could be larger than (facilitation), similar to (ineffectiveness) or smaller than (suppression) those induced by contralateral stimulation. Our findings have allowed our group to characterize novel neural circuitry for binaural computation in the midbrain. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. A Faster Parallel Algorithm and Efficient Multithreaded Implementations for Evaluating Betweenness Centrality on Massive Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madduri, Kamesh; Ediger, David; Jiang, Karl

    2009-02-15

    We present a new lock-free parallel algorithm for computing betweenness centralityof massive small-world networks. With minor changes to the data structures, ouralgorithm also achieves better spatial cache locality compared to previous approaches. Betweenness centrality is a key algorithm kernel in HPCS SSCA#2, a benchmark extensively used to evaluate the performance of emerging high-performance computing architectures for graph-theoretic computations. We design optimized implementations of betweenness centrality and the SSCA#2 benchmark for two hardware multithreaded systems: a Cray XMT system with the Threadstorm processor, and a single-socket Sun multicore server with the UltraSPARC T2 processor. For a small-world network of 134 millionmore » vertices and 1.073 billion edges, the 16-processor XMT system and the 8-core Sun Fire T5120 server achieve TEPS scores (an algorithmic performance count for the SSCA#2 benchmark) of 160 million and 90 million respectively, which corresponds to more than a 2X performance improvement over the previous parallel implementations. To better characterize the performance of these multithreaded systems, we correlate the SSCA#2 performance results with data from the memory-intensive STREAM and RandomAccess benchmarks. Finally, we demonstrate the applicability of our implementation to analyze massive real-world datasets by computing approximate betweenness centrality for a large-scale IMDb movie-actor network.« less

  11. Brain-computer interface technology: a review of the Second International Meeting.

    PubMed

    Vaughan, Theresa M; Heetderks, William J; Trejo, Leonard J; Rymer, William Z; Weinrich, Michael; Moore, Melody M; Kübler, Andrea; Dobkin, Bruce H; Birbaumer, Niels; Donchin, Emanuel; Wolpaw, Elizabeth Winter; Wolpaw, Jonathan R

    2003-06-01

    This paper summarizes the Brain-Computer Interfaces for Communication and Control, The Second International Meeting, held in Rensselaerville, NY, in June 2002. Sponsored by the National Institutes of Health and organized by the Wadsworth Center of the New York State Department of Health, the meeting addressed current work and future plans in brain-computer interface (BCI) research. Ninety-two researchers representing 38 different research groups from the United States, Canada, Europe, and China participated. The BCIs discussed at the meeting use electroencephalographic activity recorded from the scalp or single-neuron activity recorded within cortex to control cursor movement, select letters or icons, or operate neuroprostheses. The central element in each BCI is a translation algorithm that converts electrophysiological input from the user into output that controls external devices. BCI operation depends on effective interaction between two adaptive controllers, the user who encodes his or her commands in the electrophysiological input provided to the BCI, and the BCI that recognizes the commands contained in the input and expresses them in device control. Current BCIs have maximum information transfer rates of up to 25 b/min. Achievement of greater speed and accuracy requires improvements in signal acquisition and processing, in translation algorithms, and in user training. These improvements depend on interdisciplinary cooperation among neuroscientists, engineers, computer programmers, psychologists, and rehabilitation specialists, and on adoption and widespread application of objective criteria for evaluating alternative methods. The practical use of BCI technology will be determined by the development of appropriate applications and identification of appropriate user groups, and will require careful attention to the needs and desires of individual users.

  12. Extension of a streamwise upwind algorithm to a moving grid system

    NASA Technical Reports Server (NTRS)

    Obayashi, Shigeru; Goorjian, Peter M.; Guruswamy, Guru P.

    1990-01-01

    A new streamwise upwind algorithm was derived to compute unsteady flow fields with the use of a moving-grid system. The temporally nonconservative LU-ADI (lower-upper-factored, alternating-direction-implicit) method was applied for time marching computations. A comparison of the temporally nonconservative method with a time-conservative implicit upwind method indicates that the solutions are insensitive to the conservative properties of the implicit solvers when practical time steps are used. Using this new method, computations were made for an oscillating wing at a transonic Mach number. The computed results confirm that the present upwind scheme captures the shock motion better than the central-difference scheme based on the beam-warming algorithm. The new upwind option of the code allows larger time-steps and thus is more efficient, even though it requires slightly more computational time per time step than the central-difference option.

  13. [Personal computer-based computer monitoring system of the anesthesiologist (2-year experience in development and use)].

    PubMed

    Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I

    1995-01-01

    Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.

  14. Comparative evaluation of fracture resistance under static and fatigue loading of endodontically treated teeth restored with carbon fiber posts, glass fiber posts, and an experimental dentin post system: an in vitro study.

    PubMed

    Ambica, Khetarpal; Mahendran, Kavitha; Talwar, Sangeeta; Verma, Mahesh; Padmini, Govindaswamy; Periasamy, Ravishankar

    2013-01-01

    This investigation sought to compare the fracture resistance under static and fatigue loading of endodontically treated teeth restored with fiber-reinforced composite posts and experimental dentin posts milled from human root dentin by using computer-aided design/computer-aided manufacturing. Seventy maxillary central incisors were obturated and divided into 4 groups: control group without any post (n = 10), carbon fiber post group (n = 20), glass fiber post group (n = 20), and dentin post group (n = 20). Control group teeth were prepared to a height of 5 mm. In all other teeth, post space was prepared; a post was cemented, and a core build-up was provided. Half the samples from each group were statistically loaded until failure, and the remaining half were subjected to cyclic loading, followed by monostatic load until fracture. One-way analysis of variance and Bonferroni multiple comparisons revealed a significant difference among test groups. The control group demonstrated highest fracture resistance (935.03 ± 33.53 N), followed by the dentin post group (793.12 ± 33.69 N), glass fiber post group (603.44 ± 46.67 N), and carbon fiber post group (497.19 ± 19.27 N) under static loading. These values reduced to 786.69 ± 29.64 N, 646.34 ± 26.56 N, 470 ± 36.34 N, and 379.71 ± 13.95 N, respectively, after cyclic loading. Results suggest that human dentin can serve as post material under static and fatigue loading. Although at an early stage in research, the use of dentin posts in root-filled teeth looks promising. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  15. Increased Retinal Thinning after Combination of Internal Limiting Membrane Peeling and Silicone Oil Endotamponade in Proliferative Diabetic Retinopathy.

    PubMed

    Kaneko, Hiroki; Matsuura, Toshiyuki; Takayama, Kei; Ito, Yasuki; Iwase, Takeshi; Ueno, Shinji; Nonobe, Norie; Yasuda, Shunsuke; Kataoka, Keiko; Terasaki, Hiroko

    2017-01-01

    The aim of this study was to examine the change in retinal thickness after vitrectomy with internal limiting membrane (ILM) peeling and/or silicone oil (SO) endotamponade in proliferative diabetic retinopathy (PDR). The actual amount and ratio of changes in the retinal thickness were calculated. Compared to control eyes in the ILM peeling (-)/SO (-) group, the central, superior inner, and temporal inner retina in the ILM peeling (+)/SO (-) group, the central and superior inner retina in the ILM peeling (-)/SO (+) group, and the central, inferior inner, temporal inner, and nasal inner retina in the ILM peeling (+)/SO (+) group showed a significant reduction of the retinal thickness. The central, superior inner, and temporal inner retina in the ILM peeling (+)/SO (-) group, the central and superior inner retina in the ILM peeling (-)/SO (+) group, and the central, superior inner, inferior inner, and temporal inner retina in the ILM peeling (+)/SO (+) group showed a significantly increased reduction rate of the retinal thickness compared to the control group. Macular retinal thinning in PDR was observed after ILM peeling and SO endotamponade, and it was increased by the combination of these 2 factors. © 2017 S. Karger AG, Basel.

  16. A computational modeling of semantic knowledge in reading comprehension: Integrating the landscape model with latent semantic analysis.

    PubMed

    Yeari, Menahem; van den Broek, Paul

    2016-09-01

    It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.

  17. Crustal structure of the Pannonian-Carpathian region, Central Europe, from ambient noise tomography

    NASA Astrophysics Data System (ADS)

    Ren, Y.; Stuart, G. W.; Houseman, G. A.; Carpathian Basins Project Working Group

    2010-12-01

    The Pannonian Basin of Central Europe is a major extensional basin surrounded by the Carpathian Mountains. During the evolution of the Carpathian-Pannonian region, extension of the crust and lithosphere created several inter-related basins of which the Pannonian basin is the largest. Imaging the seismic velocity structure of the crust and the upper mantle may help us understand the structure and geodynamic evolution of this part of central Europe. Here, we use ambient noise tomography to investigate the crust and uppermost mantle structure in the region. We have collected and processed continuous data from 56 temporary stations deployed in the Carpathian Basins Project (CBP) for 16 months (2005-2007) and 41 permanent broadband stations; this dataset enables the most well-resolved images of the S-wave structure of the region yet obtained. We computed the cross-correlation between vertical component seismograms from pairs of stations and stacked the correlated waveforms over 1-2 years to estimate the Rayleigh wave Green’s function. Frequency-time analysis is used to measure the group velocity dispersion curves, which are then inverted for the group velocity maps. Our 4-10 s group velocity maps exhibit low velocity anomalies which clearly defined the major sediment depo-centers in the Carpathian region. A broad low velocity anomaly in the center of the 5 s group velocity map can be associated with the Pannonian Basin, whereas an anomaly in the southeastern region is related to the Moesian platform. Further east, the Vienna Basin can also be seen on our maps. A fast anomaly in the central region can be associated with the Mid-Hungarian line. At periods from 18 to 24 seconds, group velocities become increasingly sensitive to crustal thickness. The maps also reveal low-velocity anomalies associated with the Carpathians. The low velocity anomalies are probably caused by deeper crustal roots beneath the mountain ranges which occur due to isostatic compensation. CBP working group: G. Houseman, G. Stuart, Y. Ren, B. Dando, P. Lorinczi, School of Earth and Environment, University of Leeds, UK; E. Hegedus, A. Kovács, I. Török, I. László, R. Csabafi, Eötvös Loránd Geophysical Institute, Budapest, Hungary; E. Brüeckl, H. Hausmann, W. Loderer, T-U Wien, Vienna, Austria; S. Radovanovic, V. Kovacevic, D. Valcic, S. Petrovic-Cacic, G. Krunic, Seismological Survey of Serbia, Belgrade, Serbia; A. Brisbourne, D. Hawthorn, A. Horleston, V. Lane, SEIS-UK, Leicester University, UK.

  18. 28 CFR 25.8 - System safeguards.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... justice agency computer site must have adequate physical security to protect against any unauthorized... Index is stored electronically for use in an FBI computer environment. The NICS central computer will... authorized personnel who have identified themselves and their need for access to a system security officer...

  19. 28 CFR 25.8 - System safeguards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... justice agency computer site must have adequate physical security to protect against any unauthorized... Index is stored electronically for use in an FBI computer environment. The NICS central computer will... authorized personnel who have identified themselves and their need for access to a system security officer...

  20. 28 CFR 25.8 - System safeguards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... justice agency computer site must have adequate physical security to protect against any unauthorized... Index is stored electronically for use in an FBI computer environment. The NICS central computer will... authorized personnel who have identified themselves and their need for access to a system security officer...

  1. 28 CFR 25.8 - System safeguards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... justice agency computer site must have adequate physical security to protect against any unauthorized... Index is stored electronically for use in an FBI computer environment. The NICS central computer will... authorized personnel who have identified themselves and their need for access to a system security officer...

  2. 28 CFR 25.8 - System safeguards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... justice agency computer site must have adequate physical security to protect against any unauthorized... Index is stored electronically for use in an FBI computer environment. The NICS central computer will... authorized personnel who have identified themselves and their need for access to a system security officer...

  3. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    PubMed

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  4. Acceleration and Velocity Sensing from Measured Strain

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Truax, Roger

    2015-01-01

    A simple approach for computing acceleration and velocity of a structure from the strain is proposed in this study. First, deflection and slope of the structure are computed from the strain using a two-step theory. Frequencies of the structure are computed from the time histories of strain using a parameter estimation technique together with an autoregressive moving average model. From deflection, slope, and frequencies of the structure, acceleration and velocity of the structure can be obtained using the proposed approach. Simple harmonic motion is assumed for the acceleration computations, and the central difference equation with a linear autoregressive model is used for the computations of velocity. A cantilevered rectangular wing model is used to validate the simple approach. Quality of the computed deflection, acceleration, and velocity values are independent of the number of fibers. The central difference equation with a linear autoregressive model proposed in this study follows the target response with reasonable accuracy. Therefore, the handicap of the backward difference equation, phase shift, is successfully overcome.

  5. Sleep spindle alterations in patients with Parkinson's disease

    PubMed Central

    Christensen, Julie A. E.; Nikolic, Miki; Warby, Simon C.; Koch, Henriette; Zoetmulder, Marielle; Frandsen, Rune; Moghadam, Keivan K.; Sorensen, Helge B. D.; Mignot, Emmanuel; Jennum, Poul J.

    2015-01-01

    The aim of this study was to identify changes of sleep spindles (SS) in the EEG of patients with Parkinson's disease (PD). Five sleep experts manually identified SS at a central scalp location (C3-A2) in 15 PD and 15 age- and sex-matched control subjects. Each SS was given a confidence score, and by using a group consensus rule, 901 SS were identified and characterized by their (1) duration, (2) oscillation frequency, (3) maximum peak-to-peak amplitude, (4) percent-to-peak amplitude, and (5) density. Between-group comparisons were made for all SS characteristics computed, and significant changes for PD patients vs. control subjects were found for duration, oscillation frequency, maximum peak-to-peak amplitude and density. Specifically, SS density was lower, duration was longer, oscillation frequency slower and maximum peak-to-peak amplitude higher in patients vs. controls. We also computed inter-expert reliability in SS scoring and found a significantly lower reliability in scoring definite SS in patients when compared to controls. How neurodegeneration in PD could influence SS characteristics is discussed. We also note that the SS morphological changes observed here may affect automatic detection of SS in patients with PD or other neurodegenerative disorders (NDDs). PMID:25983685

  6. Developing an Advanced Environment for Collaborative Computing

    NASA Technical Reports Server (NTRS)

    Becerra-Fernandez, Irma; Stewart, Helen; DelAlto, Martha; DelAlto, Martha; Knight, Chris

    1999-01-01

    Knowledge management in general tries to organize and make available important know-how, whenever and where ever is needed. Today, organizations rely on decision-makers to produce "mission critical" decisions that am based on inputs from multiple domains. The ideal decision-maker has a profound understanding of specific domains that influence the decision-making process coupled with the experience that allows them to act quickly and decisively on the information. In addition, learning companies benefit by not repeating costly mistakes, and by reducing time-to-market in Research & Development projects. Group-decision making tools can help companies make better decisions by capturing the knowledge from groups of experts. Furthermore, companies that capture their customers preferences can improve their customer service, which translates to larger profits. Therefore collaborative computing provides a common communication space, improves sharing of knowledge, provides a mechanism for real-time feedback on the tasks being performed, helps to optimize processes, and results in a centralized knowledge warehouse. This paper presents the research directions. of a project which seeks to augment an advanced collaborative web-based environment called Postdoc, with workflow capabilities. Postdoc is a "government-off-the-shelf" document management software developed at NASA-Ames Research Center (ARC).

  7. Computerized Library Serves Six Colleges

    ERIC Educational Resources Information Center

    Blankenship, Ted

    1973-01-01

    The Associated Colleges of Central Kansas have a cooperative library program that gives students access to 300,000 volumes and 2,800 periodicals. This is possible through a central computer book listing and a telephone hotline. (PG)

  8. Confidence Sharing: An Economic Strategy for Efficient Information Flows in Animal Groups

    PubMed Central

    Korman, Amos; Greenwald, Efrat; Feinerman, Ofer

    2014-01-01

    Social animals may share information to obtain a more complete and accurate picture of their surroundings. However, physical constraints on communication limit the flow of information between interacting individuals in a way that can cause an accumulation of errors and deteriorated collective behaviors. Here, we theoretically study a general model of information sharing within animal groups. We take an algorithmic perspective to identify efficient communication schemes that are, nevertheless, economic in terms of communication, memory and individual internal computation. We present a simple and natural algorithm in which each agent compresses all information it has gathered into a single parameter that represents its confidence in its behavior. Confidence is communicated between agents by means of active signaling. We motivate this model by novel and existing empirical evidences for confidence sharing in animal groups. We rigorously show that this algorithm competes extremely well with the best possible algorithm that operates without any computational constraints. We also show that this algorithm is minimal, in the sense that further reduction in communication may significantly reduce performances. Our proofs rely on the Cramér-Rao bound and on our definition of a Fisher Channel Capacity. We use these concepts to quantify information flows within the group which are then used to obtain lower bounds on collective performance. The abstract nature of our model makes it rigorously solvable and its conclusions highly general. Indeed, our results suggest confidence sharing as a central notion in the context of animal communication. PMID:25275649

  9. Confidence sharing: an economic strategy for efficient information flows in animal groups.

    PubMed

    Korman, Amos; Greenwald, Efrat; Feinerman, Ofer

    2014-10-01

    Social animals may share information to obtain a more complete and accurate picture of their surroundings. However, physical constraints on communication limit the flow of information between interacting individuals in a way that can cause an accumulation of errors and deteriorated collective behaviors. Here, we theoretically study a general model of information sharing within animal groups. We take an algorithmic perspective to identify efficient communication schemes that are, nevertheless, economic in terms of communication, memory and individual internal computation. We present a simple and natural algorithm in which each agent compresses all information it has gathered into a single parameter that represents its confidence in its behavior. Confidence is communicated between agents by means of active signaling. We motivate this model by novel and existing empirical evidences for confidence sharing in animal groups. We rigorously show that this algorithm competes extremely well with the best possible algorithm that operates without any computational constraints. We also show that this algorithm is minimal, in the sense that further reduction in communication may significantly reduce performances. Our proofs rely on the Cramér-Rao bound and on our definition of a Fisher Channel Capacity. We use these concepts to quantify information flows within the group which are then used to obtain lower bounds on collective performance. The abstract nature of our model makes it rigorously solvable and its conclusions highly general. Indeed, our results suggest confidence sharing as a central notion in the context of animal communication.

  10. Conformal field algebras with quantum symmetry from the theory of superselection sectors

    NASA Astrophysics Data System (ADS)

    Mack, Gerhard; Schomerus, Volker

    1990-11-01

    According to the theory of superselection sectors of Doplicher, Haag, and Roberts, field operators which make transitions between different superselection sectors—i.e. different irreducible representations of the observable algebra—are to be constructed by adjoining localized endomorphisms to the algebra of local observables. We find the relevant endomorphisms of the chiral algebra of observables in the minimal conformal model with central charge c=1/2 (Ising model). We show by explicit and elementary construction how they determine a representation of the braid group B ∞ which is associated with a Temperley-Lieb-Jones algebra. We recover fusion rules, and compute the quantum dimensions of the superselection sectors. We exhibit a field algebra which is quantum group covariant and acts in the Hilbert space of physical states. It obeys local braid relations in an appropriate weak sense.

  11. Dynamic Key Management Schemes for Secure Group Access Control Using Hierarchical Clustering in Mobile Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Tsaur, Woei-Jiunn; Pai, Haw-Tyng

    2008-11-01

    The applications of group computing and communication motivate the requirement to provide group access control in mobile ad hoc networks (MANETs). The operation in MANETs' groups performs a decentralized manner and accommodated membership dynamically. Moreover, due to lack of centralized control, MANETs' groups are inherently insecure and vulnerable to attacks from both within and outside the groups. Such features make access control more challenging in MANETs. Recently, several researchers have proposed group access control mechanisms in MANETs based on a variety of threshold signatures. However, these mechanisms cannot actually satisfy MANETs' dynamic environments. This is because the threshold-based mechanisms cannot be achieved when the number of members is not up to the threshold value. Hence, by combining the efficient elliptic curve cryptosystem, self-certified public key cryptosystem and secure filter technique, we construct dynamic key management schemes based on hierarchical clustering for securing group access control in MANETs. Specifically, the proposed schemes can constantly accomplish secure group access control only by renewing the secure filters of few cluster heads, when a cluster head joins or leaves a cross-cluster. In such a new way, we can find that the proposed group access control scheme can be very effective for securing practical applications in MANETs.

  12. Studies in Mathematics, Volume 22. Studies in Computer Science.

    ERIC Educational Resources Information Center

    Pollack, Seymour V., Ed.

    The nine articles in this collection were selected because they represent concerns central to computer science, emphasize topics of particular interest to mathematicians, and underscore the wide range of areas deeply and continually affected by computer science. The contents consist of: "Introduction" (S. V. Pollack), "The…

  13. A magnetic resonance imaging finding in children with cerebral palsy: Symmetrical central tegmental tract hyperintensity.

    PubMed

    Derinkuyu, Betul Emine; Ozmen, Evrim; Akmaz-Unlu, Havva; Altinbas, Namik Kemal; Gurkas, Esra; Boyunaga, Oznur

    2017-03-01

    Central tegmental tract is an extrapyramidal tract between red nucleus and inferior olivary nucleus which is located in the tegmentum pontis bilaterally and symmetrically. The etiology of the presence of central tegmental tract hyperintensity on MRI is unclear. In this study our aim is to evaluate the frequency of central tegmental tract lesions in patients with cerebral palsy and control group, as well as to determine whether there is an association between central tegmental tract lesions and cerebral palsy types. Clinical and MRI data of 200 patients with cerebral palsy in study group (87 female, 113 male; mean age, 5.81years; range, 0-16years) and 258 patients in control group (114 female, 144 male; mean age, 6.28years; range, 0-16years) were independently evaluated by two reader for presence of central tegmental tract hyperintensity and other associated abnormalities. Central tegmental tract hyperintensities on T2WI were detected in 19% of the study group (38/200) and 3.5% of the control group (9/258) (p<0.0001). Among the total of 38 central tegmental tract lesions in study group, the frequency of central tegmental tract hyperintensity was 16% (24/150) in spastic cerebral palsy and 35% (14/40) in dyskinetic cerebral palsy (p=0.0131). The prevalence of central tegmental tract hyperintensity is higher in patients with cerebral palsy particularly in dyskinetic type. We suggest that there is an increased association of the tegmental lesions with dyskinetic CP. Patients with cerebral palsy and ischemic changes were more likely to have central tegmental tract lesions. According to our results we advocate that an ischemic process may have a role in the etiopathogenesis. Copyright © 2016 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  14. Computer code for controller partitioning with IFPC application: A user's manual

    NASA Technical Reports Server (NTRS)

    Schmidt, Phillip H.; Yarkhan, Asim

    1994-01-01

    A user's manual for the computer code for partitioning a centralized controller into decentralized subcontrollers with applicability to Integrated Flight/Propulsion Control (IFPC) is presented. Partitioning of a centralized controller into two subcontrollers is described and the algorithm on which the code is based is discussed. The algorithm uses parameter optimization of a cost function which is described. The major data structures and functions are described. Specific instructions are given. The user is led through an example of an IFCP application.

  15. Grace: A cross-platform micromagnetic simulator on graphics processing units

    NASA Astrophysics Data System (ADS)

    Zhu, Ru

    2015-12-01

    A micromagnetic simulator running on graphics processing units (GPUs) is presented. Different from GPU implementations of other research groups which are predominantly running on NVidia's CUDA platform, this simulator is developed with C++ Accelerated Massive Parallelism (C++ AMP) and is hardware platform independent. It runs on GPUs from venders including NVidia, AMD and Intel, and achieves significant performance boost as compared to previous central processing unit (CPU) simulators, up to two orders of magnitude. The simulator paved the way for running large size micromagnetic simulations on both high-end workstations with dedicated graphics cards and low-end personal computers with integrated graphics cards, and is freely available to download.

  16. TrajGraph: A Graph-Based Visual Analytics Approach to Studying Urban Network Centralities Using Taxi Trajectory Data.

    PubMed

    Huang, Xiaoke; Zhao, Ye; Yang, Jing; Zhang, Chong; Ma, Chao; Ye, Xinyue

    2016-01-01

    We propose TrajGraph, a new visual analytics method, for studying urban mobility patterns by integrating graph modeling and visual analysis with taxi trajectory data. A special graph is created to store and manifest real traffic information recorded by taxi trajectories over city streets. It conveys urban transportation dynamics which can be discovered by applying graph analysis algorithms. To support interactive, multiscale visual analytics, a graph partitioning algorithm is applied to create region-level graphs which have smaller size than the original street-level graph. Graph centralities, including Pagerank and betweenness, are computed to characterize the time-varying importance of different urban regions. The centralities are visualized by three coordinated views including a node-link graph view, a map view and a temporal information view. Users can interactively examine the importance of streets to discover and assess city traffic patterns. We have implemented a fully working prototype of this approach and evaluated it using massive taxi trajectories of Shenzhen, China. TrajGraph's capability in revealing the importance of city streets was evaluated by comparing the calculated centralities with the subjective evaluations from a group of drivers in Shenzhen. Feedback from a domain expert was collected. The effectiveness of the visual interface was evaluated through a formal user study. We also present several examples and a case study to demonstrate the usefulness of TrajGraph in urban transportation analysis.

  17. Doubling over ten years of central obesity in Hong Kong Chinese working men.

    PubMed

    Ko, Tin-choi Gary; Chan, Juliana; Chan, Amy; Wong, Patrick; Hui, Stanley; Chow, Ferrie; Tong, Spencer; Chan, Cecilia

    2007-07-05

    Obesity is now an epidemic in most parts of the world. In this cross sectional study, we report the most recent data on obesity in Hong Kong Chinese working population and compare the changes over 10 years. Between July 2000 and March 2002, 5882 adult subjects from the working class in Hong Kong were recruited (2716 men (46.2%) and 3166 women (53.8%)). They were randomly selected using computer generated codes according to the distribution of occupational groups. Results of this study were compared with the data collected from a prevalence survey for cardiovascular risk factors in a Hong Kong Chinese working population conducted in 1990 (1513 subjects, 910 men (60.1%) and 603 women (39.9%)). Standardized percentages of overweight, obesity, and central obesity, in Hong Kong Chinese working population were 59.7%, 35.0%, 26.7% in men and 32.0%, 21.7%, 26.7% in women. Compared to the data collected in 1990, the percentage of obesity increased by 5% in men and reduced by 6% in women. The percentage of central obesity doubled in men (from 12.2% to 26.7%) but remained stable in women. There is a doubling of the percentage of central obesity in Hong Kong Chinese working men over previous decade. Education and proper lifestyle modification program to tackle this social health issue are urgently indicated.

  18. Satisfaction with Life in Orofacial Pain Disorders: Associations and Theoretical Implications.

    PubMed

    Boggero, Ian A; Rojas-Ramirez, Marcia V; de Leeuw, Reny; Carlson, Charles R

    2016-01-01

    To test if patients with masticatory myofascial pain, local myalgia, centrally mediated myalgia, disc displacement, capsulitis/synovitis, or continuous neuropathic pain differed in self-reported satisfaction with life. The study also tested if satisfaction with life was similarly predicted by measures of physical, emotional, and social functioning across disorders. Satisfaction with life, fatigue, affective distress, social support, and pain data were extracted from the medical records of 343 patients seeking treatment for chronic orofacial pain. Patients were grouped by primary diagnosis assigned following their initial appointment. Satisfaction with life was compared between disorders, with and without pain intensity entered as a covariate. Disorder-specific linear regression models using physical, emotional, and social predictors of satisfaction with life were computed. Patients with centrally mediated myalgia reported significantly lower satisfaction with life than did patients with any of the other five disorders. Inclusion of pain intensity as a covariate weakened but did not eliminate the effect. Satisfaction with life was predicted by measures of physical, emotional, and social functioning, but these associations were not consistent across disorders. Results suggest that reduced satisfaction with life in patients with centrally mediated myalgia is not due only to pain intensity. There may be other factors that predispose people to both reduced satisfaction with life and centrally mediated myalgia. Furthermore, the results suggest that satisfaction with life is differentially influenced by physical, emotional, and social functioning in different orofacial pain disorders.

  19. Real-time ultrasound-guided catheterisation of the internal jugular vein: a prospective comparison with the landmark technique in critical care patients

    PubMed Central

    Karakitsos, Dimitrios; Labropoulos, Nicolaos; De Groot, Eric; Patrianakos, Alexandros P; Kouraklis, Gregorios; Poularas, John; Samonis, George; Tsoutsos, Dimosthenis A; Konstadoulakis, Manousos M; Karabinis, Andreas

    2006-01-01

    Introduction Central venous cannulation is crucial in the management of the critical care patient. This study was designed to evaluate whether real-time ultrasound-guided cannulation of the internal jugular vein is superior to the standard landmark method. Methods In this randomised study, 450 critical care patients who underwent real-time ultrasound-guided cannulation of the internal jugular vein were prospectively compared with 450 critical care patients in whom the landmark technique was used. Randomisation was performed by means of a computer-generated random-numbers table, and patients were stratified with regard to age, gender, and body mass index. Results There were no significant differences in gender, age, body mass index, or side of cannulation (left or right) or in the presence of risk factors for difficult venous cannulation such as prior catheterisation, limited sites for access attempts, previous difficulties during catheterisation, previous mechanical complication, known vascular abnormality, untreated coagulopathy, skeletal deformity, and cannulation during cardiac arrest between the two groups of patients. Furthermore, the physicians who performed the procedures had comparable experience in the placement of central venous catheters (p = non-significant). Cannulation of the internal jugular vein was achieved in all patients by using ultrasound and in 425 of the patients (94.4%) by using the landmark technique (p < 0.001). Average access time (skin to vein) and number of attempts were significantly reduced in the ultrasound group of patients compared with the landmark group (p < 0.001). In the landmark group, puncture of the carotid artery occurred in 10.6% of patients, haematoma in 8.4%, haemothorax in 1.7%, pneumothorax in 2.4%, and central venous catheter-associated blood stream infection in 16%, which were all significantly increased compared with the ultrasound group (p < 0.001). Conclusion The present data suggest that ultrasound-guided catheterisation of the internal jugular vein in critical care patients is superior to the landmark technique and therefore should be the method of choice in these patients. PMID:17112371

  20. A spacecraft computer repairable via command.

    NASA Technical Reports Server (NTRS)

    Fimmel, R. O.; Baker, T. E.

    1971-01-01

    The MULTIPAC is a central data system developed for deep-space probes with the distinctive feature that it may be repaired during flight via command and telemetry links by reprogramming around the failed unit. The computer organization uses pools of identical modules which the program organizes into one or more computers called processors. The interaction of these modules is dynamically controlled by the program rather than hardware. In the event of a failure, new programs are entered which reorganize the central data system with a somewhat reduced total processing capability aboard the spacecraft. Emphasis is placed on the evolution of the system architecture and the final overall system design rather than the specific logic design.

  1. Design of a modular digital computer system, CDRL no. D001, final design plan

    NASA Technical Reports Server (NTRS)

    Easton, R. A.

    1975-01-01

    The engineering breadboard implementation for the CDRL no. D001 modular digital computer system developed during design of the logic system was documented. This effort followed the architecture study completed and documented previously, and was intended to verify the concepts of a fault tolerant, automatically reconfigurable, modular version of the computer system conceived during the architecture study. The system has a microprogrammed 32 bit word length, general register architecture and an instruction set consisting of a subset of the IBM System 360 instruction set plus additional fault tolerance firmware. The following areas were covered: breadboard packaging, central control element, central processing element, memory, input/output processor, and maintenance/status panel and electronics.

  2. Bipartite Graphs as Models of Population Structures in Evolutionary Multiplayer Games

    PubMed Central

    Peña, Jorge; Rochat, Yannick

    2012-01-01

    By combining evolutionary game theory and graph theory, “games on graphs” study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner’s dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner’s dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures. PMID:22970237

  3. Monitoring benthic aIgal communides: A comparison of targeted and coefficient sampling methods

    USGS Publications Warehouse

    Edwards, Matthew S.; Tinker, M. Tim

    2009-01-01

    Choosing an appropriate sample unit is a fundamental decision in the design of ecological studies. While numerous methods have been developed to estimate organism abundance, they differ in cost, accuracy and precision.Using both field data and computer simulation modeling, we evaluated the costs and benefits associated with two methods commonly used to sample benthic organisms in temperate kelp forests. One of these methods, the Targeted Sampling method, relies on different sample units, each "targeted" for a specific species or group of species while the other method relies on coefficients that represent ranges of bottom cover obtained from visual esti-mates within standardized sample units. Both the field data and the computer simulations suggest that both methods yield remarkably similar estimates of organism abundance and among-site variability, although the Coefficient method slightly underestimates variability among sample units when abundances are low. In contrast, the two methods differ considerably in the effort needed to sample these communities; the Targeted Sampling requires more time and twice the personnel to complete. We conclude that the Coefficent Sampling method may be better for environmental monitoring programs where changes in mean abundance are of central concern and resources are limiting, but that the Targeted sampling methods may be better for ecological studies where quantitative relationships among species and small-scale variability in abundance are of central concern.

  4. Everything You Always Wanted to Know about Computers but Were Afraid to Ask.

    ERIC Educational Resources Information Center

    DiSpezio, Michael A.

    1989-01-01

    An overview of the basics of computers is presented. Definitions and discussions of processing, programs, memory, DOS, anatomy and design, central processing unit (CPU), disk drives, floppy disks, and peripherals are included. This article was designed to help teachers to understand basic computer terminology. (CW)

  5. Digital Data Transmission Via CATV.

    ERIC Educational Resources Information Center

    Stifle, Jack; And Others

    A low cost communications network has been designed for use in the PLATO IV computer-assisted instruction system. Over 1,000 remote computer graphic terminals each requiring a 1200 bps channel are to be connected to one centrally located computer. Digital data are distributed to these terminals using standard commercial cable television (CATV)…

  6. Design of a modular digital computer system

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A Central Control Element (CCE) module which controls the Automatically Reconfigurable Modular System (ARMS) and allows both redundant processing and multi-computing in the same computer with real time mode switching, is discussed. The same hardware is used for either reliability enhancement, speed enhancement, or for a combination of both.

  7. Helminth parasitic infections of the central nervous system: a diagnostic approach.

    PubMed

    Othman, Ahmad A; Bruschi, Fabrizio; Ganna, Ahmed A

    2014-04-01

    Helminth parasitic infections of the central nervous system (CNS) occur worldwide with high prevalence in tropical and subtropical countries. Clinical evaluation of patients is mandatory, and it is convenient to group the clinical manifestations into syndromes: for example space-occupying lesions, meningitis, and encephalitis. The history should focus on residence or travel to endemic areas, diet, activities, intercurrent medical conditions, and associated clinical clues. Direct parasitological diagnosis can be reached by cerebrospinal fluid and cerebral tissue examination either by microscopy, culture, or immunological techniques. Immunodiagnosis by detection of parasite antibodies or antigens in serum could provide indirect evidence of parasitic infections. In addition, various imaging and radiological techniques e.g., computed tomography (CT) scan and magnetic resonance imaging (MRI) complement the diagnostic work-up of CNS diseases. Finally, the helminthic CNS infections of global impact, such as schistosomiasis, neurotoxocariasis, Strongyloides infection, neurotrichinosis, neurocysticercosis, and echinococcosis will be briefly discussed as regards the principal clinical and diagnostic features.

  8. Link failure detection in a parallel computer

    DOEpatents

    Archer, Charles J.; Blocksome, Michael A.; Megerian, Mark G.; Smith, Brian E.

    2010-11-09

    Methods, apparatus, and products are disclosed for link failure detection in a parallel computer including compute nodes connected in a rectangular mesh network, each pair of adjacent compute nodes in the rectangular mesh network connected together using a pair of links, that includes: assigning each compute node to either a first group or a second group such that adjacent compute nodes in the rectangular mesh network are assigned to different groups; sending, by each of the compute nodes assigned to the first group, a first test message to each adjacent compute node assigned to the second group; determining, by each of the compute nodes assigned to the second group, whether the first test message was received from each adjacent compute node assigned to the first group; and notifying a user, by each of the compute nodes assigned to the second group, whether the first test message was received.

  9. Brief Survey of TSC Computing Facilities

    DOT National Transportation Integrated Search

    1972-05-01

    The Transportation Systems Center (TSC) has four, essentially separate, in-house computing facilities. We shall call them Honeywell Facility, the Hybrid Facility, the Multimode Simulation Facility, and the Central Facility. In addition to these four,...

  10. Path-Integration Computation of the Transport Properties of Polymers Nanoparticles and Complex Biological Structures

    NASA Astrophysics Data System (ADS)

    Douglas, Jack

    2014-03-01

    One of the things that puzzled me when I was a PhD student working under Karl Freed was the curious unity between the theoretical descriptions of excluded volume interactions in polymers, the hydrodynamic properties of polymers in solution, and the critical properties of fluid mixtures, gases and diverse other materials (magnets, superfluids,etc.) when these problems were formally expressed in terms of Wiener path integration and the interactions treated through a combination of epsilon expansion and renormalization group (RG) theory. It seemed that only the interaction labels changed from one problem to the other. What do these problems have in common? Essential clues to these interrelations became apparent when Karl Freed, myself and Shi-Qing Wang together began to study polymers interacting with hyper-surfaces of continuously variable dimension where the Feynman perturbation expansions could be performed through infinite order so that we could really understand what the RG theory was doing. It is evidently simply a particular method for resuming perturbation theory, and former ambiguities no longer existed. An integral equation extension of this type of exact calculation to ``surfaces'' of arbitrary fixed shape finally revealed the central mathematical object that links these diverse physical models- the capacity of polymer chains, whose value vanishes at the critical dimension of 4 and whose magnitude is linked to the friction coefficient of polymer chains, the virial coefficient of polymers and the 4-point function of the phi-4 field theory,...Once this central object was recognized, it then became possible solve diverse problems in material science through the calculation of capacity, and related ``virials'' properties, through Monte Carlo sampling of random walk paths. The essential ideas of this computational method are discussed and some applications given to non-trivial problems: nanotubes treated as either rigid rods or ensembles worm-like chains having finite cross-section, DNA, nanoparticles with grafted chain layers and knotted polymers. The path-integration method, which grew up from research in Karl Freed's group, is evidently a powerful tool for computing basic transport properties of complex-shaped objects and should find increasing application in polymer science, nanotechnological applications and biology.

  11. Characterizing Crowd Participation and Productivity of Foldit Through Web Scraping

    DTIC Science & Technology

    2016-03-01

    Berkeley Open Infrastructure for Network Computing CDF Cumulative Distribution Function CPU Central Processing Unit CSSG Crowdsourced Serious Game...computers at once can create a similar capacity. According to Anderson [6], principal investigator for the Berkeley Open Infrastructure for Network...extraterrestrial life. From this project, a software-based distributed computing platform called the Berkeley Open Infrastructure for Network Computing

  12. Disaster recovery plan for HANDI 2000 business management system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, D.E.

    The BMS production implementation will be complete by October 1, 1998 and the server environment will be comprised of two types of platforms. The PassPort Supply and the PeopleSoft Financials will reside on LNIX servers and the PeopleSoft Human Resources and Payroll will reside on Microsoft NT servers. Because of the wide scope and the requirements of the COTS products to run in various environments backup and recovery responsibilities are divided between two groups in Technical Operations. The Central Computer Systems Management group provides support for the LTNIX/NT Backup Data Center, and the Network Infrastructure Systems group provides support formore » the NT Application Server Backup outside the Data Center. The disaster recovery process is dependent on a good backup and recovery process. Information and integrated system data for determining the disaster recovery process is identified from the Fluor Daniel Hanford (FDH) Risk Assessment Plan, Contingency Plan, and Backup and Recovery Plan, and Backup Form for HANDI 2000 BMS.« less

  13. Solving multiconstraint assignment problems using learning automata.

    PubMed

    Horn, Geir; Oommen, B John

    2010-02-01

    This paper considers the NP-hard problem of object assignment with respect to multiple constraints: assigning a set of elements (or objects) into mutually exclusive classes (or groups), where the elements which are "similar" to each other are hopefully located in the same class. The literature reports solutions in which the similarity constraint consists of a single index that is inappropriate for the type of multiconstraint problems considered here and where the constraints could simultaneously be contradictory. This feature, where we permit possibly contradictory constraints, distinguishes this paper from the state of the art. Indeed, we are aware of no learning automata (or other heuristic) solutions which solve this problem in its most general setting. Such a scenario is illustrated with the static mapping problem, which consists of distributing the processes of a parallel application onto a set of computing nodes. This is a classical and yet very important problem within the areas of parallel computing, grid computing, and cloud computing. We have developed four learning-automata (LA)-based algorithms to solve this problem: First, a fixed-structure stochastic automata algorithm is presented, where the processes try to form pairs to go onto the same node. This algorithm solves the problem, although it requires some centralized coordination. As it is desirable to avoid centralized control, we subsequently present three different variable-structure stochastic automata (VSSA) algorithms, which have superior partitioning properties in certain settings, although they forfeit some of the scalability features of the fixed-structure algorithm. All three VSSA algorithms model the processes as automata having first the hosting nodes as possible actions; second, the processes as possible actions; and, third, attempting to estimate the process communication digraph prior to probabilistically mapping the processes. This paper, which, we believe, comprehensively reports the pioneering LA solutions to this problem, unequivocally demonstrates that LA can play an important role in solving complex combinatorial and integer optimization problems.

  14. Non-invasive detection of infection in acute pancreatic and acute necrotic collections with diffusion-weighted magnetic resonance imaging: preliminary findings.

    PubMed

    Islim, Filiz; Salik, Aysun Erbahceci; Bayramoglu, Sibel; Guven, Koray; Alis, Halil; Turhan, Ahmet Nuray

    2014-06-01

    The purpose of this study was to evaluate the contribution of diffusion-weighted magnetic resonance imaging (DW-MRI) to the detection of infection in acute pancreatitis-related collections. A total of 21 DW-MRI, and computed tomography (CT) were performed on 20 patients diagnosed as acute pancreatitis with acute peri-pancreatic fluid or necrotic collections. Collections were classified as infected or sterile according to the culture and follow-up results. Collections with gas bubbles on CT images were considered to be infected. Collections with peripheral bright signals on DW-MRI images were considered to be positive, whereas those without signals were considered to be negative. Apparent diffusion coefficient (ADC) values of the peripheral and central parts of the collections were measured. Student's t test was used to compare the means of ADC values of independent groups. Apart from one false positive result, the presence of infection was detected by DW-MRI with 95.2% accuracy. The sensitivity and accuracy of DW-MRI were higher than CT for the detection of infection. The ADC values in the central parts of the collections were significantly different between the infected and sterile groups. DW-MRI can be used as a non-invasive technique for the detection of infection in acute pancreatitis-associated collections.

  15. Multigrid Methods for the Computation of Propagators in Gauge Fields

    NASA Astrophysics Data System (ADS)

    Kalkreuter, Thomas

    Multigrid methods were invented for the solution of discretized partial differential equations in order to overcome the slowness of traditional algorithms by updates on various length scales. In the present work generalizations of multigrid methods for propagators in gauge fields are investigated. Gauge fields are incorporated in algorithms in a covariant way. The kernel C of the restriction operator which averages from one grid to the next coarser grid is defined by projection on the ground-state of a local Hamiltonian. The idea behind this definition is that the appropriate notion of smoothness depends on the dynamics. The ground-state projection choice of C can be used in arbitrary dimension and for arbitrary gauge group. We discuss proper averaging operations for bosons and for staggered fermions. The kernels C can also be used in multigrid Monte Carlo simulations, and for the definition of block spins and blocked gauge fields in Monte Carlo renormalization group studies. Actual numerical computations are performed in four-dimensional SU(2) gauge fields. We prove that our proposals for block spins are “good”, using renormalization group arguments. A central result is that the multigrid method works in arbitrarily disordered gauge fields, in principle. It is proved that computations of propagators in gauge fields without critical slowing down are possible when one uses an ideal interpolation kernel. Unfortunately, the idealized algorithm is not practical, but it was important to answer questions of principle. Practical methods are able to outperform the conjugate gradient algorithm in case of bosons. The case of staggered fermions is harder. Multigrid methods give considerable speed-ups compared to conventional relaxation algorithms, but on lattices up to 184 conjugate gradient is superior.

  16. 3-dimensional structure of the Indian Ocean inferred from long period surface waves

    NASA Astrophysics Data System (ADS)

    Montagner, Jean-Paul

    1986-04-01

    To improve the lateral resolution of the first global 3 - dimensional models of seismic wave velocities, regional studies have to be undertaken. The dispersion of Rayleigh waves along 86 paths across the Indian Ocean and surrounding regions is investigated in the period range 40 - 300 s. The regionalization of group velocity according to the age of the sea floor shows an increase of velocity with age up to 150 s only, similar to the results in the Pacific Ocean. But here, this relationship vanishes more quickly at long period. Therefore the correlation of the deep structure with surface tectonics seems to be shallower in the Indian Ocean than in the Pacific Ocean. A tomographic method is applied to compute the geographical distributions of group velocity and azimuthal anisotropy and then the 3-D structure of S-wave velocity. Horizontal wavelengths of 2000 km for velocity and 3000 km for azimuthal anisotropy distribution can be resolved. Except for the central part of the South East Indian ridge which displays high velocities at all depths, the inversion corroborates a good correlation between lithospheric structure down to 120 km and surface tectonics: low velocities along the central and southeast Indian ridges, velocity increasing with the age of the sea floor, high velocities under African, Indian and Australian shields. At greater depths, the low velocity zones under the Gulf of Aden and the western part of the Southeast Indian ridges hold but the low velocity anomaly of the Central Indian ridge is offset eastward. The low velocity anomalies suggest uprising material and complex plate boundary.

  17. Galaxy-galaxy lensing in EAGLE: comparison with data from 180 deg2 of the KiDS and GAMA surveys

    NASA Astrophysics Data System (ADS)

    Velliscig, Marco; Cacciato, Marcello; Hoekstra, Henk; Schaye, Joop; Heymans, Catherine; Hildebrandt, Hendrik; Loveday, Jon; Norberg, Peder; Sifón, Cristóbal; Schneider, Peter; van Uitert, Edo; Viola, Massimo; Brough, Sarah; Erben, Thomas; Holwerda, Benne W.; Hopkins, Andrew M.; Kuijken, Konrad

    2017-11-01

    We present predictions for the galaxy-galaxy lensing (GGL) profile from the EAGLE hydrodynamical cosmological simulation at redshift z = 0.18, in the spatial range 0.02 < R/(h- 1 Mpc) < 2, and for five logarithmically equispaced stellar mass bins in the range 10.3 < log10(Mstar/ M⊙) < 11.8. We compare these excess surface density profiles to the observed signal from background galaxies imaged by the Kilo Degree Survey around spectroscopically confirmed foreground galaxies from the Galaxy And Mass Assembly (GAMA) survey. Exploiting the GAMA galaxy group catalogue, the profiles of central and satellite galaxies are computed separately for groups with at least five members to minimize contamination. EAGLE predictions are in broad agreement with the observed profiles for both central and satellite galaxies, although the signal is underestimated at R ≈ 0.5-2 h- 1 Mpc for the highest stellar mass bins. When central and satellite galaxies are considered simultaneously, agreement is found only when the selection function of lens galaxies is taken into account in detail. Specifically, in the case of GAMA galaxies, it is crucial to account for the variation of the fraction of satellite galaxies in bins of stellar mass induced by the flux-limited nature of the survey. We report the inferred stellar-to-halo mass relation and we find good agreement with recent published results. We note how the precision of the GGL profiles in the simulation holds the potential to constrain fine-grained aspects of the galaxy-dark matter connection.

  18. Genetic Divergence Disclosing a Rapid Prehistorical Dispersion of Native Americans in Central and South America

    PubMed Central

    He, Yungang; Wang, Wei R.; Li, Ran; Wang, Sijia; Jin, Li

    2012-01-01

    An accurate estimate of the divergence time between Native Americans is important for understanding the initial entry and early dispersion of human beings in the New World. Current methods for estimating the genetic divergence time of populations could seriously depart from a linear relationship with the true divergence for multiple populations of a different population size and significant population expansion. Here, to address this problem, we propose a novel measure to estimate the genetic divergence time of populations. Computer simulation revealed that the new measure maintained an excellent linear correlation with the population divergence time in complicated multi-population scenarios with population expansion. Utilizing the new measure and microsatellite data of 21 Native American populations, we investigated the genetic divergences of the Native American populations. The results indicated that genetic divergences between North American populations are greater than that between Central and South American populations. None of the divergences, however, were large enough to constitute convincing evidence supporting the two-wave or multi-wave migration model for the initial entry of human beings into America. The genetic affinity of the Native American populations was further explored using Neighbor-Net and the genetic divergences suggested that these populations could be categorized into four genetic groups living in four different ecologic zones. The divergence of the population groups suggests that the early dispersion of human beings in America was a multi-step procedure. Further, the divergences suggest the rapid dispersion of Native Americans in Central and South Americas after a long standstill period in North America. PMID:22970308

  19. Social structure of a semi-free ranging group of mandrills (Mandrillus sphinx): a social network analysis.

    PubMed

    Bret, Céline; Sueur, Cédric; Ngoubangoye, Barthélémy; Verrier, Delphine; Deneubourg, Jean-Louis; Petit, Odile

    2013-01-01

    The difficulty involved in following mandrills in the wild means that very little is known about social structure in this species. Most studies initially considered mandrill groups to be an aggregation of one-male/multifemale units, with males occupying central positions in a structure similar to those observed in the majority of baboon species. However, a recent study hypothesized that mandrills form stable groups with only two or three permanent males, and that females occupy more central positions than males within these groups. We used social network analysis methods to examine how a semi-free ranging group of 19 mandrills is structured. We recorded all dyads of individuals that were in contact as a measure of association. The betweenness and the eigenvector centrality for each individual were calculated and correlated to kinship, age and dominance. Finally, we performed a resilience analysis by simulating the removal of individuals displaying the highest betweenness and eigenvector centrality values. We found that related dyads were more frequently associated than unrelated dyads. Moreover, our results showed that the cumulative distribution of individual betweenness and eigenvector centrality followed a power function, which is characteristic of scale-free networks. This property showed that some group members, mostly females, occupied a highly central position. Finally, the resilience analysis showed that the removal of the two most central females split the network into small subgroups and increased the network diameter. Critically, this study confirms that females appear to occupy more central positions than males in mandrill groups. Consequently, these females appear to be crucial for group cohesion and probably play a pivotal role in this species.

  20. Group quenching and galactic conformity at low redshift

    NASA Astrophysics Data System (ADS)

    Treyer, M.; Kraljic, K.; Arnouts, S.; de la Torre, S.; Pichon, C.; Dubois, Y.; Vibert, D.; Milliard, B.; Laigle, C.; Seibert, M.; Brown, M. J. I.; Grootes, M. W.; Wright, A. H.; Liske, J.; Lara-Lopez, M. A.; Bland-Hawthorn, J.

    2018-06-01

    We quantify the quenching impact of the group environment using the spectroscopic survey Galaxy and Mass Assembly to z ˜ 0.2. The fraction of red (quiescent) galaxies, whether in groups or isolated, increases with both stellar mass and large-scale (5 Mpc) density. At fixed stellar mass, the red fraction is on average higher for satellites of red centrals than of blue (star-forming) centrals, a galactic conformity effect that increases with density. Most of the signal originates from groups that have the highest stellar mass, reside in the densest environments, and have massive, red only centrals. Assuming a colour-dependent halo-to-stellar-mass ratio, whereby red central galaxies inhabit significantly more massive haloes than blue ones of the same stellar mass, two regimes emerge more distinctly: at log (Mhalo/M⊙) ≲ 13, central quenching is still ongoing, conformity is no longer existent, and satellites and group centrals exhibit the same quenching excess over field galaxies at all mass and density, in agreement with the concept of `group quenching'; at log (Mh/M⊙) ≳ 13, a cut-off that sets apart massive (log (M⋆/M⊙) > 11), fully quenched group centrals, conformity is meaningless, and satellites undergo significantly more quenching than their counterparts in smaller haloes. The latter effect strongly increases with density, giving rise to the density-dependent conformity signal when both regimes are mixed. The star formation of blue satellites in massive haloes is also suppressed compared to blue field galaxies, while blue group centrals and the majority of blue satellites, which reside in low-mass haloes, show no deviation from the colour-stellar mass relation of blue field galaxies.

  1. Social Structure of a Semi-Free Ranging Group of Mandrills (Mandrillus sphinx): A Social Network Analysis

    PubMed Central

    Bret, Céline; Sueur, Cédric; Ngoubangoye, Barthélémy; Verrier, Delphine; Deneubourg, Jean-Louis; Petit, Odile

    2013-01-01

    The difficulty involved in following mandrills in the wild means that very little is known about social structure in this species. Most studies initially considered mandrill groups to be an aggregation of one-male/multifemale units, with males occupying central positions in a structure similar to those observed in the majority of baboon species. However, a recent study hypothesized that mandrills form stable groups with only two or three permanent males, and that females occupy more central positions than males within these groups. We used social network analysis methods to examine how a semi-free ranging group of 19 mandrills is structured. We recorded all dyads of individuals that were in contact as a measure of association. The betweenness and the eigenvector centrality for each individual were calculated and correlated to kinship, age and dominance. Finally, we performed a resilience analysis by simulating the removal of individuals displaying the highest betweenness and eigenvector centrality values. We found that related dyads were more frequently associated than unrelated dyads. Moreover, our results showed that the cumulative distribution of individual betweenness and eigenvector centrality followed a power function, which is characteristic of scale-free networks. This property showed that some group members, mostly females, occupied a highly central position. Finally, the resilience analysis showed that the removal of the two most central females split the network into small subgroups and increased the network diameter. Critically, this study confirms that females appear to occupy more central positions than males in mandrill groups. Consequently, these females appear to be crucial for group cohesion and probably play a pivotal role in this species. PMID:24340074

  2. The Relation Between Injury of the Spinothalamocortical Tract and Central Pain in Chronic Patients With Mild Traumatic Brain Injury.

    PubMed

    Kim, Jin Hyun; Ahn, Sang Ho; Cho, Yoon Woo; Kim, Seong Ho; Jang, Sung Ho

    2015-01-01

    Little is known about the pathogenetic etiology of central pain in patients with traumatic brain injury (TBI). We investigated the relation between injury of the spinothalamocortical tract (STT) and chronic central pain in patients with mild TBI. Retrospective survey. We recruited 40 consecutive chronic patients with mild TBI and 21 normal control subjects: 8 patients were excluded by the inclusion criteria and the remaining 32 patients were finally recruited. The patients were classified according to 2 groups based on the presence of central pain: the pain group (22 patients) and the nonpain group (10 patients). Diffusion tensor tractography for the STT was performed using the Functional Magnetic Resonance Imaging of the Brain Software Library. Values of fractional anisotropy (FA), mean diffusivity (MD), and tract volume of each STT were measured. Lower FA value and tract volume were observed in the pain group than in the nonpain group and the control group (P < .05). By contrast, higher MD value was observed in the pain group than in the nonpain group and the control group (P < .05). However, no significant differences in all diffusion tensor imaging parameters were observed between the nonpain group and the control group (P > .05). Decreased FA and tract volume and increased MD of the STTs in the pain group appeared to indicate injury of the STT. As a result, we found that injury of the STT is related to the occurrence of central pain in patients with mild TBI. We believe that injury of the STT is a pathogenetic etiology of central pain following mild TBI.

  3. Guide to computing at ANL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peavler, J.

    1979-06-01

    This publication gives details about hardware, software, procedures, and services of the Central Computing Facility, as well as information about how to become an authorized user. Languages, compilers' libraries, and applications packages available are described. 17 tables. (RWR)

  4. All biology is computational biology.

    PubMed

    Markowetz, Florian

    2017-03-01

    Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.

  5. Observations of environmental quenching in groups in the 11 Gyr since z = 2.5: Different quenching for central and satellite galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tal, Tomer; Illingworth, Garth D.; Magee, Daniel

    2014-07-10

    We present direct observational evidence for star formation quenching in galaxy groups in the redshift range 0 < z < 2.5. We utilize a large sample of nearly 6000 groups, selected by fixed cumulative number density from three photometric catalogs, to follow the evolving quiescent fractions of central and satellite galaxies over roughly 11 Gyr. At z ∼ 0, central galaxies in our sample range in stellar mass from Milky Way/M31 analogs (M{sub *}/M{sub ☉} = 6.5 × 10{sup 10}) to nearby massive ellipticals (M{sub *}/M{sub ☉} = 1.5 × 10{sup 11}). Satellite galaxies in the same groups reach massesmore » as low as twice that of the Large Magellanic Cloud (M{sub *}/M{sub ☉} = 6.5 × 10{sup 9}). Using statistical background subtraction, we measure the average rest-frame colors of galaxies in our groups and calculate the evolving quiescent fractions of centrals and satellites over seven redshift bins. Our analysis shows clear evidence for star formation quenching in group halos, with a different quenching onset for centrals and their satellite galaxies. Using halo mass estimates for our central galaxies, we find that star formation shuts off in centrals when typical halo masses reach between 10{sup 12} and 10{sup 13} M{sub ☉}, consistent with predictions from the halo quenching model. In contrast, satellite galaxies in the same groups most likely undergo quenching by environmental processes, whose onset is delayed with respect to their central galaxy. Although star formation is suppressed in all galaxies over time, the processes that govern quenching are different for centrals and satellites. While mass plays an important role in determining the star formation activity of central galaxies, quenching in satellite galaxies is dominated by the environment in which they reside.« less

  6. Observations of Environmental Quenching in Groups in the 11 GYR Since z = 2.5: Different Quenching For Central and Satellite Galaxies

    NASA Technical Reports Server (NTRS)

    Tal, Tomer; Dekel, Avishai; Marchesini, Danilo; Momcheva, Ivelina; Nelson, Erica J.; Patel, Shannon G.; Quadri, Ryan F.; Rix, Hans-Walter; Skelton, Rosalind E.; Wake, David A.; hide

    2014-01-01

    We present direct observational evidence for star formation quenching in galaxy groups in the redshift range 0 less than z less than 2.5. We utilize a large sample of nearly 6000 groups, selected by fixed cumulative number density from three photometric catalogs, to follow the evolving quiescent fractions of central and satellite galaxies over roughly 11 Gyr. At z approximately 0, central galaxies in our sample range in stellar mass from Milky Way/M31 analogs (M=6.5x10(exp 10) M/solar mass) to nearby massive ellipticals (M=1.5x10(exp 11) M/solar mass). Satellite galaxies in the same groups reach masses as low as twice that of the Large Magellanic Cloud (M=6.5x10(exp 9) M/solar mass). Using statistical background subtraction, we measure the average rest-frame colors of galaxies in our groups and calculate the evolving quiescent fractions of centrals and satellites over seven redshift bins. Our analysis shows clear evidence for star formation quenching in group halos, with a different quenching onset for centrals and their satellite galaxies. Using halo mass estimates for our central galaxies, we find that star formation shuts off in centrals when typical halo masses reach between 10(exp 12) and 10(exp 13) M/solar mass, consistent with predictions from the halo quenching model. In contrast, satellite galaxies in the same groups most likely undergo quenching by environmental processes, whose onset is delayed with respect to their central galaxy. Although star formation is suppressed in all galaxies over time, the processes that govern quenching are different for centrals and satellites. While mass plays an important role in determining the star formation activity of central galaxies, quenching in satellite galaxies is dominated by the environment in which they reside.

  7. Computer-based medical education in Benha University, Egypt: knowledge, attitude, limitations, and suggestions.

    PubMed

    Bayomy, Hanaa; El Awadi, Mona; El Araby, Eman; Abed, Hala A

    2016-12-01

    Computer-assisted medical education has been developed to enhance learning and enable high-quality medical care. This study aimed to assess computer knowledge and attitude toward the inclusion of computers in medical education among second-year medical students in Benha Faculty of Medicine, Egypt, to identify limitations, and obtain suggestions for successful computer-based learning. This was a one-group pre-post-test study, which was carried out on second-year students in Benha Faculty of Medicine. A structured self-administered questionnaire was used to compare students' knowledge, attitude, limitations, and suggestions toward computer usage in medical education before and after the computer course to evaluate the change in students' responses. The majority of students were familiar with use of the mouse and keyboard, basic word processing, internet and web searching, and e-mail both before and after the computer course. The proportion of students who were familiar with software programs other than the word processing and trouble-shoot software/hardware was significantly higher after the course (P<0.001). There was a significant increase in the proportion of students who agreed on owning a computer (P=0.008), the inclusion of computer skills course in medical education, downloading lecture handouts, and computer-based exams (P<0.001) after the course. After the course, there was a significant increase in the proportion of students who agreed that the lack of central computers limited the inclusion of computer in medical education (P<0.001). Although the lack of computer labs, lack of Information Technology staff mentoring, large number of students, unclear course outline, and lack of internet access were more frequently reported before the course (P<0.001), the majority of students suggested the provision of computer labs, inviting Information Technology staff to support computer teaching, and the availability of free Wi-Fi internet access covering several areas in the university campus; all would support computer-assisted medical education. Medical students in Benha University are computer literate, which allows for computer-based medical education. Staff training, provision of computer labs, and internet access are essential requirements for enhancing computer usage in medical education in the university.

  8. A Series of Molecular Dynamics and Homology Modeling Computer Labs for an Undergraduate Molecular Modeling Course

    ERIC Educational Resources Information Center

    Elmore, Donald E.; Guayasamin, Ryann C.; Kieffer, Madeleine E.

    2010-01-01

    As computational modeling plays an increasingly central role in biochemical research, it is important to provide students with exposure to common modeling methods in their undergraduate curriculum. This article describes a series of computer labs designed to introduce undergraduate students to energy minimization, molecular dynamics simulations,…

  9. All about Reading and Technology.

    ERIC Educational Resources Information Center

    Karbal, Harold, Ed.

    1985-01-01

    The central theme in this journal issue is the use of the computer in teaching reading. The following articles are included: "The Use of Computers in the Reading Program: A District Approach" by Nora Forester; "Reading and Computers: A Partnership" by Dr. Martha Irwin; "Rom, Ram and Reason" by Candice Carlile; "Word Processing: Practical Ideas and…

  10. Long Range Planning for Computer Use--A Task Force Model.

    ERIC Educational Resources Information Center

    Raucher, S. M.; Koehler, T. J.

    A Management Operations Review and Evaluation (MORE) study of the Department of Management Information and Computer Services, which was completed in the fall of 1980, strongly recommended that the Montgomery County Public Schools (MCPS) develop a long-range plan to meet the computer needs of schools and central offices. In response to this…

  11. French Plans for Fifth Generation Computer Systems.

    DTIC Science & Technology

    1984-12-07

    centrally man- French industry In electronics, compu- aged project in France that covers all ters, software, and services and to make the facets of the...Centre National of Japan’s Fifth Generation Project , the de Recherche Scientifique (CNRS) Cooper- French scientific and industrial com- ative Research...systems, man-computer The National Projects interaction, novel computer structures, The French Ministry of Research and knowledge-based computer systems

  12. A Distributed Computing Framework for Real-Time Detection of Stress and of Its Propagation in a Team.

    PubMed

    Pandey, Parul; Lee, Eun Kyung; Pompili, Dario

    2016-11-01

    Stress is one of the key factor that impacts the quality of our daily life: From the productivity and efficiency in the production processes to the ability of (civilian and military) individuals in making rational decisions. Also, stress can propagate from one individual to other working in a close proximity or toward a common goal, e.g., in a military operation or workforce. Real-time assessment of the stress of individuals alone is, however, not sufficient, as understanding its source and direction in which it propagates in a group of people is equally-if not more-important. A continuous near real-time in situ personal stress monitoring system to quantify level of stress of individuals and its direction of propagation in a team is envisioned. However, stress monitoring of an individual via his/her mobile device may not always be possible for extended periods of time due to limited battery capacity of these devices. To overcome this challenge a novel distributed mobile computing framework is proposed to organize the resources in the vicinity and form a mobile device cloud that enables offloading of computation tasks in stress detection algorithm from resource constrained devices (low residual battery, limited CPU cycles) to resource rich devices. Our framework also supports computing parallelization and workflows, defining how the data and tasks divided/assigned among the entities of the framework are designed. The direction of propagation and magnitude of influence of stress in a group of individuals are studied by applying real-time, in situ analysis of Granger Causality. Tangible benefits (in terms of energy expenditure and execution time) of the proposed framework in comparison to a centralized framework are presented via thorough simulations and real experiments.

  13. High order filtering methods for approximating hyperbolic systems of conservation laws

    NASA Technical Reports Server (NTRS)

    Lafon, F.; Osher, S.

    1991-01-01

    The essentially nonoscillatory (ENO) schemes, while potentially useful in the computation of discontinuous solutions of hyperbolic conservation-law systems, are computationally costly relative to simple central-difference methods. A filtering technique is presented which employs central differencing of arbitrarily high-order accuracy except where a local test detects the presence of spurious oscillations and calls upon the full ENO apparatus to remove them. A factor-of-three speedup is thus obtained over the full-ENO method for a wide range of problems, with high-order accuracy in regions of smooth flow.

  14. Computer Security for Commercial Nuclear Power Plants - Literature Review for Korea Hydro Nuclear Power Central Research Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duran, Felicia Angelica; Waymire, Russell L.

    2013-10-01

    Sandia National Laboratories (SNL) is providing training and consultation activities on security planning and design for the Korea Hydro and Nuclear Power Central Research Institute (KHNPCRI). As part of this effort, SNL performed a literature review on computer security requirements, guidance and best practices that are applicable to an advanced nuclear power plant. This report documents the review of reports generated by SNL and other organizations [U.S. Nuclear Regulatory Commission, Nuclear Energy Institute, and International Atomic Energy Agency] related to protection of information technology resources, primarily digital controls and computer resources and their data networks. Copies of the key documentsmore » have also been provided to KHNP-CRI.« less

  15. An efficient approach to CI: General matrix element formulas for spin-coupled particle-hole excitations

    NASA Astrophysics Data System (ADS)

    Tavan, Paul; Schulten, Klaus

    1980-03-01

    A new, efficient algorithm for the evaluation of the matrix elements of the CI Hamiltonian in the basis of spin-coupled ν-fold excitations (over orthonormal orbitals) is developed for even electron systems. For this purpose we construct an orthonormal, spin-adapted CI basis in the framework of second quantization. As a prerequisite, spin and space parts of the fermion operators have to be separated; this makes it possible to introduce the representation theory of the permutation group. The ν-fold excitation operators are Serber spin-coupled products of particle-hole excitations. This construction is also designed for CI calculations from multireference (open-shell) states. The 2N-electron Hamiltonian is expanded in terms of spin-coupled particle-hole operators which map any ν-fold excitation on ν-, and ν±1-, and ν±2-fold excitations. For the calculation of the CI matrix this leaves one with only the evaluation of overlap matrix elements between spin-coupled excitations. This leads to a set of ten general matrix element formulas which contain Serber representation matrices of the permutation group Sν×Sν as parameters. Because of the Serber structure of the CI basis these group-theoretical parameters are kept to a minimum such that they can be stored readily in the central memory of a computer for ν?4 and even for higher excitations. As the computational effort required to obtain the CI matrix elements from the general formulas is very small, the algorithm presented appears to constitute for even electron systems a promising alternative to existing CI methods for multiply excited configurations, e.g., the unitary group approach. Our method makes possible the adaptation of spatial symmetries and the selection of any subset of configurations. The algorithm has been implemented in a computer program and tested extensively for ν?4 and singlet ground and excited states.

  16. A parallel computational model for GATE simulations.

    PubMed

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Computing at h1 - Experience and Future

    NASA Astrophysics Data System (ADS)

    Eckerlin, G.; Gerhards, R.; Kleinwort, C.; KrÜNer-Marquis, U.; Egli, S.; Niebergall, F.

    The H1 experiment has now been successfully operating at the electron proton collider HERA at DESY for three years. During this time the computing environment has gradually shifted from a mainframe oriented environment to the distributed server/client Unix world. This transition is now almost complete. Computing needs are largely determined by the present amount of 1.5 TB of reconstructed data per year (1994), corresponding to 1.2 × 107 accepted events. All data are centrally available at DESY. In addition to data analysis, which is done in all collaborating institutes, most of the centrally organized Monte Carlo production is performed outside of DESY. New software tools to cope with offline computing needs include CENTIPEDE, a tool for the use of distributed batch and interactive resources for Monte Carlo production, and H1 UNIX, a software package for automatic updates of H1 software on all UNIX platforms.

  18. Do individuals with autism process words in context? Evidence from language-mediated eye-movements.

    PubMed

    Brock, Jon; Norbury, Courtenay; Einav, Shiri; Nation, Kate

    2008-09-01

    It is widely argued that people with autism have difficulty processing ambiguous linguistic information in context. To investigate this claim, we recorded the eye-movements of 24 adolescents with autism spectrum disorder and 24 language-matched peers as they monitored spoken sentences for words corresponding to objects on a computer display. Following a target word, participants looked more at a competitor object sharing the same onset than at phonologically unrelated objects. This effect was, however, mediated by the sentence context such that participants looked less at the phonological competitor if it was semantically incongruous with the preceding verb. Contrary to predictions, the two groups evidenced similar effects of context on eye-movements. Instead, across both groups, the effect of sentence context was reduced in individuals with relatively poor language skills. Implications for the weak central coherence account of autism are discussed.

  19. Statistical aspects of solar flares

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.

    1987-01-01

    A survey of the statistical properties of 850 H alpha solar flares during 1975 is presented. Comparison of the results found here with those reported elsewhere for different epochs is accomplished. Distributions of rise time, decay time, and duration are given, as are the mean, mode, median, and 90th percentile values. Proportions by selected groupings are also determined. For flares in general, mean values for rise time, decay time, and duration are 5.2 + or - 0.4 min, and 18.1 + or 1.1 min, respectively. Subflares, accounting for nearly 90 percent of the flares, had mean values lower than those found for flares of H alpha importance greater than 1, and the differences are statistically significant. Likewise, flares of bright and normal relative brightness have mean values of decay time and duration that are significantly longer than those computed for faint flares, and mass-motion related flares are significantly longer than non-mass-motion related flares. Seventy-three percent of the mass-motion related flares are categorized as being a two-ribbon flare and/or being accompanied by a high-speed dark filament. Slow rise time flares (rise time greater than 5 min) have a mean value for duration that is significantly longer than that computed for fast rise time flares, and long-lived duration flares (duration greater than 18 min) have a mean value for rise time that is significantly longer than that computed for short-lived duration flares, suggesting a positive linear relationship between rise time and duration for flares. Monthly occurrence rates for flares in general and by group are found to be linearly related in a positive sense to monthly sunspot number. Statistical testing reveals the association between sunspot number and numbers of flares to be significant at the 95 percent level of confidence, and the t statistic for slope is significant at greater than 99 percent level of confidence. Dependent upon the specific fit, between 58 percent and 94 percent of the variation can be accounted for with the linear fits. A statistically significant Northern Hemisphere flare excess (P less than 1 percent) was found, as was a Western Hemisphere excess (P approx 3 percent). Subflares were more prolific within 45 deg of central meridian (P less than 1 percent), while flares of H alpha importance or = 1 were more prolific near the limbs greater than 45 deg from central meridian; P approx 2 percent). Two-ribbon flares were more frequent within 45 deg of central meridian (P less than 1 percent). Slow rise time flares occurred more frequently in the western hemisphere (P approx 2 percent), as did short-lived duration flares (P approx 9 percent), but fast rise time flares were not preferentially distributed (in terms of east-west or limb-disk). Long-lived duration flares occurred more often within 45 deg 0 central meridian (P approx 7 percent). Mean durations for subflares and flares of H alpha importance or + 1, found within 45 deg of central meridian, are 14 percent and 70 percent, respectively, longer than those found for flares closer to the limb. As compared to flares occurring near cycle maximum, the flares of 1975 (near solar minimum) have mean values of rise time, decay time, and duration that are significantly shorter. A flare near solar maximum, on average, is about 1.6 times longer than one occurring near solar minimum.

  20. 75 FR 60415 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-30

    ... computer systems and networks. This information collection is required to obtain the necessary data... card reflecting those benefits and privileges, and to maintain a centralized database of the eligible... card reflecting those benefits and privileges, and to maintain a centralized database of the eligible...

  1. Comparison of marginal and internal fit of 3-unit ceramic fixed dental prostheses made with either a conventional or digital impression.

    PubMed

    Su, Ting-Shu; Sun, Jian

    2016-09-01

    For 20 years, the intraoral digital impression technique has been applied to the fabrication of computer aided design and computer aided manufacturing (CAD-CAM) fixed dental prostheses (FDPs). Clinical fit is one of the main determinants of the success of an FDP. Studies of the clinical fit of 3-unit ceramic FDPs made by means of a conventional impression versus a digital impression technology are limited. The purpose of this in vitro study was to evaluate and compare the internal fit and marginal fit of CAD-CAM, 3-unit ceramic FDP frameworks fabricated from an intraoral digital impression and a conventional impression. A standard model was designed for a prepared maxillary left canine and second premolar and missing first premolar. The model was scanned with an intraoral digital scanner, exporting stereolithography (STL) files as the experimental group (digital group). The model was used to fabricate 10 stone casts that were scanned with an extraoral scanner, exporting STL files to a computer connected to the scanner as the control group (conventional group). The STL files were used to produce zirconia FDP frameworks with CAD-CAM. These frameworks were seated on the standard model and evaluated for marginal and internal fit. Each framework was segmented into 4 sections per abutment teeth, resulting in 8 sections per framework, and was observed using optical microscopy with ×50 magnification. Four measurement points were selected on each section as marginal discrepancy (P1), mid-axial wall (P2), axio-occusal edge (P3), and central-occlusal point (P4). Mean marginal fit values of the digital group (64 ±16 μm) were significantly smaller than those of the conventional group (76 ±18 μm) (P<.05). The mean internal fit values of the digital group (111 ±34 μm) were significantly smaller than those of the conventional group (132 ±44 μm) (P<.05). CAD-CAM 3-unit zirconia FDP frameworks fabricated from intraoral digital and conventional impressions showed clinically acceptable marginal and internal fit. The marginal and internal fit of frameworks fabricated from the intraoral digital impression system were better than those fabricated from conventional impressions. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  2. Association between central obesity and circadian parameters of blood pressure from the korean ambulatory blood pressure monitoring registry: Kor-ABP registry.

    PubMed

    Kang, In Sook; Pyun, Wook Bum; Shin, Jinho; Kim, Ju Han; Kim, Soon Gil; Shin, Gil Ja

    2013-10-01

    Central obesity has been reported as a risk for atherosclerosis and metabolic syndrome. The influence of central obesity on diurnal blood pressure (BP) has not been established. In this study, we investigated the influence of central obesity on the circadian parameters of BP by 24 hr ambulatory BP monitoring. Total 1,290 subjects were enrolled from the Korean Ambulatory BP registry. Central obesity was defined as having a waist circumference≥90 cm in males and ≥85 cm in females. The central-obese group had higher daytime systolic BP (SBP), nighttime SBP and diastolic BP (DBP) than the non-obese group (all, P<0.001). There were no differences in nocturnal dipping (ND) patterns between the groups. Female participants showed a higher BP mean difference (MD) than male participants with concerns of central obesity (daytime SBP MD 5.28 vs 4.27, nighttime SBP MD 6.48 vs 2.72) and wider pulse pressure (PP). Central obesity within the elderly (≥65 yr) also showed a higher BP MD than within the younger group (daytime SBP MD 8.23 vs 3.87, daytime DBP 4.10 vs 1.59). In conclusion, central obesity has no influence on nocturnal dipping patterns. However, higher SBP and wider PP are associated with central obesity, which is accentuated in women.

  3. SNS programming environment user's guide

    NASA Technical Reports Server (NTRS)

    Tennille, Geoffrey M.; Howser, Lona M.; Humes, D. Creig; Cronin, Catherine K.; Bowen, John T.; Drozdowski, Joseph M.; Utley, Judith A.; Flynn, Theresa M.; Austin, Brenda A.

    1992-01-01

    The computing environment is briefly described for the Supercomputing Network Subsystem (SNS) of the Central Scientific Computing Complex of NASA Langley. The major SNS computers are a CRAY-2, a CRAY Y-MP, a CONVEX C-210, and a CONVEX C-220. The software is described that is common to all of these computers, including: the UNIX operating system, computer graphics, networking utilities, mass storage, and mathematical libraries. Also described is file management, validation, SNS configuration, documentation, and customer services.

  4. Observations of Adolescent Peer Group Interactions as a Function of Within- and Between-Group Centrality Status

    ERIC Educational Resources Information Center

    Ellis, Wendy E.; Dumas, Tara M.; Mahdy, Jasmine C.; Wolfe, David A.

    2012-01-01

    Observations of adolescent (n = 258; M age = 15.45) peer group triads (n = 86) were analyzed to identify conversation and interaction styles as a function of within-group and between-group centrality status. Group members' discussions about hypothetical dilemmas were coded for agreements, disagreements, commands, and opinions. Interactions during…

  5. Space Tug Avionics Definition Study. Volume 5: Cost and Programmatics

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The baseline avionics system features a central digital computer that integrates the functions of all the space tug subsystems by means of a redundant digital data bus. The central computer consists of dual central processor units, dual input/output processors, and a fault tolerant memory, utilizing internal redundancy and error checking. Three electronically steerable phased arrays provide downlink transmission from any tug attitude directly to ground or via TDRS. Six laser gyros and six accelerometers in a dodecahedron configuration make up the inertial measurement unit. Both a scanning laser radar and a TV system, employing strobe lamps, are required as acquisition and docking sensors. Primary dc power at a nominal 28 volts is supplied from dual lightweight, thermally integrated fuel cells which operate from propellant grade reactants out of the main tanks.

  6. Sensitivity and specificity of auditory steady‐state response testing

    PubMed Central

    Rabelo, Camila Maia; Schochat, Eliane

    2011-01-01

    INTRODUCTION: The ASSR test is an electrophysiological test that evaluates, among other aspects, neural synchrony, based on the frequency or amplitude modulation of tones. OBJECTIVE: The aim of this study was to determine the sensitivity and specificity of auditory steady‐state response testing in detecting lesions and dysfunctions of the central auditory nervous system. METHODS: Seventy volunteers were divided into three groups: those with normal hearing; those with mesial temporal sclerosis; and those with central auditory processing disorder. All subjects underwent auditory steady‐state response testing of both ears at 500 Hz and 2000 Hz (frequency modulation, 46 Hz). The difference between auditory steady‐state response‐estimated thresholds and behavioral thresholds (audiometric evaluation) was calculated. RESULTS: Estimated thresholds were significantly higher in the mesial temporal sclerosis group than in the normal and central auditory processing disorder groups. In addition, the difference between auditory steady‐state response‐estimated and behavioral thresholds was greatest in the mesial temporal sclerosis group when compared to the normal group than in the central auditory processing disorder group compared to the normal group. DISCUSSION: Research focusing on central auditory nervous system (CANS) lesions has shown that individuals with CANS lesions present a greater difference between ASSR‐estimated thresholds and actual behavioral thresholds; ASSR‐estimated thresholds being significantly worse than behavioral thresholds in subjects with CANS insults. This is most likely because the disorder prevents the transmission of the sound stimulus from being in phase with the received stimulus, resulting in asynchronous transmitter release. Another possible cause of the greater difference between the ASSR‐estimated thresholds and the behavioral thresholds is impaired temporal resolution. CONCLUSIONS: The overall sensitivity of auditory steady‐state response testing was lower than its overall specificity. Although the overall specificity was high, it was lower in the central auditory processing disorder group than in the mesial temporal sclerosis group. Overall sensitivity was also lower in the central auditory processing disorder group than in the mesial temporal sclerosis group. PMID:21437442

  7. Social Network Theory in Engineering Education

    NASA Astrophysics Data System (ADS)

    Simon, Peter A.

    Collaborative groups are important both in the learning environment of engineering education and, in the real world, the business of engineering design. Selecting appropriate individuals to form an effective group and monitoring a group's progress are important aspects of successful task performance. This exploratory study looked at using the concepts of cognitive social structures, structural balance, and centrality from social network analysis as well as the measures of emotional intelligence. The concepts were used to analyze potential team members to examine if an individual's ability to perceive emotion in others and the self and to use, understand, and manage those emotions are a factor in a group's performance. The students from a capstone design course in computer engineering were used as volunteer subjects. They were formed into groups and assigned a design exercise to determine whether and which of the above-mentioned tools would be effective in both selecting teams and predicting the quality of the resultant design. The results were inconclusive with the exception of an individual's ability to accurately perceive emotions. The instruments that were successful were the Self-Monitoring scale and the accuracy scores derived from cognitive social structures and Level IV of network levels of analysis.

  8. Learning ion solid interactions hands-on: An activity based, inquiry oriented, graduate course

    NASA Astrophysics Data System (ADS)

    Braunstein, Gabriel

    2005-12-01

    Experimental work, using state of the art instrumentation, is integrated with lectures in a "real life", learning by discovery approach, in the Ion-Solid Interactions graduate/undergraduate course offered by the Department of Physics of the University of Central Florida. The lecture component of the course covers the underlying physical principles, and related scientific and technological applications, associated with the interaction of energetic ions with matter. In the experimental section the students form small groups and perform a variety of projects, experimental and computational, as part of a participative, inquiry oriented, learning process. In the most recent offering of the class, the students deposited a compound semiconductor thin film by dual-gun sputtering deposition, where each group aimed at a different stoichiometry of the same compound (Zn1-xSxOy). Then they analyzed the composition using Rutherford backscattering spectrometry, measured electrical transport properties using Hall effect and conductivity measurements, and determined the band gap using spectrophotometry. Finally the groups shared their results and each wrote a 'journal-like' technical article describing the entire work. In a different assignment, each group also developed a Monte Carlo computer program ('TRIM-like') to simulate the penetration of ions into a solid, in ion implantation, calculating the stopping cross-sections with approximate models, taught in class, which can be analytically solved. The combination of classroom/laboratory activities is very well received by the students. They gain real life experience operating state of the art equipment, and working in teams, while performing research-like projects, and simultaneously they learn the theoretical foundations of the discipline.

  9. Providing nearest neighbor point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer

    DOEpatents

    Archer, Charles J.; Faraj, Ahmad A.; Inglett, Todd A.; Ratterman, Joseph D.

    2012-10-23

    Methods, apparatus, and products are disclosed for providing nearest neighbor point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer, each compute node connected to each adjacent compute node in the global combining network through a link, that include: identifying each link in the global combining network for each compute node of the operational group; designating one of a plurality of point-to-point class routing identifiers for each link such that no compute node in the operational group is connected to two adjacent compute nodes in the operational group with links designated for the same class routing identifiers; and configuring each compute node of the operational group for point-to-point communications with each adjacent compute node in the global combining network through the link between that compute node and that adjacent compute node using that link's designated class routing identifier.

  10. The Perseus computational platform for comprehensive analysis of (prote)omics data.

    PubMed

    Tyanova, Stefka; Temu, Tikira; Sinitcyn, Pavel; Carlson, Arthur; Hein, Marco Y; Geiger, Tamar; Mann, Matthias; Cox, Jürgen

    2016-09-01

    A main bottleneck in proteomics is the downstream biological analysis of highly multivariate quantitative protein abundance data generated using mass-spectrometry-based analysis. We developed the Perseus software platform (http://www.perseus-framework.org) to support biological and biomedical researchers in interpreting protein quantification, interaction and post-translational modification data. Perseus contains a comprehensive portfolio of statistical tools for high-dimensional omics data analysis covering normalization, pattern recognition, time-series analysis, cross-omics comparisons and multiple-hypothesis testing. A machine learning module supports the classification and validation of patient groups for diagnosis and prognosis, and it also detects predictive protein signatures. Central to Perseus is a user-friendly, interactive workflow environment that provides complete documentation of computational methods used in a publication. All activities in Perseus are realized as plugins, and users can extend the software by programming their own, which can be shared through a plugin store. We anticipate that Perseus's arsenal of algorithms and its intuitive usability will empower interdisciplinary analysis of complex large data sets.

  11. Faster, Better, Cheaper: A Decade of PC Progress.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1997-01-01

    Reviews the development of personal computers and how computer components have changed in price and value. Highlights include disk drives; keyboards; displays; memory; color graphics; modems; CPU (central processing unit); storage; direct mail vendors; and future possibilities. (LRW)

  12. Analysis of Selected Enhancements to the En Route Central Computing Complex

    DOT National Transportation Integrated Search

    1981-09-01

    This report analyzes selected hardware enhancements that could improve the performance of the 9020 computer systems, which are used to provide en route air traffic control services. These enhancements could be implemented quickly, would be relatively...

  13. A Computational Model of Reasoning from the Clinical Literature

    PubMed Central

    Rennels, Glenn D.

    1986-01-01

    This paper explores the premise that a formalized representation of empirical studies can play a central role in computer-based decision support. The specific motivations underlying this research include the following propositions: 1. Reasoning from experimental evidence contained in the clinical literature is central to the decisions physicians make in patient care. 2. A computational model, based upon a declarative representation for published reports of clinical studies, can drive a computer program that selectively tailors knowledge of the clinical literature as it is applied to a particular case. 3. The development of such a computational model is an important first step toward filling a void in computer-based decision support systems. Furthermore, the model may help us better understand the general principles of reasoning from experimental evidence both in medicine and other domains. Roundsman is a developmental computer system which draws upon structured representations of the clinical literature in order to critique plans for the management of primary breast cancer. Roundsman is able to produce patient-specific analyses of breast cancer management options based on the 24 clinical studies currently encoded in its knowledge base. The Roundsman system is a first step in exploring how the computer can help to bring a critical analysis of the relevant literature to the physician, structured around a particular patient and treatment decision.

  14. Healthcare Policy Statement on the Utility of Coronary Computed Tomography for Evaluation of Cardiovascular Conditions and Preventive Healthcare: From the Health Policy Working Group of the Society of Cardiovascular Computed Tomography.

    PubMed

    Slim, Ahmad M; Jerome, Scott; Blankstein, Ron; Weigold, Wm Guy; Patel, Amit R; Kalra, Dinesh K; Miller, Ryan; Branch, Kelley; Rabbat, Mark G; Hecht, Harvey; Nicol, Edward D; Villines, Todd C; Shaw, Leslee J

    The rising cost of healthcare is prompting numerous policy and advocacy discussions regarding strategies for constraining growth and creating a more efficient and effective healthcare system. Cardiovascular imaging is central to the care of patients at risk of, and living with, heart disease. Estimates are that utilization of cardiovascular imaging exceeds 20 million studies per year. The Society of Cardiovascular CT (SCCT), alongside Rush University Medical Center, and in collaboration with government agencies, regional payers, and industry healthcare experts met in November 2016 in Chicago, IL to evaluate obstacles and hurdles facing the cardiovascular imaging community and how they can contribute to efficacy while maintaining or even improving outcomes and quality. The summit incorporated inputs from payers, providers, and patients' perspectives, providing a platform for all voices to be heard, allowing for a constructive dialogue with potential solutions moving forward. This article outlines the proceedings from the summit, with a detailed review of past hurdles, current status, and potential solutions as we move forward in an ever-changing healthcare landscape. Copyright © 2017 Society of Cardiovascular Computed Tomography. All rights reserved.

  15. A Common Variant of NGEF Is Associated with Abdominal Visceral Fat in Korean Men.

    PubMed

    Kim, Hyun-Jin; Park, Jin-Ho; Lee, Seungbok; Son, Ho-Young; Hwang, Jinha; Chae, Jeesoo; Yun, Jae Moon; Kwon, Hyuktae; Kim, Jong-Il; Cho, Belong

    2015-01-01

    Central adiposity, rather than body mass index (BMI), is a key pathophysiological feature of the development of obesity-related diseases. Although genetic studies by anthropometric measures such as waist circumference have been widely conducted, genetic studies for abdominal fat deposition measured by computed tomography (CT) have been rarely performed. A total of 1,243 participants who were recruited from two health check-up centers were included in this study. We selected four and three single-nucleotide polymorphisms (SNPs) in NGEF and RGS6, respectively, and analyzed the associations between the seven SNPs and central adiposity measured by CT using an additive, dominant, or recessive model. The participants were generally healthy middle-aged men (50.7 ± 5.3 years). In the additive model, the rs11678490 A allele of NGEF was significantly associated with total adipose tissue, visceral adipose tissue (VAT), and subcutaneous adipose tissue (all P < 0.05). The AA genotype of this SNP in the recessive model showed a more significant association with all adiposity traits, and its association with VAT remained significant even after adjustment for BMI (P = 0.005). In the overall or visceral obesity group analysis, the AA genotype of rs11678490 showed no association with overall obesity (P = 0.148), whereas it was significantly associated with visceral obesity both before (P = 0.010) and after (P = 0.029) adjustment for BMI. In particular, an AA genotype effect was conspicuous between lower and upper groups with 5% extreme VAT phenotypes (OR = 9.59, 95% CI = 1.50-61.31). However, we found no significant association between SNPs of RGS6 and central adiposity. We identified a visceral-fat-associated SNP, rs11678490 of NGEF, in Korean men. This study suggests that the genetic background of central adiposity and BMI is different, and that additional efforts should be made to find the unique genetic architecture of intra-abdominal fat accumulation.

  16. Self-Efficacy Beliefs and Their Sources in Undergraduate Computing Disciplines: An Examination of Gender and Persistence

    ERIC Educational Resources Information Center

    Lin, Guan-Yu

    2016-01-01

    This study has two central purposes: First, it examines not only the roles of gender and persistence in undergraduate computing majors' learning self-efficacy, computer self-efficacy, and programming self-efficacy but also Bandura's hypothesized sources of self-efficacy; second, it examines the influence of sources of efficacy on the three…

  17. Choosing a Computer Language for Institutional Research. The AIR Professional File No. 6.

    ERIC Educational Resources Information Center

    Strenglein, Denise

    1980-01-01

    It is suggested that much thought should be given to choosing an appropriate computer language for an institutional research office, considering the sophistication of the staff, types of planned application, size and type of computer, and availability of central programming support in the institution. For offices that prepare straight reports and…

  18. Do All Roads Lead to Rome? ("or" Reductions for Dummy Travelers)

    ERIC Educational Resources Information Center

    Kilpelainen, Pekka

    2010-01-01

    Reduction is a central ingredient of computational thinking, and an important tool in algorithm design, in computability theory, and in complexity theory. Reduction has been recognized to be a difficult topic for students to learn. Previous studies on teaching reduction have concentrated on its use in special courses on the theory of computing. As…

  19. Pedagogy Matters: Engaging Diverse Students as Community Researchers in Three Computer Science Classrooms

    ERIC Educational Resources Information Center

    Ryoo, Jean Jinsun

    2013-01-01

    Computing occupations are among the fastest growing in the U.S. and technological innovations are central to solving world problems. Yet only our most privileged students are learning to use technology for creative purposes through rigorous computer science education opportunities. In order to increase access for diverse students and females who…

  20. Cognitive Computational Neuroscience: A New Conference for an Emerging Discipline.

    PubMed

    Naselaris, Thomas; Bassett, Danielle S; Fletcher, Alyson K; Kording, Konrad; Kriegeskorte, Nikolaus; Nienborg, Hendrikje; Poldrack, Russell A; Shohamy, Daphna; Kay, Kendrick

    2018-05-01

    Understanding the computational principles that underlie complex behavior is a central goal in cognitive science, artificial intelligence, and neuroscience. In an attempt to unify these disconnected communities, we created a new conference called Cognitive Computational Neuroscience (CCN). The inaugural meeting revealed considerable enthusiasm but significant obstacles remain. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. On Topological Indices of Certain Dendrimer Structures

    NASA Astrophysics Data System (ADS)

    Aslam, Adnan; Bashir, Yasir; Ahmad, Safyan; Gao, Wei

    2017-05-01

    A topological index can be considered as transformation of chemical structure in to real number. In QSAR/QSPR study, physicochemical properties and topological indices such as Randić, Zagreb, atom-bond connectivity ABC, and geometric-arithmetic GA index are used to predict the bioactivity of chemical compounds. Dendrimers are highly branched, star-shaped macromolecules with nanometer-scale dimensions. Dendrimers are defined by three components: a central core, an interior dendritic structure (the branches), and an exterior surface with functional surface groups. In this paper we determine generalised Randić, general Zagreb, general sum-connectivity indices of poly(propyl) ether imine, porphyrin, and zinc-Porphyrin dendrimers. We also compute ABC and GA indices of these families of dendrimers.

  2. Challenges of Moving IPG into Production

    NASA Technical Reports Server (NTRS)

    Schulbach, Cathy

    2004-01-01

    Over the past 5-6 years, NASA has been developing the Information Power Grid and has a persistent testbed currently based on GT2.4.2. This presentation will begin with an overview of IPG status and services, discuss key milestones in IPG development, and present early as well as expected applications. The presentation will discuss some of the issues encountered in developing a grid including the tension between providing centralized and distributed computing. These issues also affect how the grid is moved into full production. Finally, the presentation will provide current plans for moving IPG into full production, including gaining broad user input, developing acceptance criteria from the production operations group, planning upgrades, and training users.

  3. Stochastic correlative firing for figure-ground segregation.

    PubMed

    Chen, Zhe

    2005-03-01

    Segregation of sensory inputs into separate objects is a central aspect of perception and arises in all sensory modalities. The figure-ground segregation problem requires identifying an object of interest in a complex scene, in many cases given binaural auditory or binocular visual observations. The computations required for visual and auditory figure-ground segregation share many common features and can be cast within a unified framework. Sensory perception can be viewed as a problem of optimizing information transmission. Here we suggest a stochastic correlative firing mechanism and an associative learning rule for figure-ground segregation in several classic sensory perception tasks, including the cocktail party problem in binaural hearing, binocular fusion of stereo images, and Gestalt grouping in motion perception.

  4. MDCT diagnosis of post-traumatic hepatic arterio-portal fistulas.

    PubMed

    Nguyen, Cuong T; Nguyen, Coung; Saksobhavivat, Nitima; Saksobahavivat, Nitima; Shanmuganathan, Kathirkamanathan; Steenburg, Scott D; Steenburg, Scott; Moeslein, Fred M; Moeslein, Fred; Mirvis, Stuart E; Chiu, William

    2013-06-01

    The purpose of this study is to evaluate the performance of multidetector computed tomography (MDCT) in diagnosing arterioportal fistulas (APF) in high-grade liver injury. A retrospective analysis of catheter-based hepatic angiograms performed for major penetrating and blunt liver injuries identified 11 patients with APFs. Using the trauma registry, two additional demographically matched groups with and without liver injury were formed. A randomized qualitative consensus review of 33 MDCTs was performed by three trauma radiologists for the following MDCT findings of APF: transient hepatic parenchymal attenuation differences (THPAD), early increased attenuation of a peripheral or central portal vein compared with the main portal vein, and the "double-barrel" or "rail tract" signs. THPAD was the most sensitive finding and also had a high specificity for diagnosing APF. Both the early increased attenuation of a peripheral or central portal vein compared with the main portal vein and the double-barrel or rail tract signs had a100% specificity and a sensitivity of 64% and 36%, respectively. Measurement of differences in attenuation values between the APF and the contralateral central portal vein was most sensitive and specific in diagnosing APF. Traumatic APF of the liver can be optimally diagnosed with arterial phase imaging of solid organ using MDCT.

  5. Analysis of underlying causes of inter-expert disagreement in retinopathy of prematurity diagnosis. Application of machine learning principles.

    PubMed

    Ataer-Cansizoglu, E; Kalpathy-Cramer, J; You, S; Keck, K; Erdogmus, D; Chiang, M F

    2015-01-01

    Inter-expert variability in image-based clinical diagnosis has been demonstrated in many diseases including retinopathy of prematurity (ROP), which is a disease affecting low birth weight infants and is a major cause of childhood blindness. In order to better understand the underlying causes of variability among experts, we propose a method to quantify the variability of expert decisions and analyze the relationship between expert diagnoses and features computed from the images. Identification of these features is relevant for development of computer-based decision support systems and educational systems in ROP, and these methods may be applicable to other diseases where inter-expert variability is observed. The experiments were carried out on a dataset of 34 retinal images, each with diagnoses provided independently by 22 experts. Analysis was performed using concepts of Mutual Information (MI) and Kernel Density Estimation. A large set of structural features (a total of 66) were extracted from retinal images. Feature selection was utilized to identify the most important features that correlated to actual clinical decisions by the 22 study experts. The best three features for each observer were selected by an exhaustive search on all possible feature subsets and considering joint MI as a relevance criterion. We also compared our results with the results of Cohen's Kappa [36] as an inter-rater reliability measure. The results demonstrate that a group of observers (17 among 22) decide consistently with each other. Mean and second central moment of arteriolar tortuosity is among the reasons of disagreement between this group and the rest of the observers, meaning that the group of experts consider amount of tortuosity as well as the variation of tortuosity in the image. Given a set of image-based features, the proposed analysis method can identify critical image-based features that lead to expert agreement and disagreement in diagnosis of ROP. Although tree-based features and various statistics such as central moment are not popular in the literature, our results suggest that they are important for diagnosis.

  6. Analysis of the width ratio and wear rate of maxillary anterior teeth in the Korean population.

    PubMed

    Oh, Yeon-Ah; Yang, Hong-So; Park, Sang-Won; Lim, Hyun-Pil; Yun, Kwi-Dug; Park, Chan

    2017-04-01

    The purpose of this study was to compare the width ratio of maxillary anterior teeth according to age in the Korean population and to evaluate the maxillary central incisor width-to-length (W/L) ratio, given differences in age and gender. Ninety-three Korean adults were divided into 3 groups (n = 31) by age. Group I was 20 - 39 years old, Group II was 40 - 59 years old, and Group III was over 60 years of age. After taking an impression and a cast model of the maxillary arch, the anterior teeth width ratio and central incisor W/L ratio were calculated from standard digital images of the cast models using a graph paper with a digital single lens reflex (DSLR) camera. The calculated ratios were compared among all groups and central incisor W/L ratio were analyzed according to age and gender. All comparative data were statistically analyzed with one-sample t-tests, one-way ANOVAs with Tukey tests, and independent t-tests. No significant differences in maxillary anterior teeth ratios were found among the age groups. The maxillary central incisor W/L ratios in Group III were the greatest and were significantly higher than those in the other groups. The central incisor W/L ratio of men was higher than that of women in Group II. Maxillary anterior teeth width ratios were similar in all age groups in the Korean population. The maxillary central incisor was observed as worn teeth in the group over 60 years of age, and a significant difference between genders was found in 40 to 50 year olds.

  7. Bi-Exact Groups, Strongly Ergodic Actions and Group Measure Space Type III Factors with No Central Sequence

    NASA Astrophysics Data System (ADS)

    Houdayer, Cyril; Isono, Yusuke

    2016-12-01

    We investigate the asymptotic structure of (possibly type III) crossed product von Neumann algebras {M = B rtimes Γ} arising from arbitrary actions {Γ \\curvearrowright B} of bi-exact discrete groups (e.g. free groups) on amenable von Neumann algebras. We prove a spectral gap rigidity result for the central sequence algebra {N' \\cap M^ω} of any nonamenable von Neumann subalgebra with normal expectation {N subset M}. We use this result to show that for any strongly ergodic essentially free nonsingular action {Γ \\curvearrowright (X, μ)} of any bi-exact countable discrete group on a standard probability space, the corresponding group measure space factor {L^∞(X) rtimes Γ} has no nontrivial central sequence. Using recent results of Boutonnet et al. (Local spectral gap in simple Lie groups and applications, 2015), we construct, for every {0 < λ ≤ 1}, a type {III_λ} strongly ergodic essentially free nonsingular action {F_∞ \\curvearrowright (X_λ, μ_λ)} of the free group {{F}_∞} on a standard probability space so that the corresponding group measure space type {III_λ} factor {L^∞(X_λ, μ_λ) rtimes F_∞} has no nontrivial central sequence by our main result. In particular, we obtain the first examples of group measure space type {III} factors with no nontrivial central sequence.

  8. [Dynamic observation of clinical course in patients with subacute 1, 2-dichloroethane poisoning].

    PubMed

    Liu, Weiwei; Chen, Yuquan; Pan, Jing; Yang, Zhiqian; Liu, Yimin

    2015-03-01

    To observe the clinical characteristics and regular patterns of subacute 1, 2-dichloroethane poisoning patients for providing evidences to it's diagnosis, treatment and prognosis. 51 cases of subacute 1, 2-dichloroethane poisoning analyzed. They were divided into 3 groups according to their main clinical manifestation: group A mainly with intracranial hypertension (n = 25), group B with limbs tremor (n = 18), group C with mental and behavior disorder (n = 8). All cases' clinical symptoms, cranial computer tomography, cerebrospinal pressure (Group A) were observed, the durations of the onset, deterioration, improvement, recovery and whole course of the disease were compared between groups and in each group. In all of 51 cases, only the differences between the deterioration duration of cranial CT and symptom was significantly (t = 2.555, P<0.05), which indicate the deterioration of symptom was earlier than radiological change. The symptom deterioration of group C was the fastest than group A and group B (P<0.00). As to the change of symptom duration, group B's improvement, recovery and whole course was the longest comparing with group A and group C (P<0.05). As to the change of cranial CT duration, group B's recovery duration was the shortest and group A's recovery duration was the longest (P<0.01); group B's whole course was also the shortest and group A's whole course was the longest (P<0.05). The clinical course of symptoms, cranial computer tomography, cerebrospinal pressure (Group A) was compared in each group, in group A, the duration of improvement and whole course of the cranial CT and cerebrospinal pressure change was longer than that of the symptom change (P<0.01), this indicated that group A has longer asymptomatic intracranial hypertension and their cranial radiography recover slowly. In group B, their symptoms (3.94 ± 4.31 days) deteriorated is earlier than cranial CT changes (P<0.05), the recovery (92.39 ± 55.04 days) and whole course of symptom was longer than cranial CT change (all P<0.01). In group C, symptom deterioration was earlier than CT deterioration (P< 0.05). The clinical characteristic of subacute 1, 2- dichloroethane poisoning is central nervous system damage, it differs according to the different stage of course, the regions and severity of pathology lesions.

  9. Design of the central region for axial injection in the VINCY cyclotron

    NASA Astrophysics Data System (ADS)

    Milinković, Ljiljana; Toprek, Dragan

    1996-02-01

    This paper describes the design of the central region for h = 1, h = 2 and h = 4 modes of acceleration in the VINCY cyclotron. The result which is worth reported in that the central region is unique and compatible with the three above mentioned harmonic modes of operation. Only one spiral type inflector will be used. The central region is designed to operate with two external ion sources: (a) an ECR ion source with the maximum extraction voltage of 25 kV for heavy ions, and (b) a multicusp ion source with the maximum extraction voltage of 30 kV for H - and D - ions. Heavy ions will be accelerated by the second and fourth harmonics, D - ions by the second harmonic and H - ions by the first harmonic of the RF field. The central region is equipped with an axial injection system. The electric field distribution in the inflector and in the four acceleration gaps has been numerically calculated from an electric potential map produced by the program RELAX3D. The geometry of the central region has been tested with the computations of orbits carried out by means of the computer code CYCLONE. The optical properties of the spiral inflector and the central region were studied by using the programs CASINO and CYCLONE respectively. We have also made an effort to minimize the inflector fringe field using the RELAX3D program.

  10. Satisfaction with Life in Orofacial Pain Disorders: Associations and Theoretical Implications

    PubMed Central

    Boggero, Ian A.; Rojas-Ramirez, Marcia V.; de Leeuw, Reny; Carlson, Charles R.

    2016-01-01

    Aims To test if patients with masticatory myofascial pain, local myalgia, centrally mediated myalgia, disc displacement, capsulitis/synovitis, or continuous neuropathic pain differed in self-reported satisfaction with life. The study also tested if satisfaction with life was similarly predicted by measures of physical, emotional, and social functioning across disorders. Methods Satisfaction with life, fatigue, affective distress, social support, and pain data were extracted from the medical records of 343 patients seeking treatment for chronic orofacial pain. Patients were grouped by primary diagnosis assigned following their initial appointment. Satisfaction with life was compared between disorders, with and without pain intensity entered as a covariate. Disorder-specific linear regression models using physical, emotional, and social predictors of satisfaction with life were computed. Results Patients with centrally mediated myalgia reported significantly lower satisfaction with life than did patients with any of the other five disorders. Inclusion of pain intensity as a covariate weakened but did not eliminate the effect. Satisfaction with life was predicted by measures of physical, emotional, and social functioning, but these associations were not consistent across disorders. Conclusions Results suggest that reduced satisfaction with life in patients with centrally mediated myalgia is not due only to pain intensity. There may be other factors that predispose people to both reduced satisfaction with life and centrally mediated myalgia. Furthermore, the results suggest that satisfaction with life is differentially influenced by physical, emotional, and social functioning in different orofacial pain disorders. PMID:27128473

  11. Local Data Integration in East Central Florida

    NASA Technical Reports Server (NTRS)

    Case, Jonathan L.; Manobianco, John T.

    1999-01-01

    The Applied Meteorology Unit has configured a Local Data Integration System (LDIS) for east central Florida which assimilates in-situ and remotely-sensed observational data into a series of high-resolution gridded analyses. The ultimate goal for running LDIS is to generate products that may enhance weather nowcasts and short-range (less than 6 h) forecasts issued in support of the 45th Weather Squadron (45 WS), Spaceflight Meteorology Group (SMG), and the Melbourne National Weather Service (NWS MLB) operational requirements. LDIS has the potential to provide added value for nowcasts and short-ten-n forecasts for two reasons. First, it incorporates all data operationally available in east central Florida. Second, it is run at finer spatial and temporal resolutions than current national-scale operational models such as the Rapid Update Cycle and Eta models. LDIS combines all available data to produce grid analyses of primary variables (wind, temperature, etc.) at specified temporal and spatial resolutions. These analyses of primary variables can be used to compute diagnostic quantities such as vorticity and divergence. This paper demonstrates the utility of LDIS over east central Florida for a warm season case study. The evolution of a significant thunderstorm outflow boundary is depicted through horizontal and vertical cross section plots of wind speed, divergence, and circulation. In combination with a suitable visualization too], LDIS may provide users with a more complete and comprehensive understanding of evolving mesoscale weather than could be developed by individually examining the disparate data sets over the same area and time.

  12. Computers and neurosurgery.

    PubMed

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Embracing the quantum limit in silicon computing.

    PubMed

    Morton, John J L; McCamey, Dane R; Eriksson, Mark A; Lyon, Stephen A

    2011-11-16

    Quantum computers hold the promise of massive performance enhancements across a range of applications, from cryptography and databases to revolutionary scientific simulation tools. Such computers would make use of the same quantum mechanical phenomena that pose limitations on the continued shrinking of conventional information processing devices. Many of the key requirements for quantum computing differ markedly from those of conventional computers. However, silicon, which plays a central part in conventional information processing, has many properties that make it a superb platform around which to build a quantum computer. © 2011 Macmillan Publishers Limited. All rights reserved

  14. Jargon that Computes: Today's PC Terminology.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1997-01-01

    Discusses PC (personal computer) and telecommunications terminology in context: Integrated Services Digital Network (ISDN); Asymmetric Digital Subscriber Line (ADSL); cable modems; satellite downloads; T1 and T3 lines; magnitudes ("giga-,""nano-"); Central Processing Unit (CPU); Random Access Memory (RAM); Universal Serial Bus…

  15. Nanotube Heterojunctions and Endo-Fullerenes for Nanoelectronics

    NASA Technical Reports Server (NTRS)

    Srivastava, Deepak; Menon, M.; Andriotis, Antonis; Cho, K.; Park, Jun; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Topics discussed include: (1) Light-Weight Multi-Functional Materials: Nanomechanics; Nanotubes and Composites; Thermal/Chemical/Electrical Characterization; (2) Biomimetic/Revolutionary Concepts: Evolutionary Computing and Sensing; Self-Heating Materials; (3) Central Computing System: Molecular Electronics; Materials for Quantum Bits; and (4) Molecular Machines.

  16. Neural computational modeling reveals a major role of corticospinal gating of central oscillations in the generation of essential tremor.

    PubMed

    Qu, Hong-En; Niu, Chuanxin M; Li, Si; Hao, Man-Zhao; Hu, Zi-Xiang; Xie, Qing; Lan, Ning

    2017-12-01

    Essential tremor, also referred to as familial tremor, is an autosomal dominant genetic disease and the most common movement disorder. It typically involves a postural and motor tremor of the hands, head or other part of the body. Essential tremor is driven by a central oscillation signal in the brain. However, the corticospinal mechanisms involved in the generation of essential tremor are unclear. Therefore, in this study, we used a neural computational model that includes both monosynaptic and multisynaptic corticospinal pathways interacting with a propriospinal neuronal network. A virtual arm model is driven by the central oscillation signal to simulate tremor activity behavior. Cortical descending commands are classified as alpha or gamma through monosynaptic or multisynaptic corticospinal pathways, which converge respectively on alpha or gamma motoneurons in the spinal cord. Several scenarios are evaluated based on the central oscillation signal passing down to the spinal motoneurons via each descending pathway. The simulated behaviors are compared with clinical essential tremor characteristics to identify the corticospinal pathways responsible for transmitting the central oscillation signal. A propriospinal neuron with strong cortical inhibition performs a gating function in the generation of essential tremor. Our results indicate that the propriospinal neuronal network is essential for relaying the central oscillation signal and the production of essential tremor.

  17. Neurons from the adult human dentate nucleus: neural networks in the neuron classification.

    PubMed

    Grbatinić, Ivan; Marić, Dušica L; Milošević, Nebojša T

    2015-04-07

    Topological (central vs. border neuron type) and morphological classification of adult human dentate nucleus neurons according to their quantified histomorphological properties using neural networks on real and virtual neuron samples. In the real sample 53.1% and 14.1% of central and border neurons, respectively, are classified correctly with total of 32.8% of misclassified neurons. The most important result present 62.2% of misclassified neurons in border neurons group which is even greater than number of correctly classified neurons (37.8%) in that group, showing obvious failure of network to classify neurons correctly based on computational parameters used in our study. On the virtual sample 97.3% of misclassified neurons in border neurons group which is much greater than number of correctly classified neurons (2.7%) in that group, again confirms obvious failure of network to classify neurons correctly. Statistical analysis shows that there is no statistically significant difference in between central and border neurons for each measured parameter (p>0.05). Total of 96.74% neurons are morphologically classified correctly by neural networks and each one belongs to one of the four histomorphological types: (a) neurons with small soma and short dendrites, (b) neurons with small soma and long dendrites, (c) neuron with large soma and short dendrites, (d) neurons with large soma and long dendrites. Statistical analysis supports these results (p<0.05). Human dentate nucleus neurons can be classified in four neuron types according to their quantitative histomorphological properties. These neuron types consist of two neuron sets, small and large ones with respect to their perykarions with subtypes differing in dendrite length i.e. neurons with short vs. long dendrites. Besides confirmation of neuron classification on small and large ones, already shown in literature, we found two new subtypes i.e. neurons with small soma and long dendrites and with large soma and short dendrites. These neurons are most probably equally distributed throughout the dentate nucleus as no significant difference in their topological distribution is observed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Computers in anesthesia and intensive care: lack of evidence that the central unit serves as reservoir of pathogens.

    PubMed

    Quinzio, Lorenzo; Blazek, Michael; Hartmann, Bernd; Röhrig, Rainer; Wille, Burkhard; Junger, Axel; Hempelmann, Gunter

    2005-01-01

    Computers are becoming increasingly visible in operating rooms (OR) and intensive care units (ICU) for use in bedside documentation. Recently, they have been suspected as possibly acting as reservoirs for microorganisms and vehicles for the transfer of pathogens to patients, causing nosocomial infections. The purpose of this study was to examine the microbiological (bacteriological and mycological) contamination of the central unit of computers used in an OR, a surgical and a pediatric ICU of a tertiary teaching hospital. Sterile swab samples were taken from five sites in each of 13 computers stationed at the two ICUs and 12 computers at the OR. Sample sites within the chassis housing of the computer processing unit (CPU) included the CPU fan, ventilator, and metal casing. External sites were the ventilator and the bottom of the computer tower. Quantitative and qualitative microbiological analyses were performed according to commonly used methods. One hundred and ninety sites were cultured for bacteria and fungi. Analyses of swabs taken at five equivalent sites inside and outside the computer chassis did not find any significant-number of potentially pathogenic bacteria or fungi. This can probably be attributed to either the absence or the low number of pathogens detected on the surfaces. Microbial contamination in the CPU of OR and ICU computers is too low for designating them as a reservoir for microorganisms.

  19. Fundamental Fortran for Social Scientists.

    ERIC Educational Resources Information Center

    Veldman, Donald J.

    An introduction to Fortran programming specifically for social science statistical and routine data processing is provided. The first two sections of the manual describe the components of computer hardware and software. Topics include input, output, and mass storage devices; central memory; central processing unit; internal storage of data; and…

  20. Computing shifts to monitor ATLAS distributed computing infrastructure and operations

    NASA Astrophysics Data System (ADS)

    Adam, C.; Barberis, D.; Crépé-Renaudin, S.; De, K.; Fassi, F.; Stradling, A.; Svatos, M.; Vartapetian, A.; Wolters, H.

    2017-10-01

    The ATLAS Distributed Computing (ADC) group established a new Computing Run Coordinator (CRC) shift at the start of LHC Run 2 in 2015. The main goal was to rely on a person with a good overview of the ADC activities to ease the ADC experts’ workload. The CRC shifter keeps track of ADC tasks related to their fields of expertise and responsibility. At the same time, the shifter maintains a global view of the day-to-day operations of the ADC system. During Run 1, this task was accomplished by a person of the expert team called the ADC Manager on Duty (AMOD), a position that was removed during the shutdown period due to the reduced number and availability of ADC experts foreseen for Run 2. The CRC position was proposed to cover some of the AMODs former functions, while allowing more people involved in computing to participate. In this way, CRC shifters help with the training of future ADC experts. The CRC shifters coordinate daily ADC shift operations, including tracking open issues, reporting, and representing ADC in relevant meetings. The CRC also facilitates communication between the ADC experts team and the other ADC shifters. These include the Distributed Analysis Support Team (DAST), which is the first point of contact for addressing all distributed analysis questions, and the ATLAS Distributed Computing Shifters (ADCoS), which check and report problems in central services, sites, Tier-0 export, data transfers and production tasks. Finally, the CRC looks at the level of ADC activities on a weekly or monthly timescale to ensure that ADC resources are used efficiently.

  1. Radiation-driven winds of hot stars. V - Wind models for central stars of planetary nebulae

    NASA Technical Reports Server (NTRS)

    Pauldrach, A.; Puls, J.; Kudritzki, R. P.; Mendez, R. H.; Heap, S. R.

    1988-01-01

    Wind models using the recent improvements of radiation driven wind theory by Pauldrach et al. (1986) and Pauldrach (1987) are presented for central stars of planetary nebulae. The models are computed along evolutionary tracks evolving with different stellar mass from the Asymptotic Giant Branch. We show that the calculated terminal wind velocities are in agreement with the observations and allow in principle an independent determination of stellar masses and radii. The computed mass-loss rates are in qualitative agreement with the occurrence of spectroscopic stellar wind features as a function of stellar effective temperature and gravity.

  2. Better Decomposition Heuristics for the Maximum-Weight Connected Graph Problem Using Betweenness Centrality

    NASA Astrophysics Data System (ADS)

    Yamamoto, Takanori; Bannai, Hideo; Nagasaki, Masao; Miyano, Satoru

    We present new decomposition heuristics for finding the optimal solution for the maximum-weight connected graph problem, which is known to be NP-hard. Previous optimal algorithms for solving the problem decompose the input graph into subgraphs using heuristics based on node degree. We propose new heuristics based on betweenness centrality measures, and show through computational experiments that our new heuristics tend to reduce the number of subgraphs in the decomposition, and therefore could lead to the reduction in computational time for finding the optimal solution. The method is further applied to analysis of biological pathway data.

  3. Bronchovascular versus bronchial sleeve resection for central lung tumors.

    PubMed

    Lausberg, Henning F; Graeter, Thomas P; Tscholl, Dietmar; Wendler, Olaf; Schäfers, Hans-Joachim

    2005-04-01

    Pneumonectomy has traditionally been the treatment of choice for central lung tumors. Bronchial sleeve resections are increasingly considered as a reasonable alternative. For tumor involvement of both central airways and pulmonary artery, bronchovascular sleeve resections are possible, but considered to be technically demanding and associated with a higher perioperative risk. In addition, their role as adequate oncologic treatment for lung cancer is unclear. We have compared the early and long-term results of bronchovascular sleeve resection with those of bronchial sleeve resection and pneumonectomy. We retrospectively analyzed all patients who underwent bronchial sleeve resection (group I, n = 104), bronchovascular sleeve resection (group II, n = 67), and pneumonectomy (group III, n = 63) for central lung cancer in our institution. The groups were comparable regarding demographics and tumor, node, and metastasis (TNM) stage. Early mortality was 1.9% in group I, 1.5% in group II, and 6.3% in group III (p = 0.19). The rate of bronchial complications was 0.96% in group I, 0% in group II, and 7.9% in group III (p = 0.006). Five-year survival was 46.1% in group I, 42.9% in group II, and 30.4% in group III (p = 0.16). Freedom from local recurrence of disease (5 years) was 83.8% in group I, 84.2% in group II, and 88.7% in group III (p = 0.56). Bronchovascular sleeve resections are as safe as bronchial sleeve resections for the treatment of central lung cancer. Both procedures have comparable early and long-term results, which are similar to those of pneumonectomy. It appears reasonable to apply bronchovascular sleeve resections more liberally.

  4. Perception of smile esthetics by laypeople of different ages.

    PubMed

    Sriphadungporn, Chompunuch; Chamnannidiadha, Niramol

    2017-12-01

    Age is a factor affecting smile esthetics. Three variables of smile esthetics associated with the maxillary anterior teeth and age-related changes have recently received considerable attention: (i) the incisal edge position of the maxillary central incisors, (ii) the maxillary gingival display, and (iii) the presence of a black triangle between the maxillary central incisors. The aim of this study was to evaluate the influence of age on smile esthetic perception based on these three variables in a group of Thai laypeople. The smiles were constructed from a photograph of a female smile. Smile photographs were altered in various increments using three variables: the incisal edge position of the maxillary incisors, gingival display, and a black triangle between the maxillary central incisors. The photographs were shown to a group of 240 Thai laypeople. The subjects were divided into two groups: a younger group, 15-29 years old (n = 120) and an older group, 36-52 years old (n = 120). Each subject was asked to score the attractiveness of each smile separately using a visual analog scale. Smile attractiveness scores concerning the incisal edge positions of the maxillary central incisors were similar between the two groups. However, upper lip coverage was rated as unattractive by the younger group. A gingival display of 0 and 2 mm was rated as most attractive by the younger group. Upper lip coverage and gingival display of 0 and 2 mm were considered attractive by the older group. Excessive gingival display (6 mm) was scored as unattractive by both groups. A black triangle ranging from 1 to 2.5 mm between the maxillary central incisors was scored differently between the two groups. The older group was more tolerant of the black triangle size. Age impacts smile perception based on maxillary gingival display and the presence of a black triangle between the maxillary central incisors, but not of the incisal edge position of the maxillary central incisors. Due to the variation in esthetic perception of each individual, participation between orthodontists and patients for decision-making and treatment planning is a crucial process to provide successful results.

  5. I148M variant in PNPLA3 reduces central adiposity and metabolic disease risks while increasing nonalcoholic fatty liver disease.

    PubMed

    Park, Jin-Ho; Cho, BeLong; Kwon, Hyuktae; Prilutsky, Daria; Yun, Jae Moon; Choi, Ho Chun; Hwang, Kyu-Baek; Lee, In-Hee; Kim, Jong-Il; Kong, Sek Won

    2015-12-01

    The I148M variant because of the substitution of C to G in PNPLA3 (rs738409) is associated with the increased risk of nonalcoholic fatty liver disease (NAFLD). In liver, I148M variant reduces hydrolytic function of PNPLA3, which results in hepatic steatosis; however, its association with the other clinical phenotype such as adiposity and metabolic diseases is not well established. To identify the impact of I148M variant on clinical risk factors of NAFLD, we recruited 1363 generally healthy Korean males after excluding alcoholic and secondary causes of hepatic steatosis. Central adiposity was assessed by computed tomography, and hepatic steatosis was evaluated by abdominal ultrasonography. The participants were predominantly middle-aged (49.0 ± 7.1 years; range 30-60 years), and the frequency of NAFLD was 44.2%. The rs738409-G allele carriers had a 1.19-fold increased risk for NAFLD (minor allele frequency 0.43; allelic odds ratio 1.38; P = 4.3 × 10(-5) ). Interestingly, the rs738409 GG carriers showed significantly lower levels of visceral and subcutaneous adiposity (P < 0.001 and = 0.015, respectively), BMI (P < 0.001), triglycerides (P < 0.001) and insulin resistance (P = 0.002) compared to CC carriers. These negative associations between clinical risk factors and rs738409-G dosage were more prominent in non-NAFLD group compared to those in NAFLD group. The I148M variant, although increasing the risk of NAFLD, was associated with reduced levels of central adiposity, BMI, serum triglycerides and insulin resistance, suggesting differential roles in fat storage and distribution according to cell types and metabolic status. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. Effect of computer radiation on weight and oxidant-antioxidant status of mice.

    PubMed

    Pei, Xuexian; Gu, Qijun; Ye, Dongdong; Wang, Yang; Zou, Xu; He, Lianping; Jin, Yuelong; Yao, Yingshui

    2014-10-20

    To explore the effects of computer radiation on weight and oxidant-antioxidant status of mice, and further to confirm that whether vitamin C has protective effects on computer radiation. Sixty Male adult ICR mice were randomly divided into six groups. each group give different treatment as follows: group A was control, group B given vitamin C intake, group C given 8 h/day computer radiation exposure, group D given vitamin C intake and 8 h/day computer radiation group E given 16 h/day computer radiation exposure, group F given vitamin C intake plus exposure to 16 h/day computer radiation. After seven weeks, mice was executed to collect the blood samples, for detecting total antioxidant capacity (T-AOC) and alkaline phosphatases (ALP)content in serum or liver tissue were determined by ELISA. No difference was found for the change of weight among six groups at different week. In the group C, D and F, the liver tissue T-AOC level were higher than the group A. In the group B, C and E, the serum ALP level were lower than the group A (P<0.05). The study indicate that computer radiation may have an adverse effect on T-AOC and ALP level of mice, and vitamin C have protective effect against computer radiation. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  7. A Multidimensional Approach to Determinants of Computer Use in Primary Education: Teacher and School Characteristics

    ERIC Educational Resources Information Center

    Tondeur, J.; Valcke, M.; van Braak, J.

    2008-01-01

    The central aim of this study was to test a model that integrates determinants of educational computer use. In particular, the article examines teacher and school characteristics that are associated with different types of computer use by primary school teachers. A survey was set up, involving 527 teachers from 68 primary schools in Flanders. A…

  8. Alignment between Satellite and Central Galaxies in the SDSS DR7: Dependence on Large-scale Environment

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Luo, Yu; Kang, Xi; Libeskind, Noam I.; Wang, Lei; Zhang, Youcai; Tempel, Elmo; Guo, Quan

    2018-06-01

    The alignment between satellites and central galaxies has been studied in detail both in observational and theoretical works. The widely accepted fact is that satellites preferentially reside along the major axis of their central galaxy. However, the origin and large-scale environmental dependence of this alignment are still unknown. In an attempt to determine these variables, we use data constructed from Sloan Digital Sky Survey DR7 to investigate the large-scale environmental dependence of this alignment with emphasis on examining the alignment’s dependence on the color of the central galaxy. We find a very strong large-scale environmental dependence of the satellite–central alignment (SCA) in groups with blue centrals. Satellites of blue centrals in knots are preferentially located perpendicular to the major axes of the centrals, and the alignment angle decreases with environment, namely, when going from knots to voids. The alignment angle strongly depends on the {}0.1(g-r) color of centrals. We suggest that the SCA is the result of a competition between satellite accretion within large-scale structure (LSS) and galaxy evolution inside host halos. For groups containing red central galaxies, the SCA is mainly determined by the evolution effect, while for blue central dominated groups, the effect of the LSS plays a more important role, especially in knots. Our results provide an explanation for how the SCA forms within different large-scale environments. The perpendicular case in groups and knots with blue centrals may also provide insight into understanding similar polar arrangements, such as the formation of the Milky Way and Centaurus A’s satellite system.

  9. [Ophthalmodynamometry in the diagnostics of Grave's ophthalmopathy].

    PubMed

    Harder, B; Jonas, J B

    2007-11-01

    Since endocrine orbitopathy is characterised by exophthalmos and increased orbital tissue pressure which may lead to a compression of and damage to the optic nerve, it was the purpose of this study to evaluate whether the increased orbital tissue pressure in endocrine orbitopathy is associated with an elevated central retinal vein pressure as estimated by ophthalmodynamometry, and whether the central retinal vein pressure changes in the course of the disease. The prospective clinical study included 7 patients (13 eyes) with endocrine orbitopathy. They were screened for the prevalence of a spontaneous pulsation of the central retinal vein. In case of a missing spontaneous pulse, the collapse pressure of the central retinal vein was estimated by a modified ophthalmodynamometry using a corneal contact lens associated ophthalmodynamometric device. A group of 122 patients (156 eyes) without orbital or retinal diseases served as control group. The frequency of a spontaneous pulse of the central retinal vein was significantly lower in the study group (1/13 or 8%) than in the control group (121/156 or 78% p<0.001; odds ratio: 41.5). The central retinal vein collapse pressure as determined by ophthalmodynamometry was significantly higher in the study group (22.7+/-19.5 arbitrary units) than in the control group (4.7+/-12.8 arbitrary units) (p=0.002). For one patient with 7 examinations during a follow-up of 16 months, the central retinal vein pressure increased from 17 arbitrary units to 56 units, and decreased to 14 to 19 arbitrary units after initiation of a systemic therapy and regression of the exophthalmos. Three years later a spontaneous pulsation of the central retinal vein was detectable. Ophthalmodynamometry may be a useful examination for the indirect assessment of the orbital tissue pressure in patients with endocrine orbitopathy.

  10. Comparison of the Effect of Aliskiren Versus Negative Controls on Aortic Stiffness in Patients With Marfan Syndrome Under Treatment With Atenolol.

    PubMed

    Hwang, Ji-Won; Kim, Eun Kyoung; Jang, Shin Yi; Chung, Tae-Young; Ki, Chang-Seok; Sung, Kiick; Kim, Sung Mok; Ahn, Joonghyun; Carriere, Keumhee; Choe, Yeon Hyeon; Chang, Sung-A; Kim, Duk-Kyung

    2017-11-29

    The aim of this study was to evaluate the effect of aliskiren on aortic stiffness in patients with Marfan syndrome (MS). Twenty-eight MS patients (mean age ± standard deviation: 32.6 ± 10.6 years) were recruited from November 2009 to October 2014. All patients were receiving atenolol as standard beta-blocker therapy. A prospective randomization process was performed to assign participants to either aliskiren treatment (150-300mg orally per day) or no aliskiren treatment (negative control) in an open-label design. Central aortic distensibility and central pulsed wave velocity (PWV) by magnetic resonance imaging (MRI), peripheral PWV, central aortic blood pressure and augmentation index by peripheral tonometry, and aortic dilatation by echocardiography were examined initially and after 24 weeks. The primary endpoint was central aortic distensibility by MRI. In analyses of differences between baseline and 24 weeks for the aliskiren treatment group vs the negative control group, central distensibility (overall; P = .26) and central PWV (0.2 ± 0.9 vs 0.03 ± 0.7 [m/s]; P = .79) by MRI were not significantly different. Central systolic aortic blood pressure tended to be lower by 14mmHg in patients in the aliskiren treatment group than in the control group (P = .09). A significant decrease in peripheral PWV (brachial-ankle PWV) in the aliskiren treatment group (-1.6 m/s) compared with the control group (+0.28 m/s) was noted (P = .005). Among patients with MS, the addition of aliskiren to beta-blocker treatment did not significantly improve central aortic stiffness during a 24-week period. Copyright © 2017 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  11. Client-Server: What Is It and Are We There Yet?

    ERIC Educational Resources Information Center

    Gershenfeld, Nancy

    1995-01-01

    Discusses client-server architecture in dumb terminals, personal computers, local area networks, and graphical user interfaces. Focuses on functions offered by client personal computers: individualized environments; flexibility in running operating systems; advanced operating system features; multiuser environments; and centralized data…

  12. Great Expectations: Distributed Financial Computing at Cornell.

    ERIC Educational Resources Information Center

    Schulden, Louise; Sidle, Clint

    1988-01-01

    The Cornell University Distributed Accounting (CUDA) system is an attempt to provide departments a software tool for better managing their finances, creating microcomputer standards, creating a vehicle for better administrative microcomputer support, and insuring local systems are consistent with central computer systems. (Author/MLW)

  13. 7 CFR 3203.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... described in paragraph (1) of this definition. Computers or other technical equipment means central... 7 Agriculture 15 2013-01-01 2013-01-01 false Definitions. 3203.3 Section 3203.3 Agriculture..., DEPARTMENT OF AGRICULTURE GUIDELINES FOR THE TRANSFER OF EXCESS COMPUTERS OR OTHER TECHNICAL EQUIPMENT...

  14. Effects of computer monitor-emitted radiation on oxidant/antioxidant balance in cornea and lens from rats

    PubMed Central

    Namuslu, Mehmet; Devrim, Erdinç; Durak, İlker

    2009-01-01

    Purpose This study aims to investigate the possible effects of computer monitor-emitted radiation on the oxidant/antioxidant balance in corneal and lens tissues and to observe any protective effects of vitamin C (vit C). Methods Four groups (PC monitor, PC monitor plus vitamin C, vitamin C, and control) each consisting of ten Wistar rats were studied. The study lasted for three weeks. Vitamin C was administered in oral doses of 250 mg/kg/day. The computer and computer plus vitamin C groups were exposed to computer monitors while the other groups were not. Malondialdehyde (MDA) levels and superoxide dismutase (SOD), glutathione peroxidase (GSH-Px), and catalase (CAT) activities were measured in corneal and lens tissues of the rats. Results In corneal tissue, MDA levels and CAT activity were found to increase in the computer group compared with the control group. In the computer plus vitamin C group, MDA level, SOD, and GSH-Px activities were higher and CAT activity lower than those in the computer and control groups. Regarding lens tissue, in the computer group, MDA levels and GSH-Px activity were found to increase, as compared to the control and computer plus vitamin C groups, and SOD activity was higher than that of the control group. In the computer plus vitamin C group, SOD activity was found to be higher and CAT activity to be lower than those in the control group. Conclusion The results of this study suggest that computer-monitor radiation leads to oxidative stress in the corneal and lens tissues, and that vitamin C may prevent oxidative effects in the lens. PMID:19960068

  15. SPH Simulations of Spherical Bondi Accretion: First Step of Implementing AGN Feedback in Galaxy Formation

    NASA Astrophysics Data System (ADS)

    Barai, Paramita; Proga, D.; Nagamine, K.

    2011-01-01

    Our motivation is to numerically test the assumption of Black Hole (BH) accretion (that the central massive BH of a galaxy accretes mass at the Bondi-Hoyle accretion rate, with ad-hoc choice of parameters), made in many previous galaxy formation studies including AGN feedback. We perform simulations of a spherical distribution of gas, within the radius range 0.1 - 200 pc, accreting onto a central supermassive black hole (the Bondi problem), using the 3D Smoothed Particle Hydrodynamics code Gadget. In our simulations we study the radial distribution of various gas properties (density, velocity, temperature, Mach number). We compute the central mass inflow rate at the inner boundary (0.1 pc), and investigate how different gas properties (initial density and velocity profiles) and computational parameters (simulation outer boundary, particle number) affect the central inflow. Radiative processes (namely heating by a central X-ray corona and gas cooling) have been included in our simulations. We study the thermal history of accreting gas, and identify the contribution of radiative and adiabatic terms in shaping the gas properties. We find that the current implementation of artificial viscosity in the Gadget code causes unwanted extra heating near the inner radius.

  16. The efficacy of computer-enabled discharge communication interventions: a systematic review.

    PubMed

    Motamedi, Soror Mona; Posadas-Calleja, Juan; Straus, Sharon; Bates, David W; Lorenzetti, Diane L; Baylis, Barry; Gilmour, Janet; Kimpton, Shandra; Ghali, William A

    2011-05-01

    Traditional manual/dictated discharge summaries are inaccurate, inconsistent and untimely. Computer-enabled discharge communications may improve information transfer by providing a standardised document that immediately links acute and community healthcare providers. To conduct a systematic review evaluating the efficacy of computer-enabled discharge communication compared with traditional communication for patients discharged from acute care hospitals. MEDLINE, EMBASE, Cochrane CENTRAL Register of Controlled Trials and MEDLINE In-Process. Keywords from three themes were combined: discharge communication, electronic/online/web-based and controlled interventional studies. Study types included: clinical trials, quasiexperimental studies with concurrent controls and controlled before--after studies. Interventions included: (1) automatic population of a discharge document by computer database(s); (2) transmission of discharge information via computer technology; or (3) computer technology providing a 'platform' for dynamic discharge communication. Controls included: no intervention or traditional manual/dictated discharge summaries. Primary outcomes included: mortality, readmission and adverse events/near misses. Secondary outcomes included: timeliness, accuracy, quality/completeness and physician/patient satisfaction. Description of interventions and study outcomes were extracted by two independent reviewers. 12 unique studies were identified: eight randomised controlled trials and four quasi-experimental studies. Pooling/meta-analysis was not possible, given the heterogeneity of measures and outcomes reported. The primary outcomes of mortality and readmission were inconsistently reported. There was no significant difference in mortality, and one study reported reduced long-term readmission. Intervention groups experienced reductions in perceived medical errors/adverse events, and improvements in timeliness and physician/patient satisfaction. Computer-enabled discharge communications appear beneficial with respect to a number of important secondary outcomes. Primary outcomes of mortality and readmission are less commonly reported in this literature and require further study.

  17. Probability of survival during accidental immersion in cold water.

    PubMed

    Wissler, Eugene H

    2003-01-01

    Estimating the probability of survival during accidental immersion in cold water presents formidable challenges for both theoreticians and empirics. A number of theoretical models have been developed assuming that death occurs when the central body temperature, computed using a mathematical model, falls to a certain level. This paper describes a different theoretical approach to estimating the probability of survival. The human thermal model developed by Wissler is used to compute the central temperature during immersion in cold water. Simultaneously, a survival probability function is computed by solving a differential equation that defines how the probability of survival decreases with increasing time. The survival equation assumes that the probability of occurrence of a fatal event increases as the victim's central temperature decreases. Generally accepted views of the medical consequences of hypothermia and published reports of various accidents provide information useful for defining a "fatality function" that increases exponentially with decreasing central temperature. The particular function suggested in this paper yields a relationship between immersion time for 10% probability of survival and water temperature that agrees very well with Molnar's empirical observations based on World War II data. The method presented in this paper circumvents a serious difficulty with most previous models--that one's ability to survive immersion in cold water is determined almost exclusively by the ability to maintain a high level of shivering metabolism.

  18. NASA CORE (Central Operation of Resources for Educators) Educational Materials Catalog

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This educational materials catalog presents NASA CORE (Central Operation of Resources for Educators). The topics include: 1) Videocassettes (Aeronautics, Earth Resources, Weather, Space Exploration/Satellites, Life Sciences, Careers); 2) Slide Programs; 3) Computer Materials; 4) NASA Memorabilia/Miscellaneous; 5) NASA Educator Resource Centers; 6) and NASA Resources.

  19. Informatic parcellation of the network involved in the computation of subjective value

    PubMed Central

    Rangel, Antonio

    2014-01-01

    Understanding how the brain computes value is a basic question in neuroscience. Although individual studies have driven this progress, meta-analyses provide an opportunity to test hypotheses that require large collections of data. We carry out a meta-analysis of a large set of functional magnetic resonance imaging studies of value computation to address several key questions. First, what is the full set of brain areas that reliably correlate with stimulus values when they need to be computed? Second, is this set of areas organized into dissociable functional networks? Third, is a distinct network of regions involved in the computation of stimulus values at decision and outcome? Finally, are different brain areas involved in the computation of stimulus values for different reward modalities? Our results demonstrate the centrality of ventromedial prefrontal cortex (VMPFC), ventral striatum and posterior cingulate cortex (PCC) in the computation of value across tasks, reward modalities and stages of the decision-making process. We also find evidence of distinct subnetworks of co-activation within VMPFC, one involving central VMPFC and dorsal PCC and another involving more anterior VMPFC, left angular gyrus and ventral PCC. Finally, we identify a posterior-to-anterior gradient of value representations corresponding to concrete-to-abstract rewards. PMID:23887811

  20. Astrophysical reaction rates from a symmetry-informed first-principles perspective

    NASA Astrophysics Data System (ADS)

    Dreyfuss, Alison; Launey, Kristina; Baker, Robert; Draayer, Jerry; Dytrych, Tomas

    2017-01-01

    With a view toward a new unified formalism for studying bound and continuum states in nuclei, to understand stellar nucleosynthesis from a fully ab initio perspective, we studied the nature of surface α-clustering in 20Ne by considering the overlap of symplectic states with cluster-like states. We compute the spectroscopic amplitudes and factors, α-decay width, and absolute resonance strength - characterizing major contributions to the astrophysical reaction rate through a low-lying 1- resonant state in 20Ne. As a next step, we consider a fully microscopic treatment for the n+4 He system, based on the successful first-principles No-Core Shell Model/Resonating Group Method (NCSM/RGM) for light nuclei, but with the capability to reach intermediate-mass nuclei. The new model takes advantage of the symmetry-based concept central to the Symmetry-Adapted No-Core Shell Model (SA-NCSM) to reduce computational complexity in physically-informed and methodical way, with sights toward first-principles calculations of rates for important astrophysical reactions, such as the 23 Al(p , γ) 24 Si reaction, believed to have a strong influence on X-ray burst light curves. Supported by the U.S. NSF (OCI-0904874, ACI -1516338) and the U.S. DOE (DE-SC0005248), and benefitted from computing resources provided by Blue Waters and the LSU Center for Computation & Technology.

  1. Computer use, internet access, and online health searching among Harlem adults.

    PubMed

    Cohall, Alwyn T; Nye, Andrea; Moon-Howard, Joyce; Kukafka, Rita; Dye, Bonnie; Vaughan, Roger D; Northridge, Mary E

    2011-01-01

    Computer use, Internet access, and online searching for health information were assessed toward enhancing Internet use for health promotion. Cross-sectional random digit dial landline phone survey. Eight zip codes that comprised Central Harlem/Hamilton Heights and East Harlem in New York City. Adults 18 years and older (N=646). Demographic characteristics, computer use, Internet access, and online searching for health information. Frequencies for categorical variables and means and standard deviations for continuous variables were calculated and compared with analogous findings reported in national surveys from similar time periods. Among Harlem adults, ever computer use and current Internet use were 77% and 52%, respectively. High-speed home Internet connections were somewhat lower for Harlem adults than for U.S. adults overall (43% vs. 68%). Current Internet users in Harlem were more likely to be younger, white vs. black or Hispanic, better educated, and in better self-reported health than non-current users (p<.01). Of those who reported searching online for health information, 74% sought information on medical problems and thought that information found on the Internet affected the way they eat (47%) or exercise (44%). Many Harlem adults currently use the Internet to search for health information. High-speed connections and culturally relevant materials may facilitate health information searching for underserved groups. Copyright © 2011 by American Journal of Health Promotion, Inc.

  2. On the sample complexity of learning for networks of spiking neurons with nonlinear synaptic interactions.

    PubMed

    Schmitt, Michael

    2004-09-01

    We study networks of spiking neurons that use the timing of pulses to encode information. Nonlinear interactions model the spatial groupings of synapses on the neural dendrites and describe the computations performed at local branches. Within a theoretical framework of learning we analyze the question of how many training examples these networks must receive to be able to generalize well. Bounds for this sample complexity of learning can be obtained in terms of a combinatorial parameter known as the pseudodimension. This dimension characterizes the computational richness of a neural network and is given in terms of the number of network parameters. Two types of feedforward architectures are considered: constant-depth networks and networks of unconstrained depth. We derive asymptotically tight bounds for each of these network types. Constant depth networks are shown to have an almost linear pseudodimension, whereas the pseudodimension of general networks is quadratic. Networks of spiking neurons that use temporal coding are becoming increasingly more important in practical tasks such as computer vision, speech recognition, and motor control. The question of how well these networks generalize from a given set of training examples is a central issue for their successful application as adaptive systems. The results show that, although coding and computation in these networks is quite different and in many cases more powerful, their generalization capabilities are at least as good as those of traditional neural network models.

  3. Benidipine has effects similar to losartan on the central blood pressure and arterial stiffness in mild to moderate essential hypertension.

    PubMed

    Ihm, Sang-Hyun; Jeon, Hui-Kyung; Chae, Shung Chull; Lim, Do-Sun; Kim, Kee-Sik; Choi, Dong-Ju; Ha, Jong-Won; Kim, Dong-Soo; Kim, Kye Hun; Cho, Myeong-Chan; Baek, Sang Hong

    2013-01-01

    Central blood pressure (BP) is pathophysiologically more important than peripheral BP for the pathogenesis of cardiovascular disease. Arterial stiffness is also a good predictor of cardiovascular morbidity and mortality. The effects of benidipine, a unique dual L-/T-type calcium channel blocker, on central BP have not been reported. This study aimed to compare the effect of benidipine and losartan on the central BP and arterial stiffness in mild to moderate essential hypertensives. This 24 weeks, multi-center, open label, randomized, active drug comparative, parallel group study was designed as a non-inferiority study. The eligible patients (n = 200) were randomly assigned to receive benidipine (n = 101) or losartan (n = 99). Radial artery applanation tonometry and pulse wave analysis were used to measure the central BP, pulse wave velocity (PWV) and augmentation index (AIx). We also measured the metabolic and inflammatory markers. After 24 weeks, the central BP decreased significantly from baseline by (16.8 ± 14.0/10.5 ± 9.2) mmHg (1 mmHg = 0.133 kPa) (systolic/diastolic BP; P < 0.001) in benidipine group and (18.9 ± 14.7/12.1 ± 10.2) mmHg (P < 0.001) in losartan group respectively. Both benidipine and losartan groups significantly lowered peripheral BP (P < 0.001) and AIx (P < 0.05), but there were no significant differences between the two groups. The mean aortic, brachial and femoral PWV did not change in both groups after 24-week treatment. There were no significant changes of the blood metabolic and inflammatory biomarkers in each group. Benidipine is as effective as losartan in lowering the central and peripheral BP, and improving arterial stiffness.

  4. An In Vitro Study on the Effects of Post-Core Design and Ferrule on the Fracture Resistance of Endodontically Treated Maxillary Central Incisors.

    PubMed

    Sreedevi, S; Sanjeev, R; Raghavan, Rekha; Abraham, Anna; Rajamani, T; Govind, Girish Kumar

    2015-08-01

    Endodontically treated teeth have significantly different physical and mechanical properties compared to vital teeth and are more prone to fracture. The study aims to compare the fracture resistance of endodontically treated teeth with and without post reinforcement, custom cast post-core and prefabricated post with glass ionomer core and to evaluate the ferrule effect on endodontically treated teeth restored with custom cast post-core. A total of 40 human maxillary central incisors with similar dimensions devoid of any root caries, restorations, previous endodontic treatment or cracks were selected from a collection of stored extracted teeth. An initial silicone index of each tooth was made. They were treated endodontically and divided into four groups of ten specimens each. Their apical seal was maintained with 4 mm of gutta-percha. Root canal preparation was done and then post core fabrication was done. The prepared specimens were subjected to load testing using a computer coordinated UTM. The fracture load results were then statistically analyzed. One-way ANOVA was followed by paired t-test. 1. Reinforcement of endodontically treated maxillary central incisors with post and core, improved their fracture resistance to be at par with that of endodontically treated maxillary central incisor, with natural crown. 2. The fracture resistance of endodontically treated maxillary central incisors is significantly increased when restored with custom cast post-core and 2 mm ferrule. With 2 mm ferrule, teeth restored with custom cast post-core had a significantly higher fracture resistance than teeth restored with custom cast post-core or prefabricated post and glass ionomer core without ferrule.

  5. Paleocene coal deposits of the Wilcox group, central Texas

    USGS Publications Warehouse

    Hook, Robert W.; Warwick, Peter D.; SanFilipo, John R.; Schultz, Adam C.; Nichols, Douglas J.; Swanson, Sharon M.; Warwick, Peter D.; Karlsen, Alexander K.; Merrill, Matthew D.; Valentine, Brett J.

    2011-01-01

    Coal deposits in the Wilcox Group of central Texas have been regarded as the richest coal resources in the Gulf Coastal Plain. Although minable coal beds appear to be less numerous and generally higher in sulfur content (1 percent average, as-received basis; table 1) than Wilcox coal deposits in the Northeast Texas and Louisiana Sabine assessment areas (0.5 and 0.6 percent sulfur, respectively; table 1), net coal thickness in coal zones in central Texas is up to 32 ft thick and more persistent along strike (up to 15 mi) at or near the surface than coals of any other Gulf Coast assessment area. The rank of the coal beds in central Texas is generally lignite (table 1), but some coal ranks as great as subbituminous C have been reported (Mukhopadhyay, 1989). The outcrop of the Wilcox Group in central Texas strikes northeast, extends for approximately 140 mi between the Trinity and Colorado Rivers, and covers parts of Bastrop, Falls, Freestone, Lee, Leon, Limestone, Milam, Navarro, Robertson, and Williamson Counties (Figure 1). Three formations, in ascending order, the Hooper, Simsboro, and Calvert Bluff, are recognized in central Texas (Figure 2). The Wilcox Group is underlain conformably by the Midway Group, a mudstone-dominated marine sequence, and is overlain and scoured locally by the Carrizo Sand, a fluvial unit at the base of the Claiborne Group.

  6. A study of tapping by the unaffected finger of patients presenting with central and peripheral nerve damage.

    PubMed

    Zhang, Lingli; Han, Xiuying; Li, Peihong; Liu, Yang; Zhu, Yulian; Zou, Jun; Yu, Zhusheng

    2015-01-01

    Whether the unaffected function of the hand of patients presenting with nerve injury is affected remains inconclusive. We aimed to evaluate whether there are differences in finger tapping following central or peripheral nerve injury compared with the unaffected hand and the ipsilateral hand of a healthy subject. Thirty right brain stroke patients with hemiplegia, 30 left arm peripheral nerve injury cases, and 60 healthy people were selected. We tested finger tapping of the right hands, and each subject performed the test twice. Finger tapping following peripheral nerve injury as compared with the unaffected hand and the dominant hand of a healthy person was markedly higher than was found for central nerve injury (P < 0.05). Finger tapping of the male peripheral group's unaffected hand and the control group's dominant hand was significantly higher than the central group (P < 0.001). However, finger tapping of the female control group's dominant hand was significantly higher than the central group's unaffected hand (P < 0.01, P = 0.002), the peripheral group's unaffected hand (P < 0.05, P = 0.034). The unaffected function of the hand of patients with central and peripheral nerve injury was different as compared with the ipsilateral hand of healthy individuals. The rehabilitation therapist should intensify the practice of normal upper limb fine activities and coordination of the patient.

  7. Better Safe than Sorry - Socio-Spatial Group Structure Emerges from Individual Variation in Fleeing, Avoidance or Velocity in an Agent-Based Model

    PubMed Central

    Evers, Ellen; de Vries, Han; Spruijt, Berry M.; Sterck, Elisabeth H. M.

    2011-01-01

    In group-living animals, such as primates, the average spatial group structure often reflects the dominance hierarchy, with central dominants and peripheral subordinates. This central-peripheral group structure can arise by self-organization as a result of subordinates fleeing from dominants after losing a fight. However, in real primates, subordinates often avoid interactions with potentially aggressive group members, thereby preventing aggression and subsequent fleeing. Using agent-based modeling, we investigated which spatial and encounter structures emerge when subordinates also avoid known potential aggressors at a distance as compared with the model which only included fleeing after losing a fight (fleeing model). A central-peripheral group structure emerged in most conditions. When avoidance was employed at small or intermediate distances, centrality of dominants emerged similar to the fleeing model, but in a more pronounced way. This result was also found when fleeing after a fight was made independent of dominance rank, i.e. occurred randomly. Employing avoidance at larger distances yielded more spread out groups. This provides a possible explanation of larger group spread in more aggressive species. With avoidance at very large distances, spatially and socially distinct subgroups emerged. We also investigated how encounters were distributed amongst group members. In the fleeing model all individuals encountered all group members equally often, whereas in the avoidance model encounters occurred mostly among similar-ranking individuals. Finally, we also identified a very general and simple mechanism causing a central-peripheral group structure: when individuals merely differed in velocity, faster individuals automatically ended up at the periphery. In summary, a central-peripheral group pattern can easily emerge from individual variation in different movement properties in general, such as fleeing, avoidance or velocity. Moreover, avoidance behavior also affects the encounter structure and can lead to subgroup formation. PMID:22125595

  8. Color vision and neuroretinal function in diabetes.

    PubMed

    Wolff, B E; Bearse, M A; Schneck, M E; Dhamdhere, K; Harrison, W W; Barez, S; Adams, A J

    2015-04-01

    We investigate how type 2 diabetes (T2DM) and diabetic retinopathy (DR) affect color vision (CV) and mfERG implicit time (IT), whether CV and IT are correlated, and whether CV and IT abnormality classifications agree. Adams desaturated D-15 color test, mfERG, and fundus photographs were examined in 37 controls, 22 T2DM patients without DR (NoRet group), and 25 T2DM patients with DR (Ret group). Color confusion score (CCS) was calculated. ITs were averaged within the central 7 hexagons (central IT; ≤4.5°) and outside this area (peripheral IT; ≥4.5°). DR was within (DRIN) or outside (DROUT) of the central 7 hexagons. Group differences, percentages of abnormalities, correlations, and agreement were determined. CCS was greater in the NoRet (P = 0.002) and Ret (P < 0.0001) groups than in control group. CCS was abnormal in 3, 41, and 48 % of eyes in the control, NoRet, and Ret groups, respectively. Ret group CV abnormalities were more frequent in DRIN than in DROUT subgroups (71 vs. 18 %, respectively; P < 0.0001). CCS and IT were correlated only in the Ret group, in both retinal zones (P ≤ 0.028). Only in the Ret group did CCS and peripheral IT abnormality classifications agree (72 %; P < 0.05). CV is affected in patients with T2DM, even without DR. Central DR increases the likelihood of a CV deficit compared with non-central DR. mfERG IT averaged across central or peripheral retinal locations is less frequently abnormal than CV in the absence of DR, and these two measures are correlated only when DR is present.

  9. Color vision and neuroretinal function in diabetes

    PubMed Central

    Bearse, M. A.; Schneck, M. E.; Dhamdhere, K.; Harrison, W. W.; Barez, S.; Adams, A. J.

    2015-01-01

    Purpose We investigate how type 2 diabetes (T2DM) and diabetic retinopathy (DR) affect color vision (CV) and mfERG implicit time (IT), whether CV and IT are correlated, and whether CV and IT abnormality classifications agree. Methods Adams desaturated D-15 color test, mfERG, and fundus photographs were examined in 37 controls, 22 T2DM patients without DR (NoRet group), and 25 T2DM patients with DR (Ret group). Color confusion score (CCS) was calculated. ITs were averaged within the central 7 hexagons (central IT; ≥4.5°) and outside this area (peripheral IT; ≤4.5°). DR was within (DRIN) or outside (DROUT) of the central 7 hexagons. Group differences, percentages of abnormalities, correlations, and agreement were determined. Results CCS was greater in the NoRet (P = 0.002) and Ret (P < 0.0001) groups than in control group. CCS was abnormal in 3, 41, and 48 % of eyes in the control, NoRet, and Ret groups, respectively. Ret group CV abnormalities were more frequent in DRIN than in DROUT subgroups (71 vs. 18 %, respectively; P < 0.0001). CCS and IT were correlated only in the Ret group, in both retinal zones (P ≥ 0.028). Only in the Ret group did CCS and peripheral IT abnormality classifications agree (72 %; P < 0.05). Conclusion CV is affected in patients with T2DM, even without DR. Central DR increases the likelihood of a CV deficit compared with non-central DR. mfERG IT averaged across central or peripheral retinal locations is less frequently abnormal than CV in the absence of DR, and these two measures are correlated only when DR is present. PMID:25516428

  10. Solvation structures of water in trihexyltetradecylphosphonium-orthoborate ionic liquids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yong-Lei, E-mail: wangyonl@gmail.com; System and Component Design, Department of Machine Design, KTH Royal Institute of Technology, SE-100 44 Stockholm; Sarman, Sten

    2016-08-14

    Atomistic molecular dynamics simulations have been performed to investigate effective interactions of isolated water molecules dispersed in trihexyltetradecylphosphonium-orthoborate ionic liquids (ILs). The intrinsic free energy changes in solvating one water molecule from gas phase into bulk IL matrices were estimated as a function of temperature, and thereafter, the calculations of potential of mean force between two dispersed water molecules within different IL matrices were performed using umbrella sampling simulations. The systematic analyses of local ionic microstructures, orientational preferences, probability and spatial distributions of dispersed water molecules around neighboring ionic species indicate their preferential coordinations to central polar segments in orthoboratemore » anions. The effective interactions between two dispersed water molecules are partially or totally screened as their separation distance increases due to interference of ionic species in between. These computational results connect microscopic anionic structures with macroscopically and experimentally observed difficulty in completely removing water from synthesized IL samples and suggest that the introduction of hydrophobic groups to central polar segments and the formation of conjugated ionic structures in orthoborate anions can effectively reduce residual water content in the corresponding IL samples.« less

  11. 5 CFR 532.315 - Additional survey jobs.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... data obtained in special industries WG-10 Communications Telephone Installer-Repairer WG-9 Central... Repairer WG-11 Electronic Computer Mechanic WG-11 Television Station Mechanic WG-11 Guided missiles Electronic Computer Mechanic WG-11 Guided Missile Mechanical Repairer WG-11 Heavy duty equipment Heavy Mobile...

  12. 36 CFR 200.1 - Central organization.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., engineering, lands, aviation, and computer systems. The National Forest System includes: 155 Proclaimed or... other environmental concerns, forest insects and disease, forest fire and atmospheric science. Plans and...-wide management of systems and computer applications. [41 FR 24350, June 16, 1976, as amended at 42 FR...

  13. 36 CFR 200.1 - Central organization.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., engineering, lands, aviation, and computer systems. The National Forest System includes: 155 Proclaimed or... other environmental concerns, forest insects and disease, forest fire and atmospheric science. Plans and...-wide management of systems and computer applications. [41 FR 24350, June 16, 1976, as amended at 42 FR...

  14. 36 CFR 200.1 - Central organization.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., engineering, lands, aviation, and computer systems. The National Forest System includes: 155 Proclaimed or... other environmental concerns, forest insects and disease, forest fire and atmospheric science. Plans and...-wide management of systems and computer applications. [41 FR 24350, June 16, 1976, as amended at 42 FR...

  15. 36 CFR 200.1 - Central organization.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., engineering, lands, aviation, and computer systems. The National Forest System includes: 155 Proclaimed or... other environmental concerns, forest insects and disease, forest fire and atmospheric science. Plans and...-wide management of systems and computer applications. [41 FR 24350, June 16, 1976, as amended at 42 FR...

  16. A Computer Program Functional Design of the Simulation Subsystem of an Automated Central Flow Control System

    DOT National Transportation Integrated Search

    1976-08-01

    This report contains a functional design for the simulation of a future automation concept in support of the ATC Systems Command Center. The simulation subsystem performs airport airborne arrival delay predictions and computes flow control tables for...

  17. Microcomputers in Education.

    ERIC Educational Resources Information Center

    Anderson, Cheryl A.

    Designed to answer basic questions educators have about microcomputer hardware and software and their applications in teaching, this paper describes the revolution in computer technology that has resulted from the development of the microchip processor and provides information on the major computer components; i.e.; input, central processing unit,…

  18. COMPUTER PROGRAM FOR CALCULATING THE COST OF DRINKING WATER TREATMENT SYSTEMS

    EPA Science Inventory

    This FORTRAN computer program calculates the construction and operation/maintenance costs for 45 centralized unit treatment processes for water supply. The calculated costs are based on various design parameters and raw water quality. These cost data are applicable to small size ...

  19. Concept of operations for the use of connected vehicle data in road weather applications.

    DOT National Transportation Integrated Search

    2006-01-30

    The Computer Aided Dispatch (CAD) computer system went into live operation January 2002. System design involved creating a distributed network, which involved setting up a central main server at the Idaho State Police (ISP) headquarters located in Me...

  20. Thermodynamics of quasideterministic digital computers

    NASA Astrophysics Data System (ADS)

    Chu, Dominique

    2018-02-01

    A central result of stochastic thermodynamics is that irreversible state transitions of Markovian systems entail a cost in terms of an infinite entropy production. A corollary of this is that strictly deterministic computation is not possible. Using a thermodynamically consistent model, we show that quasideterministic computation can be achieved at finite, and indeed modest cost with accuracies that are indistinguishable from deterministic behavior for all practical purposes. Concretely, we consider the entropy production of stochastic (Markovian) systems that behave like and and a not gates. Combinations of these gates can implement any logical function. We require that these gates return the correct result with a probability that is very close to 1, and additionally, that they do so within finite time. The central component of the model is a machine that can read and write binary tapes. We find that the error probability of the computation of these gates falls with the power of the system size, whereas the cost only increases linearly with the system size.

  1. Uniformity testing: assessment of a centralized web-based uniformity analysis system.

    PubMed

    Klempa, Meaghan C

    2011-06-01

    Uniformity testing is performed daily to ensure adequate camera performance before clinical use. The aim of this study is to assess the reliability of Beth Israel Deaconess Medical Center's locally built, centralized, Web-based uniformity analysis system by examining the differences between manufacturer and Web-based National Electrical Manufacturers Association integral uniformity calculations measured in the useful field of view (FOV) and the central FOV. Manufacturer and Web-based integral uniformity calculations measured in the useful FOV and the central FOV were recorded over a 30-d period for 4 cameras from 3 different manufacturers. These data were then statistically analyzed. The differences between the uniformity calculations were computed, in addition to the means and the SDs of these differences for each head of each camera. There was a correlation between the manufacturer and Web-based integral uniformity calculations in the useful FOV and the central FOV over the 30-d period. The average differences between the manufacturer and Web-based useful FOV calculations ranged from -0.30 to 0.099, with SD ranging from 0.092 to 0.32. For the central FOV calculations, the average differences ranged from -0.163 to 0.055, with SD ranging from 0.074 to 0.24. Most of the uniformity calculations computed by this centralized Web-based uniformity analysis system are comparable to the manufacturers' calculations, suggesting that this system is reasonably reliable and effective. This finding is important because centralized Web-based uniformity analysis systems are advantageous in that they test camera performance in the same manner regardless of the manufacturer.

  2. A review of bioinformatics training applied to research in molecular medicine, agriculture and biodiversity in Costa Rica and Central America.

    PubMed

    Orozco, Allan; Morera, Jessica; Jiménez, Sergio; Boza, Ricardo

    2013-09-01

    Today, Bioinformatics has become a scientific discipline with great relevance for the Molecular Biosciences and for the Omics sciences in general. Although developed countries have progressed with large strides in Bioinformatics education and research, in other regions, such as Central America, the advances have occurred in a gradual way and with little support from the Academia, either at the undergraduate or graduate level. To address this problem, the University of Costa Rica's Medical School, a regional leader in Bioinformatics in Central America, has been conducting a series of Bioinformatics workshops, seminars and courses, leading to the creation of the region's first Bioinformatics Master's Degree. The recent creation of the Central American Bioinformatics Network (BioCANET), associated to the deployment of a supporting computational infrastructure (HPC Cluster) devoted to provide computing support for Molecular Biology in the region, is providing a foundational stone for the development of Bioinformatics in the area. Central American bioinformaticians have participated in the creation of as well as co-founded the Iberoamerican Bioinformatics Society (SOIBIO). In this article, we review the most recent activities in education and research in Bioinformatics from several regional institutions. These activities have resulted in further advances for Molecular Medicine, Agriculture and Biodiversity research in Costa Rica and the rest of the Central American countries. Finally, we provide summary information on the first Central America Bioinformatics International Congress, as well as the creation of the first Bioinformatics company (Indromics Bioinformatics), spin-off the Academy in Central America and the Caribbean.

  3. The effect of introducing computers into an introductory physics problem-solving laboratory

    NASA Astrophysics Data System (ADS)

    McCullough, Laura Ellen

    2000-10-01

    Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted with gender, with the men in the control group more likely to discuss equipment difficulties than any other group. Overall, the differences between the control and quasi-experimental groups were minimal. It was concluded that carefully replacing traditional data collection and analysis tools with a computer tool had no negative effects on achievement, attitude, group behavior, and did not interact with gender.

  4. An Integrated Model of the Cardiovascular and Central Nervous Systems for Analysis of Microgravity Induced Fluid Redistribution

    NASA Technical Reports Server (NTRS)

    Price, R.; Gady, S.; Heinemann, K.; Nelson, E. S.; Mulugeta, L.; Ethier, C. R.; Samuels, B. C.; Feola, A.; Vera, J.; Myers, J. G.

    2015-01-01

    A recognized side effect of prolonged microgravity exposure is visual impairment and intracranial pressure (VIIP) syndrome. The medical understanding of this phenomenon is at present preliminary, although it is hypothesized that the headward shift of bodily fluids in microgravity may be a contributor. Computational models can be used to provide insight into the origins of VIIP. In order to further investigate this phenomenon, NASAs Digital Astronaut Project (DAP) is developing an integrated computational model of the human body which is divided into the eye, the cerebrovascular system, and the cardiovascular system. This presentation will focus on the development and testing of the computational model of an integrated model of the cardiovascular system (CVS) and central nervous system (CNS) that simulates the behavior of pressures, volumes, and flows within these two physiological systems.

  5. Automating the Analytical Laboratories Section, Lewis Research Center, National Aeronautics and Space Administration: A feasibility study

    NASA Technical Reports Server (NTRS)

    Boyle, W. G.; Barton, G. W.

    1979-01-01

    The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.

  6. Computational Insights into the Central Role of Nonbonding Interactions in Modern Covalent Organocatalysis

    DOE PAGES

    Walden, Daniel; Ogba, O. Maduka; Johnston, Ryne C.; ...

    2016-06-06

    The flexibility, complexity, and size of contemporary organocatalytic transformations pose interesting and powerful opportunities to computational and experimental chemists alike. In this Account, we disclose our recent computational investigations of three branches of organocatalysis in which nonbonding interactions, such as C–H···O/N interactions, play a crucial role in the organization of transition states, catalysis, and selectivity.

  7. Multiple-User, Multitasking, Virtual-Memory Computer System

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1993-01-01

    Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.

  8. Nurturing a growing field: Computers & Geosciences

    NASA Astrophysics Data System (ADS)

    Mariethoz, Gregoire; Pebesma, Edzer

    2017-10-01

    Computational issues are becoming increasingly critical for virtually all fields of geoscience. This includes the development of improved algorithms and models, strategies for implementing high-performance computing, or the management and visualization of the large datasets provided by an ever-growing number of environmental sensors. Such issues are central to scientific fields as diverse as geological modeling, Earth observation, geophysics or climatology, to name just a few. Related computational advances, across a range of geoscience disciplines, are the core focus of Computers & Geosciences, which is thus a truly multidisciplinary journal.

  9. THE ZURICH ENVIRONMENTAL STUDY OF GALAXIES IN GROUPS ALONG THE COSMIC WEB. I. WHICH ENVIRONMENT AFFECTS GALAXY EVOLUTION?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carollo, C. Marcella; Cibinel, Anna; Lilly, Simon J.

    2013-10-20

    The Zurich Environmental Study (ZENS) is based on a sample of ∼1500 galaxy members of 141 groups in the mass range ∼10{sup 12.5-14.5} M{sub ☉} within the narrow redshift range 0.05 < z < 0.0585. ZENS adopts novel approaches, described here, to quantify four different galactic environments, namely: (1) the mass of the host group halo; (2) the projected halo-centric distance; (3) the rank of galaxies as central or satellites within their group halos; and (4) the filamentary large-scale structure density. No self-consistent identification of a central galaxy is found in ∼40% of <10{sup 13.5} M{sub ☉} groups, from whichmore » we estimate that ∼15% of groups at these masses are dynamically unrelaxed systems. Central galaxies in relaxed and unrelaxed groups generally have similar properties, suggesting that centrals are regulated by their mass and not by their environment. Centrals in relaxed groups have, however, ∼30% larger sizes than in unrelaxed groups, possibly due to accretion of small satellites in virialized group halos. At M > 10{sup 10} M{sub ☉}, satellite galaxies in relaxed and unrelaxed groups have similar size, color, and (specific) star formation rate distributions; at lower galaxy masses, satellites are marginally redder in relaxed relative to unrelaxed groups, suggesting quenching of star formation in low-mass satellites by physical processes active in relaxed halos. Overall, relaxed and unrelaxed groups show similar stellar mass populations, likely indicating similar stellar mass conversion efficiencies. In the enclosed ZENS catalog, we publish all environmental diagnostics as well as the galaxy structural and photometric measurements described in companion ZENS papers II and III.« less

  10. Doing Your Science While You're in Orbit

    NASA Astrophysics Data System (ADS)

    Green, Mark L.; Miller, Stephen D.; Vazhkudai, Sudharshan S.; Trater, James R.

    2010-11-01

    Large-scale neutron facilities such as the Spallation Neutron Source (SNS) located at Oak Ridge National Laboratory need easy-to-use access to Department of Energy Leadership Computing Facilities and experiment repository data. The Orbiter thick- and thin-client and its supporting Service Oriented Architecture (SOA) based services (available at https://orbiter.sns.gov) consist of standards-based components that are reusable and extensible for accessing high performance computing, data and computational grid infrastructure, and cluster-based resources easily from a user configurable interface. The primary Orbiter system goals consist of (1) developing infrastructure for the creation and automation of virtual instrumentation experiment optimization, (2) developing user interfaces for thin- and thick-client access, (3) provide a prototype incorporating major instrument simulation packages, and (4) facilitate neutron science community access and collaboration. The secure Orbiter SOA authentication and authorization is achieved through the developed Virtual File System (VFS) services, which use Role-Based Access Control (RBAC) for data repository file access, thin-and thick-client functionality and application access, and computational job workflow management. The VFS Relational Database Management System (RDMS) consists of approximately 45 database tables describing 498 user accounts with 495 groups over 432,000 directories with 904,077 repository files. Over 59 million NeXus file metadata records are associated to the 12,800 unique NeXus file field/class names generated from the 52,824 repository NeXus files. Services that enable (a) summary dashboards of data repository status with Quality of Service (QoS) metrics, (b) data repository NeXus file field/class name full text search capabilities within a Google like interface, (c) fully functional RBAC browser for the read-only data repository and shared areas, (d) user/group defined and shared metadata for data repository files, (e) user, group, repository, and web 2.0 based global positioning with additional service capabilities are currently available. The SNS based Orbiter SOA integration progress with the Distributed Data Analysis for Neutron Scattering Experiments (DANSE) software development project is summarized with an emphasis on DANSE Central Services and the Virtual Neutron Facility (VNF). Additionally, the DANSE utilization of the Orbiter SOA authentication, authorization, and data transfer services best practice implementations are presented.

  11. Multicentric evaluation of the impact of central tumour location when comparing rates of N1 upstaging in patients undergoing video-assisted and open surgery for clinical Stage I non-small-cell lung cancer.

    PubMed

    Decaluwé, Herbert; Petersen, René Horsleben; Brunelli, Alex; Pompili, Cecilia; Seguin-Givelet, Agathe; Gust, Lucile; Aigner, Clemens; Falcoz, Pierre-Emmanuel; Rinieri, Philippe; Augustin, Florian; Sokolow, Youri; Verhagen, Ad; Depypere, Lieven; Papagiannopoulos, Kostas; Gossot, Dominique; D'Journo, Xavier Benoit; Guerrera, Francesco; Baste, Jean-Marc; Schmid, Thomas; Stanzi, Alessia; Van Raemdonck, Dirk; Bardet, Jeremy; Thomas, Pascal-Alexandre; Massard, Gilbert; Fieuws, Steffen; Moons, Johnny; Dooms, Christophe; De Leyn, Paul; Hansen, Henrik Jessen

    2017-09-27

    Large retrospective series have indicated lower rates of cN0 to pN1 nodal upstaging after video-assisted thoracic surgery (VATS) compared with open resections for Stage I non-small-cell lung cancer (NSCLC). The objective of our multicentre study was to investigate whether the presumed lower rate of N1 upstaging after VATS disappears after correction for central tumour location in a multivariable analysis. Consecutive patients operated for PET-CT based clinical Stage I NSCLC were selected from prospectively managed surgical databases in 11 European centres. Central tumour location was defined as contact with bronchovascular structures on computer tomography and/or visibility on standard bronchoscopy. Eight hundred and ninety-five patients underwent pulmonary resection by VATS (n = 699, 9% conversions) or an open technique (n = 196) in 2014. Incidence of nodal pN1 and pN2 upstaging was 8% and 7% after VATS and 15% and 6% after open surgery, respectively. pN1 was found in 27% of patients with central tumours. Less central tumours were operated on by VATS compared with the open technique (12% vs 28%, P < 0.001). Logistic regression analysis showed that only tumour location had a significant impact on N1 upstaging (OR 6.2, confidence interval 3.6-10.8; P < 0.001) and that the effect of surgical technique (VATS versus open surgery) was no longer significant when accounting for tumour location. A quarter of patients with central clinical Stage I NSCLC was upstaged to pN1 at resection. Central tumour location was the only independent factor associated with N1 upstaging, undermining the evidence for lower N1 upstaging after VATS resections. Studies investigating N1 upstaging after VATS compared with open surgery should be interpreted with caution due to possible selection bias, i.e. relatively more central tumours in the open group with a higher chance of N1 upstaging. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  12. 40 CFR 63.1416 - Recordkeeping requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Recordkeeping requirements. (a) Data retention. Unless otherwise specified in this subpart, each owner or... shall be accessible from a central location by computer or other means that provides access within 2... may be maintained in hard copy or computer-readable form including, but not limited to, on paper...

  13. 40 CFR 63.1416 - Recordkeeping requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... § 63.1416 Recordkeeping requirements. (a) Data retention. Unless otherwise specified in this subpart... site or shall be accessible from a central location by computer or other means that provides access.... Records may be maintained in hard copy or computer-readable form including, but not limited to, on paper...

  14. 40 CFR 63.1416 - Recordkeeping requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... § 63.1416 Recordkeeping requirements. (a) Data retention. Unless otherwise specified in this subpart... site or shall be accessible from a central location by computer or other means that provides access.... Records may be maintained in hard copy or computer-readable form including, but not limited to, on paper...

  15. 40 CFR 63.1416 - Recordkeeping requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Recordkeeping requirements. (a) Data retention. Unless otherwise specified in this subpart, each owner or... shall be accessible from a central location by computer or other means that provides access within 2... may be maintained in hard copy or computer-readable form including, but not limited to, on paper...

  16. 41 CFR 105-56.024 - Purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... offset computer matching, identify Federal employees who owe delinquent non-tax debt to the United States. Centralized salary offset computer matching is the computerized comparison of delinquent debt records with...) administrative offset program, to collect delinquent debts owed to the Federal Government. This process is known...

  17. Little Package, Big Deal.

    ERIC Educational Resources Information Center

    Campbell, Joseph K.

    1979-01-01

    Describes New York State's extension experience in using the programable calculator, a portable pocket-size computer, to solve many of the problems that central computers now handle. Subscription services to programs written for the Texas Instruments TI-59 programable calculator are provided by both Cornell and Iowa State Universities. (MF)

  18. ACToR Chemical Structure processing using Open Source ChemInformatics Libraries (FutureToxII)

    EPA Science Inventory

    ACToR (Aggregated Computational Toxicology Resource) is a centralized database repository developed by the National Center for Computational Toxicology (NCCT) at the U.S. Environmental Protection Agency (EPA). Free and open source tools were used to compile toxicity data from ove...

  19. Teaching Molecular Biology with Microcomputers.

    ERIC Educational Resources Information Center

    Reiss, Rebecca; Jameson, David

    1984-01-01

    Describes a series of computer programs that use simulation and gaming techniques to present the basic principles of the central dogma of molecular genetics, mutation, and the genetic code. A history of discoveries in molecular biology is presented and the evolution of these computer assisted instructional programs is described. (MBR)

  20. PLATO--AN AUTOMATED TEACHING DEVICE.

    ERIC Educational Resources Information Center

    BITZER, D.; AND OTHERS

    PLATO (PROGRAMED LOGIC FOR AUTOMATIC TEACHING OPERATION) IS A DEVICE FOR TEACHING A NUMBER OF STUDENTS INDIVIDUALLY BY MEANS OF A SINGLE, CENTRAL PURPOSE, DIGITAL COMPUTER. THE GENERAL ORGANIZATION OF EQUIPMENT CONSISTS OF A KEYSET FOR STUDENT RESPONSES, THE COMPUTER, STORAGE DEVICE (ELECTRIC BLACKBOARD), SLIDE SELECTOR (ELECTRICAL BOOK), AND TV…

  1. CDL description of the CDC 6600 stunt box

    NASA Technical Reports Server (NTRS)

    Hertzog, J. B.

    1971-01-01

    The CDC 6600 central memory control (stunt box) is described utilizing CDL (Computer Design Language), block diagrams, and text. The stunt box is a clearing house for all central memory references from the 6600 central and peripheral processors. Since memory requests can be issued simultaneously, the stunt box must be capable of assigning priorities to requests, of labeling requests so that the data will be distributed correctly, and of remembering rejected addresses due to memory conflicts.

  2. Emergent Adaptive Noise Reduction from Communal Cooperation of Sensor Grid

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Jones, Michael G.; Nark, Douglas M.; Lodding, Kenneth N.

    2010-01-01

    In the last decade, the realization of small, inexpensive, and powerful devices with sensors, computers, and wireless communication has promised the development of massive sized sensor networks with dense deployments over large areas capable of high fidelity situational assessments. However, most management models have been based on centralized control and research has concentrated on methods for passing data from sensor devices to the central controller. Most implementations have been small but, as it is not scalable, this methodology is insufficient for massive deployments. Here, a specific application of a large sensor network for adaptive noise reduction demonstrates a new paradigm where communities of sensor/computer devices assess local conditions and make local decisions from which emerges a global behaviour. This approach obviates many of the problems of centralized control as it is not prone to single point of failure and is more scalable, efficient, robust, and fault tolerant

  3. Managing drought risk with a computer model of the Raritan River Basin water-supply system in central New Jersey

    USGS Publications Warehouse

    Dunne, Paul; Tasker, Gary

    1996-01-01

    The reservoirs and pumping stations that comprise the Raritan River Basin water-supply system and its interconnections to the Delaware-Raritan Canal water-supply system, operated by the New Jersey Water Supply Authority (NJWSA), provide potable water to central New Jersey communities. The water reserve of this combined system can easily be depleted by an extended period of below-normal precipitation. Efficient operation of the combined system is vital to meeting the water-supply needs of central New Jersey. In an effort to improve the efficiency of the system operation, the U.S. Geological Survey (USGS), in cooperation with the NJWSA, has developed a computer model that provides a technical basis for evaluating the effects of alternative patterns of operation of the Raritan River Basin water-supply system. This fact sheet describes the model, its technical basis, and its operation.

  4. Forest vegetation simulation tools and forest health assessment

    Treesearch

    Richard M. Teck; Melody Steele

    1995-01-01

    A Stand Hazard Rating System for Central ldaho forests has been incorporated into the Central ldaho Prognosis variant of the Forest Vegetation Simulator to evaluate how insects, disease and fire hazards within the Deadwood River Drainage change over time. A custom interface, BOISE.COMPUTE.PR, has been developed so hazard ratings can be electronically downloaded...

  5. Electronic Mail Is One High-Tech Management Tool that Really Delivers.

    ERIC Educational Resources Information Center

    Parker, Donald C.

    1987-01-01

    Describes an electronic mail system used by the Horseheads (New York) Central School Distict's eight schools and central office that saves time and enhances productivity. This software calls up information from the district's computer network and sends it to other users' special files--electronic "mailboxes" set aside for messages and…

  6. Growth-simulation model for lodgepole pine in central Oregon.

    Treesearch

    Walter G. Dahms

    1983-01-01

    A growth-simulation model for central Oregon lodgepole pine (Pinus contorta Dougl.) has been constructed by combining data from temporary and permanent sample plots. The model is similar to a conventional yield table with the added capacity for dealing with the stand-density variable. The simulator runs on a desk-top computer.

  7. Computation and brain processes, with special reference to neuroendocrine systems.

    PubMed

    Toni, Roberto; Spaletta, Giulia; Casa, Claudia Della; Ravera, Simone; Sandri, Giorgio

    2007-01-01

    The development of neural networks and brain automata has made neuroscientists aware that the performance limits of these brain-like devices lies, at least in part, in their computational power. The computational basis of a. standard cybernetic design, in fact, refers to that of a discrete and finite state machine or Turing Machine (TM). In contrast, it has been suggested that a number of human cerebral activites, from feedback controls up to mental processes, rely on a mixing of both finitary, digital-like and infinitary, continuous-like procedures. Therefore, the central nervous system (CNS) of man would exploit a form of computation going beyond that of a TM. This "non conventional" computation has been called hybrid computation. Some basic structures for hybrid brain computation are believed to be the brain computational maps, in which both Turing-like (digital) computation and continuous (analog) forms of calculus might occur. The cerebral cortex and brain stem appears primary candidate for this processing. However, also neuroendocrine structures like the hypothalamus are believed to exhibit hybrid computional processes, and might give rise to computational maps. Current theories on neural activity, including wiring and volume transmission, neuronal group selection and dynamic evolving models of brain automata, bring fuel to the existence of natural hybrid computation, stressing a cooperation between discrete and continuous forms of communication in the CNS. In addition, the recent advent of neuromorphic chips, like those to restore activity in damaged retina and visual cortex, suggests that assumption of a discrete-continuum polarity in designing biocompatible neural circuitries is crucial for their ensuing performance. In these bionic structures, in fact, a correspondence exists between the original anatomical architecture and synthetic wiring of the chip, resulting in a correspondence between natural and cybernetic neural activity. Thus, chip "form" provides a continuum essential to chip "function". We conclude that it is reasonable to predict the existence of hybrid computational processes in the course of many human, brain integrating activities, urging development of cybernetic approaches based on this modelling for adequate reproduction of a variety of cerebral performances.

  8. Misdiagnosis of acute peripheral vestibulopathy in central nervous ischemic infarction.

    PubMed

    Braun, Eva Maria; Tomazic, Peter Valentin; Ropposch, Thorsten; Nemetz, Ulrike; Lackner, Andreas; Walch, Christian

    2011-12-01

    Vertigo is a very common symptom at otorhinolaryngology (ENT), neurological, and emergency units, but often, it is difficult to distinguish between vertigo of peripheral and central origin. We conducted a retrospective analysis of a hospital database, including all patients admitted to the ENT University Hospital Graz after neurological examination, with a diagnosis of peripheral vestibular vertigo and subsequent diagnosis of central nervous infarction as the actual cause for the vertigo. Twelve patients were included in this study. All patients with acute spinning vertigo after a thorough neurological examination and with uneventful computed tomographic scans were referred to our ENT department. Nine of them presented with horizontal nystagmus. Only 1 woman experienced additional hearing loss. The mean diagnostic delay to the definite diagnosis of a central infarction through magnetic resonance imaging was 4 days (SD, 2.3 d). A careful otologic and neurological examination, including the head impulse test and caloric testing, is mandatory. Because ischemic events cannot be diagnosed in computed tomographic scans at an early stage, we strongly recommend to perform cranial magnetic resonance imaging within 48 hours from admission if vertigo has not improved under conservative treatment.

  9. A Novel Centrality Measure for Network-wide Cyber Vulnerability Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sathanur, Arun V.; Haglin, David J.

    In this work we propose a novel formulation that models the attack and compromise on a cyber network as a combination of two parts - direct compromise of a host and the compromise occurring through the spread of the attack on the network from a compromised host. The model parameters for the nodes are a concise representation of the host profiles that can include the risky behaviors of the associated human users while the model parameters for the edges are based on the existence of vulnerabilities between each pair of connected hosts. The edge models relate to the summary representationsmore » of the corresponding attack-graphs. This results in a formulation based on Random Walk with Restart (RWR) and the resulting centrality metric can be solved for in an efficient manner through the use of sparse linear solvers. Thus the formulation goes beyond mere topological considerations in centrality computations by summarizing the host profiles and the attack graphs into the model parameters. The computational efficiency of the method also allows us to also quantify the uncertainty in the centrality measure through Monte Carlo analysis.« less

  10. Central Data Processing System (CDPS) user's manual: Solar heating and cooling program

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The software and data base management system required to assess the performance of solar heating and cooling systems installed at multiple sites is presented. The instrumentation data associated with these systems is collected, processed, and presented in a form which supported continuity of performance evaluation across all applications. The CDPS consisted of three major elements: communication interface computer, central data processing computer, and performance evaluation data base. Users of the performance data base were identified, and procedures for operation, and guidelines for software maintenance were outlined. The manual also defined the output capabilities of the CDPS in support of external users of the system.

  11. Making automated computer program documentation a feature of total system design

    NASA Technical Reports Server (NTRS)

    Wolf, A. W.

    1970-01-01

    It is pointed out that in large-scale computer software systems, program documents are too often fraught with errors, out of date, poorly written, and sometimes nonexistent in whole or in part. The means are described by which many of these typical system documentation problems were overcome in a large and dynamic software project. A systems approach was employed which encompassed such items as: (1) configuration management; (2) standards and conventions; (3) collection of program information into central data banks; (4) interaction among executive, compiler, central data banks, and configuration management; and (5) automatic documentation. A complete description of the overall system is given.

  12. Unifying model of carpal mechanics based on computationally derived isometric constraints and rules-based motion - the stable central column theory.

    PubMed

    Sandow, M J; Fisher, T J; Howard, C Q; Papas, S

    2014-05-01

    This study was part of a larger project to develop a (kinetic) theory of carpal motion based on computationally derived isometric constraints. Three-dimensional models were created from computed tomography scans of the wrists of ten normal subjects and carpal spatial relationships at physiological motion extremes were assessed. Specific points on the surface of the various carpal bones and the radius that remained isometric through range of movement were identified. Analysis of the isometric constraints and intercarpal motion suggests that the carpus functions as a stable central column (lunate-capitate-hamate-trapezoid-trapezium) with a supporting lateral column (scaphoid), which behaves as a 'two gear four bar linkage'. The triquetrum functions as an ulnar translation restraint, as well as controlling lunate flexion. The 'trapezoid'-shaped trapezoid places the trapezium anterior to the transverse plane of the radius and ulna, and thus rotates the principal axis of the central column to correspond to that used in the 'dart thrower's motion'. This study presents a forward kinematic analysis of the carpus that provides the basis for the development of a unifying kinetic theory of wrist motion based on isometric constraints and rules-based motion.

  13. Relationship between crown-root angulation (collum angle) of maxillary central incisors in Class II, division 2 malocclusion and lower lip line.

    PubMed

    Srinivasan, Bhadrinath; Kailasam, Vignesh; Chitharanjan, Arun; Ramalingam, Arthi

    2013-01-01

    The present study aimed to measure the magnitude of the collum angle (crown-root angulation) of maxillary central incisors present in Class II, division 2 malocclusion and to relate the changes in its magnitude with variations in the lower lip line. A set of 120 conventional lateral cephalograms were selected and divided into three groups of 40 each based on the type of malocclusion presented: Class II, division 2 (group 1); Class II, division 1 (group 2); and Class I (group 3). The collum angle of the maxillary central incisor was measured, and the lower lip line was recorded. Analysis of variance (ANOVA) revealed that the mean collum angle was statistically significantly different in the three groups. The mean collum angle was greatest in Class II, division 2 malocclusion (group 1). The mean collum angles were 3.24 ± 4.69 degrees, 0.95 ± 1.06 degrees, and 1.05 ± 1.50 degrees in groups 1, 2, and 3 respectively. In χ ² test comparison of the location of the lower lip line (incisal, middle, or apical third of the central incisor) among the three groups, the lower lip line was found to contact the middle third of the central incisor most frequently in Class II, division 2 malocclusion. ANOVA followed by Tukey honestly significant difference (HSD) test showed that the mean collum angle is significantly increased when the lower lip is in the middle third (P < .05) of the central incisor. Variations in magnitude of the collum angle with the change in the lower lip line suggest a probable etiologic role of the lower lip line in the development of the collum angle.

  14. Prophylactic central neck lymphadenectomy in high risk patients with T1 or T2 papillary thyroid carcinoma: is it useful?

    PubMed

    Delogu, Daniele; Pisano, Ilia Patrizia; Pala, Carlo; Pulighe, Fabio; Denti, Salvatore; Cossu, Antonio; Trignano, Mario

    2014-01-01

    The aim of this study was to evaluate the role of prophylactic central neck lymph node dissection in high risk patients with T1 or T2 papillary thyroid cancer. Seventy-three patients who had undergone total thyroidectomy for papillary thyroid cancer smaller than 4cm, without cervical lymphadenopathy and prophylactic central neck lymph node dissection were included. Patients were divided in two groups: low risk patients (group A) and high risk patients (group B). High risk patients were considered those with at least one of the followings: male sex, age ≥ 45 years, and extracapsular or extrathyroid disease. Statistical significant differences in persistent disease, recurrence and complications rates between the two groups were studied. Persistence of the disease was observed in one case in group A (5.9%) and in three cases in group B (5.4%), while thyroid cancer recurrence was registered in zero and two (3.6%) cases respectively. One single case (5.9%) of transitory recurrent laryngeal nerve damage was reported in group A and none in group B, while transitory hypoparathyroidism was observed in 2 (3.6%) patients in group A, and 1 (1.8%) patient in group B. Permanent recurrent laryngeal nerve damage was observed in one patient in group A, while permanent hypoparathyroidism was registered in one case in group B. Logistic regression evidenced that multifocality was the only risk factor significantly related to persistence of disease and recurrence. Our results suggests that prophylactic central neck lymph node dissection can be safely avoided in patients with T1 or T2 papillary thyroid cancer, except in those with multifocal disease. Cancer, Central neck, Cervical, Lymphadenectomy, Lymph nodes, Papillary carcinoma, Thyroid.

  15. The Comparison between Torsional and Conventional Mode Phacoemulsification in Moderate and Hard Cataracts

    PubMed Central

    Kim, Dong-Hyun; Wee, Won-Ryang; Lee, Jin-Hak

    2010-01-01

    Purpose To compare the intraoperative performances and postoperative outcomes of cataract surgery performed with longitudinal phacoemulsification and torsional phacoemulsification in moderate and hard cataracts. Methods Of 85 patients who had senile cataracts, 102 eyes were operated on using the Infiniti Vision System. Preoperative examinations (slit lamp examination, mean central corneal thickness, and central endothelial cell counts) were performed for each patient. Cataracts were subdivided into moderate and hard, according to the Lens Opacities Classification System III grading of nucleus opalescence (NO). Eyes in each cataract group were randomly assigned to conventional and torsional phaco-mode. Intraoperative parameters, including ultrasound time (UST), cumulative dissipated energy (CDE), and the balanced salt solution plus (BSSP) volume utilized were evaluated. Best corrected visual acuity (BCVA) was checked on postoperative day 30; mean central corneal thickness and central endothelial cell counts were investigated on postoperative days 7 and 30. Results Preoperative BCVA and mean grading of NO showed no difference in both groups. Preoperative endothelial cell count and central corneal thickness also showed no significant difference in both groups. In the moderate cataract group, the CDE, UST, and BSSP volume were significantly lower in the torsional mode than the longitudinal mode, but they did not show any difference in the hard cataract group. Torsional group showed less endothelial cell loss and central corneal thickening at postoperative day seven in moderate cataracts but showed no significant differences, as compared with the longitudinal group, by postoperative day 30. Conclusions Torsional phacoemulsification showed superior efficiency for moderate cataracts, as compared with longitudinal phacoemulsification, in the early postoperative stage. PMID:21165231

  16. The comparison between torsional and conventional mode phacoemulsification in moderate and hard cataracts.

    PubMed

    Kim, Dong-Hyun; Wee, Won-Ryang; Lee, Jin-Hak; Kim, Mee-Kum

    2010-12-01

    To compare the intraoperative performances and postoperative outcomes of cataract surgery performed with longitudinal phacoemulsification and torsional phacoemulsification in moderate and hard cataracts. Of 85 patients who had senile cataracts, 102 eyes were operated on using the Infiniti Vision System. Preoperative examinations (slit lamp examination, mean central corneal thickness, and central endothelial cell counts) were performed for each patient. Cataracts were subdivided into moderate and hard, according to the Lens Opacities Classification System III grading of nucleus opalescence (NO). Eyes in each cataract group were randomly assigned to conventional and torsional phaco-mode. Intraoperative parameters, including ultrasound time (UST), cumulative dissipated energy (CDE), and the balanced salt solution plus (BSSP) volume utilized were evaluated. Best corrected visual acuity (BCVA) was checked on postoperative day 30; mean central corneal thickness and central endothelial cell counts were investigated on postoperative days 7 and 30. Preoperative BCVA and mean grading of NO showed no difference in both groups. Preoperative endothelial cell count and central corneal thickness also showed no significant difference in both groups. In the moderate cataract group, the CDE, UST, and BSSP volume were significantly lower in the torsional mode than the longitudinal mode, but they did not show any difference in the hard cataract group. Torsional group showed less endothelial cell loss and central corneal thickening at postoperative day seven in moderate cataracts but showed no significant differences, as compared with the longitudinal group, by postoperative day 30. Torsional phacoemulsification showed superior efficiency for moderate cataracts, as compared with longitudinal phacoemulsification, in the early postoperative stage.

  17. Connecting the virtual world of computers to the real world of medicinal chemistry.

    PubMed

    Glen, Robert C

    2011-03-01

    Drug discovery involves the simultaneous optimization of chemical and biological properties, usually in a single small molecule, which modulates one of nature's most complex systems: the balance between human health and disease. The increased use of computer-aided methods is having a significant impact on all aspects of the drug-discovery and development process and with improved methods and ever faster computers, computer-aided molecular design will be ever more central to the discovery process.

  18. The influence of phylogeny, social style, and sociodemographic factors on macaque social network structure.

    PubMed

    Balasubramaniam, Krishna N; Beisner, Brianne A; Berman, Carol M; De Marco, Arianna; Duboscq, Julie; Koirala, Sabina; Majolo, Bonaventura; MacIntosh, Andrew J; McFarland, Richard; Molesti, Sandra; Ogawa, Hideshi; Petit, Odile; Schino, Gabriele; Sosa, Sebastian; Sueur, Cédric; Thierry, Bernard; de Waal, Frans B M; McCowan, Brenda

    2018-01-01

    Among nonhuman primates, the evolutionary underpinnings of variation in social structure remain debated, with both ancestral relationships and adaptation to current conditions hypothesized to play determining roles. Here we assess whether interspecific variation in higher-order aspects of female macaque (genus: Macaca) dominance and grooming social structure show phylogenetic signals, that is, greater similarity among more closely-related species. We use a social network approach to describe higher-order characteristics of social structure, based on both direct interactions and secondary pathways that connect group members. We also ask whether network traits covary with each other, with species-typical social style grades, and/or with sociodemographic characteristics, specifically group size, sex-ratio, and current living condition (captive vs. free-living). We assembled 34-38 datasets of female-female dyadic aggression and allogrooming among captive and free-living macaques representing 10 species. We calculated dominance (transitivity, certainty), and grooming (centrality coefficient, Newman's modularity, clustering coefficient) network traits as aspects of social structure. Computations of K statistics and randomization tests on multiple phylogenies revealed moderate-strong phylogenetic signals in dominance traits, but moderate-weak signals in grooming traits. GLMMs showed that grooming traits did not covary with dominance traits and/or social style grade. Rather, modularity and clustering coefficient, but not centrality coefficient, were strongly predicted by group size and current living condition. Specifically, larger groups showed more modular networks with sparsely-connected clusters than smaller groups. Further, this effect was independent of variation in living condition, and/or sampling effort. In summary, our results reveal that female dominance networks were more phylogenetically conserved across macaque species than grooming networks, which were more labile to sociodemographic factors. Such findings narrow down the processes that influence interspecific variation in two core aspects of macaque social structure. Future directions should include using phylogeographic approaches, and addressing challenges in examining the effects of socioecological factors on primate social structure. © 2017 Wiley Periodicals, Inc.

  19. Pulmonary cryptococcosis in rheumatoid arthritis (RA) patients: comparison of imaging characteristics among RA, acquired immunodeficiency syndrome, and immunocompetent patients.

    PubMed

    Yanagawa, Noriyo; Sakai, Fumikazu; Takemura, Tamiko; Ishikawa, Satoru; Takaki, Yasunobu; Hishima, Tsunekazu; Kamata, Noriko

    2013-11-01

    The imaging characteristics of cryptococcosis in rheumatoid arthritis (RA) patients were analyzed by comparing them with those of acquired immunodeficiency syndrome (AIDS) and immunocompetent patients, and the imaging findings were correlated with pathological findings. Two radiologists retrospectively compared the computed tomographic (CT) findings of 35 episodes of pulmonary cryptococcosis in 31 patients with 3 kinds of underlying states (10 RA, 12 AIDS, 13 immunocompetent), focusing on the nature, number, and distribution of lesions. The pathological findings of 18 patients (8 RA, 2 AIDS, 8 immunocompetent) were analyzed by two pathologists, and then correlated with imaging findings. The frequencies of consolidation and ground glass attenuation (GGA) were significantly higher, and the frequency of peripheral distribution was significantly lower in the RA group than in the immunocompetent group. Peripheral distribution was less common and generalized distribution was more frequent in the RA group than in the AIDS group. The pathological findings of the AIDS and immunocompetent groups reflected their immune status: There was lack of a granuloma reaction in the AIDS group, and a complete granuloma reaction in the immunocompetent group, while the findings of the RA group varied, including a complete granuloma reaction, a loose granuloma reaction and a hyper-immune reaction. Cases with the last two pathologic findings were symptomatic and showed generalized or central distribution on CT. Cryptococcosis in the RA group showed characteristic radiological and pathological findings compared with the other 2 groups. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. When Elementary Students Change Peer Groups: Intragroup Centrality, Intergroup Centrality, and Self-Perceptions of Popularity

    ERIC Educational Resources Information Center

    Jones, Martin H.; Estell, David B.

    2010-01-01

    The current study follows two cohorts of fourth and fifth graders across 1 school year to better understand why some students change peer groups. The study focuses on popularity and intragroup social status. We examined whether differences between individuals' and group members' self-perceptions of popularity were related to changing peer groups.…

  1. The effects of a computer skill training programme adopting social comparison and self-efficacy enhancement strategies on self-concept and skill outcome in trainees with physical disabilities.

    PubMed

    Tam, S F

    2000-10-15

    The aim of this controlled, quasi-experimental study was to evaluate the effects of both self-efficacy enhancement and social comparison training strategy on computer skills learning and self-concept outcome of trainees with physical disabilities. The self-efficacy enhancement group comprised 16 trainees, the tutorial training group comprised 15 trainees, and there were 25 subjects in the control group. Both the self-efficacy enhancement group and the tutorial training group received a 15 week computer skills training course, including generic Chinese computer operation, Chinese word processing and Chinese desktop publishing skills. The self-efficacy enhancement group received training with tutorial instructions that incorporated self-efficacy enhancement strategies and experienced self-enhancing social comparisons. The tutorial training group received behavioural learning-based tutorials only, and the control group did not receive any training. The following measurements were employed to evaluate the outcomes: the Self-Concept Questionnaire for the Physically Disabled Hong Kong Chinese (SCQPD), the computer self-efficacy rating scale and the computer performance rating scale. The self-efficacy enhancement group showed significantly better computer skills learning outcome, total self-concept, and social self-concept than the tutorial training group. The self-efficacy enhancement group did not show significant changes in their computer self-efficacy: however, the tutorial training group showed a significant lowering of their computer self-efficacy. The training strategy that incorporated self-efficacy enhancement and positive social comparison experiences maintained the computer self-efficacy of trainees with physical disabilities. This strategy was more effective in improving the learning outcome (p = 0.01) and self-concept (p = 0.05) of the trainees than the conventional tutorial-based training strategy.

  2. Configuring compute nodes of a parallel computer in an operational group into a plurality of independent non-overlapping collective networks

    DOEpatents

    Archer, Charles J.; Inglett, Todd A.; Ratterman, Joseph D.; Smith, Brian E.

    2010-03-02

    Methods, apparatus, and products are disclosed for configuring compute nodes of a parallel computer in an operational group into a plurality of independent non-overlapping collective networks, the compute nodes in the operational group connected together for data communications through a global combining network, that include: partitioning the compute nodes in the operational group into a plurality of non-overlapping subgroups; designating one compute node from each of the non-overlapping subgroups as a master node; and assigning, to the compute nodes in each of the non-overlapping subgroups, class routing instructions that organize the compute nodes in that non-overlapping subgroup as a collective network such that the master node is a physical root.

  3. Computers and the Learning of Biological Concepts: Attitudes and Achievement of Nigerian Students.

    ERIC Educational Resources Information Center

    Jegede, Olugbemiro J.

    1991-01-01

    Compared attitudes toward computer use and achievement in biology for three groups of Nigerian students (n=64): (1) working alone with computer; (2) working in groups of three on the computer; (3) and a control group that received normal instruction (lecture). Students in the second group had the highest scores on attitude. No significant…

  4. Reversal of diet-induced obesity increases insulin transport into cerebrospinal fluid and restores sensitivity to the anorexic action of central insulin in male rats.

    PubMed

    Begg, Denovan P; Mul, Joram D; Liu, Min; Reedy, Brianne M; D'Alessio, David A; Seeley, Randy J; Woods, Stephen C

    2013-03-01

    Diet-induced obesity (DIO) reduces the ability of centrally administered insulin to reduce feeding behavior and also reduces the transport of insulin from the periphery to the central nervous system (CNS). The current study was designed to determine whether reversal of high-fat DIO restores the anorexic efficacy of central insulin and whether this is accompanied by restoration of the compromised insulin transport. Adult male Long-Evans rats were initially maintained on either a low-fat chow diet (LFD) or a high-fat diet (HFD). After 22 weeks, half of the animals on the HFD were changed to the LFD, whereas the other half continued on the HFD for an additional 8 weeks, such that there were 3 groups: 1) a LFD control group (Con; n = 18), 2) a HFD-fed, DIO group (n = 17), and 3) a HFD to LFD, DIO-reversal group (DIO-rev; n = 18). The DIO reversal resulted in a significant reduction of body weight and epididymal fat weight relative to the DIO group. Acute central insulin administration (8 mU) reduced food intake and caused weight loss in Con and DIO-rev but not DIO rats. Fasting cerebrospinal fluid insulin was higher in DIO than Con animals. However, after a peripheral bolus injection of insulin, cerebrospinal fluid insulin increased in Con and DIO-rev rats but not in the DIO group. These data provide support for previous reports that DIO inhibits both the central effects of insulin and insulin's transport to the CNS. Importantly, DIO-rev restored sensitivity to the effects of central insulin on food intake and insulin transport into the CNS.

  5. Decision Making about Computer Acquisition and Use in American Schools.

    ERIC Educational Resources Information Center

    Becker, Henry Jay

    1993-01-01

    Discusses the centralization and decentralization of decision making about computer use in elementary and secondary schools based on results of a 1989 national survey. Results unexpectedly indicate that more successful programs are the result of districtwide planning than individual teacher or school-level decision making. (LRW)

  6. 40 CFR 63.506 - General recordkeeping and reporting provisions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... General recordkeeping and reporting provisions. (a) Data retention. Unless otherwise specified in this... retained on site or shall be accessible from a central location by computer or other means that provide... offsite. Records may be maintained in hard copy or computer-readable form including, but not limited to...

  7. 40 CFR 63.506 - General recordkeeping and reporting provisions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... General recordkeeping and reporting provisions. (a) Data retention. Unless otherwise specified in this... retained on site or shall be accessible from a central location by computer or other means that provide... offsite. Records may be maintained in hard copy or computer-readable form including, but not limited to...

  8. 40 CFR 63.506 - General recordkeeping and reporting provisions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... General recordkeeping and reporting provisions. (a) Data retention. Unless otherwise specified in this... retained on site or shall be accessible from a central location by computer or other means that provide... offsite. Records may be maintained in hard copy or computer-readable form including, but not limited to...

  9. Elliptic Curve Cryptography with Java

    ERIC Educational Resources Information Center

    Klima, Richard E.; Sigmon, Neil P.

    2005-01-01

    The use of the computer, and specifically the mathematics software package Maple, has played a central role in the authors' abstract algebra course because it provides their students with a way to see realistic examples of the topics they discuss without having to struggle with extensive computations. However, Maple does not provide the computer…

  10. Making Construals as a New Digital Skill for Learning

    ERIC Educational Resources Information Center

    Beynon, Meurig; Boyatt, Russell; Foss, Jonathan; Hall, Chris; Hudnott, Elizabeth; Russ, Steve; Sutinen, Erkki; Macleod, Hamish; Kommers, Piet

    2015-01-01

    Making construals is a practical approach to computing that was originally developed for and by computer science undergraduates. It is the central theme of an EU project aimed at disseminating the relevant principles to a broader audience. This involves bringing together technical experts in making construals and international experts in…

  11. New Directions in Statewide Computer Planning and Cooperation.

    ERIC Educational Resources Information Center

    Norris, Donald M.; St. John, Edward P.

    1981-01-01

    In the 1960s and early 1970s, statewide planning efforts usually resulted in plans for centralized hardware networks. The focus of statewide planning has shifted to the issue of improved computer financing, information sharing, and enhanced utilization in instruction, administration. A "facilitating network" concept and Missouri efforts…

  12. Web Based Parallel Programming Workshop for Undergraduate Education.

    ERIC Educational Resources Information Center

    Marcus, Robert L.; Robertson, Douglass

    Central State University (Ohio), under a contract with Nichols Research Corporation, has developed a World Wide web based workshop on high performance computing entitled "IBN SP2 Parallel Programming Workshop." The research is part of the DoD (Department of Defense) High Performance Computing Modernization Program. The research…

  13. Teaching Pronunciation with Computer Assisted Pronunciation Instruction in a Technological University

    ERIC Educational Resources Information Center

    Liu, Sze-Chu; Hung, Po-Yi

    2016-01-01

    The purpose of this study is to evaluate the effectiveness of computer assisted pronunciation instruction in English pronunciation for students in vocational colleges and universities in Taiwan. The participants were fifty-one first-year undergraduate students from a technological university located in central Taiwan. The participants received an…

  14. 12 CFR 403.9 - Fees.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... SECURITY INFORMATION § 403.9 Fees. The following specific fees shall be applicable with respect to services... records, per hour or fraction thereof: (i) Professional $11.00 (ii) Clerical 6.00 (b) Computer service charges per second for actual use of computer central processing unit .25 (c) Copies made by photostat or...

  15. 12 CFR 403.9 - Fees.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... SECURITY INFORMATION § 403.9 Fees. The following specific fees shall be applicable with respect to services... records, per hour or fraction thereof: (i) Professional $11.00 (ii) Clerical 6.00 (b) Computer service charges per second for actual use of computer central processing unit .25 (c) Copies made by photostat or...

  16. 12 CFR 403.9 - Fees.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... SECURITY INFORMATION § 403.9 Fees. The following specific fees shall be applicable with respect to services... records, per hour or fraction thereof: (i) Professional $11.00 (ii) Clerical 6.00 (b) Computer service charges per second for actual use of computer central processing unit .25 (c) Copies made by photostat or...

  17. 12 CFR 403.9 - Fees.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... SECURITY INFORMATION § 403.9 Fees. The following specific fees shall be applicable with respect to services... records, per hour or fraction thereof: (i) Professional $11.00 (ii) Clerical 6.00 (b) Computer service charges per second for actual use of computer central processing unit .25 (c) Copies made by photostat or...

  18. 12 CFR 403.9 - Fees.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... SECURITY INFORMATION § 403.9 Fees. The following specific fees shall be applicable with respect to services... records, per hour or fraction thereof: (i) Professional $11.00 (ii) Clerical 6.00 (b) Computer service charges per second for actual use of computer central processing unit .25 (c) Copies made by photostat or...

  19. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  20. Inertial subsystem functional and design requirements for the orbiter (Phase B extension baseline)

    NASA Technical Reports Server (NTRS)

    Flanders, J. H.; Green, J. P., Jr.

    1972-01-01

    The design requirements use the Phase B extension baseline system definition. This means that a GNC computer is specified for all command control functions instead of a central computer communicating with the ISS through a databus. Forced air cooling is used instead of cold plate cooling.

  1. Bioinformatics and Astrophysics Cluster (BinAc)

    NASA Astrophysics Data System (ADS)

    Krüger, Jens; Lutz, Volker; Bartusch, Felix; Dilling, Werner; Gorska, Anna; Schäfer, Christoph; Walter, Thomas

    2017-09-01

    BinAC provides central high performance computing capacities for bioinformaticians and astrophysicists from the state of Baden-Württemberg. The bwForCluster BinAC is part of the implementation concept for scientific computing for the universities in Baden-Württemberg. Community specific support is offered through the bwHPC-C5 project.

  2. EFFECTS OF INTRAVITREAL RANIBIZUMAB AND BEVACIZUMAB ON THE RETINAL VESSEL SIZE IN DIABETIC MACULAR EDEMA.

    PubMed

    Kurt, Muhammed Mustafa; Çekiç, Osman; Akpolat, Çetin; Elçioglu, Mustafa

    2018-06-01

    The goal of this study was to assess the effects of a single injection of intravitreal ranibizumab (RAN) or bevacizumab (BEV) on the retinal vessel size in eyes with diabetic macular edema. In total, 32 patients were enrolled in the RAN group, and 30 patients were included in BEV group. Each of these groups was also subdivided into two others groups: a study group and a control group. The study groups were composed of the injected eyes, whereas the noninjected fellow eyes served as the control groups. The patients underwent complete ophthalmic examinations, including optical coherence tomography and fundus fluorescein angiography, and the primary outcome measures included the central retinal artery equivalent, central retinal vein equivalent, and artery-to-vein ratio. In the RAN study group (n = 32), the preinjection mean central retinal artery equivalent (175.42 μm) decreased to 169.01 μm after 1 week, and to 167.47 μm after 1 month (P < 0.001), whereas the baseline central retinal vein equivalent (235.29 μm) decreased initially to 219.90 μm after 1 week, and to 218.36 μm after 1 month (P < 0.001). In the BEV study group (n = 30), the preinjection central retinal artery equivalent (150.21 μm) decreased to 146.25 μm after 1 week, and to 145.89 μm after 1 month (P < 0.001); whereas the baseline central retinal vein equivalent (211.87 μm) decreased initially to 204.59 μm after 1 week and was 205.24 μm after 1 month (P < 0.001). The preinjection artery-to-vein ratio values changed significantly (P = 0.001) after 1 week and after 1 month in the RAN group, but no significant alteration in the artery-to-vein ratio was observed in the BEV group (P = 0.433). In both the RAN (n = 32) and BEV (n = 30) control groups, none of the 3 parameters changed throughout the study period, when compared with the baseline. The results of this study showed that both RAN and BEV injections significantly constricted the retinal blood vessel diameters.

  3. [Application of locomotor activity test to evaluate functional injury after global cerebral ischemia in C57BL/6 mice].

    PubMed

    Zhang, Li-quan; Xu, Jia-ni; Wang, Zhen-zhen; Zeng, Li-jun; Ye, Yi-lu; Zhang, Wei-ping; Wei, Er-qing; Zhang, Qi

    2014-05-01

    To evaluate the application of locomotor activity test in functional injury after global cerebral ischemia (GCI) in C57BL/6 mice. GCI was induced by bilateral carotid arteries occlusion for 30 min in C57BL/6 mice. Mice were divided into sham group, GCI group and minocycline group. Saline or minocycline (45 mg/kg) was i.p. injected once daily for 6 d after ischemia. At Day 6 after ischemia, locomotor activity was recorded for 1 h in open field test. Total distance, central distance, central distance ratio, periphery distance, periphery distance ratio, central time and periphery time were used to evaluate the behavior characteristics of locomotor activity in C57BL/6 mice after ischemia. The survival neuron density was detected by Nissl staining in hippocampus, cortex and striatum. Compared with sham group, total distance, central distance and central time increased and periphery time decreased in C57BL/6 mice after GCI (Ps<0.05). However, minocycline significantly reduced the central distance and central time and increased the periphery time (Ps<0.05). Neurons were damaged in hippocampus, cortex and striatum after GCI, which manifested by decreased neurons and the most serious damage in hippocampal CA1 region. Minocycline significantly improved the neuron appearance and increased the neuron number in hippocampus and striatum (P<0.001 or P<0.05). Locomotor activity in open field test can objectively evaluate the behavior injury after GCI in mice. Central distance and central time can be used as indexes of quantitative assessment.

  4. CORRELATION OF CLINICAL AND STRUCTURAL PROGRESSION WITH VISUAL ACUITY LOSS IN MACULAR TELANGIECTASIA TYPE 2: MacTel Project Report No. 6-The MacTel Research Group.

    PubMed

    Peto, Tunde; Heeren, Tjebo F C; Clemons, Traci E; Sallo, Ferenc B; Leung, Irene; Chew, Emily Y; Bird, Alan C

    2018-01-01

    To evaluate progression of macular telangiectasia Type 2 lesions and their correlation with visual acuity. An international multicenter prospective study with annual examinations including best-corrected visual acuity (BCVA), fundus photography, fluorescein angiography, and optical coherence tomography images graded centrally. Mixed models were used to estimate progression rates, and a generalized linear model to compute the relative risk of BCVA loss, loss of ellipsoid zone (EZ) reflectivity, development of pigment plaques, or neovascularization. One thousand and fourteen eyes of 507 participants were followed for 4.2 ± 1.6 years. Best-corrected visual acuity decreased 1.07 ± 0.05 letters (mean ± SE) per year. Of all eyes, 15% lost ≥15 letters after 5 years. Of the eyes without EZ loss, 76% developed a noncentral loss. Of the eyes with noncentral loss, 45% progressed to central EZ loss. The rate of BCVA loss in eyes with noncentral EZ loss at baseline was similar to eyes without EZ loss. The rate of BCVA loss was significantly higher in eyes with central EZ loss at baseline (-1.40 ± 0.14 letters, P < 0.001). Ellipsoid zone loss is frequently found in macular telangiectasia Type 2 and is an important structural component reflecting visual function. Its presence in the fovea significantly correlates with worse visual prognosis.

  5. A computer program for anisotropic shallow-shell finite elements using symbolic integration

    NASA Technical Reports Server (NTRS)

    Andersen, C. M.; Bowen, J. T.

    1976-01-01

    A FORTRAN computer program for anisotropic shallow-shell finite elements with variable curvature is described. A listing of the program is presented together with printed output for a sample case. Computation times and central memory requirements are given for several different elements. The program is based on a stiffness (displacement) finite-element model in which the fundamental unknowns consist of both the displacement and the rotation components of the reference surface of the shell. Two triangular and four quadrilateral elements are implemented in the program. The triangular elements have 6 or 10 nodes, and the quadrilateral elements have 4 or 8 nodes. Two of the quadrilateral elements have internal degrees of freedom associated with displacement modes which vanish along the edges of the elements (bubble modes). The triangular elements and the remaining two quadrilateral elements do not have bubble modes. The output from the program consists of arrays corresponding to the stiffness, the geometric stiffness, the consistent mass, and the consistent load matrices for individual elements. The integrals required for the generation of these arrays are evaluated by using symbolic (or analytic) integration in conjunction with certain group-theoretic techniques. The analytic expressions for the integrals are exact and were developed using the symbolic and algebraic manipulation language.

  6. The impact of goal-oriented task design on neurofeedback learning for brain-computer interface control.

    PubMed

    McWhinney, S R; Tremblay, A; Boe, S G; Bardouille, T

    2018-02-01

    Neurofeedback training teaches individuals to modulate brain activity by providing real-time feedback and can be used for brain-computer interface control. The present study aimed to optimize training by maximizing engagement through goal-oriented task design. Participants were shown either a visual display or a robot, where each was manipulated using motor imagery (MI)-related electroencephalography signals. Those with the robot were instructed to quickly navigate grid spaces, as the potential for goal-oriented design to strengthen learning was central to our investigation. Both groups were hypothesized to show increased magnitude of these signals across 10 sessions, with the greatest gains being seen in those navigating the robot due to increased engagement. Participants demonstrated the predicted increase in magnitude, with no differentiation between hemispheres. Participants navigating the robot showed stronger left-hand MI increases than those with the computer display. This is likely due to success being reliant on maintaining strong MI-related signals. While older participants showed stronger signals in early sessions, this trend later reversed, suggesting greater natural proficiency but reduced flexibility. These results demonstrate capacity for modulating neurofeedback using MI over a series of training sessions, using tasks of varied design. Importantly, the more goal-oriented robot control task resulted in greater improvements.

  7. High energy X-ray CT study on the central void formations and the fuel pin deformations of FBR fuel assemblies

    NASA Astrophysics Data System (ADS)

    Katsuyama, Kozo; Nagamine, Tsuyoshi; Matsumoto, Shin-ichiro; Sato, Seichi

    2007-02-01

    The central void formations and deformations of fuel pins were investigated in fuel assemblies irradiated to high burn-up, using a non-destructive X-ray CT (computer tomography) technique. In this X-ray CT, the effect of strong gamma ray activity could be reduced to a negligible degree by using the pulse of a high energy X-ray source and detecting the intensity of the transmitted X-rays in synchronization with the generated X-rays. Clear cross-sectional images of fuel assemblies irradiated to high burn-up in a fast breeder reactor were successively obtained, in which the wrapping wires, cladding, pellets and central voids could be distinctly seen. The diameter of a typical central void measured by X-ray CT agreed with the one obtained by ceramography within an error of 0.1 mm. Based on this result, the dependence of the central void diameter on the linear heating rate was analyzed. In addition, the deformation behavior of a fuel pin along its axial direction could be analyzed from 20 stepwise X-ray cross-sectional images obtained in a small interval, and the results obtained showed a good agreement with the predictions calculated by two computer codes.

  8. Low erythrocyte Na/K-pump activity and number in northeast Thailand adults: evidence suggesting an acquired disorder.

    PubMed

    Tosukhowong, P; Tungsanga, K; Kittinantavorakoon, C; Chaitachawong, C; Pansin, P; Sriboonlue, P; Sitprija, V

    1996-07-01

    Healthy northeastern Thais have a higher erythrocyte sodium concentration and a lower erythrocyte membrane Na,K-adenosine triphosphatase (ATPase) activity than central Thais. To elucidate whether the defect is hereditary or acquired, we studied plasma sodium and potassium and erythrocyte sodium, potassium, Na,K-ATPase activity, and ouabain-binding sites (OBS) in the following groups: healthy newborns of ethnic central Thais (group 1), healthy newborns of ethnic northeast Thais (group 2), healthy adults of central Thailand ethnicity who lived in the rural central region (group 3) or in Bangkok (group 4), healthy adults of northeast Thailand ethnicity who lived in the rural northeast region (group 5) or who migrated to work in Bangkok for at least 1 year (group 6). Erythrocyte Na was higher in group 2 than in group 1. Group 3 had lower erythrocyte Na,K-ATPase activity than group 4, and it was lower in group 5 than in group 6. Among all groups, group 5 had the highest erythrocyte Na (11.6 mmol/L,F < 0.0001) and the lowest Na,K-ATPase activity (63 mmol Pi/mg x h, F < 0.0001) and erythrocyte OBS (397 sites per cell, F < 0.05) than the other adult groups. There was a positive correlation between erythrocyte Na,K-ATPase and erythrocyte OBS (r = .416, P < .0001). Multiple regression analysis demonstrated a correlation between erythrocyte Na as a dependent variable and erythrocyte OBS, plasma potassium, erythrocyte potassium, and erythrocyte Na,K-ATPase (r = .517, P < .0001). The erythrocyte Na,K-ATPase/OBS ratio, an expression of Na,K-ATPase activity equalized for the number of Na,K-pump units, was lowest among rural adults of the central region (group 3) and the northeast region (group 5) (F < 0.0002). Our data suggest that rural dwellers in Thailand tend to have lower erythrocyte Na,K-ATPase activity than urban dwellers and that this is probably acquired after birth. It was more severe among those from the northeast versus the central region, and was less severe among those who migrated to an urban area. This defect in northeast rural dwellers was probably associated with low numbers of Na,K-pump units and a defect of the pump to express activity, whereas in central rural dwellers it was probably associated with the latter condition. We postulate that there might be circulating Na,K-pump inhibitors and metabolic disturbances that cause attenuation of Na,K-ATPase function and synthesis in the northeast Thailand rural population, and that such substances may have an environmental origin. There may be a relationship between these abnormalities and sudden unexpected deaths.

  9. Myometrial contractility influences oxytocin receptor (OXTR) expression in term trophoblast cells obtained from the maternal surface of the human placenta.

    PubMed

    Szukiewicz, Dariusz; Bilska, Anna; Mittal, Tarun Kumar; Stangret, Aleksandra; Wejman, Jaroslaw; Szewczyk, Grzegorz; Pyzlak, Michal; Zamlynski, Jacek

    2015-09-16

    Oxytocin (OXT) acts through its specific receptor (OXTR) and increased density of OXTR and/or augmented sensitivity to OXT were postulated as prerequisites of normal onset of labor. Expression of OXTR in the placental term trophoblast cells has not yet been analyzed in the context of contractile activity of the uterus. Here we examine comparatively OXT contents in the placental tissue adjacent to the uterine wall and expressions of OXTR in this tissue and corresponding isolated placental trophoblast cells. Twenty eight placentae after normal labors at term (group I, N = 14) and after cesarean sections performed without uterine contractile activity (group II, N = 14) have been collected. Tissue excised from the maternal surface of examined placenta was used for OXT concentration measurement, cytotrophoblast cell cultures preparation and immunohistochemistry of OXTR. Concentration of OXT was estimated in the tissue homogenates by an enzyme immunoassay with colorimetric detection. Cytotrophoblast cells were isolated using Kliman's method based on trypsin, DNase, and a 5-70% Percoll gradient centrifugation. The cultures were incubated for 5 days in normoxia. Both placental specimens and terminated cytotrophoblast cultures were fixed and embedded in paraffin before being immunostained for OXTR. Using light microscopy with computed morphometry for quantitative analysis, OXTR expressions were estimated in calibrated areas of the paraffin sections. There were not significant differences between the groups in respect to the mean OXT concentration. However, in both groups the median value of OXT concentration was significantly (p < 0.05) higher in the tissue obtained from the peripheral regions of the maternal surface of the placenta, compared to the samples from the central region of this surface. In placental tissue the mean expression of OXTR in group I was significantly (p < 0.05) increased by approximately 3.2-fold and 3.45-fold (the samples collected from central and peripheral regions, respectively) compared to the values obtained in group II. In the isolated primary trophoblast cultures the differences were even more evident (p < 0.02) and the mean change in OXTR expression in group I comprised approximately 6.9-fold increase and 6.5-fold increase (the samples collected from central and peripheral regions, respectively) compared to the values obtained in group II. Upregulation of OXTR within placental trophoblast cells localized close or adherent to uterine wall may play a crucial role in labor with efficient contractile activity (vaginal delivery). Further studies may disclose if this local OXT/OXTR signaling is utilized in the third stage of labor to elicit placental detachment or contribute in a more versatile way throughout the labor period.

  10. Isolated central diabetes insipidus in a newborn with congenital toxoplasmosis.

    PubMed

    Karadag, Ahmet; Erdeve, Omer; Atasay, Begum; Arsan, Saadet; Deda, Gulhis; Ince, Erdal; Ocal, Gonul; Berberoglu, Merih

    2006-02-01

    We present a 5 day-old male newborn with isolated central diabetes insipidus due to congenital toxoplasmosis. This patient was referred to us for hydrocephalus. As we investigated the aetiology of the hydrocephalus, the patient's serum and cerebrospinal fluid tested positive for toxoplasmosis via ELISA and polymerase chain reaction. Computed tomography showed obstructive hydrocephalus and disseminated cranial calcifications. Central diabetes insipidus developed on the 10th day, apparently as a result of the toxoplasmosis infection, and was treated successfully with oral desmopressin.

  11. Nonequilibrium scheme for computing the flux of the convection-diffusion equation in the framework of the lattice Boltzmann method.

    PubMed

    Chai, Zhenhua; Zhao, T S

    2014-07-01

    In this paper, we propose a local nonequilibrium scheme for computing the flux of the convection-diffusion equation with a source term in the framework of the multiple-relaxation-time (MRT) lattice Boltzmann method (LBM). Both the Chapman-Enskog analysis and the numerical results show that, at the diffusive scaling, the present nonequilibrium scheme has a second-order convergence rate in space. A comparison between the nonequilibrium scheme and the conventional second-order central-difference scheme indicates that, although both schemes have a second-order convergence rate in space, the present nonequilibrium scheme is more accurate than the central-difference scheme. In addition, the flux computation rendered by the present scheme also preserves the parallel computation feature of the LBM, making the scheme more efficient than conventional finite-difference schemes in the study of large-scale problems. Finally, a comparison between the single-relaxation-time model and the MRT model is also conducted, and the results show that the MRT model is more accurate than the single-relaxation-time model, both in solving the convection-diffusion equation and in computing the flux.

  12. Comparative historical biogeography of three groups of Nearctic freshwater fishes across central Mexico.

    PubMed

    Pérez-Rodríguez, R; Domínguez-Domínguez, O; Doadrio, I; Cuevas-García, E; Pérez-Ponce de León, G

    2015-03-01

    Biogeographic patterns of the three main Nearctic groups of continental fishes inhabiting river drainages in central Mexico (livebearing goodeids, southern Mexican notropins and species of Algansea, the last two representing independent lineages of cyprinids) were obtained and compared by following two approaches: an estimate of divergence times and using a well-defined biogeographic method. Three concordant biogeographic events were identified among the three groups, showing some evidence of a partially congruent evolutionary history. The analysed groups show at least three independent colonization events into central Mexico: two western routes, followed by the Goodeinae and members of Algansea, and an early Plateau route followed by southern notropins. The most recent common ancestor (MRCA) of each of the three freshwater fish groups diversified in central Mexico in the Late Miocene. The lack of a strong congruence in their biogeographic patterns, and the differences in species richness among the three clades might be evidence for distinct patterns of diversification. © 2015 The Fisheries Society of the British Isles.

  13. Federated Tensor Factorization for Computational Phenotyping

    PubMed Central

    Kim, Yejin; Sun, Jimeng; Yu, Hwanjo; Jiang, Xiaoqian

    2017-01-01

    Tensor factorization models offer an effective approach to convert massive electronic health records into meaningful clinical concepts (phenotypes) for data analysis. These models need a large amount of diverse samples to avoid population bias. An open challenge is how to derive phenotypes jointly across multiple hospitals, in which direct patient-level data sharing is not possible (e.g., due to institutional policies). In this paper, we developed a novel solution to enable federated tensor factorization for computational phenotyping without sharing patient-level data. We developed secure data harmonization and federated computation procedures based on alternating direction method of multipliers (ADMM). Using this method, the multiple hospitals iteratively update tensors and transfer secure summarized information to a central server, and the server aggregates the information to generate phenotypes. We demonstrated with real medical datasets that our method resembles the centralized training model (based on combined datasets) in terms of accuracy and phenotypes discovery while respecting privacy. PMID:29071165

  14. Laboratory and software applications for clinical trials: the global laboratory environment.

    PubMed

    Briscoe, Chad

    2011-11-01

    The Applied Pharmaceutical Software Meeting is held annually. It is sponsored by The Boston Society, a not-for-profit organization that coordinates a series of meetings within the global pharmaceutical industry. The meeting generally focuses on laboratory applications, but in recent years has expanded to include some software applications for clinical trials. The 2011 meeting emphasized the global laboratory environment. Global clinical trials generate massive amounts of data in many locations that must be centralized and processed for efficient analysis. Thus, the meeting had a strong focus on establishing networks and systems for dealing with the computer infrastructure to support such environments. In addition to the globally installed laboratory information management system, electronic laboratory notebook and other traditional laboratory applications, cloud computing is quickly becoming the answer to provide efficient, inexpensive options for managing the large volumes of data and computing power, and thus it served as a central theme for the meeting.

  15. Scalable software-defined optical networking with high-performance routing and wavelength assignment algorithms.

    PubMed

    Lee, Chankyun; Cao, Xiaoyuan; Yoshikane, Noboru; Tsuritani, Takehiro; Rhee, June-Koo Kevin

    2015-10-19

    The feasibility of software-defined optical networking (SDON) for a practical application critically depends on scalability of centralized control performance. The paper, highly scalable routing and wavelength assignment (RWA) algorithms are investigated on an OpenFlow-based SDON testbed for proof-of-concept demonstration. Efficient RWA algorithms are proposed to achieve high performance in achieving network capacity with reduced computation cost, which is a significant attribute in a scalable centralized-control SDON. The proposed heuristic RWA algorithms differ in the orders of request processes and in the procedures of routing table updates. Combined in a shortest-path-based routing algorithm, a hottest-request-first processing policy that considers demand intensity and end-to-end distance information offers both the highest throughput of networks and acceptable computation scalability. We further investigate trade-off relationship between network throughput and computation complexity in routing table update procedure by a simulation study.

  16. Conjunctival impression cytology in computer users.

    PubMed

    Kumar, S; Bansal, R; Khare, A; Malik, K P S; Malik, V K; Jain, K; Jain, C

    2013-01-01

    It is known that the computer users develop the features of dry eye. To study the cytological changes in the conjunctiva using conjunctival impression cytology in computer users and a control group. Fifteen eyes of computer users who had used computers for more than one year and ten eyes of an age-and-sex matched control group (those who had not used computers) were studied by conjunctival impression cytology. Conjunctival impression cytology (CIC) results in the control group were of stage 0 and stage I while the computer user group showed CIC results between stages II to stage IV. Among the computer users, the majority ( > 90 %) showed stage III and stage IV changes. We found that those who used computers daily for long hours developed more CIC changes than those who worked at the computer for a shorter daily duration. © NEPjOPH.

  17. Treatment of central venous in-stent restenosis with repeat stent deployment in hemodialysis patients.

    PubMed

    Ronald, James; Davis, Bradley; Guevara, Carlos J; Pabon-Ramos, Waleska M; Smith, Tony P; Kim, Charles Y

    2017-05-15

    To report patency rates for stent deployment for treatment of in-stent stenosis of the central veins of the chest in hemodialysis patients. A retrospective analysis was performed on 29 patients who underwent 35 secondary percutaneous transluminal stent (PTS) deployments for in-stent stenosis within the central veins that were refractory to angioplasty and ipsilateral to a functioning hemodialysis access (in-stent PTS group). For comparison, patency data were acquired for 47 patients who underwent 78 successful percutaneous transluminal angioplasty (PTA) procedures for in-stent stenosis (in-stent PTA group) and 55 patients who underwent 55 stent deployments within native central vein stenosis refractory to angioplasty (native vein PTS group). The 3-, 6-, and 12-month primary lesion patency for the in-stent PTS group was 73%, 57%, and 32%, respectively. The 3-, 6-, and 12-month primary patency for the in-stent PTA group was 70%, 38%, and 17% and for the native vein PTS group was 78%, 57%, and 26%, which were similar to the in-stent PTS group (p = 0.20 and 0.41, respectively). The 3-, 6-, and 12-month secondary access patency was 91%, 73%, and 65% for the in-stent PTS group. Sub-analysis of the in-stent PTS group revealed no difference in primary (p = 0.93) or secondary patency rates (p = 0.27) of bare metal stents (n = 23) compared with stent grafts (n = 12). Stent deployment for central vein in-stent stenosis refractory to angioplasty was associated with reasonable patency rates, which were similar to in-stent PTA and native vein PTS.

  18. NMR Crystallography of a Carbanionic Intermediate in Tryptophan Synthase: Chemical Structure, Tautomerization, and Reaction Specificity.

    PubMed

    Caulkins, Bethany G; Young, Robert P; Kudla, Ryan A; Yang, Chen; Bittbauer, Thomas J; Bastin, Baback; Hilario, Eduardo; Fan, Li; Marsella, Michael J; Dunn, Michael F; Mueller, Leonard J

    2016-11-23

    Carbanionic intermediates play a central role in the catalytic transformations of amino acids performed by pyridoxal-5'-phosphate (PLP)-dependent enzymes. Here, we make use of NMR crystallography-the synergistic combination of solid-state nuclear magnetic resonance, X-ray crystallography, and computational chemistry-to interrogate a carbanionic/quinonoid intermediate analogue in the β-subunit active site of the PLP-requiring enzyme tryptophan synthase. The solid-state NMR chemical shifts of the PLP pyridine ring nitrogen and additional sites, coupled with first-principles computational models, allow a detailed model of protonation states for ionizable groups on the cofactor, substrates, and nearby catalytic residues to be established. Most significantly, we find that a deprotonated pyridine nitrogen on PLP precludes formation of a true quinonoid species and that there is an equilibrium between the phenolic and protonated Schiff base tautomeric forms of this intermediate. Natural bond orbital analysis indicates that the latter builds up negative charge at the substrate C α and positive charge at C4' of the cofactor, consistent with its role as the catalytic tautomer. These findings support the hypothesis that the specificity for β-elimination/replacement versus transamination is dictated in part by the protonation states of ionizable groups on PLP and the reacting substrates and underscore the essential role that NMR crystallography can play in characterizing both chemical structure and dynamics within functioning enzyme active sites.

  19. NMR Crystallography of a Carbanionic Intermediate in Tryptophan Synthase: Chemical Structure, Tautomerization, and Reaction Specificity

    PubMed Central

    2016-01-01

    Carbanionic intermediates play a central role in the catalytic transformations of amino acids performed by pyridoxal-5′-phosphate (PLP)-dependent enzymes. Here, we make use of NMR crystallography—the synergistic combination of solid-state nuclear magnetic resonance, X-ray crystallography, and computational chemistry—to interrogate a carbanionic/quinonoid intermediate analogue in the β-subunit active site of the PLP-requiring enzyme tryptophan synthase. The solid-state NMR chemical shifts of the PLP pyridine ring nitrogen and additional sites, coupled with first-principles computational models, allow a detailed model of protonation states for ionizable groups on the cofactor, substrates, and nearby catalytic residues to be established. Most significantly, we find that a deprotonated pyridine nitrogen on PLP precludes formation of a true quinonoid species and that there is an equilibrium between the phenolic and protonated Schiff base tautomeric forms of this intermediate. Natural bond orbital analysis indicates that the latter builds up negative charge at the substrate Cα and positive charge at C4′ of the cofactor, consistent with its role as the catalytic tautomer. These findings support the hypothesis that the specificity for β-elimination/replacement versus transamination is dictated in part by the protonation states of ionizable groups on PLP and the reacting substrates and underscore the essential role that NMR crystallography can play in characterizing both chemical structure and dynamics within functioning enzyme active sites. PMID:27779384

  20. Lessons learned from a regional strategy for resource allocation.

    PubMed

    Edwards, Janine C; Stapley, Jonathan; Akins, Ralitsa; Silenas, Rasa; Williams, Josie R

    2005-01-01

    Two qualitative case studies focus on the allocation of CDC funds distributed during 2002 for bioterrorism preparedness in two Texas public health regions (each as populous and complex as many states). Lessons learned are presented for public health officials and others who work to build essential public health services and security for our nation. The first lesson is that personal relationships are the cornerstone of preparedness. A major lesson is that a regional strategy to manage funds may be more effective than allocating funds on a per capita basis. One regional director required every local department to complete a strategic plan as a basis for proportional allocation of the funds. Control of communicable diseases was a central component of the planning. Some funds were kept at the regional level to provide epidemiology services, computer software, equipment, and training for the entire region. Confirmation of the value of this regional strategy was expressed by local public health and emergency management officials in a focus group 1 year after the strategy had been implemented. The group members also pointed out the need to streamline the planning process, provide up-to-date computer networks, and receive more than minimal communication. This regional strategy can be viewed from the perspective of adaptive leadership, defined as activities to bring about constructive change, which also can be used to analyze other difficult areas of preparedness.

  1. Toward a framework for computer-mediated collaborative design in medical informatics.

    PubMed

    Patel, V L; Kaufman, D R; Allen, V G; Shortliffe, E H; Cimino, J J; Greenes, R A

    1999-09-01

    The development and implementation of enabling tools and methods that provide ready access to knowledge and information are among the central goals of medical informatics. The need for multi-institutional collaboration in the development of such tools and methods is increasingly being recognized. Collaboration involves communication, which typically involves individuals who work together at the same location. With the evolution of electronic modalities for communication, we seek to understand the role that such technologies can play in supporting collaboration, especially when the participants are geographically separated. Using the InterMed Collaboratory as a subject of study, we have analyzed their activities as an exercise in computer- and network-mediated collaborative design. We report on the cognitive, sociocultural, and logistical issues encountered when scientists from diverse organizations and backgrounds use communications technologies while designing and implementing shared products. Results demonstrate that it is important to match carefully the content with the mode of communication, identifying, for example, suitable uses of E-mail, conference calls, and face-to-face meetings. The special role of leaders in guiding and facilitating the group activities can also be seen, regardless of the communication setting in which the interactions occur. Most important is the proper use of technology to support the evolution of a shared vision of group goals and methods, an element that is clearly necessary before successful collaborative designs can proceed.

  2. Scalable and massively parallel Monte Carlo photon transport simulations for heterogeneous computing platforms

    NASA Astrophysics Data System (ADS)

    Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian

    2018-01-01

    We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs.

  3. Contextuality as a Resource for Models of Quantum Computation with Qubits

    NASA Astrophysics Data System (ADS)

    Bermejo-Vega, Juan; Delfosse, Nicolas; Browne, Dan E.; Okay, Cihan; Raussendorf, Robert

    2017-09-01

    A central question in quantum computation is to identify the resources that are responsible for quantum speed-up. Quantum contextuality has been recently shown to be a resource for quantum computation with magic states for odd-prime dimensional qudits and two-dimensional systems with real wave functions. The phenomenon of state-independent contextuality poses a priori an obstruction to characterizing the case of regular qubits, the fundamental building block of quantum computation. Here, we establish contextuality of magic states as a necessary resource for a large class of quantum computation schemes on qubits. We illustrate our result with a concrete scheme related to measurement-based quantum computation.

  4. Thyroid Hormone Indices in Computer Workers with Emphasis on the Role of Zinc Supplementation.

    PubMed

    Amin, Ahmed Ibrahim; Hegazy, Noha Mohamed; Ibrahim, Khadiga Salah; Mahdy-Abdallah, Heba; Hammouda, Hamdy A A; Shaban, Eman Essam

    2016-06-15

    This study aimed to investigate the effects of computer monitor-emitted radiation on thyroid hormones and the possible protective role of zinc supplementation. The study included three groups. The first group (group B) consisted of 42 computer workers. This group was given Zinc supplementation in the form of one tablet daily for eight weeks. The second group (group A) comprised the same 42 computer workers after zinc supplementation. A group of 63 subjects whose job does not entail computer use was recruited as a control Group (Group C). All participants filled a questionnaire including detailed medical and occupational histories. They were subjected to full clinical examination. Thyroid stimulating hormone (TSH), free triiodothyronine (FT3), free thyroxine (FT4) and zinc levels were measured in all participants. TSH, FT3, FT4 and zinc concentrations were decreased significantly in group B relative to group C. In group A, all tested parameters were improved when compared with group B. The obtained results revealed that radiation emitted from computers led to changes in TSH and thyroid hormones (FT3 and FT4) in the workers. Improvement after supplementation suggests that zinc can ameliorate hazards of such radiation on thyroid hormone indices.

  5. Comparison of Marginal and Internal Adaptation of CAD/CAM and Conventional Cement Retained Implant-Supported Single Crowns.

    PubMed

    Nejatidanesh, Farahnaz; Shakibamehr, Amir Hossein; Savabi, Omid

    2016-02-01

    To evaluate the accuracy of marginal and internal adaptation of 2 computer-aided design/computer-aided manufacturing (CAD/CAM) and 2 conventionally made cement retained implant-supported restorations. An abutment and its corresponding fixture analog (Astra Tech) were inserted in left central incisor area of a maxillary cast. Four types of implant-supported single restorations were fabricated on the abutment (n = 10): e.max CAD (Cerec AC system), zirconia-based (Cercon system), IPS e.max Press, and metal-ceramic restorations. The internal and marginal gaps of the studied groups were measured by replica method and stereomicroscope. Data were subjected to 1-way ANOVA and Scheffe post hoc tests (α = 0.05). Mean internal gaps of Cercon (59.48 ± 16.49 μm) and e.max Press (75.62 ± 26.92 μm) groups were significantly different from e.max CAD (120.29 ± 16.74 μm) group, but there was no significant difference between metal-ceramic restorations (89.65 ± 47.84 μm) and e.max CAD. The marginal gaps of e.max CAD (32.02 ± 10.38 μm) and Cercon restorations (34.26 ± 11.41 μm) were significantly superior from metal ceramics (59.19 ± 17.81 μm) and e.max press (74.99 ± 24.51 μm). Within the limitations of this study, it can be concluded that although the marginal and internal gaps of the studied implant-supported restorations were in the clinically acceptable range, single crowns made with CAD/CAM technology provide better marginal fit.

  6. Visiting the Digital Divide: Women Entrepreneurs in Central America

    ERIC Educational Resources Information Center

    Tapper, Helena

    2006-01-01

    Micro and small enterprises comprise approximately 60-70% of enterprises in South and Central America. Most of these enterprises, particularly micro enterprises, are managed and owned by women. These women for the most part lack both skills and training in the use of computers and the Internet, and access to the use of information and…

  7. VIEW LOOKING SOUTHEAST AT BUILDING 121. THE BUILDING HOUSES OFFICES, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW LOOKING SOUTHEAST AT BUILDING 121. THE BUILDING HOUSES OFFICES, THE ROCKY FLATS PLANT CENTRAL ALARM STATION, ALARM CONSOLES, THE ARMORY, THE LOCK AND KEY SECTION, A COMPUTER ROOM, A UTILITY ROOM, AND LOCKER ROOMS WITH SHOWERS. (1/98) - Rocky Flats Plant, Security & Armory, West of Third Street, south of Central Avenue, Golden, Jefferson County, CO

  8. Transfer of classical eyeblink conditioning with electrical stimulation of mPFC or tone as conditioned stimulus in guinea pigs.

    PubMed

    Yao, Juan; Wu, Guang-Yan; Liu, Guo-Long; Liu, Shu-Lei; Yang, Yi; Wu, Bing; Li, Xuan; Feng, Hua; Sui, Jian-Feng

    2014-11-01

    Learning with a stimulus from one sensory modality can facilitate subsequent learning with a new stimulus from a different sensory modality. To date, the characteristics and mechanism of this phenomenon named transfer effect still remain ambiguous. Our previous work showed that electrical stimulation of medial prefrontal cortex (mPFC) as a conditioned stimulus (CS) could successfully establish classical eyeblink conditioning (EBC). The present study aimed to (1) observe whether transfer of EBC learning would occur when CSs shift between central (mPFC electrical stimulation as a CS, mPFC-CS) and peripheral (tone as a CS, tone CS); (2) compare the difference in transfer effect between the two paradigms, delay EBC (DEBC) and trace EBC (TEBC). A total of 8 groups of guinea pigs were tested in the study, including 4 experimental groups and 4 control groups. Firstly, the experimental groups accepted central (or peripheral) CS paired with corneal airpuff unconditioned stimulus (US); then, CS shifted to the peripheral (or central) and paired with US. The control groups accepted corresponding central (or peripheral) CS and pseudo-paired with US, and then shifted CS from central (or peripheral) to peripheral (or central) and paired with US. The results showed that the acquisition rates of EBC were higher in experimental groups than in control groups after CS switching from central to peripheral or vice versa, and the CR acquisition rate was remarkably higher in DEBC than in TEBC in both transfer ways. The results indicate that EBC transfer can occur between learning established with mPFC-CS and tone CS. Memory of CS-US association for delay paradigm was less disturbed by the sudden switch of CS than for trace paradigm. This study provides new insight into neural mechanisms underlying conditioned reflex as well as the role of mPFC. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Randomized trial comparing the effects of a low-dose combination of nifedipine GITS and valsartan versus high-dose monotherapy on central hemodynamics in patients with inadequately controlled hypertension: FOCUS study.

    PubMed

    Park, Jeong Bae; Ha, Jong-Won; Jung, Hae-Ok; Rhee, Moo-Yong

    2014-10-01

    Measurement of central blood pressure provides prognostic information beyond conventional peripheral blood pressure (BP). However, few studies have directly compared the effects of antihypertensives on central hemodynamics. This study investigated the effects of a low-dose combination of nifedipine Gastrointestinal Therapeutic System (GITS) and valsartan versus high-dose monotherapy with either agent in reducing central BP in essential hypertension inadequately controlled by low-dose monotherapy. In this prospective, open-label, randomized, active-controlled, multicenter 8-week study, patients not meeting the target BP after 4 weeks of treatment with low-dose monotherapy were randomized to receive nifedipine GITS 30 mg plus valsartan 80 mg (N30+V80), nifedipine GITS 60 mg (N60), or valsartan 160 mg (V160) for a further 4 weeks. Central hemodynamics were measured by applanation tonometry. A total of 391 patients were enrolled. Reduction in central systolic BP from baseline to week 8, the primary efficacy variable, was significantly greater in the N30+V80 group (-27.2±14.7 mmHg) and the N60 group (-27.1±16.5 mmHg) compared with V160 group (-14.4±16.6 mmHg). Decrease in the augmentation index in the N60 group was significantly greater compared with V160 alone, without differences between combination therapy and either high-dose monotherapy. Decreases in brachial systolic BP were significantly greater in the N30+V80 and N60 groups than in the V160 group. By multiple regression analysis, most differences in drug effects on central hemodynamics disappeared after controlling for changes in peripheral BP. A low rate of adverse events occurred in all treatment groups. A low-dose combination of nifedipine GITS plus valsartan or high-dose nifedipine was more effective in improving central hemodynamics than high-dose valsartan in patients with hypertension, mostly because of the improvement in peripheral (brachial) hemodynamics.

  10. Guidelines for Computing Longitudinal Dynamic Stability Characteristics of a Subsonic Transport

    NASA Technical Reports Server (NTRS)

    Thompson, Joseph R.; Frank, Neal T.; Murphy, Patrick C.

    2010-01-01

    A systematic study is presented to guide the selection of a numerical solution strategy for URANS computation of a subsonic transport configuration undergoing simulated forced oscillation about its pitch axis. Forced oscillation is central to the prevalent wind tunnel methodology for quantifying aircraft dynamic stability derivatives from force and moment coefficients, which is the ultimate goal for the computational simulations. Extensive computations are performed that lead in key insights of the critical numerical parameters affecting solution convergence. A preliminary linear harmonic analysis is included to demonstrate the potential of extracting dynamic stability derivatives from computational solutions.

  11. Computing Nash equilibria through computational intelligence methods

    NASA Astrophysics Data System (ADS)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  12. Posterior Displacement of Supraspinatus Central Tendon Observed on Magnetic Resonance Imaging: A Useful Preoperative Indicator of Rotator Cuff Tear Characteristics.

    PubMed

    Updegrove, Gary F; Armstrong, April D; Mosher, Timothy J; Kim, H Mike

    2015-11-01

    To characterize the orientation of the normal supraspinatus central tendon and describe the displacement patterns of the central tendon in rotator cuff tears using a magnetic resonance imaging (MRI)-based method. We performed a retrospective MRI and chart review of 183 patients with a rotator cuff tear (cuff tear group), 52 with a labral tear but no rotator cuff tear (labral tear group), and 74 with a normal shoulder (normal group). The orientation of the supraspinatus central tendon relative to the bicipital groove was evaluated based on axial MRI and was numerically represented by the shortest distance from the lateral extension line of the central tendon to the bicipital groove. Tear size, fatty degeneration, and involvement of the anterior supraspinatus were evaluated to identify the factors associated with orientation changes. The mean distance from the bicipital groove to the central tendon line was 0.7 mm and 1.3 mm in the normal group and labral tear group, respectively. Full-thickness cuff tears involving the anterior supraspinatus showed a significantly greater distance (17.7 mm) than those sparing the anterior supraspinatus (4.9 mm, P = .001). Fatty degeneration of the supraspinatus was significantly correlated with the distance (P = .006). Disruption of the anterior supraspinatus and fatty degeneration of the supraspinatus were independent predictors of posterior displacement. The supraspinatus central tendon has a constant orientation toward the bicipital groove in normal shoulders, and the central tendon is frequently displaced posteriorly in full-thickness rotator cuff tears involving the anterior leading edge of the supraspinatus. The degree of posterior displacement is proportional to tear size and severity of fatty degeneration of the supraspinatus muscle. A simple and quick assessment of the central tendon orientation on preoperative MRI can be a useful indicator of tear characteristics, potentially providing insight into the intraoperative repair strategy. Level IV, diagnostic case-control study. Copyright © 2015 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  13. Increased thalamic centrality and putamen-thalamic connectivity in patients with parkinsonian resting tremor.

    PubMed

    Gu, Quanquan; Cao, Hengyi; Xuan, Min; Luo, Wei; Guan, Xiaojun; Xu, Jingjing; Huang, Peiyu; Zhang, Minming; Xu, Xiaojun

    2017-01-01

    Evidence has indicated a strong association between hyperactivity in the cerebello-thalamo-motor cortical loop and resting tremor in Parkinson's disease (PD). Within this loop, the thalamus serves as a central hub based on its structural centrality in the generation of resting tremor. To study whether this thalamic abnormality leads to an alteration at the whole-brain level, our study investigated the role of the thalamus in patients with parkinsonian resting tremor in a large-scale brain network context. Forty-one patients with PD (22 with resting tremor, TP and 19 without resting tremor, NTP) and 45 healthy controls (HC) were included in this resting-state functional MRI study. Graph theory-based network analysis was performed to examine the centrality measures of bilateral thalami across the three groups. To further provide evidence to the central role of the thalamus in parkinsonian resting tremor, the seed-based functional connectivity analysis was then used to quantify the functional interactions between the basal ganglia and the thalamus. Compared with the HC group, patients with the TP group exhibited increased degree centrality ( p  < .04), betweenness centrality ( p  < .01), and participation coefficient ( p  < .01) in the bilateral thalami. Two of these alterations (degree centrality and participation coefficient) were significantly correlated with tremor severity, especially in the left hemisphere ( p  < .02). The modular analysis showed that the TP group had more intermodular connections between the thalamus and the regions within the cerebello-thalamo-motor cortical loop. Furthermore, the data revealed significantly enhanced functional connectivity between the putamen and the thalamus in the TP group ( p  = .027 corrected for family-wise error). These findings suggest increased thalamic centrality as a potential tremor-specific imaging measure for PD, and provide evidence for the altered putamen-thalamic interaction in patients with resting tremor.

  14. Efficient Computation of Anharmonic Force Constants via q-space, with Application to Graphene

    NASA Astrophysics Data System (ADS)

    Kornbluth, Mordechai; Marianetti, Chris

    We present a new approach for extracting anharmonic force constants from a sparse sampling of the anharmonic dynamical tensor. We calculate the derivative of the energy with respect to q-space displacements (phonons) and strain, which guarantees the absence of supercell image errors. Central finite differences provide a well-converged quadratic error tail for each derivative, separating the contribution of each anharmonic order. These derivatives populate the anharmonic dynamical tensor in a sparse mesh that bounds the Brillouin Zone, which ensures comprehensive sampling of q-space while exploiting small-cell calculations for efficient, high-throughput computation. This produces a well-converged and precisely-defined dataset, suitable for big-data approaches. We transform this sparsely-sampled anharmonic dynamical tensor to real-space anharmonic force constants that obey full space-group symmetries by construction. Machine-learning techniques identify the range of real-space interactions. We show the entire process executed for graphene, up to and including the fifth-order anharmonic force constants. This method successfully calculates strain-based phonon renormalization in graphene, even under large strains, which solves a major shortcoming of previous potentials.

  15. Cone-beam computed tomography evaluation of the association of cortical plate proximity and apical root resorption after orthodontic treatment.

    PubMed

    Nakada, Tomoo; Motoyoshi, Mitsuru; Horinuki, Eri; Shimizu, Noriyoshi

    2016-01-01

    We investigated the effects of proximity of the root apex to the maxillary labial cortical plate, palatal cortical plate, and incisive canal cortical plate on apical root resorption. Cone-beam computed tomography was used to measure the amount of root resorption and root apex movement around maxillary right and left central incisors in 30 adults who underwent four-bicuspid extraction followed by treatment with multibracket appliances. The patients were divided into three groups on the basis of the direction of root apex movement, after which the correlation between the amount of root resorption and root apex movement was determined. Mean apical root resorption was 1.80 ± 0.82 mm (range, 0.18-3.96 mm). The amount of root apex movement was positively correlated with the amount of root resorption on the side of pressure. Root apex proximity to the maxillary labial cortical plate, palatal cortical plate, and incisive canal cortical plate was associated with apical root resorption. Orthodontic treatment plans should carefully consider root proximity to the maxillary cortical plate. (J Oral Sci 58, 231-236, 2016).

  16. In vitro and in vivo evaluations of three computer-aided shade matching instruments.

    PubMed

    Yuan, Kun; Sun, Xiang; Wang, Fu; Wang, Hui; Chen, Ji-hua

    2012-01-01

    This study evaluated the accuracy and reliability of three computer-aided shade matching instruments (Shadepilot, VITA Easyshade, and ShadeEye NCC) using both in vitro and in vivo models. The in vitro model included the measurement of five VITA Classical shade guides. The in vivo model utilized three instruments to measure the central region of the labial surface of maxillary right central incisors of 85 people. The accuracy and reliability of the three instruments in these two evaluating models were calculated. Significant differences were observed in the accuracy of instruments both in vitro and in vivo. No significant differences were found in the reliability of instruments between and within the in vitro and the in vivo groups. VITA Easyshade was significantly different in accuracy between in vitro and in vivo models, while no significant difference was found for the other two instruments. Shadepilot was the only instrument tested in the present study that showed high accuracy and reliability both in vitro and in vivo. Significant differences were observed in the L*a*b* values of the 85 natural teeth measured using three instruments in the in vivo assessment. The pair-agreement rates of shade matching among the three instruments ranged from 37.7% to 48.2%, and the incidence of identical shade results shared by all three instruments was 25.9%. As different L*a*b* values and shade matching results were reported for the same tooth, a combination of the evaluated shade matching instruments and visual shade confirmation is recommended for clinical use.

  17. Central Sensitization-Based Classification for Temporomandibular Disorders: A Pathogenetic Hypothesis

    PubMed Central

    Cattaneo, Ruggero; Marci, Maria Chiara; Pietropaoli, Davide; Ortu, Eleonora

    2017-01-01

    Dysregulation of Autonomic Nervous System (ANS) and central pain pathways in temporomandibular disorders (TMD) is a growing evidence. Authors include some forms of TMD among central sensitization syndromes (CSS), a group of pathologies characterized by central morphofunctional alterations. Central Sensitization Inventory (CSI) is useful for clinical diagnosis. Clinical examination and CSI cannot identify the central site(s) affected in these diseases. Ultralow frequency transcutaneous electrical nerve stimulation (ULFTENS) is extensively used in TMD and in dental clinical practice, because of its effects on descending pain modulation pathways. The Diagnostic Criteria for TMD (DC/TMD) are the most accurate tool for diagnosis and classification of TMD. However, it includes CSI to investigate central aspects of TMD. Preliminary data on sensory ULFTENS show it is a reliable tool for the study of central and autonomic pathways in TMD. An alternative classification based on the presence of Central Sensitization and on individual response to sensory ULFTENS is proposed. TMD may be classified into 4 groups: (a) TMD with Central Sensitization ULFTENS Responders; (b) TMD with Central Sensitization ULFTENS Nonresponders; (c) TMD without Central Sensitization ULFTENS Responders; (d) TMD without Central Sensitization ULFTENS Nonresponders. This pathogenic classification of TMD may help to differentiate therapy and aetiology. PMID:28932132

  18. ALGORITHMS AND PROGRAMS FOR STRONG GRAVITATIONAL LENSING IN KERR SPACE-TIME INCLUDING POLARIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Bin; Maddumage, Prasad; Kantowski, Ronald

    2015-05-15

    Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravitymore » field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.« less

  19. Second-Order Sensitivity Analysis of Uncollided Particle Contributions to Radiation Detector Responses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cacuci, Dan G.; Favorite, Jeffrey A.

    This work presents an application of Cacuci’s Second-Order Adjoint Sensitivity Analysis Methodology (2nd-ASAM) to the simplified Boltzmann equation that models the transport of uncollided particles through a medium to compute efficiently and exactly all of the first- and second-order derivatives (sensitivities) of a detector’s response with respect to the system’s isotopic number densities, microscopic cross sections, source emission rates, and detector response function. The off-the-shelf PARTISN multigroup discrete ordinates code is employed to solve the equations underlying the 2nd-ASAM. The accuracy of the results produced using PARTISN is verified by using the results of three test configurations: (1) a homogeneousmore » sphere, for which the response is the exactly known total uncollided leakage, (2) a multiregion two-dimensional (r-z) cylinder, and (3) a two-region sphere for which the response is a reaction rate. For the homogeneous sphere, results for the total leakage as well as for the respective first- and second-order sensitivities are in excellent agreement with the exact benchmark values. For the nonanalytic problems, the results obtained by applying the 2nd-ASAM to compute sensitivities are in excellent agreement with central-difference estimates. The efficiency of the 2nd-ASAM is underscored by the fact that, for the cylinder, only 12 adjoint PARTISN computations were required by the 2nd-ASAM to compute all of the benchmark’s 18 first-order sensitivities and 224 second-order sensitivities, in contrast to the 877 PARTISN calculations needed to compute the respective sensitivities using central finite differences, and this number does not include the additional calculations that were required to find appropriate values of the perturbations to use for the central differences.« less

  20. Second-Order Sensitivity Analysis of Uncollided Particle Contributions to Radiation Detector Responses

    DOE PAGES

    Cacuci, Dan G.; Favorite, Jeffrey A.

    2018-04-06

    This work presents an application of Cacuci’s Second-Order Adjoint Sensitivity Analysis Methodology (2nd-ASAM) to the simplified Boltzmann equation that models the transport of uncollided particles through a medium to compute efficiently and exactly all of the first- and second-order derivatives (sensitivities) of a detector’s response with respect to the system’s isotopic number densities, microscopic cross sections, source emission rates, and detector response function. The off-the-shelf PARTISN multigroup discrete ordinates code is employed to solve the equations underlying the 2nd-ASAM. The accuracy of the results produced using PARTISN is verified by using the results of three test configurations: (1) a homogeneousmore » sphere, for which the response is the exactly known total uncollided leakage, (2) a multiregion two-dimensional (r-z) cylinder, and (3) a two-region sphere for which the response is a reaction rate. For the homogeneous sphere, results for the total leakage as well as for the respective first- and second-order sensitivities are in excellent agreement with the exact benchmark values. For the nonanalytic problems, the results obtained by applying the 2nd-ASAM to compute sensitivities are in excellent agreement with central-difference estimates. The efficiency of the 2nd-ASAM is underscored by the fact that, for the cylinder, only 12 adjoint PARTISN computations were required by the 2nd-ASAM to compute all of the benchmark’s 18 first-order sensitivities and 224 second-order sensitivities, in contrast to the 877 PARTISN calculations needed to compute the respective sensitivities using central finite differences, and this number does not include the additional calculations that were required to find appropriate values of the perturbations to use for the central differences.« less

  1. Algorithms and Programs for Strong Gravitational Lensing In Kerr Space-time Including Polarization

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Kantowski, Ronald; Dai, Xinyu; Baron, Eddie; Maddumage, Prasad

    2015-05-01

    Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravity field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.

  2. Geodatabase design and characteristics of geologic information for a geodatabase of selected wells penetrating the Austin Group in central Bexar County, Texas, 2010

    USGS Publications Warehouse

    Pedraza, Diana E.; Shah, Sachin D.

    2010-01-01

    The U.S. Geological Survey, in cooperation with the San Antonio Water System, developed a geodatabase of geologic and hydrogeologic information for selected wells penetrating the Austin Group in central Bexar County, Texas. The Austin Group functions as an upper confining unit to the Edwards aquifer and is the thickest and most permeable of the Edwards aquifer confining units. The geologic and hydrogeologic information pertains to a 377-square-mile study area that encompasses central Bexar County. Data were compiled primarily from drillers' and borehole geophysical logs from federal, State, and local agencies and published reports. Austin Group characteristics compiled for 523 unique wells are documented (if known), including year drilled, well depth, altitude of top and base of the Austin Group, and thickness of the Austin Group.

  3. Adolescents' perception of peer groups: Psychological, behavioral, and relational determinants.

    PubMed

    Lee, Seungyoon; Foote, Jeremy; Wittrock, Zachary; Xu, Siyu; Niu, Li; French, Doran C

    2017-07-01

    Adolescents' social cognitive understanding of their social world is often inaccurate and biased. Focusing on peer groups, this study examines how adolescents' psychological, behavioral, and relational characteristics influence the extent to which they accurately identify their own and others' peer groups. Analyses were conducted with a sample of 1481 seventh- and tenth-grade Chinese students who are embedded with 346 peer groups. Overall, females and older students had more accurate perceptions. In addition, lower self-esteem, higher indegree centrality, and lower betweenness centrality in the friendship network predicted more accurate perception of one's own groups, whereas higher academic performance and lower betweenness centrality in the friendship network predicted more accurate perception of others' groups. Implications for understanding the connection between adolescents' psychological and behavioral traits, social relationships, and social cognition are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Relative peripheral refraction in children: twelve-month changes in eyes with different ametropias.

    PubMed

    Lee, Tsui-Tsui; Cho, Pauline

    2013-05-01

    To determine the peripheral refraction of children with different types of ametropias and to evaluate the relationship between central refractive changes, baseline relative peripheral refraction (RPR) and changes in RPR over a 12-month monitoring period. Cycloplegic central and peripheral refraction were performed biannually on the right eyes of children aged 6-9 for 12 months, using an open-view autorefractor. Peripheral refraction were measured along 10°, 20° and 30° from central fixation in both nasal and temporal fields. Refractive data were transposed into M, J0 and J45 vectors for analyses. RPR was determined by subtracting the central measurement from each peripheral measurement. Hyperopic eyes showed relative peripheral myopia while myopic eyes had relative hyperopia across the central 60° horizontal field at baseline. Emmetropic eyes had relative myopia within but showed relative hyperopia beyond the central 30° field. However, there was no significant correlation between central refractive changes and baseline RPR or between changes in central refraction and RPR over twelve months in any refractive groups. Correlations between changes in PR and central myopic shift were found mainly in the nasal field in different groups. In the subgroup analysis on the initially emmetropic and the initially myopic groups, the subgroups with faster myopic progression did not have significantly different RPR from the subgroups with slower progression. The RPR pattern of the initially emmetropic and the initially myopic groups became more asymmetric at the end of the study period with a larger increase in relative hyperopia in the temporal field. RPR patterns were different among hyperopic, emmetropic and myopic eyes. However, baseline RPR and changes in RPR cannot predict changes in central refraction over time. Our results did not provide evidence to support the hypothesis of RPR as a causative factor for myopic central refractive changes in children. Ophthalmic & Physiological Optics © 2013 The College of Optometrists.

  5. A computer software system for the generation of global ocean tides including self-gravitation and crustal loading effects

    NASA Technical Reports Server (NTRS)

    Estes, R. H.

    1977-01-01

    A computer software system is described which computes global numerical solutions of the integro-differential Laplace tidal equations, including dissipation terms and ocean loading and self-gravitation effects, for arbitrary diurnal and semidiurnal tidal constituents. The integration algorithm features a successive approximation scheme for the integro-differential system, with time stepping forward differences in the time variable and central differences in spatial variables.

  6. Mid-term survival analysis of closed wedge high tibial osteotomy: A comparative study of computer-assisted and conventional techniques.

    PubMed

    Bae, Dae Kyung; Song, Sang Jun; Kim, Kang Il; Hur, Dong; Jeong, Ho Yeon

    2016-03-01

    The purpose of the present study was to compare the clinical and radiographic results and survival rates between computer-assisted and conventional closing wedge high tibial osteotomies (HTOs). Data from a consecutive cohort comprised of 75 computer-assisted HTOs and 75 conventional HTOs were retrospectively reviewed. The Knee Society knee and function scores, Hospital for Special Surgery (HSS) score and femorotibial angle (FTA) were compared between the two groups. Survival rates were also compared with procedure failure. The knee and function scores at one year postoperatively were slightly better in the computer-assisted group than those in conventional group (90.1 vs. 86.1) (82.0 vs. 76.0). The HSS scores at one year postoperatively were slightly better for the computer-assisted HTOs than those of conventional HTOs (89.5 vs. 81.8). The inlier of the postoperative FTA was wider in the computer-assisted group than that in the conventional HTO group (88.0% vs. 58.7%), and mean postoperative FTA was greater in the computer-assisted group that in the conventional HTO group (valgus 9.0° vs. valgus 7.6°, p<0.001). The five- and 10-year survival rates were 97.1% and 89.6%, respectively. No difference was detected in nine-year survival rates (p=0.369) between the two groups, although the clinical and radiographic results were better in the computer-assisted group that those in the conventional HTO group. Mid-term survival rates did not differ between computer-assisted and conventional HTOs. A comparative analysis of longer-term survival rate is required to demonstrate the long-term benefit of computer-assisted HTO. III. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. The Value Proposition in Institutional Repositories

    ERIC Educational Resources Information Center

    Blythe, Erv; Chachra, Vinod

    2005-01-01

    In the education and research arena of the late 1970s and early 1980s, a struggle developed between those who advocated centralized, mainframe-based computing and those who advocated distributed computing. Ultimately, the debate reduced to whether economies of scale or economies of scope are more important to the effectiveness and efficiency of…

  8. How Learning Logic Programming Affects Recursion Comprehension

    ERIC Educational Resources Information Center

    Haberman, Bruria

    2004-01-01

    Recursion is a central concept in computer science, yet it is difficult for beginners to comprehend. Israeli high-school students learn recursion in the framework of a special modular program in computer science (Gal-Ezer & Harel, 1999). Some of them are introduced to the concept of recursion in two different paradigms: the procedural…

  9. Information Commons to Go

    ERIC Educational Resources Information Center

    Bayer, Marc Dewey

    2008-01-01

    Since 2004, Buffalo State College's E. H. Butler Library has used the Information Commons (IC) model to assist its 8,500 students with library research and computer applications. Campus Technology Services (CTS) plays a very active role in its IC, with a centrally located Computer Help Desk and a newly created Application Support Desk right in the…

  10. A Computational Account of Children's Analogical Reasoning: Balancing Inhibitory Control in Working Memory and Relational Representation

    ERIC Educational Resources Information Center

    Morrison, Robert G.; Doumas, Leonidas A. A.; Richland, Lindsey E.

    2011-01-01

    Theories accounting for the development of analogical reasoning tend to emphasize either the centrality of relational knowledge accretion or changes in information processing capability. Simulations in LISA (Hummel & Holyoak, 1997, 2003), a neurally inspired computer model of analogical reasoning, allow us to explore how these factors may…

  11. Scaling Up and Zooming In: Big Data and Personalization in Language Learning

    ERIC Educational Resources Information Center

    Godwin-Jones, Robert

    2017-01-01

    From its earliest days, practitioners of computer-assisted language learning (CALL) have collected data from computer-mediated learning environments. Indeed, that has been a central aspect of the field from the beginning. Usage logs provided valuable insights into how systems were used and how effective they were for language learning. That…

  12. Computer System Resource Requirements of Novice Programming Students.

    ERIC Educational Resources Information Center

    Nutt, Gary J.

    The characteristics of jobs that constitute the mix for lower division FORTRAN classes in a university were investigated. Samples of these programs were also benchmarked on a larger central site computer and two minicomputer systems. It was concluded that a carefully chosen minicomputer system could offer service at least the equivalent of the…

  13. The Mathematics and Computer Science Learning Center (MLC).

    ERIC Educational Resources Information Center

    Abraham, Solomon T.

    The Mathematics and Computer Science Learning Center (MLC) was established in the Department of Mathematics at North Carolina Central University during the fall semester of the 1982-83 academic year. The initial operations of the MLC were supported by grants to the University from the Burroughs-Wellcome Company and the Kenan Charitable Trust Fund.…

  14. A Low Cost Microcomputer Laboratory for Investigating Computer Architecture.

    ERIC Educational Resources Information Center

    Mitchell, Eugene E., Ed.

    1980-01-01

    Described is a microcomputer laboratory at the United States Military Academy at West Point, New York, which provides easy access to non-volatile memory and a single input/output file system for 16 microcomputer laboratory positions. A microcomputer network that has a centralized data base is implemented using the concepts of computer network…

  15. NRL Fact Book 1992-1993

    DTIC Science & Technology

    1993-06-01

    administering contractual support for lab-wide or multiple buys of ADP systems, software, and services. Computer systems located in the Central Computing Facility...Code Dr. D.L. Bradley Vacant Mrs. N.J. Beauchamp Dr. W.A. Kuperman Dr. E.R. Franchi Dr. M.H. Orr Dr. J.A. Bucaro Mr. L.B. Palmer Dr. D.J. Ramsdale Mr

  16. Baroreflex regulation of blood pressure during dynamic exercise

    NASA Technical Reports Server (NTRS)

    Raven, P. B.; Potts, J. T.; Shi, X.; Blomqvist, C. G. (Principal Investigator)

    1997-01-01

    From the work of Potts et al. Papelier et al. and Shi et al. it is readily apparent that the arterial (aortic and carotid) baroreflexes are reset to function at the prevailing ABP of exercise. The blood pressure of exercise is the result of the hemodynamic (cardiac output and TPR) responses, which appear to be regulated by two redundant neural control systems, "Central Command" and the "exercise pressor reflex". Central Command is a feed-forward neural control system that operates in parallel with the neural regulation of the locomotor system and appears to establish the hemodynamic response to exercise. Within the central nervous system it appears that the HLR may be the operational site for Central Command. Specific neural sites within the HLR have been demonstrated in animals to be active during exercise. With the advent of positron emission tomography (PET) and single-photon emission computed tomography (SPECT), the anatomical areas of the human brain related to Central Command are being mapped. It also appears that the Nucleus Tractus Solitarius and the ventrolateral medulla may serve as an integrating site as they receive neural information from the working muscles via the group III/IV muscle afferents as well as from higher brain centers. This anatomical site within the CNS is now the focus of many investigations in which arterial baroreflex function, Central Command and the "exercise pressor reflex" appear to demonstrate inhibitory or facilitatory interaction. The concept of whether Central Command is the prime mover in the resetting of the arterial baroreceptors to function at the exercising ABP or whether the resetting is an integration of the "exercise pressor reflex" information with that of Central Command is now under intense investigation. However, it would be justified to conclude, from the data of Bevegard and Shepherd, Dicarlo and Bishop, Potts et al., and Papelier et al. that the act of exercise results in the resetting of the arterial baroreflex. In addition, if, as we have proposed, the cardiopulmonary baroreceptors primarily monitors and reflexly regulates cardiac filling volume, it would seem from the data of Mack et al. and Potts et al. that the cardiopulmonary baroreceptor is also reset at the beginning of exercise. Therefore, investigations of the neural mechanisms of regulation involving Central Command and cardiopulmonary afferents, similar to those being undertaken for the arterial baroreflex, need to be established.

  17. Effects of Computer Course on Computer Self-Efficacy, Computer Attitudes and Achievements of Young Individuals in Siirt, Turkey

    ERIC Educational Resources Information Center

    Çelik, Halil Coskun

    2015-01-01

    The purpose of this study is to investigate the effects of computer courses on young individuals' computer self-efficacy, attitudes and achievement. The study group of this research included 60 unemployed young individuals (18-25 ages) in total; 30 in the experimental group and 30 in the control group. An experimental research model with pretest…

  18. Computers in medicine: patients' attitudes

    PubMed Central

    Cruickshank, P. J.

    1984-01-01

    Data are presented from two surveys where a 26-item questionnaire was used to measure patients' attitudes to diagnostic computers and to medical computers in general. The first group of respondents were 229 patients who had been given outpatient appointments at a hospital general medical clinic specializing in gastrointestinal problems, where some had experienced a diagnostic computer in use. The second group of respondents were 416 patients attending a group general practice where there was no computer. Patients who had experience of the diagnostic computer or a personal computer had more favourable attitudes to computers in medicine as did younger people and males. The two samples of patients showed broadly similar attitudes, and a notable finding was that over half of each group believed that, with a computer around, the personal touch of the doctor would be lost. PMID:6471021

  19. Embedding medical student computer tutorials into a busy emergency department.

    PubMed

    Pusic, Martin V; Pachev, George S; MacDonald, Wendy A

    2007-02-01

    To explore medical students' use of computer tutorials embedded in a busy clinical setting; to demonstrate that such tutorials can increase knowledge gain over and above that attributable to the clinical rotation itself. Six tutorials were installed on a computer placed in a central area in an emergency department. Each tutorial was made up of between 33 and 85 screens of information that include text, graphics, animations, and questions. They were designed to be brief (10 minutes), focused, interactive, and immediately relevant. The authors evaluated the intervention using quantitative research methods, including usage tracking, surveys of faculty and students, and a randomized pretest-posttest study. Over 46 weeks, 95 medical students used the tutorials 544 times, for an overall average of 1.7 times a day. The median time spent on completed tutorials was 11 minutes (average [SD], 14 [+/-12] minutes). Seventy-four students completed the randomized study. They completed 65% of the assigned tutorials, resulting in improved examination scores compared with the control (effect size, 0.39; 95% confidence interval = 0.15 to 0.62). Students were positively disposed to the tutorials, ranking them as "valuable." Fifty-four percent preferred the tutorials to small group teaching sessions with a preceptor. The faculty was also positive about the tutorials, although they did not appear to integrate the tutorials directly into their teaching. Medical students on rotation in a busy clinical setting can and will use appropriately presented computer tutorials. The tutorials are effective in raising examination scores.

  20. Centralization of symptoms and lumbar range of motion in patients with low back pain.

    PubMed

    Bybee, Ronald F; Olsen, Denise L; Cantu-Boncser, Gloria; Allen, Heather Condie; Byars, Allyn

    2009-05-01

    This quasi-experimental repeated measures study examined the relationship between centralization of symptoms and lumbar flexion and extension range of motion (ROM) in patients with low back pain. Rapid and lasting changes in lumbar ROM have been noted with centralization of symptoms. However, no study has objectively measured the changes in lumbar ROM occurring with centralization. Forty-two adult subjects (mean age, 45.68 years; SD=15.76 years) with low back pain and associated lower extremity symptoms were followed by McKenzie trained physical therapists. Subjects' lumbar ROM was measured at the beginning and end of each patient visit by using double inclinometers, and pain location was documented. Subjects were grouped as 1) centralized, 2) centralizing, or 3) noncentralized for comparisons of symptom and ROM changes. Data were analyzed by using multivariate analysis of variance and one-way analysis of variance. Significance was set at 0.05. A significant difference was found between initial and final mean extension ROM in the centralized and centralizing groups (p=0.003). No significant difference was found in the noncentralized group (p<0.05). Subjects (n=23) who demonstrated a change in pain location during the initial visit also showed a significant (p<0.001) change in extension ROM, whereas patients with no change in pain location (n=19) did not (p=0.848). Lumbar extension ROM increased as centralization occurred.

  1. Interim 18F-FGD PET/CT may not predict the outcome in primary central nervous system lymphoma patients treated with sequential treatment with methotrexate and cytarabine.

    PubMed

    Jo, Jae-Cheol; Yoon, Dok Hyun; Kim, Shin; Lee, Kyoungmin; Kang, Eun Hee; Park, Jung Sun; Ryu, Jin-Sook; Huh, Jooryung; Park, Chan-Sik; Kim, Jong Hoon; Lee, Sang Wook; Suh, Cheolwon

    2017-09-01

    18 F-fluoro-2-dexoy-D-glucose-positron emission tomography (PET)/computed tomography (CT) is a useful imaging technique for monitoring the treatment response in lymphoma cases. We investigated the value of interim brain PET/CT (I-PET/CT) for monitoring the response to intensive methotrexate-based chemotherapy in primary central nervous system lymphoma (PCNSL) patients with diffuse large B cell lymphoma (DLBCL). Of the 76 PCNSL patients treated with intensive methotrexate and cytarabine chemotherapy between September 2006 and December 2012, 66 patients with DLBCL were included in this study. The patient cohort of 66 individuals comprised 43 men and 23 women with a median age of 59 years (range, 17-75 years). During chemotherapy, 36 patients (54.5%) showed a negative metabolism on I-PET/CT, and 47 (71.2%) were negative on final (F) PET/CT. The baseline characteristics were similar between I-PET/CT-negative (n = 36) and I-PET/CT-positive patients (n = 30) except ECOG performance status. After a median follow-up of 27.5 months, there was no difference in the progression-free survival (PFS; P = 0.701) or overall survival (OS; P = 0.620) between the I-PET/CT-negative and I-PET/CT-positive groups. However, PFS in the F-PET/CT-negative group was significantly longer than that in the F-PET/CT-positive group (P < 0.001) without a significant difference in OS (P = 0.892). I-PET/CT may not predict the survival outcome of PCNSL patients with DLBCL treated with intensive methotrexate and cytarabine chemotherapy. Prospective trials are required to fully evaluate the role of I-PET/CT.

  2. Individual versus group decision making: Jurors’ reliance on central and peripheral information to evaluate expert testimony

    PubMed Central

    Bottoms, Bette L.; Peter-Hagene, Liana C.

    2017-01-01

    To investigate dual-process persuasion theories in the context of group decision making, we studied low and high need-for-cognition (NFC) participants within a mock trial study. Participants considered plaintiff and defense expert scientific testimony that varied in argument strength. All participants heard a cross-examination of the experts focusing on peripheral information (e.g., credentials) about the expert, but half were randomly assigned to also hear central information highlighting flaws in the expert’s message (e.g., quality of the research presented by the expert). Participants rendered pre- and post-group-deliberation verdicts, which were considered “scientifically accurate” if the verdicts reflected the strong (versus weak) expert message, and “scientifically inaccurate” if they reflected the weak (versus strong) expert message. For individual participants, we replicated studies testing classic persuasion theories: Factors promoting reliance on central information (i.e., central cross-examination, high NFC) improved verdict accuracy because they sensitized individual participants to the quality discrepancy between the experts’ messages. Interestingly, however, at the group level, the more that scientifically accurate mock jurors discussed peripheral (versus central) information about the experts, the more likely their group was to reach the scientifically accurate verdict. When participants were arguing for the scientifically accurate verdict consistent with the strong expert message, peripheral comments increased their persuasiveness, which made the group more likely to reach the more scientifically accurate verdict. PMID:28931011

  3. Individual versus group decision making: Jurors' reliance on central and peripheral information to evaluate expert testimony.

    PubMed

    Salerno, Jessica M; Bottoms, Bette L; Peter-Hagene, Liana C

    2017-01-01

    To investigate dual-process persuasion theories in the context of group decision making, we studied low and high need-for-cognition (NFC) participants within a mock trial study. Participants considered plaintiff and defense expert scientific testimony that varied in argument strength. All participants heard a cross-examination of the experts focusing on peripheral information (e.g., credentials) about the expert, but half were randomly assigned to also hear central information highlighting flaws in the expert's message (e.g., quality of the research presented by the expert). Participants rendered pre- and post-group-deliberation verdicts, which were considered "scientifically accurate" if the verdicts reflected the strong (versus weak) expert message, and "scientifically inaccurate" if they reflected the weak (versus strong) expert message. For individual participants, we replicated studies testing classic persuasion theories: Factors promoting reliance on central information (i.e., central cross-examination, high NFC) improved verdict accuracy because they sensitized individual participants to the quality discrepancy between the experts' messages. Interestingly, however, at the group level, the more that scientifically accurate mock jurors discussed peripheral (versus central) information about the experts, the more likely their group was to reach the scientifically accurate verdict. When participants were arguing for the scientifically accurate verdict consistent with the strong expert message, peripheral comments increased their persuasiveness, which made the group more likely to reach the more scientifically accurate verdict.

  4. The analysis of delays in simulator digital computing systems. Volume 1: Formulation of an analysis approach using a central example simulator model

    NASA Technical Reports Server (NTRS)

    Heffley, R. K.; Jewell, W. F.; Whitbeck, R. F.; Schulman, T. M.

    1980-01-01

    The effects of spurious delays in real time digital computing systems are examined. Various sources of spurious delays are defined and analyzed using an extant simulator system as an example. A specific analysis procedure is set forth and four cases are viewed in terms of their time and frequency domain characteristics. Numerical solutions are obtained for three single rate one- and two-computer examples, and the analysis problem is formulated for a two-rate, two-computer example.

  5. Computer Assisted Thermography And Its Application In Ovulation Detection

    NASA Astrophysics Data System (ADS)

    Rao, K. H.; Shah, A. V.

    1984-08-01

    Hardware and software of a computer-assisted image analyzing system used for infrared images in medical applications are discussed. The application of computer-assisted thermography (CAT) as a complementary diagnostic tool in centralized diagnostic management is proposed. The authors adopted 'Computer Assisted Thermography' to study physiological changes in the breasts related to the hormones characterizing the menstrual cycle of a woman. Based on clinical experi-ments followed by thermal image analysis, they suggest that 'differential skin temperature (DST)1 be measured to detect the fertility interval in the menstrual cycle of a woman.

  6. Cenozoic biogeography and evolution in direct-developing frogs of Central America (Leptodactylidae: Eleutherodactylus) as inferred from a phylogenetic analysis of nuclear and mitochondrial genes.

    PubMed

    Crawford, Andrew J; Smith, Eric N

    2005-06-01

    We report the first phylogenetic analysis of DNA sequence data for the Central American component of the genus Eleutherodactylus (Anura: Leptodactylidae: Eleutherodactylinae), one of the most ubiquitous, diverse, and abundant components of the Neotropical amphibian fauna. We obtained DNA sequence data from 55 specimens representing 45 species. Sampling was focused on Central America, but also included Bolivia, Brazil, Jamaica, and the USA. We sequenced 1460 contiguous base pairs (bp) of the mitochondrial genome containing ND2 and five neighboring tRNA genes, plus 1300 bp of the c-myc nuclear gene. The resulting phylogenetic inferences were broadly concordant between data sets and among analytical methods. The subgenus Craugastor is monophyletic and its initial radiation was potentially rapid and adaptive. Within Craugastor, the earliest splits separate three northern Central American species groups, milesi, augusti, and alfredi, from a clade comprising the rest of Craugastor. Within the latter clade, the rhodopis group as formerly recognized comprises three deeply divergent clades that do not form a monophyletic group; we therefore restrict the content of the rhodopis group to one of two northern clades, and use new names for the other northern (mexicanus group) and one southern clade (bransfordii group). The new rhodopis and bransfordii groups together form the sister taxon to a clade comprising the biporcatus, fitzingeri, mexicanus, and rugulosus groups. We used a Bayesian MCMC approach together with geological and biogeographic assumptions to estimate divergence times from the combined DNA sequence data. Our results corroborated three independent dispersal events for the origins of Central American Eleutherodactylus: (1) an ancestor of Craugastor entered northern Central America from South American in the early Paleocene, (2) an ancestor of the subgenus Syrrhophus entered northern Central America from the Caribbean at the end of the Eocene, and (3) a wave of independent dispersal events from South America coincided with formation of the Isthmus of Panama during the Pliocene. We elevate the subgenus Craugastor to the genus rank.

  7. Crustal thickness variations in the Zagros continental collision zone (Iran) from joint inversion of receiver functions and surface wave dispersion

    NASA Astrophysics Data System (ADS)

    Tatar, M.; Nasrabadi, A.

    2013-10-01

    Variations in crustal thickness in the Zagros determined by joint inversion of P wave receiver functions (RFs) and Rayleigh wave group and phase velocity dispersion. The time domain iterative deconvolution procedure was employed to compute RFs from teleseismic recordings at seven broadband stations of INSN network. Rayleigh wave phase velocity dispersion curves were estimated employing two-station method. Fundamental mode Rayleigh wave group velocities for each station is taken from a regional scale surface wave tomographic imaging. The main variations in crustal thickness that we observe are between stations located in the Zagros fold and thrust belt with those located in the Sanandaj-Sirjan zone (SSZ) and Urumieh-Dokhtar magmatic assemblage (UDMA). Our results indicate that the average crustal thickness beneath the Zagros Mountain Range varies from ˜46 km in Western and Central Zagros beneath SHGR and GHIR up to ˜50 km beneath BNDS located in easternmost of the Zagros. Toward NE, we observe an increase in Moho depth where it reaches ˜58 km beneath SNGE located in the SSZ. Average crustal thickness also varies beneath the UDMA from ˜50 km in western parts below ASAO to ˜58 in central parts below NASN. The observed variation along the SSZ and UDMA may be associated to ongoing slab steepening or break off in the NW Zagros, comparing under thrusting of the Arabian plate beneath Central Zagros. The results show that in Central Iran, the crustal thickness decrease again to ˜47 km below KRBR. There is not a significant crustal thickness difference along the Zagros fold and thrust belt. We found the same crystalline crust of ˜34 km thick beneath the different parts of the Zagros fold and thrust belt. The similarity of crustal structure suggests that the crust of the Zagros fold and thrust belt was uniform before subsidence and deposition of the sediments. Our results confirm that the shortening of the western and eastern parts of the Zagros basement is small and has only started recently.

  8. The differential effects of ecstasy/polydrug use on executive components: shifting, inhibition, updating and access to semantic memory.

    PubMed

    Montgomery, Catharine; Fisk, John E; Newcombe, Russell; Murphy, Phillip N

    2005-10-01

    Recent theoretical models suggest that the central executive may not be a unified structure. The present study explored the nature of central executive deficits in ecstasy users. In study 1, 27 ecstasy users and 34 non-users were assessed using tasks to tap memory updating (computation span; letter updating) and access to long-term memory (a semantic fluency test and the Chicago Word Fluency Test). In study 2, 51 ecstasy users and 42 non-users completed tasks that assess mental set switching (number/letter and plus/minus) and inhibition (random letter generation). MANOVA revealed that ecstasy users performed worse on both tasks used to assess memory updating and on tasks to assess access to long-term memory (C- and S-letter fluency). However, notwithstanding the significant ecstasy group-related effects, indices of cocaine and cannabis use were also significantly correlated with most of the executive measures. Unexpectedly, in study 2, ecstasy users performed significantly better on the inhibition task, producing more letters than non-users. No group differences were observed on the switching tasks. Correlations between indices of ecstasy use and number of letters produced were significant. The present study provides further support for ecstasy/polydrug-related deficits in memory updating and in access to long-term memory. The surplus evident on the inhibition task should be treated with some caution, as this was limited to a single measure and has not been supported by our previous work.

  9. North Central IPM Center

    Science.gov Websites

    ) solutions, based in the North Central region. North Central IPM Center Invasive Plants in Trade Working webpage to learn about the North Central IPM Center's grants program and other IPM-related funding Tribal IPM Urban Ag IPM Partners In IPM Working Groups Critical Issues Projects North Central IPM

  10. Central diabetes insipidus: clinical profile and factors indicating organic etiology in children.

    PubMed

    Bajpai, Anurag; Kabra, Madhulika; Menon, P S N

    2008-06-01

    To evaluate the profile of children with central diabetes insipidus (DI) and identify factors indicating organic etiology. Retrospective chart review. Tertiary referral hospital. Fifty-nine children with central DI (40 boys, 19 girls). Features of organic and idiopathic central DI were compared using students t test and chi square test. Odds ratio was calculated for factors indicating organic etiology. Diagnosis included post-operative central DI (13, 22%), central nervous system (CNS) malformations (5, 8.6% holoprosencephaly 4 and hydrocephalus 1), histiocytosis (11, 18.6%), CNS pathology (11, 18.6%; craniopharyngioma 3, empty sella 2, germinoma 2, neuro-tuberculosis 2, arachnoid cyst 1 and glioma 1) and idiopathic central DI (19, 32.2%). Children with organic central DI were diagnosed later (7.8+/- 3.1 years against 5.3+/-2.4 years, P=0.03) and had lower height standard deviation score (-2.7+/-1.0 versus -1.0+/- 1.0, P<0.001) compared to idiopathic group. A greater proportion of children with organic central DI had short stature (81.8% against 10.5%, P <0.001, odds ratio 38.25), neurological features (45.5% against 0%, p 0.009) and anterior pituitary hormone deficiency (81.8% against 5.3%, P<0.001, odds ratio 81) compared to idiopathic group. A combination of short stature and onset after five years of age led to discrimination of organic central DI from idiopathic group in all cases. Organic central DI should be suspected in children presenting after the age of five years with growth retardation and features of anterior pituitary deficiency.

  11. Morphologic characteristics of central pulmonary thromboemboli predict haemodynamic response in massive pulmonary embolism.

    PubMed

    Podbregar, Matej; Voga, Gorazd; Krivec, Bojan

    2004-08-01

    On hospital admission, the morphology of the central pulmonary artery thromboemboli is an independent predictor of 30-day mortality in patients with massive pulmonary embolism (MPE). This may be due to the differential susceptibility of thromboemboli to thrombolysis. The aim of this study was to assess haemodynamic response to treatment in patients with MPE and morphologically different thromboemboli. Prospective observational study. An 11-bed closed medical ICU at a 860-bed community general hospital. Twelve consecutive patients with shock or hypotension due to MPE and central pulmonary thromboemboli detected by transesophageal echocardiography who were treated with thrombolysis between January 2000 through April 2002. Patients were divided into two groups according to the characteristics of detected central pulmonary thromboemboli: group 1, thrombi with one or more long, mobile parts; and group 2, immobile thrombi. Urokinase infusion was terminated when mixed venous oxygen saturation was stabilized above 60% for 15 min. At 2 h, the total pulmonary vascular resistance index was reduced more in group 1 than group 2 [from 27+/-12 mmHg/(l.min.m(2)) to 14+/-6 mmHg/(l.min.m(2)) (-52%) vs 27+/-8 mmHg/(l.min.m(2)) to 23+/-10 mmHg/(l.min.m(2)) (-15%), respectively, P=0.04]. In group 1 thrombolysis was terminated earlier than group 2 (89+/-40 min vs 210+/-62 min, respectively, P= 0.0024). The cumulative dose of urokinase used in group 1 was lower than group 2 (1.7+/-0.3 M i.u. vs 2.7+/-0.5 M i.u., respectively, P= 0.023). Haemodynamic stabilization is achieved faster in patients with mobile central thromboemboli detected by transesophageal echocardiography during MPE.

  12. Computer-Mediated Collaborative Projects: Processes for Enhancing Group Development

    ERIC Educational Resources Information Center

    Dupin-Bryant, Pamela A.

    2008-01-01

    Groups are a fundamental part of the business world. Yet, as companies continue to expand internationally, a major challenge lies in promoting effective communication among employees who work in varying time zones. Global expansion often requires group collaboration through computer systems. Computer-mediated groups lead to different communicative…

  13. Interactive access to forest inventory data for the South Central United States

    Treesearch

    William H. McWilliams

    1990-01-01

    On-line access to USDA, Forest Service successive forest inventory data for the South Central United States is provided by two computer systems. The Easy Access to Forest Inventory and Analysis Tables program (EZTAB) produces a set of tables for specific geographic areas. The Interactive Graphics and Retrieval System (INGRES) is a database management system that...

  14. Using Centrality of Concept Maps as a Measure of Problem Space States in Computer-Supported Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Engelmann, Tanja; Yu, Wu

    2013-01-01

    Problem solving likely involves at least two broad stages, problem space representation and then problem solution (Newell and Simon, Human problem solving, 1972). The metric centrality that Freeman ("Social Networks" 1:215-239, 1978) implemented in social network analysis is offered here as a potential measure of both. This development research…

  15. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  16. Extraction and visualization of the central chest lymph-node stations

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Merritt, Scott A.; Higgins, William E.

    2008-03-01

    Lung cancer remains the leading cause of cancer death in the United States and is expected to account for nearly 30% of all cancer deaths in 2007. Central to the lung-cancer diagnosis and staging process is the assessment of the central chest lymph nodes. This assessment typically requires two major stages: (1) location of the lymph nodes in a three-dimensional (3D) high-resolution volumetric multi-detector computed-tomography (MDCT) image of the chest; (2) subsequent nodal sampling using transbronchial needle aspiration (TBNA). We describe a computer-based system for automatically locating the central chest lymph-node stations in a 3D MDCT image. Automated analysis methods are first run that extract the airway tree, airway-tree centerlines, aorta, pulmonary artery, lungs, key skeletal structures, and major-airway labels. This information provides geometrical and anatomical cues for localizing the major nodal stations. Our system demarcates these stations, conforming to criteria outlined for the Mountain and Wang standard classification systems. Visualization tools within the system then enable the user to interact with these stations to locate visible lymph nodes. Results derived from a set of human 3D MDCT chest images illustrate the usage and efficacy of the system.

  17. Working and strategic memory deficits in schizophrenia

    NASA Technical Reports Server (NTRS)

    Stone, M.; Gabrieli, J. D.; Stebbins, G. T.; Sullivan, E. V.

    1998-01-01

    Working memory and its contribution to performance on strategic memory tests in schizophrenia were studied. Patients (n = 18) and control participants (n = 15), all men, received tests of immediate memory (forward digit span), working memory (listening, computation, and backward digit span), and long-term strategic (free recall, temporal order, and self-ordered pointing) and nonstrategic (recognition) memory. Schizophrenia patients performed worse on all tests. Education, verbal intelligence, and immediate memory capacity did not account for deficits in working memory in schizophrenia patients. Reduced working memory capacity accounted for group differences in strategic memory but not in recognition memory. Working memory impairment may be central to the profile of impaired cognitive performance in schizophrenia and is consistent with hypothesized frontal lobe dysfunction associated with this disease. Additional medial-temporal dysfunction may account for the recognition memory deficit.

  18. Landslide and Flood Warning System Prototypes based on Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Hloupis, George; Stavrakas, Ilias; Triantis, Dimos

    2010-05-01

    Wireless sensor networks (WSNs) are one of the emerging areas that received great attention during the last few years. This is mainly due to the fact that WSNs have provided scientists with the capability of developing real-time monitoring systems equipped with sensors based on Micro-Electro-Mechanical Systems (MEMS). WSNs have great potential for many applications in environmental monitoring since the sensor nodes that comprised from can host several MEMS sensors (such as temperature, humidity, inertial, pressure, strain-gauge) and transducers (such as position, velocity, acceleration, vibration). The resulting devices are small and inexpensive but with limited memory and computing resources. Each sensor node contains a sensing module which along with an RF transceiver. The communication is broadcast-based since the network topology can change rapidly due to node failures [1]. Sensor nodes can transmit their measurements to central servers through gateway nodes without any processing or they make preliminary calculations locally in order to produce results that will be sent to central servers [2]. Based on the above characteristics, two prototypes using WSNs are presented in this paper: A Landslide detection system and a Flood warning system. Both systems sent their data to central processing server where the core of processing routines exists. Transmission is made using Zigbee and IEEE 802.11b protocol but is capable to use VSAT communication also. Landslide detection system uses structured network topology. Each measuring node comprises of a columnar module that is half buried to the area under investigation. Each sensing module contains a geophone, an inclinometer and a set of strain gauges. Data transmitted to central processing server where possible landslide evolution is monitored. Flood detection system uses unstructured network topology since the failure rate of sensor nodes is expected higher. Each sensing module contains a custom water level sensor (based on plastic optical fiber). Data transmitted directly to server where the early warning algorithms monitor the water level variations in real time. Both sensor nodes use power harvesting techniques in order to extend their battery life as much as possible. [1] Yick J.; Mukherjee, B.; Ghosal, D. Wireless sensor network survey. Comput. Netw. 2008, 52, 2292-2330. [2] Garcia, M.; Bri, D.; Boronat, F.; Lloret, J. A new neighbor selection strategy for group-based wireless sensor networks, In The Fourth International Conference on Networking and Services (ICNS 2008), Gosier, Guadalupe, March 16-21, 2008.

  19. The revolution in data gathering systems

    NASA Technical Reports Server (NTRS)

    Cambra, J. M.; Trover, W. F.

    1975-01-01

    Data acquisition systems used in NASA's wind tunnels from the 1950's through the present time are summarized as a baseline for assessing the impact of minicomputers and microcomputers on data acquisition and data processing. Emphasis is placed on the cyclic evolution in computer technology which transformed the central computer system, and finally the distributed computer system. Other developments discussed include: medium scale integration, large scale integration, combining the functions of data acquisition and control, and micro and minicomputers.

  20. A New Volumetric Radiologic Method to Assess Indirect Decompression After Extreme Lateral Interbody Fusion Using High-Resolution Intraoperative Computed Tomography.

    PubMed

    Navarro-Ramirez, Rodrigo; Berlin, Connor; Lang, Gernot; Hussain, Ibrahim; Janssen, Insa; Sloan, Stephen; Askin, Gulce; Avila, Mauricio J; Zubkov, Micaella; Härtl, Roger

    2018-01-01

    Two-dimensional radiographic methods have been proposed to evaluate the radiographic outcome after indirect decompression through extreme lateral interbody fusion (XLIF). However, the assessment of neural decompression in a single plane may underestimate the effect of indirect decompression on central canal and foraminal volumes. The present study aimed to assess the reliability and consistency of a novel 3-dimensional radiographic method that assesses neural decompression by volumetric analysis using a new generation of intraoperative fan-beam computed tomography scanner in patients undergoing XLIF. Prospectively collected data from 7 patients (9 levels) undergoing XLIF was retrospectively analyzed. Three independent, blind raters using imaging analysis software performed volumetric measurements pre- and postoperatively to determine central canal and foraminal volumes. Intrarater and Interrater reliability tests were performed to assess the reliability of this novel volumetric method. The interrater reliability between the three raters ranged from 0.800 to 0.952, P < 0.0001. The test-retest analysis on a randomly selected subset of three patients showed good to excellent internal reliability (range of 0.78-1.00) for all 3 raters. There was a significant increase in mean volume ≈20% for right foramen, left foramen, and central canal volumes postoperatively (P = 0.0472; P = 0.0066; P = 0.0003, respectively). Here we demonstrate a new volumetric analysis technique that is feasible, reliable, and reproducible amongst independent raters for central canal and foraminal volumes in the lumbar spine using an intraoperative computed tomography scanner. Copyright © 2017. Published by Elsevier Inc.

Top