20. VIEW OF THE RECORDS STORAGE AREA LOCATED ON THE ...
20. VIEW OF THE RECORDS STORAGE AREA LOCATED ON THE FIRST FLOOR MEZZANINE. (1/83) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
23. VIEW OF THE FIRST FLOOR PLAN. THE FIRST FLOOR ...
23. VIEW OF THE FIRST FLOOR PLAN. THE FIRST FLOOR HOUSED ADMINISTRATIVE OFFICES, THE CENTRAL COMPUTING, UTILITY SYSTEMS, ANALYTICAL LABORATORIES, AND MAINTENANCE SHOPS. THE ORIGINAL DRAWING HAS BEEN ARCHIVED ON MICROFILM. THE DRAWING WAS REPRODUCED AT THE BEST QUALITY POSSIBLE. LETTERS AND NUMBERS IN THE CIRCLES INDICATE FOOTER AND/OR COLUMN LOCATIONS. - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
19. VIEW OF THE GENERAL CHEMISTRY LABORATORY IN BUILDING 881. ...
19. VIEW OF THE GENERAL CHEMISTRY LABORATORY IN BUILDING 881. (4/12/62) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
12. VIEW OF THE NONDESTRUCTIVE TESTING EQUIPMENT BEING USED TO ...
12. VIEW OF THE NON-DESTRUCTIVE TESTING EQUIPMENT BEING USED TO DETECT FLAWS IN FABRICATED COMPONENTS. (6/76) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
7. VIEW OF MACHINE SHOP IN BUILDING 881. WORKERS IN ...
7. VIEW OF MACHINE SHOP IN BUILDING 881. WORKERS IN THE MACHINE SHOP FORMED ENRICHED URANIUM COMPONENTS INTO THEIR FINAL SHAPES. (12/12/56) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
17. VIEW OF HYDRIDING SYSTEM IN BUILDING 881. THE HYDRIDING ...
17. VIEW OF HYDRIDING SYSTEM IN BUILDING 881. THE HYDRIDING SYSTEM WAS PART OF THE FAST ENRICHED URANIUM RECOVERY PROCESS. (11/11/59) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
5. VIEW OF THE FOUNDRY. IN THE FOUNDRY, ENRICHED URANIUM ...
5. VIEW OF THE FOUNDRY. IN THE FOUNDRY, ENRICHED URANIUM WAS CAST INTO SLABS OR INGOTS FROM WHICH WEAPONS COMPONENTS WERE FABRICATED. (4/4/66) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
4. VIEW OF THE FOUNDRY. IN THE FOUNDRY, ENRICHED URANIUM ...
4. VIEW OF THE FOUNDRY. IN THE FOUNDRY, ENRICHED URANIUM WAS CAST INTO SLABS OR INGOTS FROM WHICH WEAPONS COMPONENTS WERE FABRICATED. (5/17/62). - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
Using old technology to implement modern computer-aided decision support for primary diabetes care.
Hunt, D. L.; Haynes, R. B.; Morgan, D.
2001-01-01
BACKGROUND: Implementation rates of interventions known to be beneficial for people with diabetes mellitus are often suboptimal. Computer-aided decision support systems (CDSSs) can improve these rates. The complexity of establishing a fully integrated electronic medical record that provides decision support, however, often prevents their use. OBJECTIVE: To develop a CDSS for diabetes care that can be easily introduced into primary care settings and diabetes clinics. THE SYSTEM: The CDSS uses fax-machine-based optical character recognition software for acquiring patient information. Simple, 1-page paper forms, completed by patients or health practitioners, are faxed to a central location. The information is interpreted and recorded in a database. This initiates a routine that matches the information against a knowledge base so that patient-specific recommendations can be generated. These are formatted and faxed back within 4-5 minutes. IMPLEMENTATION: The system is being introduced into 2 diabetes clinics. We are collecting information on frequency of use of the system, as well as satisfaction with the information provided. CONCLUSION: Computer-aided decision support can be provided in any setting with a fax machine, without the need for integrated electronic medical records or computerized data-collection devices. PMID:11825194
Using old technology to implement modern computer-aided decision support for primary diabetes care.
Hunt, D L; Haynes, R B; Morgan, D
2001-01-01
Implementation rates of interventions known to be beneficial for people with diabetes mellitus are often suboptimal. Computer-aided decision support systems (CDSSs) can improve these rates. The complexity of establishing a fully integrated electronic medical record that provides decision support, however, often prevents their use. To develop a CDSS for diabetes care that can be easily introduced into primary care settings and diabetes clinics. THE SYSTEM: The CDSS uses fax-machine-based optical character recognition software for acquiring patient information. Simple, 1-page paper forms, completed by patients or health practitioners, are faxed to a central location. The information is interpreted and recorded in a database. This initiates a routine that matches the information against a knowledge base so that patient-specific recommendations can be generated. These are formatted and faxed back within 4-5 minutes. The system is being introduced into 2 diabetes clinics. We are collecting information on frequency of use of the system, as well as satisfaction with the information provided. Computer-aided decision support can be provided in any setting with a fax machine, without the need for integrated electronic medical records or computerized data-collection devices.
Incorrect support and missing center tolerances of phasing algorithms
Huang, Xiaojing; Nelson, Johanna; Steinbrener, Jan; ...
2010-01-01
In x-ray diffraction microscopy, iterative algorithms retrieve reciprocal space phase information, and a real space image, from an object's coherent diffraction intensities through the use of a priori information such as a finite support constraint. In many experiments, the object's shape or support is not well known, and the diffraction pattern is incompletely measured. We describe here computer simulations to look at the effects of both of these possible errors when using several common reconstruction algorithms. Overly tight object supports prevent successful convergence; however, we show that this can often be recognized through pathological behavior of the phase retrieval transfermore » function. Dynamic range limitations often make it difficult to record the central speckles of the diffraction pattern. We show that this leads to increasing artifacts in the image when the number of missing central speckles exceeds about 10, and that the removal of unconstrained modes from the reconstructed image is helpful only when the number of missing central speckles is less than about 50. In conclusion, this simulation study helps in judging the reconstructability of experimentally recorded coherent diffraction patterns.« less
Computing at DESY — current setup, trends and strategic directions
NASA Astrophysics Data System (ADS)
Ernst, Michael
1998-05-01
Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.
21. VIEW OF THE ENTRANCE TO THE TUNNEL CONNECTING BUILDINGS ...
21. VIEW OF THE ENTRANCE TO THE TUNNEL CONNECTING BUILDINGS 881 AND 883. THE TUNNEL WAS CONSTRUCTED IN 1957 TO TRANSPORT ENRICHED URANIUM COMPONENTS BETWEEN THE BUILDINGS. (1/98) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
15. DETAILED VIEW OF ENRICHED URANIUM STORAGE TANK. THE ADDITION ...
15. DETAILED VIEW OF ENRICHED URANIUM STORAGE TANK. THE ADDITION OF THE GLASS RINGS SHOWN AT THE TOP OF THE TANK HELPS PREVENT THE URANIUM FROM REACHING CRITICALITY LIMITS. (4/12/62) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
16. VIEW OF THE ENRICHED URANIUM RECOVERY SYSTEM. ENRICHED URANIUM ...
16. VIEW OF THE ENRICHED URANIUM RECOVERY SYSTEM. ENRICHED URANIUM RECOVERY PROCESSED RELATIVELY PURE MATERIALS AND SOLUTIONS AND SOLID RESIDUES WITH RELATIVELY LOW URANIUM CONTENT. URANIUM RECOVERY INVOLVED BOTH SLOW AND FAST PROCESSES. (4/4/66) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
6. VIEW OF THE BRIQUETTING PRESS AND CHIP CLEANING HOOD. ...
6. VIEW OF THE BRIQUETTING PRESS AND CHIP CLEANING HOOD. SCRAPS OF ENRICHED URANIUM FROM MACHINING OPERATIONS WERE CLEANED IN A SOLVENT BATH, THEN PRESSED INTO BRIQUETTS. THE BRIQUETTS WERE USED AS FEED MATERIAL FOR THE FOUNDRY. (4/4/66) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
18. VIEW OF THE GENERAL CHEMISTRY LAB. THE LABORATORY PROVIDED ...
18. VIEW OF THE GENERAL CHEMISTRY LAB. THE LABORATORY PROVIDED GENERAL ANALYTICAL AND STANDARDS CALIBRATION, AS WELL AS DEVELOPMENT OPERATIONS INCLUDING WASTE TECHNOLOGY DEVELOPMENT AND DEVELOPMENT AND TESTING OF MECHANICAL SYSTEMS FOR WEAPONS SYSTEMS. (4/4/66) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
21 CFR 1305.24 - Central processing of orders.
Code of Federal Regulations, 2013 CFR
2013-04-01
... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...
21 CFR 1305.24 - Central processing of orders.
Code of Federal Regulations, 2012 CFR
2012-04-01
... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...
21 CFR 1305.24 - Central processing of orders.
Code of Federal Regulations, 2011 CFR
2011-04-01
... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...
21 CFR 1305.24 - Central processing of orders.
Code of Federal Regulations, 2010 CFR
2010-04-01
... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...
21 CFR 1305.24 - Central processing of orders.
Code of Federal Regulations, 2014 CFR
2014-04-01
... or more registered locations and maintains a central processing computer system in which orders are... order with all linked records on the central computer system. (b) A company that has central processing... the company owns and operates. ...
9. VIEW OF MILLING AND LATHE MACHINES, MILLING AND LATHE ...
9. VIEW OF MILLING AND LATHE MACHINES, MILLING AND LATHE MACHINES WERE USED TO FORM COMPONENTS INTO THEIR FINAL SHAPE. IN THE FOUNDRY, ENRICHED URANIUM WAS CAST INTO SPHERICAL SHAPES OR INGOT FROM WHICH WEAPONS COMPONENTS WERE FABRICATED. (4/4/66) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
22. VIEW OF THE BASEMENT FLOOR PLAN. THE BASEMENT TUNNELS ...
22. VIEW OF THE BASEMENT FLOOR PLAN. THE BASEMENT TUNNELS WERE DESIGNED AS FALLOUT SHELTERS AND USED FOR STORAGE. THE ORIGINAL DRAWING HAS BEEN ARCHIVED ON MICROFILM. THE DRAWING WAS REPRODUCED AT THE BEST QUALITY POSSIBLE. LETTERS AND NUMBERS IN THE CIRCLES INDICATE FOOTER AND/OR COLUMN LOCATIONS. - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
The snow system: A decentralized medical data processing system.
Bellika, Johan Gustav; Henriksen, Torje Starbo; Yigzaw, Kassaye Yitbarek
2015-01-01
Systems for large-scale reuse of electronic health record data is claimed to have the potential to transform the current health care delivery system. In principle three alternative solutions for reuse exist: centralized, data warehouse, and decentralized solutions. This chapter focuses on the decentralized system alternative. Decentralized systems may be categorized into approaches that move data to enable computations or move computations to the where data is located to enable computations. We describe a system that moves computations to where the data is located. Only this kind of decentralized solution has the capabilities to become ideal systems for reuse as the decentralized alternative enables computation and reuse of electronic health record data without moving or exposing the information to outsiders. This chapter describes the Snow system, which is a decentralized medical data processing system, its components and how it has been used. It also describes the requirements this kind of systems need to support to become sustainable and successful in recruiting voluntary participation from health institutions.
Radar Detection Models in Computer Supported Naval War Games
1979-06-08
revealed a requirement for the effective centralized manage- ment of computer supported war game development and employment in the U.S. Navy. A...considerations and supports the requirement for centralized Io 97 management of computerized war game development . Therefore it is recommended that a central...managerial and fiscal authority be estab- lished for computerized tactical war game development . This central authority should ensure that new games
41 CFR 105-56.017 - Centralized salary offset computer match.
Code of Federal Regulations, 2013 CFR
2013-07-01
... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.017 - Centralized salary offset computer match.
Code of Federal Regulations, 2012 CFR
2012-01-01
... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.027 - Centralized salary offset computer match.
Code of Federal Regulations, 2014 CFR
2014-01-01
... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.017 - Centralized salary offset computer match.
Code of Federal Regulations, 2014 CFR
2014-01-01
... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.027 - Centralized salary offset computer match.
Code of Federal Regulations, 2012 CFR
2012-01-01
... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.027 - Centralized salary offset computer match.
Code of Federal Regulations, 2013 CFR
2013-07-01
... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.027 - Centralized salary offset computer match.
Code of Federal Regulations, 2011 CFR
2011-01-01
... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.027 - Centralized salary offset computer match.
Code of Federal Regulations, 2010 CFR
2010-07-01
... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.017 - Centralized salary offset computer match.
Code of Federal Regulations, 2010 CFR
2010-07-01
... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.017 - Centralized salary offset computer match.
Code of Federal Regulations, 2011 CFR
2011-01-01
... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
24. VIEW OF THE SECOND FLOOR PLAN. ENRICHED URANIUM AND ...
24. VIEW OF THE SECOND FLOOR PLAN. ENRICHED URANIUM AND STAINLESS STEEL WEAPONS COMPONENT PRODUCTION-RELATED ACTIVITIES OCCURRED PRIMARILY ON THE SECOND FLOOR. THE ORIGINAL DRAWING HAS BEEN ARCHIVED ON MICROFILM. THE DRAWING WAS REPRODUCED AT THE BEST QUALITY POSSIBLE. LETTERS AND NUMBERS IN THE CIRCLES INDICATE FOOTER AND/OR COLUMN LOCATIONS. - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
25. VIEW OF THE MACHINE TOOL LAYOUT IN ROOMS 244 ...
25. VIEW OF THE MACHINE TOOL LAYOUT IN ROOMS 244 AND 296. MACHINES WERE USED FOR STAINLESS STEEL FABRICATION (THE J-LINE). THE ORIGINAL DRAWING HAS BEEN ARCHIVED ON MICROFILM. THE DRAWING WAS REPRODUCED AT THE BEST QUALITY POSSIBLE. LETTERS AND NUMBERS IN THE CIRCLES INDICATE FOOTER AND/OR COLUMN LOCATIONS. - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
[Problem list in computer-based patient records].
Ludwig, C A
1997-01-14
Computer-based clinical information systems are capable of effectively processing even large amounts of patient-related data. However, physicians depend on rapid access to summarized, clearly laid out data on the computer screen to inform themselves about a patient's current clinical situation. In introducing a clinical workplace system, we therefore transformed the problem list-which for decades has been successfully used in clinical information management-into an electronic equivalent and integrated it into the medical record. The table contains a concise overview of diagnoses and problems as well as related findings. Graphical information can also be integrated into the table, and an additional space is provided for a summary of planned examinations or interventions. The digital form of the problem list makes it possible to use the entire list or selected text elements for generating medical documents. Diagnostic terms for medical reports are transferred automatically to corresponding documents. Computer technology has an immense potential for the further development of problem list concepts. With multimedia applications sound and images will be included in the problem list. For hyperlink purpose the problem list could become a central information board and table of contents of the medical record, thus serving as the starting point for database searches and supporting the user in navigating through the medical record.
Distributed Computing with Centralized Support Works at Brigham Young.
ERIC Educational Resources Information Center
McDonald, Kelly; Stone, Brad
1992-01-01
Brigham Young University (Utah) has addressed the need for maintenance and support of distributed computing systems on campus by implementing a program patterned after a national business franchise, providing the support and training of a centralized administration but allowing each unit to operate much as an independent small business.…
Hand-held computer operating system program for collection of resident experience data.
Malan, T K; Haffner, W H; Armstrong, A Y; Satin, A J
2000-11-01
To describe a system for recording resident experience involving hand-held computers with the Palm Operating System (3 Com, Inc., Santa Clara, CA). Hand-held personal computers (PCs) are popular, easy to use, inexpensive, portable, and can share data among other operating systems. Residents in our program carry individual hand-held database computers to record Residency Review Committee (RRC) reportable patient encounters. Each resident's data is transferred to a single central relational database compatible with Microsoft Access (Microsoft Corporation, Redmond, WA). Patient data entry and subsequent transfer to a central database is accomplished with commercially available software that requires minimal computer expertise to implement and maintain. The central database can then be used for statistical analysis or to create required RRC resident experience reports. As a result, the data collection and transfer process takes less time for residents and program director alike, than paper-based or central computer-based systems. The system of collecting resident encounter data using hand-held computers with the Palm Operating System is easy to use, relatively inexpensive, accurate, and secure. The user-friendly system provides prompt, complete, and accurate data, enhancing the education of residents while facilitating the job of the program director.
10 CFR 600.342 - Retention and access requirements for records.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., reliability, and security of the original computer data. Recipients must also maintain an audit trail... records, supporting documents, statistical records, and all other records pertinent to an award must be... related to computer usage chargeback rates), along with their supporting records, must be retained for a 3...
10 CFR 600.342 - Retention and access requirements for records.
Code of Federal Regulations, 2014 CFR
2014-01-01
..., reliability, and security of the original computer data. Recipients must also maintain an audit trail... records, supporting documents, statistical records, and all other records pertinent to an award must be... related to computer usage chargeback rates), along with their supporting records, must be retained for a 3...
10 CFR 600.342 - Retention and access requirements for records.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., reliability, and security of the original computer data. Recipients must also maintain an audit trail... records, supporting documents, statistical records, and all other records pertinent to an award must be... related to computer usage chargeback rates), along with their supporting records, must be retained for a 3...
LaRC local area networks to support distributed computing
NASA Technical Reports Server (NTRS)
Riddle, E. P.
1984-01-01
The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.
13. VIEW OF A BBOX, WHICH WAS USED IN THE ...
13. VIEW OF A B-BOX, WHICH WAS USED IN THE FAST RECOVERY PROCESS. URANIUM OXIDE WAS TRANSFERRED FOR DISSOLUTION IN A ROOM WHICH HOUSED 3 ROWS OF B-BOXES. B-BOXES ARE CONTROLLED HOODS, SIMILAR TO LAB HOODS THAT OPERATED WITH HIGH AIR VELOCITIES AT THEIR OPENINGS TO ENSURE THAT THE VAPORS WERE CONTAINED WITHIN THE HOOD. (2/14/79) - Rocky Flats Plant, General Manufacturing, Support, Records-Central Computing, Southern portion of Plant, Golden, Jefferson County, CO
Attendance fingerprint identification system using arduino and single board computer
NASA Astrophysics Data System (ADS)
Muchtar, M. A.; Seniman; Arisandi, D.; Hasanah, S.
2018-03-01
Fingerprint is one of the most unique parts of the human body that distinguishes one person from others and is easily accessed. This uniqueness is supported by technology that can automatically identify or recognize a person called fingerprint sensor. Yet, the existing Fingerprint Sensor can only do fingerprint identification on one machine. For the mentioned reason, we need a method to be able to recognize each user in a different fingerprint sensor. The purpose of this research is to build fingerprint sensor system for fingerprint data management to be centralized so identification can be done in each Fingerprint Sensor. The result of this research shows that by using Arduino and Raspberry Pi, data processing can be centralized so that fingerprint identification can be done in each fingerprint sensor with 98.5 % success rate of centralized server recording.
ERIC Educational Resources Information Center
Enriquez, Judith Guevarra
2010-01-01
In this article, centrality is explored as a measure of computer-mediated communication (CMC) in networked learning. Centrality measure is quite common in performing social network analysis (SNA) and in analysing social cohesion, strength of ties and influence in CMC, and computer-supported collaborative learning research. It argues that measuring…
20 CFR 404.1588 - Your responsibility to tell us of events that may change your disability status.
Code of Federal Regulations, 2013 CFR
2013-04-01
... may change your disability status. 404.1588 Section 404.1588 Employees' Benefits SOCIAL SECURITY... issue a receipt to you or your representative at least until a centralized computer file that records... centralized computer file is in place, we will continue to issue receipts to you or your representative if you...
20 CFR 404.1588 - Your responsibility to tell us of events that may change your disability status.
Code of Federal Regulations, 2012 CFR
2012-04-01
... may change your disability status. 404.1588 Section 404.1588 Employees' Benefits SOCIAL SECURITY... issue a receipt to you or your representative at least until a centralized computer file that records... centralized computer file is in place, we will continue to issue receipts to you or your representative if you...
20 CFR 404.1588 - Your responsibility to tell us of events that may change your disability status.
Code of Federal Regulations, 2011 CFR
2011-04-01
... may change your disability status. 404.1588 Section 404.1588 Employees' Benefits SOCIAL SECURITY... issue a receipt to you or your representative at least until a centralized computer file that records... centralized computer file is in place, we will continue to issue receipts to you or your representative if you...
20 CFR 404.1588 - Your responsibility to tell us of events that may change your disability status.
Code of Federal Regulations, 2014 CFR
2014-04-01
... may change your disability status. 404.1588 Section 404.1588 Employees' Benefits SOCIAL SECURITY... issue a receipt to you or your representative at least until a centralized computer file that records... centralized computer file is in place, we will continue to issue receipts to you or your representative if you...
Central Computational Facility CCF communications subsystem options
NASA Technical Reports Server (NTRS)
Hennigan, K. B.
1979-01-01
A MITRE study which investigated the communication options available to support both the remaining Central Computational Facility (CCF) computer systems and the proposed U1108 replacements is presented. The facilities utilized to link the remote user terminals with the CCF were analyzed and guidelines to provide more efficient communications were established.
Farmer, Andrew; Toms, Christy; Hardinge, Maxine; Williams, Veronika; Rutter, Heather; Tarassenko, Lionel
2014-01-08
The potential for telehealth-based interventions to provide remote support, education and improve self-management for long-term conditions is increasingly recognised. This trial aims to determine whether an intervention delivered through an easy-to-use tablet computer can improve the quality of life of patients with chronic obstructive pulmonary disease (COPD) by providing personalised self-management information and education. The EDGE (sElf management anD support proGrammE) for COPD is a multicentre, randomised controlled trial designed to assess the efficacy of an Internet-linked tablet computer-based intervention (the EDGE platform) in improving quality of life in patients with moderate to very severe COPD compared with usual care. Eligible patients are randomly allocated to receive the tablet computer-based intervention or usual care in a 2:1 ratio using a web-based randomisation system. Participants are recruited from respiratory outpatient clinics and pulmonary rehabilitation courses as well as from those recently discharged from hospital with a COPD-related admission and from primary care clinics. Participants allocated to the tablet computer-based intervention complete a daily symptom diary and record clinical symptoms using a Bluetooth-linked pulse oximeter. Participants allocated to receive usual care are provided with all the information given to those allocated to the intervention but without the use of the tablet computer or the facility to monitor their symptoms or physiological variables. The primary outcome of quality of life is measured using the St George's Respiratory Questionnaire for COPD patients (SGRQ-C) baseline, 6 and 12 months. Secondary outcome measures are recorded at these intervals in addition to 3 months. The Research Ethics Committee for Berkshire-South Central has provided ethical approval for the conduct of the study in the recruiting regions. The results of the study will be disseminated through peer review publications and conference presentations. Current controlled trials ISRCTN40367841.
77 FR 60401 - Privacy Act of 1974; Systems of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-03
... computer password protection.'' * * * * * System manager(s) and address: Delete entry and replace with...; Systems of Records AGENCY: National Security Agency/Central Security Service, DoD. ACTION: Notice to amend a system of records. SUMMARY: The National Security Agency (NSA) is proposing to amend a system of...
Report Central: quality reporting tool in an electronic health record.
Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H; Middleton, Blackford; Einbinder, Jonathan S
2006-01-01
Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XItrade mark and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow.
Changes in corticostriatal connectivity during reinforcement learning in humans.
Horga, Guillermo; Maia, Tiago V; Marsh, Rachel; Hao, Xuejun; Xu, Dongrong; Duan, Yunsuo; Tau, Gregory Z; Graniello, Barbara; Wang, Zhishun; Kangarlu, Alayar; Martinez, Diana; Packard, Mark G; Peterson, Bradley S
2015-02-01
Many computational models assume that reinforcement learning relies on changes in synaptic efficacy between cortical regions representing stimuli and striatal regions involved in response selection, but this assumption has thus far lacked empirical support in humans. We recorded hemodynamic signals with fMRI while participants navigated a virtual maze to find hidden rewards. We fitted a reinforcement-learning algorithm to participants' choice behavior and evaluated the neural activity and the changes in functional connectivity related to trial-by-trial learning variables. Activity in the posterior putamen during choice periods increased progressively during learning. Furthermore, the functional connections between the sensorimotor cortex and the posterior putamen strengthened progressively as participants learned the task. These changes in corticostriatal connectivity differentiated participants who learned the task from those who did not. These findings provide a direct link between changes in corticostriatal connectivity and learning, thereby supporting a central assumption common to several computational models of reinforcement learning. © 2014 Wiley Periodicals, Inc.
32 CFR 34.42 - Retention and access requirements for records.
Code of Federal Regulations, 2014 CFR
2014-07-01
... procedures shall maintain the integrity, reliability, and security of the original computer data. Recipients... (such as documents related to computer usage chargeback rates), along with their supporting records... this section is maintained on a computer, recipients shall retain the computer data on a reliable...
32 CFR 34.42 - Retention and access requirements for records.
Code of Federal Regulations, 2013 CFR
2013-07-01
... procedures shall maintain the integrity, reliability, and security of the original computer data. Recipients... (such as documents related to computer usage chargeback rates), along with their supporting records... this section is maintained on a computer, recipients shall retain the computer data on a reliable...
32 CFR 34.42 - Retention and access requirements for records.
Code of Federal Regulations, 2012 CFR
2012-07-01
... procedures shall maintain the integrity, reliability, and security of the original computer data. Recipients... (such as documents related to computer usage chargeback rates), along with their supporting records... this section is maintained on a computer, recipients shall retain the computer data on a reliable...
Report Central: Quality Reporting Tool in an Electronic Health Record
Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S.; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H.; Middleton, Blackford; Einbinder, Jonathan S.
2006-01-01
Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XI™ and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow. PMID:17238590
21 CFR 1304.04 - Maintenance of records and inventories.
Code of Federal Regulations, 2011 CFR
2011-04-01
... manual, or computer readable, form. (2) A registered retail pharmacy that possesses additional... this part for those additional registered sites at the retail pharmacy or other approved central...) Each registered pharmacy shall maintain the inventories and records of controlled substances as follows...
21 CFR 1304.04 - Maintenance of records and inventories.
Code of Federal Regulations, 2013 CFR
2013-04-01
... manual, or computer readable, form. (2) A registered retail pharmacy that possesses additional... this part for those additional registered sites at the retail pharmacy or other approved central...) Each registered pharmacy shall maintain the inventories and records of controlled substances as follows...
21 CFR 1304.04 - Maintenance of records and inventories.
Code of Federal Regulations, 2014 CFR
2014-04-01
... manual, or computer readable, form. (2) A registered retail pharmacy that possesses additional... this part for those additional registered sites at the retail pharmacy or other approved central...) Each registered pharmacy shall maintain the inventories and records of controlled substances as follows...
21 CFR 1304.04 - Maintenance of records and inventories.
Code of Federal Regulations, 2012 CFR
2012-04-01
... manual, or computer readable, form. (2) A registered retail pharmacy that possesses additional... this part for those additional registered sites at the retail pharmacy or other approved central...) Each registered pharmacy shall maintain the inventories and records of controlled substances as follows...
Grid site availability evaluation and monitoring at CMS
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; ...
2017-10-01
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less
Grid site availability evaluation and monitoring at CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less
Grid site availability evaluation and monitoring at CMS
NASA Astrophysics Data System (ADS)
Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; Lammel, Stephan; Sciabà, Andrea
2017-10-01
The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impact data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.
44 CFR 13.42 - Retention and access requirements for records.
Code of Federal Regulations, 2014 CFR
2014-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND... to all financial and programmatic records, supporting documents, statistical records, and other... computations of the rate at which a particular group of costs is chargeable (such as computer usage chargeback...
44 CFR 13.42 - Retention and access requirements for records.
Code of Federal Regulations, 2011 CFR
2011-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND... to all financial and programmatic records, supporting documents, statistical records, and other... computations of the rate at which a particular group of costs is chargeable (such as computer usage chargeback...
44 CFR 13.42 - Retention and access requirements for records.
Code of Federal Regulations, 2013 CFR
2013-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND... to all financial and programmatic records, supporting documents, statistical records, and other... computations of the rate at which a particular group of costs is chargeable (such as computer usage chargeback...
44 CFR 13.42 - Retention and access requirements for records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND... to all financial and programmatic records, supporting documents, statistical records, and other... computations of the rate at which a particular group of costs is chargeable (such as computer usage chargeback...
44 CFR 13.42 - Retention and access requirements for records.
Code of Federal Regulations, 2012 CFR
2012-10-01
... AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND... to all financial and programmatic records, supporting documents, statistical records, and other... computations of the rate at which a particular group of costs is chargeable (such as computer usage chargeback...
10 CFR 600.342 - Retention and access requirements for records.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., reliability, and security of the original computer data. Recipients must also maintain an audit trail... related to computer usage chargeback rates), along with their supporting records, must be retained for a 3... maintained on a computer, recipients must retain the computer data on a reliable medium for the time periods...
10 CFR 600.342 - Retention and access requirements for records.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., reliability, and security of the original computer data. Recipients must also maintain an audit trail... related to computer usage chargeback rates), along with their supporting records, must be retained for a 3... maintained on a computer, recipients must retain the computer data on a reliable medium for the time periods...
Yamamoto, K; Ogura, H; Furutani, H; Kitazoe, Y; Takeda, Y; Hirakawa, M
1986-01-01
A computer system operation is introduced, which has been in use since October 1981 at Kochi medical school as one of the integral sub-systems of the total hospital information system called IMIS. The system was designed from the beginning with the main purposes of obtaining better management of operations, and detailed medical records are included for before, during and after operations. It is shown that almost all operations except emergencies were managed using the computer system rather than the paper system. After presenting some of the results of the accumulated records we will discuss the reason for this high frequency of use of the computer system.
40 CFR 63.1416 - Recordkeeping requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... § 63.1416 Recordkeeping requirements. (a) Data retention. Unless otherwise specified in this subpart... site or shall be accessible from a central location by computer or other means that provides access.... Records may be maintained in hard copy or computer-readable form including, but not limited to, on paper...
40 CFR 63.1416 - Recordkeeping requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... § 63.1416 Recordkeeping requirements. (a) Data retention. Unless otherwise specified in this subpart... site or shall be accessible from a central location by computer or other means that provides access.... Records may be maintained in hard copy or computer-readable form including, but not limited to, on paper...
41 CFR 105-56.024 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-07-01
... offset computer matching, identify Federal employees who owe delinquent non-tax debt to the United States. Centralized salary offset computer matching is the computerized comparison of delinquent debt records with...) administrative offset program, to collect delinquent debts owed to the Federal Government. This process is known...
34 CFR 80.42 - Retention and access requirements for records.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., and any similar accounting computations of the rate at which a particular group of costs is chargeable... plan, or computation and its supporting records starts from end of the fiscal year (or other accounting... to any pertinent books, documents, papers, or other records of grantees and subgrantees which are...
A Patient Record-Filing System for Family Practice
Levitt, Cheryl
1988-01-01
The efficient storage and easy retrieval of quality records are a central concern of good family practice. Many physicians starting out in practice have difficulty choosing a practical and lasting system for storing their records. Some who have established practices are installing computers in their offices and finding that their filing systems are worn, outdated, and incompatible with computerized systems. This article describes a new filing system installed simultaneously with a new computer system in a family-practice teaching centre. The approach adopted solved all identifiable problems and is applicable in family practices of all sizes.
Commodity Tracker: Mobile Application for Food Security Monitoring in Haiti
NASA Astrophysics Data System (ADS)
Chiu, M. T.; Huang, X.; Baird, J.; Gourley, J. R.; Morelli, R.; de Lanerolle, T. R.; Haiti Food Security Monitoring Mobile App Team
2011-12-01
Megan Chiu, Jason Baird, Xu Huang, Trishan de Lanerolle, Ralph Morelli, Jonathan Gourley Trinity College, Computer Science Department and Environmental Science Program, 300 Summit Street, Hartford, CT 06106 megan.chiu@trincoll.edu, Jason.baird@trincoll.edu, xu.huang@trincoll.edu, trishan.delanerolle@trincoll.edu, ralph.morelli@trincoll.edu, jonathan.gourley@trincoll.edu Price data for Haiti commodities such as rice and potatoes have been traditionally recorded by hand on paper forms for many years. The information is then entered onto computer manually, thus making the process a long and arduous one. With the development of the Haiti Commodity Tracker mobile app, we are able to make this commodity price data recording process more efficient. Officials may use this information for making inferences about the difference in commodity prices and for food distribution during critical time after natural disasters. This information can also be utilized by governments and aid agencies on their food assistance programs. Agronomists record the item prices from several sample sites in a marketplace and compare those results from other markets across the region. Due to limited connectivity in rural areas, data is first saved to the phone's database and then retransmitted to a central server via SMS messaging. The mobile app is currently being field tested by an international NGO providing agricultural aid and support in rural Haiti.
1965-01-01
The surface-water records for the 1965 water year for gaging stations, partial-record stations, and miscellaneous sites within California are given in this report. For convenience, also included are records for a few pertinent gaging stations in bordering States. The records were collected and computed by the Water Resources Division of the U.S. Geological Survey, under the direction of Walter Hofmann, district chief, Menlo Park, Calif.
An Implemented Strategy for Campus Connectivity and Cooperative Computing.
ERIC Educational Resources Information Center
Halaris, Antony S.; Sloan, Lynda W.
1989-01-01
ConnectPac, a software package developed at Iona College to allow a computer user to access all services from a single personal computer, is described. ConnectPac uses mainframe computing to support a campus computing network, integrating personal and centralized computing into a menu-driven user environment. (Author/MLW)
40 CFR 63.506 - General recordkeeping and reporting provisions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... General recordkeeping and reporting provisions. (a) Data retention. Unless otherwise specified in this... retained on site or shall be accessible from a central location by computer or other means that provide... offsite. Records may be maintained in hard copy or computer-readable form including, but not limited to...
40 CFR 63.506 - General recordkeeping and reporting provisions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... General recordkeeping and reporting provisions. (a) Data retention. Unless otherwise specified in this... retained on site or shall be accessible from a central location by computer or other means that provide... offsite. Records may be maintained in hard copy or computer-readable form including, but not limited to...
40 CFR 63.506 - General recordkeeping and reporting provisions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... General recordkeeping and reporting provisions. (a) Data retention. Unless otherwise specified in this... retained on site or shall be accessible from a central location by computer or other means that provide... offsite. Records may be maintained in hard copy or computer-readable form including, but not limited to...
Code of Federal Regulations, 2010 CFR
2010-01-01
... SECURITY INFORMATION § 403.9 Fees. The following specific fees shall be applicable with respect to services... records, per hour or fraction thereof: (i) Professional $11.00 (ii) Clerical 6.00 (b) Computer service charges per second for actual use of computer central processing unit .25 (c) Copies made by photostat or...
Code of Federal Regulations, 2012 CFR
2012-01-01
... SECURITY INFORMATION § 403.9 Fees. The following specific fees shall be applicable with respect to services... records, per hour or fraction thereof: (i) Professional $11.00 (ii) Clerical 6.00 (b) Computer service charges per second for actual use of computer central processing unit .25 (c) Copies made by photostat or...
Code of Federal Regulations, 2011 CFR
2011-01-01
... SECURITY INFORMATION § 403.9 Fees. The following specific fees shall be applicable with respect to services... records, per hour or fraction thereof: (i) Professional $11.00 (ii) Clerical 6.00 (b) Computer service charges per second for actual use of computer central processing unit .25 (c) Copies made by photostat or...
Code of Federal Regulations, 2013 CFR
2013-01-01
... SECURITY INFORMATION § 403.9 Fees. The following specific fees shall be applicable with respect to services... records, per hour or fraction thereof: (i) Professional $11.00 (ii) Clerical 6.00 (b) Computer service charges per second for actual use of computer central processing unit .25 (c) Copies made by photostat or...
Code of Federal Regulations, 2014 CFR
2014-01-01
... SECURITY INFORMATION § 403.9 Fees. The following specific fees shall be applicable with respect to services... records, per hour or fraction thereof: (i) Professional $11.00 (ii) Clerical 6.00 (b) Computer service charges per second for actual use of computer central processing unit .25 (c) Copies made by photostat or...
Computer graphics and the graphic artist
NASA Technical Reports Server (NTRS)
Taylor, N. L.; Fedors, E. G.; Pinelli, T. E.
1985-01-01
A centralized computer graphics system is being developed at the NASA Langley Research Center. This system was required to satisfy multiuser needs, ranging from presentation quality graphics prepared by a graphic artist to 16-mm movie simulations generated by engineers and scientists. While the major thrust of the central graphics system was directed toward engineering and scientific applications, hardware and software capabilities to support the graphic artists were integrated into the design. This paper briefly discusses the importance of computer graphics in research; the central graphics system in terms of systems, software, and hardware requirements; the application of computer graphics to graphic arts, discussed in terms of the requirements for a graphic arts workstation; and the problems encountered in applying computer graphics to the graphic arts. The paper concludes by presenting the status of the central graphics system.
Data from clinical notes: a perspective on the tension between structure and flexible documentation
Denny, Joshua C; Xu, Hua; Lorenzi, Nancy; Stead, William W; Johnson, Kevin B
2011-01-01
Clinical documentation is central to patient care. The success of electronic health record system adoption may depend on how well such systems support clinical documentation. A major goal of integrating clinical documentation into electronic heath record systems is to generate reusable data. As a result, there has been an emphasis on deploying computer-based documentation systems that prioritize direct structured documentation. Research has demonstrated that healthcare providers value different factors when writing clinical notes, such as narrative expressivity, amenability to the existing workflow, and usability. The authors explore the tension between expressivity and structured clinical documentation, review methods for obtaining reusable data from clinical notes, and recommend that healthcare providers be able to choose how to document patient care based on workflow and note content needs. When reusable data are needed from notes, providers can use structured documentation or rely on post-hoc text processing to produce structured data, as appropriate. PMID:21233086
1968-01-01
The surface-water records for the 1967 water year for gaging stations, partial-record stations, and miscellaneous sites within California are given in this report. For convenience, also included are records for a few pertinent gaging stations in bordering States. The records were collected and computed by the Water Resources Division of the U.S. Geological Survey, under the direction of R. Stanley Lord, district chief, Menlo Park, Calif.
1968-01-01
The surface-water records for the 1967 water year for gaging stations, partial-record stations, and miscellaneous sites within California are given in this report. For convenience, also included are records for a few pertinent gaging stations in bordering States. The records were collected and computed by the Water Resources Division of the U.S. Geological Survey, under the direction of R. Stanley Lord, district chief, Menlo Park, Calif.
,
1969-01-01
The surface-water records for the 1968 water year for gaging stations, partial-record stations, and miscellaneous sites within California are given in this report. For convenience, also included are records for a few pertinent gaging stations in bordering States. The records were collected and computed by the Water Resources Division of the U.S. Geological Survey, under the direction of R. Stanley Lord, district chief, Menlo Park, Calif.
,
1969-01-01
The surface-water records for the 1968 water year for gaging stations, partial-record stations, and miscellaneous sites within California are given in this report. For convenience, also included are records for a few pertinent gaging stations in bordering States. The records were collected and computed by the water Resources Division of the U.S. Geological Survey, under the direction of R. Stanley Lord, district chief, Menlo Park, Calif.
1967-01-01
The surface-water records for the 1966 water year for gaging stations, partial-record stations, and miscellaneous sites within California are given in this report. For convenience, also included are records for a few pertinent gaging stations in bordering States. The records were collected and computed by the Water Resources Division of the U.S. Geological Survey, under the direction of Walter Hofmann and R. Stanley Lord, successive district chiefs, Menlo Park, Calif.
1965-01-01
The surface-water records for the 1965 water year for gaging stations, partial-record stations, and miscellaneous sites within California are given in this report. For convenience, also included are records for a few pertinent gaging stations in bordering States. The records were collected and computed by the Water Resources Division of the U.S. Geological Survey, under the direction of Walter Hofmann, district chief, Menlo Park, Calif.
1965-01-01
The surface-water records for the 1964 water year for gaging stations, partial-record stations, and miscellaneous sites within the State of California are given in this report. For convenience there are also included records for a few pertinent gaging stations in bordering States. The records were collected and computed by the Water Resources Division of the U.S. Geological Survey, under the direction of Walter Hofmann, district engineer, Surface Water Branch.
A perioperative echocardiographic reporting and recording system.
Pybus, David A
2004-11-01
Advances in video capture, compression, and streaming technology, coupled with improvements in central processing unit design and the inclusion of a database engine in the Windows operating system, have simplified the task of implementing a digital echocardiographic recording system. I describe an application that uses these technologies and runs on a notebook computer.
Integrating Micro-computers with a Centralized DBMS: ORACLE, SEED AND INGRES
NASA Technical Reports Server (NTRS)
Hoerger, J.
1984-01-01
Users of ADABAS, a relational-like data base management system (ADABAS) with its data base programming language (NATURAL) are acquiring microcomputers with hopes of solving their individual word processing, office automation, decision support, and simple data processing problems. As processor speeds, memory sizes, and disk storage capacities increase, individual departments begin to maintain "their own" data base on "their own" micro-computer. This situation can adversely affect several of the primary goals set for implementing a centralized DBMS. In order to avoid this potential problem, these micro-computers must be integrated with the centralized DBMS. An easy to use and flexible means for transferring logic data base files between the central data base machine and micro-computers must be provided. Some of the problems encounted in an effort to accomplish this integration and possible solutions are discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-02
... Standards and Technology's (NIST) Computer Security Division maintains a Computer Security Resource Center... Regarding Driver History Record Information Security, Continuity of Operation Planning, and Disaster... (SDLAs) to support their efforts at maintaining the security of information contained in the driver...
The color-vision approach to emotional space: cortical evoked potential data.
Boucsein, W; Schaefer, F; Sokolov, E N; Schröder, C; Furedy, J J
2001-01-01
A framework for accounting for emotional phenomena proposed by Sokolov and Boucsein (2000) employs conceptual dimensions that parallel those of hue, brightness, and saturation in color vision. The approach that employs the concepts of emotional quality. intensity, and saturation has been supported by psychophysical emotional scaling data gathered from a few trained observers. We report cortical evoked potential data obtained during the change between different emotions expressed in schematic faces. Twenty-five subjects (13 male, 12 female) were presented with a positive, a negative, and a neutral computer-generated face with random interstimulus intervals in a within-subjects design, together with four meaningful and four meaningless control stimuli made up from the same elements. Frontal, central, parietal, and temporal ERPs were recorded from each hemisphere. Statistically significant outcomes in the P300 and N200 range support the potential fruitfulness of the proposed color-vision-model-based approach to human emotional space.
Computer systems for automatic earthquake detection
Stewart, S.W.
1974-01-01
U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously.
De Rosis, Sabina; Seghieri, Chiara
2015-08-22
There is general consensus that appropriate development and use of information and communication technologies (ICT) are crucial in the delivery of effective primary care (PC). Several countries are defining policies to support and promote a structural change of the health care system through the introduction of ICT. This study analyses the state of development of basic ICT in PC systems of 31 European countries with the aim to describe the extent of, and main purposes for, computer use by General Practitioners (GPs) across Europe. Additionally, trends over time have been analysed. Descriptive statistical analysis was performed on data from the QUALICOPC (Quality and Costs of Primary Care in Europe) survey, to describe the geographic differences in the general use of computer, and in specific computerized clinical functions for different health-related purposes such as prescribing, medication checking, generating health records and research for medical information on the Internet. While all the countries have achieved a near-universal adoption of a computer in their primary care practices, with only a few countries near or under the boundary of 90 %, the computerisation of primary care clinical functions presents a wide variability of adoption within and among countries and, in several cases (such as in the southern and central-eastern Europe), a large room for improvement. At European level, more efforts could be done to support southern and central-eastern Europe in closing the gap in adoption and use of ICT in PC. In particular, more attention seems to be need on the current usages of the computer in PC, by focusing policies and actions on the improvement of the appropriate usages that can impact on quality and costs of PC and can facilitate an interconnected health care system. However, policies and investments seem necessary but not sufficient to achieve these goals. Organizational, behavioural and also networking aspects should be taken in consideration.
Computer model of Raritan River Basin water-supply system in central New Jersey
Dunne, Paul; Tasker, Gary D.
1996-01-01
This report describes a computer model of the Raritan River Basin water-supply system in central New Jersey. The computer model provides a technical basis for evaluating the effects of alternative patterns of operation of the Raritan River Basin water-supply system during extended periods of below-average precipitation. The computer model is a continuity-accounting model consisting of a series of interconnected nodes. At each node, the inflow volume, outflow volume, and change in storage are determined and recorded for each month. The model runs with a given set of operating rules and water-use requirements including releases, pumpages, and diversions. The model can be used to assess the hypothetical performance of the Raritan River Basin water- supply system in past years under alternative sets of operating rules. It also can be used to forecast the likelihood of specified outcomes, such as the depletion of reservoir contents below a specified threshold or of streamflows below statutory minimum passing flows, for a period of up to 12 months. The model was constructed on the basis of current reservoir capacities and the natural, unregulated monthly runoff values recorded at U.S. Geological Survey streamflow- gaging stations in the basin.
Central Data Processing System (CDPS) user's manual: Solar heating and cooling program
NASA Technical Reports Server (NTRS)
1976-01-01
The software and data base management system required to assess the performance of solar heating and cooling systems installed at multiple sites is presented. The instrumentation data associated with these systems is collected, processed, and presented in a form which supported continuity of performance evaluation across all applications. The CDPS consisted of three major elements: communication interface computer, central data processing computer, and performance evaluation data base. Users of the performance data base were identified, and procedures for operation, and guidelines for software maintenance were outlined. The manual also defined the output capabilities of the CDPS in support of external users of the system.
Federated learning of predictive models from federated Electronic Health Records.
Brisimi, Theodora S; Chen, Ruidi; Mela, Theofanie; Olshevsky, Alex; Paschalidis, Ioannis Ch; Shi, Wei
2018-04-01
In an era of "big data," computationally efficient and privacy-aware solutions for large-scale machine learning problems become crucial, especially in the healthcare domain, where large amounts of data are stored in different locations and owned by different entities. Past research has been focused on centralized algorithms, which assume the existence of a central data repository (database) which stores and can process the data from all participants. Such an architecture, however, can be impractical when data are not centrally located, it does not scale well to very large datasets, and introduces single-point of failure risks which could compromise the integrity and privacy of the data. Given scores of data widely spread across hospitals/individuals, a decentralized computationally scalable methodology is very much in need. We aim at solving a binary supervised classification problem to predict hospitalizations for cardiac events using a distributed algorithm. We seek to develop a general decentralized optimization framework enabling multiple data holders to collaborate and converge to a common predictive model, without explicitly exchanging raw data. We focus on the soft-margin l 1 -regularized sparse Support Vector Machine (sSVM) classifier. We develop an iterative cluster Primal Dual Splitting (cPDS) algorithm for solving the large-scale sSVM problem in a decentralized fashion. Such a distributed learning scheme is relevant for multi-institutional collaborations or peer-to-peer applications, allowing the data holders to collaborate, while keeping every participant's data private. We test cPDS on the problem of predicting hospitalizations due to heart diseases within a calendar year based on information in the patients Electronic Health Records prior to that year. cPDS converges faster than centralized methods at the cost of some communication between agents. It also converges faster and with less communication overhead compared to an alternative distributed algorithm. In both cases, it achieves similar prediction accuracy measured by the Area Under the Receiver Operating Characteristic Curve (AUC) of the classifier. We extract important features discovered by the algorithm that are predictive of future hospitalizations, thus providing a way to interpret the classification results and inform prevention efforts. Copyright © 2018 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Gonczi, Amanda L.; Chiu, Jennifer L.; Maeng, Jennifer L.; Bell, Randy L.
2016-01-01
This investigation sought to identify patterns in elementary science teachers' computer simulation use, particularly implementation structures and instructional supports commonly employed by teachers. Data included video-recorded science lessons of 96 elementary teachers who used computer simulations in one or more science lessons. Results…
Floods of May 1981 in west-central Montana
Parrett, Charles; Omang, R.J.; Hull, J.A.; Fassler, John W.
1982-01-01
Extensive flooding occurred in west-central Montana during May 22-23, 1981, as a result of a series of rainstorms. Flooding was particularly severe in the communities of East Helena, Belt, and Deer Lodge. Although no lives were lost, total flood damages were estimated by the Montana Disaster Emergency Services Division to be in excess of $30 million. Peak discharges were determined at 75 sites in the flooded area. At 25 sites the May 1981 peak discharge exceeded the computed 100-year frequency flood, and at 29 sites, where previous flow records are available, the May 1981 peak discharge exceeded the previous peak of record. (USGS)
Automated technical validation--a real time expert system for decision support.
de Graeve, J S; Cambus, J P; Gruson, A; Valdiguié, P M
1996-04-15
Dealing daily with various machines and various control specimens provides a lot of data that cannot be processed manually. In order to help decision-making we wrote specific software coping with the traditional QC, with patient data (mean of normals, delta check) and with criteria related to the analytical equipment (flags and alarms). Four machines (3 Ektachem 700 and 1 Hitachi 911) analysing 25 common chemical tests are controlled. Every day, three different control specimens and one more once a week (regional survey) are run on the various pieces of equipment. The data are collected on a 486 microcomputer connected to the central computer. For every parameter the standard deviation is compared with the published acceptable limits and the Westgard's rules are computed. The mean of normals is continuously monitored. The final decision induces either an alarm sound and the print-out of the cause of rejection or, if no alarms happen, the daily print-out of recorded data, with or without the Levey Jennings graphs.
ERIC Educational Resources Information Center
Grajek, Susan
2014-01-01
For three days in January 2014, more than a hundred thought leaders met in Tempe, Arizona, to discuss the present and future challenges and opportunities for IT's support of research. Recommendations to improve institutions' support of scientific and humanities research included shaping central IT's role as an aggregator; ensuring that central IT…
Pikwer, Andreas; Acosta, Stefan; Kölbel, Tilo; Åkeson, Jonas
2010-01-01
This study was designed to assess endovascular intervention for central venous cannulation in patients with vascular occlusion after previous catheterization. Patients referred for endovascular management of central venous occlusion during a 42-month period were identified from a regional endovascular database, providing prospective information on techniques and clinical outcome. Corresponding patient records, angiograms, and radiographic reports were analyzed retrospectively. Sixteen patients aged 48 years (range 0.5-76), including 11 females, were included. All patients but 1 had had multiple central venous catheters with a median total indwelling time of 37 months. Eleven patients cannulated for hemodialysis had had significantly fewer individual catheters inserted compared with 5 patients cannulated for nutritional support (mean 3.6 vs. 10.2, p<0.001) before endovascular intervention. Preoperative imaging by magnetic resonance tomography (MRT) in 8 patients, computed tomography (CT) venography in 3, conventional angiography in 6, and/or ultrasonography in 8, verified 15 brachiocephalic, 13 internal jugular, 3 superior caval, and/or 3 subclavian venous occlusions. Patients were subjected to recanalization (n=2), recanalization and percutaneous transluminal angioplasty (n=5), or stenting for vena cava superior syndrome (n=1) prior to catheter insertion. The remaining 8 patients were cannulated by avoiding the occluded route. Central venous occlusion occurs particularly in patients under hemodialysis and with a history of multiple central venous catheterizations with large-diameter catheters and/or long total indwelling time periods. Patients with central venous occlusion verified by CT or MRT venography and need for central venous access should be referred for endovascular intervention.
Reconciliation of the cloud computing model with US federal electronic health record regulations
2011-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing. PMID:21727204
Reconciliation of the cloud computing model with US federal electronic health record regulations.
Schweitzer, Eugene J
2012-01-01
Cloud computing refers to subscription-based, fee-for-service utilization of computer hardware and software over the Internet. The model is gaining acceptance for business information technology (IT) applications because it allows capacity and functionality to increase on the fly without major investment in infrastructure, personnel or licensing fees. Large IT investments can be converted to a series of smaller operating expenses. Cloud architectures could potentially be superior to traditional electronic health record (EHR) designs in terms of economy, efficiency and utility. A central issue for EHR developers in the US is that these systems are constrained by federal regulatory legislation and oversight. These laws focus on security and privacy, which are well-recognized challenges for cloud computing systems in general. EHRs built with the cloud computing model can achieve acceptable privacy and security through business associate contracts with cloud providers that specify compliance requirements, performance metrics and liability sharing.
A direct-to-drive neural data acquisition system.
Kinney, Justin P; Bernstein, Jacob G; Meyer, Andrew J; Barber, Jessica B; Bolivar, Marti; Newbold, Bryan; Scholvin, Jorg; Moore-Kochlacs, Caroline; Wentz, Christian T; Kopell, Nancy J; Boyden, Edward S
2015-01-01
Driven by the increasing channel count of neural probes, there is much effort being directed to creating increasingly scalable electrophysiology data acquisition (DAQ) systems. However, all such systems still rely on personal computers for data storage, and thus are limited by the bandwidth and cost of the computers, especially as the scale of recording increases. Here we present a novel architecture in which a digital processor receives data from an analog-to-digital converter, and writes that data directly to hard drives, without the need for a personal computer to serve as an intermediary in the DAQ process. This minimalist architecture may support exceptionally high data throughput, without incurring costs to support unnecessary hardware and overhead associated with personal computers, thus facilitating scaling of electrophysiological recording in the future.
A direct-to-drive neural data acquisition system
Kinney, Justin P.; Bernstein, Jacob G.; Meyer, Andrew J.; Barber, Jessica B.; Bolivar, Marti; Newbold, Bryan; Scholvin, Jorg; Moore-Kochlacs, Caroline; Wentz, Christian T.; Kopell, Nancy J.; Boyden, Edward S.
2015-01-01
Driven by the increasing channel count of neural probes, there is much effort being directed to creating increasingly scalable electrophysiology data acquisition (DAQ) systems. However, all such systems still rely on personal computers for data storage, and thus are limited by the bandwidth and cost of the computers, especially as the scale of recording increases. Here we present a novel architecture in which a digital processor receives data from an analog-to-digital converter, and writes that data directly to hard drives, without the need for a personal computer to serve as an intermediary in the DAQ process. This minimalist architecture may support exceptionally high data throughput, without incurring costs to support unnecessary hardware and overhead associated with personal computers, thus facilitating scaling of electrophysiological recording in the future. PMID:26388740
Yoder, J W; Schultz, D F; Williams, B T
1998-10-01
The solution to many of the problems of the computer-based recording of the medical record has been elusive, largely due to difficulties in the capture of those data elements that comprise the records of the Present Illness and of the Physical Findings. Reliable input of data has proven to be more complex than originally envisioned by early work in the field. This has led to more research and development into better data collection protocols and easy to use human-computer interfaces as support tools. The Medical Examination Direct Iconic and Graphic Augmented Text Entry System (MEDIGATE System) is a computer enhanced interactive graphic and textual record of the findings from physical examinations designed to provide ease of user input and to support organization and processing of the data characterizing these findings. The primary design objective of the MEDIGATE System is to develop and evaluate different interface designs for recording observations from the physical examination in an attempt to overcome some of the deficiencies in this major component of the individual record of health and illness.
12 CFR 790.2 - Central and regional office organization.
Code of Federal Regulations, 2010 CFR
2010-01-01
... development, maintenance, operation, and support of information systems which directly support the Agency's... operating fees from federal credit unions, for maintaining NCUA's accounting system and accounting records... for the information and use of agency staff, credit union officials, state credit union supervisory...
Great Expectations: Distributed Financial Computing at Cornell.
ERIC Educational Resources Information Center
Schulden, Louise; Sidle, Clint
1988-01-01
The Cornell University Distributed Accounting (CUDA) system is an attempt to provide departments a software tool for better managing their finances, creating microcomputer standards, creating a vehicle for better administrative microcomputer support, and insuring local systems are consistent with central computer systems. (Author/MLW)
Code of Federal Regulations, 2014 CFR
2014-10-01
.... Computer software does not include computer data bases or computer software documentation. Litigation... includes technical data and computer software, but does not include information that is lawfully, publicly available without restriction. Technical data means recorded information, regardless of the form or method...
Computerized Fleet Maintenance.
ERIC Educational Resources Information Center
Cataldo, John J.
The computerization of school bus maintenance records by the Niskayuna (New York) Central School District enabled the district's transportation department to engage in management practices resulting in significant savings. The district obtains computer analyses of the work performed on all vehicles, including time spent, parts, labor, costs,…
Orozco, Allan; Morera, Jessica; Jiménez, Sergio; Boza, Ricardo
2013-09-01
Today, Bioinformatics has become a scientific discipline with great relevance for the Molecular Biosciences and for the Omics sciences in general. Although developed countries have progressed with large strides in Bioinformatics education and research, in other regions, such as Central America, the advances have occurred in a gradual way and with little support from the Academia, either at the undergraduate or graduate level. To address this problem, the University of Costa Rica's Medical School, a regional leader in Bioinformatics in Central America, has been conducting a series of Bioinformatics workshops, seminars and courses, leading to the creation of the region's first Bioinformatics Master's Degree. The recent creation of the Central American Bioinformatics Network (BioCANET), associated to the deployment of a supporting computational infrastructure (HPC Cluster) devoted to provide computing support for Molecular Biology in the region, is providing a foundational stone for the development of Bioinformatics in the area. Central American bioinformaticians have participated in the creation of as well as co-founded the Iberoamerican Bioinformatics Society (SOIBIO). In this article, we review the most recent activities in education and research in Bioinformatics from several regional institutions. These activities have resulted in further advances for Molecular Medicine, Agriculture and Biodiversity research in Costa Rica and the rest of the Central American countries. Finally, we provide summary information on the first Central America Bioinformatics International Congress, as well as the creation of the first Bioinformatics company (Indromics Bioinformatics), spin-off the Academy in Central America and the Caribbean.
Protocols for Handling Messages Between Simulation Computers
NASA Technical Reports Server (NTRS)
Balcerowski, John P.; Dunnam, Milton
2006-01-01
Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.
Koh, Jansen; Cheung, Jeffrey J H; Mackinnon, Kim; Brett, Clare; Kapralos, Bill; Dubrowski, Adam
2013-01-01
There is a lack of evidence for the use of Web-based Learning (WBL) and Computer Supported Collaborative Learning (CSCL) for acquiring psychomotor skills in medical education. In this study, we surveyed medical undergraduate students attending a simulation based training session for central line insertion on their perspectives and utilization of WBL and CSCL for acquisition of a complex psychomotor skill.
Choosing a Computer Language for Institutional Research. The AIR Professional File No. 6.
ERIC Educational Resources Information Center
Strenglein, Denise
1980-01-01
It is suggested that much thought should be given to choosing an appropriate computer language for an institutional research office, considering the sophistication of the staff, types of planned application, size and type of computer, and availability of central programming support in the institution. For offices that prepare straight reports and…
DOT National Transportation Integrated Search
1976-08-01
This report contains a functional design for the simulation of a future automation concept in support of the ATC Systems Command Center. The simulation subsystem performs airport airborne arrival delay predictions and computes flow control tables for...
DOT National Transportation Integrated Search
1988-10-01
An analysis of the current environment within the Acquisition stage of the Weapon System Life Cycle Pertaining to the Logistics Support Analysis (LSA) process, the Logistics Support Analysis Record (LSAR), and other Logistics Support data was underta...
DOT National Transportation Integrated Search
1988-10-01
An analysis of the current environment within the Acquisition stage of the Weapon System Life Cycle Pertaining to the Logistics Support Analysis (LSA) process, the Logistics Support Analysis Record (LSAR), and other Logistics Support data was underta...
ERA 1103 UNIVAC 2 Calculating Machine
1955-09-21
The new 10-by 10-Foot Supersonic Wind Tunnel at the Lewis Flight Propulsion Laboratory included high tech data acquisition and analysis systems. The reliable gathering of pressure, speed, temperature, and other data from test runs in the facilities was critical to the research process. Throughout the 1940s and early 1950s female employees, known as computers, recorded all test data and performed initial calculations by hand. The introduction of punch card computers in the late 1940s gradually reduced the number of hands-on calculations. In the mid-1950s new computational machines were installed in the office building of the 10-by 10-Foot tunnel. The new systems included this UNIVAC 1103 vacuum tube computer—the lab’s first centralized computer system. The programming was done on paper tape and fed into the machine. The 10-by 10 computer center also included the Lewis-designed Computer Automated Digital Encoder (CADDE) and Digital Automated Multiple Pressure Recorder (DAMPR) systems which converted test data to binary-coded decimal numbers and recorded test pressures automatically, respectively. The systems primarily served the 10-by 10, but were also applied to the other large facilities. Engineering Research Associates (ERA) developed the initial UNIVAC computer for the Navy in the late 1940s. In 1952 the company designed a commercial version, the UNIVAC 1103. The 1103 was the first computer designed by Seymour Cray and the first commercially successful computer.
Authentication of digital video evidence
NASA Astrophysics Data System (ADS)
Beser, Nicholas D.; Duerr, Thomas E.; Staisiunas, Gregory P.
2003-11-01
In response to a requirement from the United States Postal Inspection Service, the Technical Support Working Group tasked The Johns Hopkins University Applied Physics Laboratory (JHU/APL) to develop a technique tha will ensure the authenticity, or integrity, of digital video (DV). Verifiable integrity is needed if DV evidence is to withstand a challenge to its admissibility in court on the grounds that it can be easily edited. Specifically, the verification technique must detect additions, deletions, or modifications to DV and satisfy the two-part criteria pertaining to scientific evidence as articulated in Daubert et al. v. Merrell Dow Pharmaceuticals Inc., 43 F3d (9th Circuit, 1995). JHU/APL has developed a prototype digital video authenticator (DVA) that generates digital signatures based on public key cryptography at the frame level of the DV. Signature generation and recording is accomplished at the same time as DV is recorded by the camcorder. Throughput supports the consumer-grade camcorder data rate of 25 Mbps. The DVA software is implemented on a commercial laptop computer, which is connected to a commercial digital camcorder via the IEEE-1394 serial interface. A security token provides agent identification and the interface to the public key infrastructure (PKI) that is needed for management of the public keys central to DV integrity verification.
Data Recording Room in the 10-by 10-Foot Supersonic Wind Tunnel
1973-04-21
The test data recording equipment located in the office building of the 10-by 10-Foot Supersonic Wind Tunnel at the NASA Lewis Research Center. The data system was the state of the art when the facility began operating in 1955 and was upgraded over time. NASA engineers used solenoid valves to measure pressures from different locations within the test section. Up 48 measurements could be fed into a single transducer. The 10-by 10 data recorders could handle up to 200 data channels at once. The Central Automatic Digital Data Encoder (CADDE) converted this direct current raw data from the test section into digital format on magnetic tape. The digital information was sent to the Lewis Central Computer Facility for additional processing. It could also be displayed in the control room via strip charts or oscillographs. The 16-by 56-foot long ERA 1103 UNIVAC mainframe computer processed most of the digital data. The paper tape with the raw data was fed into the ERA 1103 which performed the needed calculations. The information was then sent back to the control room. There was a lag of several minutes before the computed information was available, but it was exponentially faster than the hand calculations performed by the female computers. The 10- by 10-foot tunnel, which had its official opening in May 1956, was built under the Congressional Unitary Plan Act which coordinated wind tunnel construction at the NACA, Air Force, industry, and universities. The 10- by 10 was the largest of the three NACA tunnels built under the act.
Blind source computer device identification from recorded VoIP calls for forensic investigation.
Jahanirad, Mehdi; Anuar, Nor Badrul; Wahab, Ainuddin Wahid Abdul
2017-03-01
The VoIP services provide fertile ground for criminal activity, thus identifying the transmitting computer devices from recorded VoIP call may help the forensic investigator to reveal useful information. It also proves the authenticity of the call recording submitted to the court as evidence. This paper extended the previous study on the use of recorded VoIP call for blind source computer device identification. Although initial results were promising but theoretical reasoning for this is yet to be found. The study suggested computing entropy of mel-frequency cepstrum coefficients (entropy-MFCC) from near-silent segments as an intrinsic feature set that captures the device response function due to the tolerances in the electronic components of individual computer devices. By applying the supervised learning techniques of naïve Bayesian, linear logistic regression, neural networks and support vector machines to the entropy-MFCC features, state-of-the-art identification accuracy of near 99.9% has been achieved on different sets of computer devices for both call recording and microphone recording scenarios. Furthermore, unsupervised learning techniques, including simple k-means, expectation-maximization and density-based spatial clustering of applications with noise (DBSCAN) provided promising results for call recording dataset by assigning the majority of instances to their correct clusters. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Margaret; Loo, Jerry; Ma, Kevin; Liu, Brent
2011-03-01
Multiple sclerosis (MS) is a debilitating autoimmune disease of the central nervous system that damages axonal pathways through inflammation and demyelination. In order to address the need for a centralized application to manage and study MS patients, the MS e-Folder - a web-based, disease-specific electronic medical record system - was developed. The e-Folder has a PHP and MySQL based graphical user interface (GUI) that can serve as both a tool for clinician decision support and a data mining tool for researchers. This web-based GUI gives the e-Folder a user friendly interface that can be securely accessed through the internet and requires minimal software installation on the client side. The e-Folder GUI displays and queries patient medical records--including demographic data, social history, past medical history, and past MS history. In addition, DICOM format imaging data, and computer aided detection (CAD) results from a lesion load algorithm are also displayed. The GUI interface is dynamic and allows manipulation of the DICOM images, such as zoom, pan, and scrolling, and the ability to rotate 3D images. Given the complexity of clinical management and the need to bolster research in MS, the MS e-Folder system will improve patient care and provide MS researchers with a function-rich patient data hub.
Quality user support: Supporting quality users
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woolley, T.C.
1994-12-31
During the past decade, fundamental changes have occurred in technical computing in the oil industry. Technical computing systems have moved from local, fragmented quantity, to global, integrated, quality. The compute power available to the average geoscientist at his desktop has grown exponentially. Technical computing applications have increased in integration and complexity. At the same time, there has been a significant change in the work force due to the pressures of restructuring, and the increased focus on international opportunities. The profile of the user of technical computing resources has changed. Users are generally more mature, knowledgeable, and team oriented than theirmore » predecessors. In the 1990s, computer literacy is a requirement. This paper describes the steps taken by Oryx Energy Company to address the problems and opportunities created by the explosive growth in computing power and needs, coupled with the contraction of the business. A successful user support strategy will be described. Characteristics of the program include: (1) Client driven support; (2) Empowerment of highly skilled professionals to fill the support role; (3) Routine and ongoing modification to the support plan; (4) Utilization of the support assignment to create highly trained advocates on the line; (5) Integration of the support role to the reservoir management team. Results of the plan include a highly trained work force, stakeholder teams that include support personnel, and global support from a centralized support organization.« less
A Computational Model of Reasoning from the Clinical Literature
Rennels, Glenn D.
1986-01-01
This paper explores the premise that a formalized representation of empirical studies can play a central role in computer-based decision support. The specific motivations underlying this research include the following propositions: 1. Reasoning from experimental evidence contained in the clinical literature is central to the decisions physicians make in patient care. 2. A computational model, based upon a declarative representation for published reports of clinical studies, can drive a computer program that selectively tailors knowledge of the clinical literature as it is applied to a particular case. 3. The development of such a computational model is an important first step toward filling a void in computer-based decision support systems. Furthermore, the model may help us better understand the general principles of reasoning from experimental evidence both in medicine and other domains. Roundsman is a developmental computer system which draws upon structured representations of the clinical literature in order to critique plans for the management of primary breast cancer. Roundsman is able to produce patient-specific analyses of breast cancer management options based on the 24 clinical studies currently encoded in its knowledge base. The Roundsman system is a first step in exploring how the computer can help to bring a critical analysis of the relevant literature to the physician, structured around a particular patient and treatment decision.
Huang, Xiaoke; Zhao, Ye; Yang, Jing; Zhang, Chong; Ma, Chao; Ye, Xinyue
2016-01-01
We propose TrajGraph, a new visual analytics method, for studying urban mobility patterns by integrating graph modeling and visual analysis with taxi trajectory data. A special graph is created to store and manifest real traffic information recorded by taxi trajectories over city streets. It conveys urban transportation dynamics which can be discovered by applying graph analysis algorithms. To support interactive, multiscale visual analytics, a graph partitioning algorithm is applied to create region-level graphs which have smaller size than the original street-level graph. Graph centralities, including Pagerank and betweenness, are computed to characterize the time-varying importance of different urban regions. The centralities are visualized by three coordinated views including a node-link graph view, a map view and a temporal information view. Users can interactively examine the importance of streets to discover and assess city traffic patterns. We have implemented a fully working prototype of this approach and evaluated it using massive taxi trajectories of Shenzhen, China. TrajGraph's capability in revealing the importance of city streets was evaluated by comparing the calculated centralities with the subjective evaluations from a group of drivers in Shenzhen. Feedback from a domain expert was collected. The effectiveness of the visual interface was evaluated through a formal user study. We also present several examples and a case study to demonstrate the usefulness of TrajGraph in urban transportation analysis.
Satisfaction with Life in Orofacial Pain Disorders: Associations and Theoretical Implications.
Boggero, Ian A; Rojas-Ramirez, Marcia V; de Leeuw, Reny; Carlson, Charles R
2016-01-01
To test if patients with masticatory myofascial pain, local myalgia, centrally mediated myalgia, disc displacement, capsulitis/synovitis, or continuous neuropathic pain differed in self-reported satisfaction with life. The study also tested if satisfaction with life was similarly predicted by measures of physical, emotional, and social functioning across disorders. Satisfaction with life, fatigue, affective distress, social support, and pain data were extracted from the medical records of 343 patients seeking treatment for chronic orofacial pain. Patients were grouped by primary diagnosis assigned following their initial appointment. Satisfaction with life was compared between disorders, with and without pain intensity entered as a covariate. Disorder-specific linear regression models using physical, emotional, and social predictors of satisfaction with life were computed. Patients with centrally mediated myalgia reported significantly lower satisfaction with life than did patients with any of the other five disorders. Inclusion of pain intensity as a covariate weakened but did not eliminate the effect. Satisfaction with life was predicted by measures of physical, emotional, and social functioning, but these associations were not consistent across disorders. Results suggest that reduced satisfaction with life in patients with centrally mediated myalgia is not due only to pain intensity. There may be other factors that predispose people to both reduced satisfaction with life and centrally mediated myalgia. Furthermore, the results suggest that satisfaction with life is differentially influenced by physical, emotional, and social functioning in different orofacial pain disorders.
29 CFR 97.42 - Retention and access requirements for records.
Code of Federal Regulations, 2010 CFR
2010-07-01
...: indirect cost rate computations or proposals, cost allocation plans, and any similar accounting... supporting records starts from end of the fiscal year (or other accounting period) covered by the proposal..., papers, or other records of grantees and subgrantees which are pertinent to the grant, in order to make...
49 CFR 18.42 - Retention and access requirements for records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... similar accounting computations of the rate at which a particular group of costs is chargeable (such as... and its supporting records starts from theend of the fiscal year (or other accounting period) covered... pertinent books, documents, papers, or other records of grantees and subgrantees which are pertinent to the...
Maclean, Donald; Younes, Hakim Ben; Forrest, Margaret; Towers, Hazel K
2012-03-01
Accurate and timely clinical data are required for clinical and organisational purposes and is especially important for patient management, audit of surgical performance and the electronic health record. The recent introduction of computerised theatre management systems has enabled real-time (point-of-care) operative procedure coding by clinical staff. However the accuracy of these data is unknown. The aim of this Scottish study was to compare the accuracy of theatre nurses' real-time coding on the local theatre management system with the central Scottish Morbidity Record (SMR01). Paired procedural codes were recorded, qualitatively graded for precision and compared (n = 1038). In this study, real-time, point-of-care coding by theatre nurses resulted in significant coding errors compared with the central SMR01 database. Improved collaboration between full-time coders and clinical staff using computerised decision support systems is suggested.
Bioinformatics and Astrophysics Cluster (BinAc)
NASA Astrophysics Data System (ADS)
Krüger, Jens; Lutz, Volker; Bartusch, Felix; Dilling, Werner; Gorska, Anna; Schäfer, Christoph; Walter, Thomas
2017-09-01
BinAC provides central high performance computing capacities for bioinformaticians and astrophysicists from the state of Baden-Württemberg. The bwForCluster BinAC is part of the implementation concept for scientific computing for the universities in Baden-Württemberg. Community specific support is offered through the bwHPC-C5 project.
NASA Astrophysics Data System (ADS)
Guimaraes, Cayley; Antunes, Diego R.; de F. Guilhermino Trindade, Daniela; da Silva, Rafaella A. Lopes; Garcia, Laura Sanchez
This work presents a computational model (XML) of the Brazilian Sign Language (Libras), based on its phonology. The model was used to create a sample of representative signs to aid the recording of a base of videos whose aim is to support the development of tools to support genuine social inclusion of the deaf.
Improving Target Detection in Visual Search Through the Augmenting Multi-Sensory Cues
2013-01-01
target detection, visual search James Merlo, Joseph E. Mercado , Jan B.F. Van Erp, Peter A. Hancock University of Central Florida 12201 Research Parkway...were controlled by a purpose-created, LabView- based software computer program that synchronised the respective displays and recorded response times and
System Description and Status Report: California Education Information System.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento.
The California Education Information System (CEIS) consists of two subsystems of computer programs designed to process business and pupil data for local school districts. Creating and maintaining records concerning the students in the schools, the pupil subsystem provides for a central repository of school district identification information and a…
Adoption of a Nationwide Shared Medical Record in France: Lessons Learnt after 5 Years of Deployment
Séroussi, Brigitte; Bouaud, Jacques
2016-01-01
Information sharing among health practitioners, either for coordinated or unscheduled care, is necessary to guarantee care quality and patient safety. In most countries, nationwide programs have provided tools to support information sharing, from centralized care records to health information exchange between electronic health records (EHRs). The French personal medical record (DMP) is a centralized patient-controlled record, created according to the opt-in consent model. It contains the documents health practitioners voluntarily push into the DMP from their EHRs. Five years after the launching of the program in December 2010, there were nearly 570,000 DMPs covering only 1.5% of the target population in December 2015. Reasons for this poor level of adoption are discussed in the perspective of other countries’ initiatives. The new French governmental strategy for the DMP deployment in 2016 is outlined, with the implementation of measures similar to the US Meaningful Use. PMID:28269907
Shope, William G.; ,
1991-01-01
The U.S. Geological Survey is acquiring a new generation of field computers and communications software to support hydrologic data-collection at field locations. The new computer hardware and software mark the beginning of the Survey's transition from the use of electromechanical devices and paper tapes to electronic microprocessor-based instrumentation. Software is being developed for these microprocessors to facilitate the collection, conversion, and entry of data into the Survey's National Water Information System. The new automated data-collection process features several microprocessor-controlled sensors connected to a serial digital multidrop line operated by an electronic data recorder. Data are acquired from the sensors in response to instructions programmed into the data recorder by the user through small portable lap-top or hand-held computers. The portable computers, called personal field computers, also are used to extract data from the electronic recorders for transport by courier to the office computers. The Survey's alternative to manual or courier retrieval is the use of microprocessor-based remote telemetry stations. Plans have been developed to enhance the Survey's use of the Geostationary Operational Environmental Satellite telemetry by replacing the present network of direct-readout ground stations with less expensive units. Plans also provide for computer software that will support other forms of telemetry such as telephone or land-based radio.
Platform links clinical data with electronic health records
To make data gathered from patients in clinical trials available for use in standard care, NCI has created a new computer tool to support interoperability between clinical research and electronic health record systems. This new software represents an inno
Gunst, S; Del Chicca, F; Fürst, A E; Kuemmerle, J M
2016-09-01
There are no reports on the configuration of equine central tarsal bone fractures based on cross-sectional imaging and clinical and radiographic long-term outcome after internal fixation. To report clinical, radiographic and computed tomographic findings of equine central tarsal bone fractures and to evaluate the long-term outcome of internal fixation. Retrospective case series. All horses diagnosed with a central tarsal bone fracture at our institution in 2009-2013 were included. Computed tomography and internal fixation using lag screw technique was performed in all patients. Medical records and diagnostic images were reviewed retrospectively. A clinical and radiographic follow-up examination was performed at least 1 year post operatively. A central tarsal bone fracture was diagnosed in 6 horses. Five were Warmbloods used for showjumping and one was a Quarter Horse used for reining. All horses had sagittal slab fractures that began dorsally, ran in a plantar or plantaromedial direction and exited the plantar cortex at the plantar or plantaromedial indentation of the central tarsal bone. Marked sclerosis of the central tarsal bone was diagnosed in all patients. At long-term follow-up, 5/6 horses were sound and used as intended although mild osteophyte formation at the distal intertarsal joint was commonly observed. Central tarsal bone fractures in nonracehorses had a distinct configuration but radiographically subtle additional fracture lines can occur. A chronic stress related aetiology seems likely. Internal fixation of these fractures based on an accurate diagnosis of the individual fracture configuration resulted in a very good prognosis. © 2015 EVJ Ltd.
41 CFR 105-71.142 - Retention and access requirements for records.
Code of Federal Regulations, 2010 CFR
2010-07-01
... similar accounting computations of the rates at which a particular group of costs is chargeable (such as... and its supporting records starts from end of the fiscal year (or other accounting period) covered by..., documents, papers, or other records of grantees and subgrantees which are pertinent to the grant, in order...
14 CFR 1274.601 - Retention and access requirements for records.
Code of Federal Regulations, 2010 CFR
2010-01-01
... access to any books, documents, papers, or other records of Recipients that are pertinent to the awards... or proposals, cost allocation plans, and any similar accounting computations of the rate at which a... supporting records starts at the end of the fiscal year (or other accounting period) covered by the proposal...
Central Satellite Data Repository Supporting Research and Development
NASA Astrophysics Data System (ADS)
Han, W.; Brust, J.
2015-12-01
Near real-time satellite data is critical to many research and development activities of atmosphere, land, and ocean processes. Acquiring and managing huge volumes of satellite data without (or with less) latency in an organization is always a challenge in the big data age. An organization level data repository is a practical solution to meeting this challenge. The STAR (Center for Satellite Applications and Research of NOAA) Central Data Repository (SCDR) is a scalable, stable, and reliable repository to acquire, manipulate, and disseminate various types of satellite data in an effective and efficient manner. SCDR collects more than 200 data products, which are commonly used by multiple groups in STAR, from NOAA, GOES, Metop, Suomi NPP, Sentinel, Himawari, and other satellites. The processes of acquisition, recording, retrieval, organization, and dissemination are performed in parallel. Multiple data access interfaces, like FTP, FTPS, HTTP, HTTPS, and RESTful, are supported in the SCDR to obtain satellite data from their providers through high speed internet. The original satellite data in various raster formats can be parsed in the respective adapter to retrieve data information. The data information is ingested to the corresponding partitioned tables in the central database. All files are distributed equally on the Network File System (NFS) disks to balance the disk load. SCDR provides consistent interfaces (including Perl utility, portal, and RESTful Web service) to locate files of interest easily and quickly and access them directly by over 200 compute servers via NFS. SCDR greatly improves collection and integration of near real-time satellite data, addresses satellite data requirements of scientists and researchers, and facilitates their primary research and development activities.
ERIC Educational Resources Information Center
Bayer, Marc Dewey
2008-01-01
Since 2004, Buffalo State College's E. H. Butler Library has used the Information Commons (IC) model to assist its 8,500 students with library research and computer applications. Campus Technology Services (CTS) plays a very active role in its IC, with a centrally located Computer Help Desk and a newly created Application Support Desk right in the…
The Mathematics and Computer Science Learning Center (MLC).
ERIC Educational Resources Information Center
Abraham, Solomon T.
The Mathematics and Computer Science Learning Center (MLC) was established in the Department of Mathematics at North Carolina Central University during the fall semester of the 1982-83 academic year. The initial operations of the MLC were supported by grants to the University from the Burroughs-Wellcome Company and the Kenan Charitable Trust Fund.…
1993-06-01
administering contractual support for lab-wide or multiple buys of ADP systems, software, and services. Computer systems located in the Central Computing Facility...Code Dr. D.L. Bradley Vacant Mrs. N.J. Beauchamp Dr. W.A. Kuperman Dr. E.R. Franchi Dr. M.H. Orr Dr. J.A. Bucaro Mr. L.B. Palmer Dr. D.J. Ramsdale Mr
Federated Tensor Factorization for Computational Phenotyping
Kim, Yejin; Sun, Jimeng; Yu, Hwanjo; Jiang, Xiaoqian
2017-01-01
Tensor factorization models offer an effective approach to convert massive electronic health records into meaningful clinical concepts (phenotypes) for data analysis. These models need a large amount of diverse samples to avoid population bias. An open challenge is how to derive phenotypes jointly across multiple hospitals, in which direct patient-level data sharing is not possible (e.g., due to institutional policies). In this paper, we developed a novel solution to enable federated tensor factorization for computational phenotyping without sharing patient-level data. We developed secure data harmonization and federated computation procedures based on alternating direction method of multipliers (ADMM). Using this method, the multiple hospitals iteratively update tensors and transfer secure summarized information to a central server, and the server aggregates the information to generate phenotypes. We demonstrated with real medical datasets that our method resembles the centralized training model (based on combined datasets) in terms of accuracy and phenotypes discovery while respecting privacy. PMID:29071165
Taylor, David; Valenza, John A; Spence, James M; Baber, Randolph H
2007-10-11
Simulation has been used for many years in dental education, but the educational context is typically a laboratory divorced from the clinical setting, which impairs the transfer of learning. Here we report on a true simulation clinic with multimedia communication from a central teaching station. Each of the 43 fully-functioning student operatories includes a thin-client networked computer with access to an Electronic Patient Record (EPR).
Virtual reality and planetary exploration
NASA Technical Reports Server (NTRS)
Mcgreevy, Michael W.
1992-01-01
Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.
Virtual reality and planetary exploration
NASA Astrophysics Data System (ADS)
McGreevy, Michael W.
Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.
Tsunami Modeling to Validate Slip Models of the 2007 M w 8.0 Pisco Earthquake, Central Peru
NASA Astrophysics Data System (ADS)
Ioualalen, M.; Perfettini, H.; Condo, S. Yauri; Jimenez, C.; Tavera, H.
2013-03-01
Following the 2007, August 15th, M w 8.0, Pisco earthquake in central Peru, Sladen et al. (J Geophys Res 115: B02405, 2010) have derived several slip models of this event. They inverted teleseismic data together with geodetic (InSAR) measurements to look for the co-seismic slip distribution on the fault plane, considering those data sets separately or jointly. But how close to the real slip distribution are those inverted slip models? To answer this crucial question, the authors generated some tsunami records based on their slip models and compared them to DART buoys, tsunami records, and available runup data. Such an approach requires a robust and accurate tsunami model (non-linear, dispersive, accurate bathymetry and topography, etc.) otherwise the differences between the data and the model may be attributed to the slip models themselves, though they arise from an incomplete tsunami simulation. The accuracy of a numerical tsunami simulation strongly depends, among others, on two important constraints: (i) A fine computational grid (and thus the bathymetry and topography data sets used) which is not always available, unfortunately, and (ii) a realistic tsunami propagation model including dispersion. Here, we extend Sladen's work using newly available data, namely a tide gauge record at Callao (Lima harbor) and the Chilean DART buoy record, while considering a complete set of runup data along with a more realistic tsunami numerical that accounts for dispersion, and also considering a fine-resolution computational grid, which is essential. Through these accurate numerical simulations we infer that the InSAR-based model is in better agreement with the tsunami data, studying the case of the Pisco earthquake indicating that geodetic data seems essential to recover the final co-seismic slip distribution on the rupture plane. Slip models based on teleseismic data are unable to describe the observed tsunami, suggesting that a significant amount of co-seismic slip may have been aseismic. Finally, we compute the runup distribution along the central part of the Peruvian coast to better understand the wave amplification/attenuation processes of the tsunami generated by the Pisco earthquake.
Palmer, Rebecca; Enderby, Pam
2016-10-01
The speech-language pathology profession has explored a number of approaches to support efficient delivery of interventions for people with stroke-induced aphasia. This study aimed to explore the role of volunteers in supporting self-managed practice of computerised language exercises. A qualitative interview study of the volunteer support role was carried out alongside a pilot randomised controlled trial of computer aphasia therapy. Patients with aphasia practised computer exercises tailored for them by a speech-language pathologist at home regularly for 5 months. Eight of the volunteers who supported the intervention took part in semi-structured interviews. Interviews were audio recorded, transcribed verbatim and analysed thematically. Emergent themes included: training and support requirements; perception of the volunteer role; challenges facing the volunteer, in general and specifically related to supporting computer therapy exercises. The authors concluded that volunteers helped to motivate patients to practise their computer therapy exercises and also provided support to the carers. Training and ongoing structured support of therapy activity and conduct is required from a trained speech-language pathologist to ensure the successful involvement of volunteers supporting impairment-based computer exercises in patients' own homes.
ERIC Educational Resources Information Center
Clariana, Roy B.; Engelmann, Tanja; Yu, Wu
2013-01-01
Problem solving likely involves at least two broad stages, problem space representation and then problem solution (Newell and Simon, Human problem solving, 1972). The metric centrality that Freeman ("Social Networks" 1:215-239, 1978) implemented in social network analysis is offered here as a potential measure of both. This development research…
Satisfaction with Life in Orofacial Pain Disorders: Associations and Theoretical Implications
Boggero, Ian A.; Rojas-Ramirez, Marcia V.; de Leeuw, Reny; Carlson, Charles R.
2016-01-01
Aims To test if patients with masticatory myofascial pain, local myalgia, centrally mediated myalgia, disc displacement, capsulitis/synovitis, or continuous neuropathic pain differed in self-reported satisfaction with life. The study also tested if satisfaction with life was similarly predicted by measures of physical, emotional, and social functioning across disorders. Methods Satisfaction with life, fatigue, affective distress, social support, and pain data were extracted from the medical records of 343 patients seeking treatment for chronic orofacial pain. Patients were grouped by primary diagnosis assigned following their initial appointment. Satisfaction with life was compared between disorders, with and without pain intensity entered as a covariate. Disorder-specific linear regression models using physical, emotional, and social predictors of satisfaction with life were computed. Results Patients with centrally mediated myalgia reported significantly lower satisfaction with life than did patients with any of the other five disorders. Inclusion of pain intensity as a covariate weakened but did not eliminate the effect. Satisfaction with life was predicted by measures of physical, emotional, and social functioning, but these associations were not consistent across disorders. Conclusions Results suggest that reduced satisfaction with life in patients with centrally mediated myalgia is not due only to pain intensity. There may be other factors that predispose people to both reduced satisfaction with life and centrally mediated myalgia. Furthermore, the results suggest that satisfaction with life is differentially influenced by physical, emotional, and social functioning in different orofacial pain disorders. PMID:27128473
The development of a disease oriented eFolder for multiple sclerosis decision support
NASA Astrophysics Data System (ADS)
Ma, Kevin; Jacobs, Colin; Fernandez, James; Amezcua, Lilyana; Liu, Brent
2010-03-01
Multiple sclerosis (MS) is a demyelinating disease of the central nervous system. The chronic nature of MS necessitates multiple MRI studies to track disease progression. Currently, MRI assessment of multiple sclerosis requires manual lesion measurement and yields an estimate of lesion volume and change that is highly variable and user-dependent. In the setting of a longitudinal study, disease trends and changes become difficult to extrapolate from the lesions. In addition, it is difficult to establish a correlation between these imaged lesions and clinical factors such as treatment course. To address these clinical needs, an MS specific e-Folder for decision support in the evaluation and assessment of MS has been developed. An e-Folder is a disease-centric electronic medical record in contrast to a patient-centric electronic health record. Along with an MS lesion computer aided detection (CAD) package for lesion load, location, and volume, clinical parameters such as patient demographics, disease history, clinical course, and treatment history are incorporated to make the e-Folder comprehensive. With the integration of MRI studies together with related clinical data and informatics tools designed for monitoring multiple sclerosis, it provides a platform to improve the detection of treatment response in patients with MS. The design and deployment of MS e-Folder aims to standardize MS lesion data and disease progression to aid in decision making and MS-related research.
Use of computers and Internet among people with severe mental illnesses at peer support centers.
Brunette, Mary F; Aschbrenner, Kelly A; Ferron, Joelle C; Ustinich, Lee; Kelly, Michael; Grinley, Thomas
2017-12-01
Peer support centers are an ideal setting where people with severe mental illnesses can access the Internet via computers for online health education, peer support, and behavioral treatments. The purpose of this study was to assess computer use and Internet access in peer support agencies. A peer-assisted survey assessed the frequency with which consumers in all 13 New Hampshire peer support centers (n = 702) used computers to access Internet resources. During the 30-day survey period, 200 of the 702 peer support consumers (28%) responded to the survey. More than 3 quarters (78.5%) of respondents had gone online to seek information in the past year. About half (49%) of respondents were interested in learning about online forums that would provide information and peer support for mental health issues. Peer support centers may be a useful venue for Web-based approaches to education, peer support, and intervention. Future research should assess facilitators and barriers to use of Web-based resources among people with severe mental illness in peer support centers. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
"Dear Fresher …"--How Online Questionnaires Can Improve Learning and Teaching Statistics
ERIC Educational Resources Information Center
Bebermeier, Sarah; Nussbeck, Fridtjof W.; Ontrup, Greta
2015-01-01
Lecturers teaching statistics are faced with several challenges supporting students' learning in appropriate ways. A variety of methods and tools exist to facilitate students' learning on statistics courses. The online questionnaires presented in this report are a new, slightly different computer-based tool: the central aim was to support students…
Readerbench: Automated Evaluation of Collaboration Based on Cohesion and Dialogism
ERIC Educational Resources Information Center
Dascalu, Mihai; Trausan-Matu, Stefan; McNamara, Danielle S.; Dessus, Philippe
2015-01-01
As Computer-Supported Collaborative Learning (CSCL) gains a broader usage, the need for automated tools capable of supporting tutors in the time-consuming process of analyzing conversations becomes more pressing. Moreover, collaboration, which presumes the intertwining of ideas or points of view among participants, is a central element of dialogue…
Angulo, Arturo; López-Sánchez, Myrna I
2017-02-23
New records of occurrence for four species of lampriform fishes (Teleostei: Lampriformes; Desmodema polystictum, Regalecus russelii, Trachipterus fukuzakii and Zu cristatus) poorly known or previously unknown for the Pacific coast of lower Central America (Costa Rica-Panama) are herein reported. Museum specimens supporting such records are characterized and described. Comparative morphometric and meristic data on other collections and species of lampriforms, as well as distributional information, are provided and discussed. Diversity, taxonomy and distribution of the eastern Pacific species of the order also are briefly discussed. Finally, a key to the eastern Pacific species of the Lampriformes, based on our research and data available in the literature, is presented.
Mercury Toolset for Spatiotemporal Metadata
NASA Technical Reports Server (NTRS)
Wilson, Bruce E.; Palanisamy, Giri; Devarakonda, Ranjeet; Rhyne, B. Timothy; Lindsley, Chris; Green, James
2010-01-01
Mercury (http://mercury.ornl.gov) is a set of tools for federated harvesting, searching, and retrieving metadata, particularly spatiotemporal metadata. Version 3.0 of the Mercury toolset provides orders of magnitude improvements in search speed, support for additional metadata formats, integration with Google Maps for spatial queries, facetted type search, support for RSS (Really Simple Syndication) delivery of search results, and enhanced customization to meet the needs of the multiple projects that use Mercury. It provides a single portal to very quickly search for data and information contained in disparate data management systems, each of which may use different metadata formats. Mercury harvests metadata and key data from contributing project servers distributed around the world and builds a centralized index. The search interfaces then allow the users to perform a variety of fielded, spatial, and temporal searches across these metadata sources. This centralized repository of metadata with distributed data sources provides extremely fast search results to the user, while allowing data providers to advertise the availability of their data and maintain complete control and ownership of that data. Mercury periodically (typically daily) harvests metadata sources through a collection of interfaces and re-indexes these metadata to provide extremely rapid search capabilities, even over collections with tens of millions of metadata records. A number of both graphical and application interfaces have been constructed within Mercury, to enable both human users and other computer programs to perform queries. Mercury was also designed to support multiple different projects, so that the particular fields that can be queried and used with search filters are easy to configure for each different project.
Mercury Toolset for Spatiotemporal Metadata
NASA Astrophysics Data System (ADS)
Devarakonda, Ranjeet; Palanisamy, Giri; Green, James; Wilson, Bruce; Rhyne, B. Timothy; Lindsley, Chris
2010-06-01
Mercury (http://mercury.ornl.gov) is a set of tools for federated harvesting, searching, and retrieving metadata, particularly spatiotemporal metadata. Version 3.0 of the Mercury toolset provides orders of magnitude improvements in search speed, support for additional metadata formats, integration with Google Maps for spatial queries, facetted type search, support for RSS (Really Simple Syndication) delivery of search results, and enhanced customization to meet the needs of the multiple projects that use Mercury. It provides a single portal to very quickly search for data and information contained in disparate data management systems, each of which may use different metadata formats. Mercury harvests metadata and key data from contributing project servers distributed around the world and builds a centralized index. The search interfaces then allow the users to perform a variety of fielded, spatial, and temporal searches across these metadata sources. This centralized repository of metadata with distributed data sources provides extremely fast search results to the user, while allowing data providers to advertise the availability of their data and maintain complete control and ownership of that data. Mercury periodically (typically daily)harvests metadata sources through a collection of interfaces and re-indexes these metadata to provide extremely rapid search capabilities, even over collections with tens of millions of metadata records. A number of both graphical and application interfaces have been constructed within Mercury, to enable both human users and other computer programs to perform queries. Mercury was also designed to support multiple different projects, so that the particular fields that can be queried and used with search filters are easy to configure for each different project.
Better informed in clinical practice - a brief overview of dental informatics.
Reynolds, P A; Harper, J; Dunne, S
2008-03-22
Uptake of dental informatics has been hampered by technical and user issues. Innovative systems have been developed, but usability issues have affected many. Advances in technology and artificial intelligence are now producing clinically useful systems, although issues still remain with adapting computer interfaces to the dental practice working environment. A dental electronic health record has become a priority in many countries, including the UK. However, experience shows that any dental electronic health record (EHR) system cannot be subordinate to, or a subset of, a medical record. Such a future dental EHR is likely to incorporate integrated care pathways. Future best dental practice will increasingly depend on computer-based support tools, although disagreement remains about the effectiveness of current support tools. Over the longer term, future dental informatics tools will incorporate dynamic, online evidence-based medicine (EBM) tools, and promise more adaptive, patient-focused and efficient dental care with educational advantages in training.
A decision-supported outpatient practice system.
Barrows, R. C.; Allen, B. A.; Smith, K. C.; Arni, V. V.; Sherman, E.
1996-01-01
We describe a Decision-supported Outpatient Practice (DOP) system developed and now in use at the Columbia-Presbyterian Medical Center. DOP is an automated ambulatory medical record system that integrates in-patient and ambulatory care data, and incorporates active and passive decision support mechanisms with a view towards improving the quality of primary care. Active decision support occurs in the form of event-driven reminders created within a remote clinical information system with its central data repository and decision support system (DSS). Novel features of DOP include patient specific health maintenance task lists calculated by the remote DSS. uses of a semantically structured controlled medical vocabulary to support clinical results review and provider data entry, and exploitation of an underlying ambulatory data model that provides for an explicit record of evolution of insight regarding patient management. Benefits, challenges, and plans are discussed. PMID:8947774
BIO-Plex Information System Concept
NASA Technical Reports Server (NTRS)
Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)
1999-01-01
This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.
Evaluating Implementations of Service Oriented Architecture for Sensor Network via Simulation
2011-04-01
Subject: COMPUTER SCIENCE Approved: Boleslaw Szymanski , Thesis Adviser Rensselaer Polytechnic Institute Troy, New York April 2011 (For Graduation May 2011...simulation supports distributed and centralized composition with a type hierarchy and multiple -service statically-located nodes in a 2-dimensional space...distributed and centralized composition with a type hierarchy and multiple -service statically-located nodes in a 2-dimensional space. The second simulation
Translating eHealth Visions from Strategy to Practice - A Benefit Management Approach.
Villumsen, Sidsel; Nøhr, Christian; Faxvaag, Arild
2018-01-01
The municipalities and the Regional Health Authorities in Central Norway have been assigned a mandate to implement a shared electronic health record, Helseplattformen, reflecting the visions set out in the national eHealth white paper 'One Citizen - One Record'. This study identifies and describe anticipated benefit streams of clinical decision support in 'One Citizen - One Record' and the user requirement specification documents of Helseplattformen. This study found that the benefit stream of clinical decision support translates from the health policy visions stated in 'One Citizen - One Record' into Helseplattformen. However, business changes, although a critical element of achieving benefits, were not emphasised in either. This calls for the programme of Helseplattformen to pay careful attention to how the information system and information technology requirements must be accompanied by enabling changes as well as business changes in order to achieve the identified benefits of 'One Citizen - One Record' and Helseplattformen.
Program Description: EDIT Program and Vendor Master Update, SWRL Financial System.
ERIC Educational Resources Information Center
Ikeda, Masumi
Computer routines to edit input data for the Southwest Regional Laboratory's (SWRL) Financial System are described. The program is responsible for validating input records, generating records for further system processing, and updating the Vendor Master File--a file containing the information necessary to support the accounts payable and…
NASA Astrophysics Data System (ADS)
Gonczi, Amanda L.; Chiu, Jennifer L.; Maeng, Jennifer L.; Bell, Randy L.
2016-07-01
This investigation sought to identify patterns in elementary science teachers' computer simulation use, particularly implementation structures and instructional supports commonly employed by teachers. Data included video-recorded science lessons of 96 elementary teachers who used computer simulations in one or more science lessons. Results indicated teachers used a one-to-one student-to-computer ratio most often either during class-wide individual computer use or during a rotating station structure. Worksheets, general support, and peer collaboration were the most common forms of instructional support. The least common instructional support forms included lesson pacing, initial play, and a closure discussion. Students' simulation use was supported in the fewest ways during a rotating station structure. Results suggest that simulation professional development with elementary teachers needs to explicitly focus on implementation structures and instructional support to enhance participants' pedagogical knowledge and improve instructional simulation use. In addition, research is needed to provide theoretical explanations for the observed patterns that should subsequently be addressed in supporting teachers' instructional simulation use during professional development or in teacher preparation programs.
NASA Astrophysics Data System (ADS)
Yin, J.-J.; Yuan, D.-X.; Li, H.-C.; Cheng, H.; Li, T.-Y.; Edwards, R. L.; Lin, Y.-S.; Qin, J.-M.; Tang, W.; Zhao, Z.-Y.; Mii, H.-S.
2014-10-01
This paper focuses on the climate variability in central China since AD 1300, involving: (1) a well-dated, 1.5-year resolution stalagmite δ18O record from Lianhua Cave, central China (2) links of the δ18O record with regional dry-wet conditions, monsoon intensity, and temperature over eastern China (3) correlations among drought events in the Lianhua record, solar irradiation, and ENSO (El Niño-Southern Oscillation) variation. We present a highly precise, 230Th / U-dated, 1.5-year resolution δ18O record of an aragonite stalagmite (LHD1) collected from Lianhua Cave in the Wuling Mountain area of central China. The comparison of the δ18O record with the local instrumental record and historical documents indicates that (1) the stalagmite δ18O record reveals variations in the summer monsoon intensity and dry-wet conditions in the Wuling Mountain area. (2) A stronger East Asian summer monsoon (EASM) enhances the tropical monsoon trough controlled by ITCZ (Intertropical Convergence Zone), which produces higher spring quarter rainfall and isotopically light monsoonal moisture in the central China. (3) The summer quarter/spring quarter rainfall ratio in central China can be a potential indicator of the EASM strength: a lower ratio corresponds to stronger EASM and higher spring rainfall. The ratio changed from <1 to >1 after 1950, reflecting that the summer quarter rainfall of the study area became dominant under stronger influence of the Northwestern Pacific High. Eastern China temperatures varied with the solar activity, showing higher temperatures under stronger solar irradiation, which produced stronger summer monsoons. During Maunder, Dalton and 1900 sunspot minima, more severe drought events occurred, indicating a weakening of the summer monsoon when solar activity decreased on decadal timescales. On an interannual timescale, dry conditions in the study area prevailed under El Niño conditions, which is also supported by the spectrum analysis. Hence, our record illustrates the linkage of Asian summer monsoon precipitation to solar irradiation and ENSO: wetter conditions in the study area under stronger summer monsoon during warm periods, and vice versa. During cold periods, the Walker Circulation will shift toward the central Pacific under El Niño conditions, resulting in a further weakening of Asian summer monsoons.
Integrating all medical records to an enterprise viewer.
Li, Haomin; Duan, Huilong; Lu, Xudong; Zhao, Chenhui; An, Jiye
2005-01-01
The idea behind hospital information systems is to make all of a patient's medical reports, lab results, and images electronically available to clinicians, instantaneously, wherever they are. But the higgledy-piggledy evolution of most hospital computer systems makes it hard to integrate all these clinical records. Although several integration standards had been proposed to meet this challenger, none of them is fit to Chinese hospitals. In this paper, we introduce our work of implementing a three-tiered architecture enterprise viewer in Huzhou Central Hospital to integration all existing medical information systems using limited resource.
Application of mobile computers in a measuring system supporting examination of posture diseases
NASA Astrophysics Data System (ADS)
Piekarski, Jacek; Klimiec, Ewa; Zaraska, Wiesław
2013-07-01
Measuring system designed and manufactured by the authors and based on mobile computers (smartphones and tablets) working as data recorders has been invented to support diagnosis of orthopedic, especially feet, diseases. The basic idea is to examine a patient in his natural environment, during the usual activities (such as walking or running). The paper describes the proposed system with sensors manufactured from piezoelectric film (PVDF film) and placed in the shoe insole. The mechanical reliability of PVDF film is excellent, though elimination of the pyroelectric effect is required. A possible solution of the problem and the test results are presented in the paper. Data recording is based on wireless transmission to a mobile device used as a data logger.
Interdisciplinary investigations in support of project DI-MOD
NASA Technical Reports Server (NTRS)
Starks, Scott A.
1991-01-01
Interdisciplinary investigations in support of project DI-MOD are discussed. The following subject areas were covered: (1) potential extensions of Project DI-MOD to additional sites in Central America; (2) human migration patterns and their impact on malaria transmission; and (3) an investigation into possible computer-based approaches to the analysis of remotely sensed multispectral data.
ERIC Educational Resources Information Center
Farmer, Thomas A.; Cargill, Sarah A.; Hindy, Nicholas C.; Dale, Rick; Spivey, Michael J.
2007-01-01
Although several theories of online syntactic processing assume the parallel activation of multiple syntactic representations, evidence supporting simultaneous activation has been inconclusive. Here, the continuous and non-ballistic properties of computer mouse movements are exploited, by recording their streaming x, y coordinates to procure…
Chiarelli, Antonio Maria; Croce, Pierpaolo; Merla, Arcangelo; Zappasodi, Filippo
2018-06-01
Brain-computer interface (BCI) refers to procedures that link the central nervous system to a device. BCI was historically performed using electroencephalography (EEG). In the last years, encouraging results were obtained by combining EEG with other neuroimaging technologies, such as functional near infrared spectroscopy (fNIRS). A crucial step of BCI is brain state classification from recorded signal features. Deep artificial neural networks (DNNs) recently reached unprecedented complex classification outcomes. These performances were achieved through increased computational power, efficient learning algorithms, valuable activation functions, and restricted or back-fed neurons connections. By expecting significant overall BCI performances, we investigated the capabilities of combining EEG and fNIRS recordings with state-of-the-art deep learning procedures. We performed a guided left and right hand motor imagery task on 15 subjects with a fixed classification response time of 1 s and overall experiment length of 10 min. Left versus right classification accuracy of a DNN in the multi-modal recording modality was estimated and it was compared to standalone EEG and fNIRS and other classifiers. At a group level we obtained significant increase in performance when considering multi-modal recordings and DNN classifier with synergistic effect. BCI performances can be significantly improved by employing multi-modal recordings that provide electrical and hemodynamic brain activity information, in combination with advanced non-linear deep learning classification procedures.
NASA Astrophysics Data System (ADS)
Chiarelli, Antonio Maria; Croce, Pierpaolo; Merla, Arcangelo; Zappasodi, Filippo
2018-06-01
Objective. Brain–computer interface (BCI) refers to procedures that link the central nervous system to a device. BCI was historically performed using electroencephalography (EEG). In the last years, encouraging results were obtained by combining EEG with other neuroimaging technologies, such as functional near infrared spectroscopy (fNIRS). A crucial step of BCI is brain state classification from recorded signal features. Deep artificial neural networks (DNNs) recently reached unprecedented complex classification outcomes. These performances were achieved through increased computational power, efficient learning algorithms, valuable activation functions, and restricted or back-fed neurons connections. By expecting significant overall BCI performances, we investigated the capabilities of combining EEG and fNIRS recordings with state-of-the-art deep learning procedures. Approach. We performed a guided left and right hand motor imagery task on 15 subjects with a fixed classification response time of 1 s and overall experiment length of 10 min. Left versus right classification accuracy of a DNN in the multi-modal recording modality was estimated and it was compared to standalone EEG and fNIRS and other classifiers. Main results. At a group level we obtained significant increase in performance when considering multi-modal recordings and DNN classifier with synergistic effect. Significance. BCI performances can be significantly improved by employing multi-modal recordings that provide electrical and hemodynamic brain activity information, in combination with advanced non-linear deep learning classification procedures.
NASA Technical Reports Server (NTRS)
Low, M. D.; Baker, M.; Ferguson, R.; Frost, J. D., Jr.
1972-01-01
This paper describes a complete electroencephalographic acquisition and transmission system, designed to meet the needs of a large hospital with multiple critical care patient monitoring units. The system provides rapid and prolonged access to a centralized recording and computing area from remote locations within the hospital complex, and from locations in other hospitals and other cities. The system includes quick-on electrode caps, amplifier units and cable transmission for access from within the hospital, and EEG digitization and telephone transmission for access from other hospitals or cities.
Floods in Central Texas, September 7-14, 2010
Winters, Karl E.
2012-01-01
Severe flooding occurred near the Austin metropolitan area in central Texas September 7–14, 2010, because of heavy rainfall associated with Tropical Storm Hermine. The U.S. Geological Survey, in cooperation with the Upper Brushy Creek Water Control and Improvement District, determined rainfall amounts and annual exceedance probabilities for rainfall resulting in flooding in Bell, Williamson, and Travis counties in central Texas during September 2010. We documented peak streamflows and the annual exceedance probabilities for peak streamflows recorded at several streamflow-gaging stations in the study area. The 24-hour rainfall total exceeded 12 inches at some locations, with one report of 14.57 inches at Lake Georgetown. Rainfall probabilities were estimated using previously published depth-duration frequency maps for Texas. At 4 sites in Williamson County, the 24-hour rainfall had an annual exceedance probability of 0.002. Streamflow measurement data and flood-peak data from U.S. Geological Survey surface-water monitoring stations (streamflow and reservoir gaging stations) are presented, along with a comparison of September 2010 flood peaks to previous known maximums in the periods of record. Annual exceedance probabilities for peak streamflow were computed for 20 streamflow-gaging stations based on an analysis of streamflow-gaging station records. The annual exceedance probability was 0.03 for the September 2010 peak streamflow at the Geological Survey's streamflow-gaging stations 08104700 North Fork San Gabriel River near Georgetown, Texas, and 08154700 Bull Creek at Loop 360 near Austin, Texas. The annual exceedance probability was 0.02 for the peak streamflow for Geological Survey's streamflow-gaging station 08104500 Little River near Little River, Texas. The lack of similarity in the annual exceedance probabilities computed for precipitation and streamflow might be attributed to the small areal extent of the heaviest rainfall over these and the other gaged watersheds.
ANL statement of site strategy for computing workstations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.
1991-11-01
This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less
Mobile healthcare information management utilizing Cloud Computing and Android OS.
Doukas, Charalampos; Pliakas, Thomas; Maglogiannis, Ilias
2010-01-01
Cloud Computing provides functionality for managing information data in a distributed, ubiquitous and pervasive manner supporting several platforms, systems and applications. This work presents the implementation of a mobile system that enables electronic healthcare data storage, update and retrieval using Cloud Computing. The mobile application is developed using Google's Android operating system and provides management of patient health records and medical images (supporting DICOM format and JPEG2000 coding). The developed system has been evaluated using the Amazon's S3 cloud service. This article summarizes the implementation details and presents initial results of the system in practice.
Timpka, Toomas; Olvander, Christina; Hallberg, Niklas
2008-09-01
The international Safe Community programme was used as the setting for a case study to explore the need for information system support in health promotion programmes. The 14 Safe Communities active in Sweden during 2002 were invited to participate and 13 accepted. A questionnaire on computer usage and a critical incident technique instrument were distributed. Sharing of management information, creating social capital for safety promotion, and injury data recording were found to be key areas that need to be further supported by computer-based information systems. Most respondents reported having access to a personal computer workstation with standard office software. Interest in using more advanced computer applications was low, and there was considerable need for technical user support. Areas where information systems can be used to make health promotion practice more efficient were identified, and patterns of computers usage were described.
NASA Astrophysics Data System (ADS)
Yin, J.-J.; Yuan, D.-X.; Li, H.-C.; Cheng, H.; Li, T.-Y.; Edwards, R. L.; Lin, Y.-S.; Qin, J.-M.; Tang, W.; Zhao, Z.-Y.; Mii, H.-S.
2014-04-01
Highlight: this paper focuses on the climate variability in central China since 1300 AD, involving: 1. A well-dated, 1.5 year resolution stalagmite δ18O record from Lianhua Cave, central China; 2. Links of the δ18O record with regional dry-wet condition, monsoon intensity, and temperature over eastern China; 3. Correlations among drought events in the Lianhua record, solar irradiation, and ENSO index. We present a highly precisely 230Th/U dated, 1.5 year resolution δ18O record of an aragonite stalagmite (LHD1) collected from Lianhua Cave in Wuling mountain area of central China. The comparison of the δ18O record with the local instrumental record and historical documents exhibits at least 15 drought events in the Wuling mountain and adjacent areas during the Little Ice Age, in which some of them were corresponding to megadrought events in the broad Asian monsoonal region of China. Thus, the stalagmite δ18O record reveals variations in the summer monsoon precipitation and dry-wet condition in Wuling mountain area. The eastern China temperature varied with the solar activity, showing higher temperature under stronger solar irradiation which produces stronger summer monsoon. During Maunder, Dalton and 1900 sunspot minima, more severe drought events occurred, indicating weakening of the summer monsoon when solar activity decreased on decadal time scales. On interannual time scale, dry conditions in the studying area were prevailing under El Niño condition, which is also supported by the spectrum analysis. Hence, our record illustrates the linkage of Asian summer monsoon precipitation to solar irradiation and ENSO: wetter condition under stronger summer monsoon during warm periods and vice versa; During cold periods, the Walker circulation will shift toward central Pacific under El Niño condition, resulting further weakening of Asian summer monsoon. However, the δ18O of LHD1 record is positively correlated with temperature after ~1940 AD which is opposite to the δ18O - temperature relationship in earlier time. This anomaly relationship might be caused by the greenhouse-gas forcing.
SSCR Automated Manager (SAM) release 1. 1 reference manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-10-01
This manual provides instructions for using the SSCR Automated Manager (SAM) to manage System Software Change Records (SSCRs) online. SSCRs are forms required to document all system software changes for the Martin Marietta Energy Systems, Inc., Central computer systems. SAM, a program developed at Energy Systems, is accessed through IDMS/R (Integrated Database Management System) on an IBM system.
A Directory of Sources of Information and Data Bases on Education and Training.
1980-09-01
ACADO07 National Opinion Research Center (NORC) ... ............. ... ACADOO8 U of California Union Catalog Supp. (1963-1967...Records (RSR) ...... .................. ... ARMYO30 Union Central Registry System (UCRSYS) .... .............. ... ARMY032 Training Control Card Report...research. Your query directs a computer search of the Compre- hensive Dissertation Database. The search produces a list of all titles matching your
Genomic inferences of domestication events are corroborated by written records in Brassica rapa.
Qi, Xinshuai; An, Hong; Ragsdale, Aaron P; Hall, Tara E; Gutenkunst, Ryan N; Chris Pires, J; Barker, Michael S
2017-07-01
Demographic modelling is often used with population genomic data to infer the relationships and ages among populations. However, relatively few analyses are able to validate these inferences with independent data. Here, we leverage written records that describe distinct Brassica rapa crops to corroborate demographic models of domestication. Brassica rapa crops are renowned for their outstanding morphological diversity, but the relationships and order of domestication remain unclear. We generated genomewide SNPs from 126 accessions collected globally using high-throughput transcriptome data. Analyses of more than 31,000 SNPs across the B. rapa genome revealed evidence for five distinct genetic groups and supported a European-Central Asian origin of B. rapa crops. Our results supported the traditionally recognized South Asian and East Asian B. rapa groups with evidence that pak choi, Chinese cabbage and yellow sarson are likely monophyletic groups. In contrast, the oil-type B. rapa subsp. oleifera and brown sarson were polyphyletic. We also found no evidence to support the contention that rapini is the wild type or the earliest domesticated subspecies of B. rapa. Demographic analyses suggested that B. rapa was introduced to Asia 2,400-4,100 years ago, and that Chinese cabbage originated 1,200-2,100 years ago via admixture of pak choi and European-Central Asian B. rapa. We also inferred significantly different levels of founder effect among the B. rapa subspecies. Written records from antiquity that document these crops are consistent with these inferences. The concordance between our age estimates of domestication events with historical records provides unique support for our demographic inferences. © 2017 John Wiley & Sons Ltd.
49 CFR 19.53 - Retention and access requirements for records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... authorized representatives, have the right of timely and unrestricted access to any books, documents, papers... allocation plans, and any similar accounting computations of the rate at which a particular group of costs is... supporting records starts at the end of the fiscal year (or other accounting period) covered by the proposal...
2 CFR 215.53 - Retention and access requirements for records.
Code of Federal Regulations, 2010 CFR
2010-01-01
... representatives, have the right of timely and unrestricted access to any books, documents, papers, or other... any similar accounting computations of the rate at which a particular group of costs is chargeable... supporting records starts at the end of the fiscal year (or other accounting period) covered by the proposal...
7 CFR 3019.53 - Retention and access requirements for records.
Code of Federal Regulations, 2010 CFR
2010-01-01
... representatives, have the right of timely and unrestricted access to any books, documents, papers, or other... any similar accounting computations of the rate at which a particular group of costs is chargeable... supporting records starts at the end of the fiscal year (or other accounting period) covered by the proposal...
PATSTAGS: PATRAN-To-STAGSC-1 Translator
NASA Technical Reports Server (NTRS)
Otte, Neil
1993-01-01
PATSTAGS computer program translates data from PATRAN finite-element mathematical model into STAGS input records used for engineering analysis. Reads data from PATRAN neutral file and writes STAGS input records into STAGS input file and UPRESS data file. Supports translations of nodal constraints, and of nodal, element, force, and pressure data. Written in FORTRAN 77.
NASA Astrophysics Data System (ADS)
Allam, A. A.; Lin, F. C.; Share, P. E.; Ben-Zion, Y.; Vernon, F.; Schuster, G. T.; Karplus, M. S.
2016-12-01
We present earthquake data and statistical analyses from a month-long deployment of a linear array of 134 Fairfield three-component 5 Hz seismometers along the Clark strand of the San Jacinto fault zone in Southern California. With a total aperture of 2.4km and mean station spacing of 20m, the array locally spans the entire fault zone from the most intensely fractured core to relatively undamaged host rock on the outer edges. We recorded 36 days of continuous seismic data at 1000Hz sampling rate, capturing waveforms from 751 local events with Mw>0.5 and 43 teleseismic events with M>5.5, including two 600km deep M7.5 events along the Andean subduction zone. For any single local event on the San Jacinto fault, the central stations of the array recorded both higher amplitude and longer duration waveforms, which we interpret as the result of damage-related low-velocity structure acting as a broad waveguide. Using 271 San Jacinto events, we compute the distributions of three quantities for each station: maximum amplitude, mean amplitude, and total energy (the integral of the envelope). All three values become statistically lower with increasing distance from the fault, but in addition show a nonrandom zigzag pattern which we interpret as normal mode oscillations. This interpretation is supported by polarization analysis which demonstrates that the high-amplitude late-arriving energy is strongly vertically polarized in the central part of the array, consistent with Love-type trapped waves. These results, comprising nearly 30,000 separate coseismic waveforms, support the consistent interpretation of a 450m wide asymmetric damage zone, with the lowest velocities offset to the northeast of the mapped surface trace by 100m. This asymmetric damage zone has important implications for the earthquake dynamics of the San Jacinto and especially its ability to generate damaging multi-segment ruptures.
CFD in Support of Wind Tunnel Testing for Aircraft/Weapons Integration
2004-06-01
Warming flux vector splitting scheme. Viscous rate t mies s to the oDentati ote t fluxes (computed using spatial central differencing) in erotate try...computations factors to eliminate them from the current computation. performed. The grid system consisted of 18 x 106 points These newly i-blanked grid...273-295. 130 14. van Leer, B., "Towards the Ultimate Conservative 18 . Suhs, N.E., and R.W. Tramel, "PEGSUS 4.0 Users Manual." Difference Scheme V. A
Quade, G; Novotny, J; Burde, B; May, F; Beck, L E; Goldschmidt, A
1999-01-01
A distributed multimedia electronic patient record (EPR) is a central component of a medicine-telematics application that supports physicians working in rural areas of South America, and offers medical services to scientists in Antarctica. A Hyperwave server is used to maintain the patient record. As opposed to common web servers--and as a second generation web server--Hyperwave provides the capability of holding documents in a distributed web space without the problem of broken links. This enables physicians to browse through a patient's record by using a standard browser even if the patient's record is distributed over several servers. The patient record is basically implemented on the "Good European Health Record" (GEHR) architecture.
NASA Technical Reports Server (NTRS)
1978-01-01
In the photo, employees of the UAB Bank, Knoxville, Tennessee, are using Teller Transaction Terminals manufactured by SCI Systems, Inc., Huntsville, Alabama, an electronics firm which has worked on a number of space projects under contract with NASA. The terminals are part of an advanced, computerized financial transaction system that offers high efficiency in bank operations. The key to the system's efficiency is a "multiplexing" technique developed for NASA's Space Shuttle. Multiplexing is simultaneous transmission of large amounts of data over a single transmission link at very high rates of speed. In the banking application, a small multiplex "data bus" interconnects all the terminals and a central computer which stores information on clients' accounts. The data bus replaces the maze-of wiring that would be needed to connect each terminal separately and it affords greater speed in recording transactions. The SCI system offers banks real-time data management through constant updating of the central computer. For example, a check is immediately cancelled at the teller's terminal and the computer is simultaneously advised of the transaction; under other methods, the check would be cancelled and the transaction recorded at the close of business. Teller checkout at the end of the day, conventionally a time-consuming matter of processing paper, can be accomplished in minutes by calling up a summary of the day's transactions. SCI manufactures other types of terminals for use in the system, such as an administrative terminal that provides an immediate printout of a client's account, and another for printing and recording savings account deposits and withdrawals. SCI systems have been installed in several banks in Tennessee, Arizona, and Oregon and additional installations are scheduled this year.
Virtual Record Keeping: Should Teachers Keep Online Grade Books?
ERIC Educational Resources Information Center
Lacina, Jan
2006-01-01
Teaching and learning radically changed with advances in technology. Research shows that the computer can be an effective tool in both teaching and learning, and for that reason, school districts throughout the United States support schools by purchasing computers and software for individual classrooms. As a result, many school districts are using…
Techniques for Soundscape Retrieval and Synthesis
NASA Astrophysics Data System (ADS)
Mechtley, Brandon Michael
The study of acoustic ecology is concerned with the manner in which life interacts with its environment as mediated through sound. As such, a central focus is that of the soundscape: the acoustic environment as perceived by a listener. This dissertation examines the application of several computational tools in the realms of digital signal processing, multimedia information retrieval, and computer music synthesis to the analysis of the soundscape. Namely, these tools include a) an open source software library, Sirens, which can be used for the segmentation of long environmental field recordings into individual sonic events and compare these events in terms of acoustic content, b) a graph-based retrieval system that can use these measures of acoustic similarity and measures of semantic similarity using the lexical database WordNet to perform both text-based retrieval and automatic annotation of environmental sounds, and c) new techniques for the dynamic, realtime parametric morphing of multiple field recordings, informed by the geographic paths along which they were recorded.
Solving large mixed linear models using preconditioned conjugate gradient iteration.
Strandén, I; Lidauer, M
1999-12-01
Continuous evaluation of dairy cattle with a random regression test-day model requires a fast solving method and algorithm. A new computing technique feasible in Jacobi and conjugate gradient based iterative methods using iteration on data is presented. In the new computing technique, the calculations in multiplication of a vector by a matrix were recorded to three steps instead of the commonly used two steps. The three-step method was implemented in a general mixed linear model program that used preconditioned conjugate gradient iteration. Performance of this program in comparison to other general solving programs was assessed via estimation of breeding values using univariate, multivariate, and random regression test-day models. Central processing unit time per iteration with the new three-step technique was, at best, one-third that needed with the old technique. Performance was best with the test-day model, which was the largest and most complex model used. The new program did well in comparison to other general software. Programs keeping the mixed model equations in random access memory required at least 20 and 435% more time to solve the univariate and multivariate animal models, respectively. Computations of the second best iteration on data took approximately three and five times longer for the animal and test-day models, respectively, than did the new program. Good performance was due to fast computing time per iteration and quick convergence to the final solutions. Use of preconditioned conjugate gradient based methods in solving large breeding value problems is supported by our findings.
NASA Astrophysics Data System (ADS)
Kim, Sung-Phil; Simeral, John D.; Hochberg, Leigh R.; Donoghue, John P.; Black, Michael J.
2008-12-01
Computer-mediated connections between human motor cortical neurons and assistive devices promise to improve or restore lost function in people with paralysis. Recently, a pilot clinical study of an intracortical neural interface system demonstrated that a tetraplegic human was able to obtain continuous two-dimensional control of a computer cursor using neural activity recorded from his motor cortex. This control, however, was not sufficiently accurate for reliable use in many common computer control tasks. Here, we studied several central design choices for such a system including the kinematic representation for cursor movement, the decoding method that translates neuronal ensemble spiking activity into a control signal and the cursor control task used during training for optimizing the parameters of the decoding method. In two tetraplegic participants, we found that controlling a cursor's velocity resulted in more accurate closed-loop control than controlling its position directly and that cursor velocity control was achieved more rapidly than position control. Control quality was further improved over conventional linear filters by using a probabilistic method, the Kalman filter, to decode human motor cortical activity. Performance assessment based on standard metrics used for the evaluation of a wide range of pointing devices demonstrated significantly improved cursor control with velocity rather than position decoding. Disclosure. JPD is the Chief Scientific Officer and a director of Cyberkinetics Neurotechnology Systems (CYKN); he holds stock and receives compensation. JDS has been a consultant for CYKN. LRH receives clinical trial support from CYKN.
Speed-Discussion: Engaging Students in Class Discussions
ERIC Educational Resources Information Center
Kornfield, Sarah; Noack, Kristen
2017-01-01
Courses: Communication Criticism, Rhetorical Criticism, Family and Communication, Gender and Communication, Popular Communication, and theory-based courses. Objectives: This activity engages students in dynamic, supportive, social discussion groups; helps them to identify and review the central ideas from the reading; and creates a record of their…
NASA Astrophysics Data System (ADS)
Mujal, Eudald; Fortuny, Josep; Marmi, Josep; Dinarès-Turell, Jaume; Bolet, Arnau; Oms, Oriol
2018-01-01
The Carboniferous-Permian terrestrial successions record a global climatic shift from icehouse to hothouse conditions. Our multidisciplinary study documents an aridification trend throughout the 1000 m thick composite terrestrial succession of the western Catalan Pyrenees (NE Iberian Peninsula), representing this time period. The detailed stratigraphic framework integrates sedimentology, paleopedology, biochronology (plant fossils and tetrapod footprints) and geochronology (paleomagnetism). Additional absolute age correlation is also carried out. The new and reviewed data show that the late Carboniferous wet environments (with short drought periods) progressively changed to a strong seasonal semi-arid and arid climate (with short humid periods) through the early Permian. This paleoclimatic trend supports the previously suggested aridification of the Pangean pan-tropical belt, and supports the hypothesis of the influence of the recurrent climatic fluctuations in Central Pangea, being tentatively correlated to the Southern Gondwanan glaciation-deglaciation periods. Therefore, the Carboniferous-Permian terrestrial succession from the Catalan Pyrenees emerges as a continuous record that can help to constrain late Paleozoic paleoenvironmental events.
Sailer, Irena; Benic, Goran I; Fehmer, Vincent; Hämmerle, Christoph H F; Mühlemann, Sven
2017-07-01
Clinical studies are needed to evaluate the entire digital and conventional workflows in prosthetic dentistry. The purpose of the second part of this clinical study was to compare the laboratory production time for tooth-supported single crowns made with 4 different digital workflows and 1 conventional workflow and to compare these crowns clinically. For each of 10 participants, a monolithic crown was fabricated in lithium disilicate-reinforced glass ceramic (IPS e.max CAD). The computer-aided design and computer-aided manufacturing (CAD-CAM) systems were Lava C.O.S. CAD software and centralized CAM (group L), Cares CAD software and centralized CAM (group iT), Cerec Connect CAD software and lab side CAM (group CiL), and Cerec Connect CAD software with centralized CAM (group CiD). The conventional fabrication (group K) included a wax pattern of the crown and heat pressing according to the lost-wax technique (IPS e.max Press). The time for the fabrication of the casts and the crowns was recorded. Subsequently, the crowns were clinically evaluated and the corresponding treatment times were recorded. The Paired Wilcoxon test with the Bonferroni correction was applied to detect differences among treatment groups (α=.05). The total mean (±standard deviation) active working time for the dental technician was 88 ±6 minutes in group L, 74 ±12 minutes in group iT, 74 ±5 minutes in group CiL, 92 ±8 minutes in group CiD, and 148 ±11 minutes in group K. The dental technician spent significantly more working time for the conventional workflow than for the digital workflows (P<.001). No statistically significant differences were found between group L and group CiD or between group iT and group CiL. No statistical differences in time for the clinical evaluation were found among groups, indicating similar outcomes (P>.05). Irrespective of the CAD-CAM system, the overall laboratory working time for a digital workflow was significantly shorter than for the conventional workflow, since the dental technician needed less active working time. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
DOT National Transportation Integrated Search
2004-10-01
Communications in current railroad operations rely heavily on voice communications. Radio congestion impairs roadway workers ability to communicate effectively with dispatchers at the Central Traffic Control Center and has adverse consequences for...
DOT National Transportation Integrated Search
2004-10-31
Communications in current railroad operations rely heavily on voice communications. Radio congestion impairs roadway workers ability to communicate effectively with dispatchers at the Central Traffic Control Center and has adverse consequences for...
NASA Astrophysics Data System (ADS)
Mehta, Neville; Kompalli, Suryaprakash; Chaudhary, Vipin
Teleradiology is the electronic transmission of radiological patient images, such as x-rays, CT, or MR across multiple locations. The goal could be interpretation, consultation, or medical records keeping. Information technology solutions have enabled electronic records and their associated benefits are evident in health care today. However, salient aspects of collaborative interfaces, and computer assisted diagnostic (CAD) tools are yet to be integrated into workflow designs. The Computer Assisted Diagnostics and Interventions (CADI) group at the University at Buffalo has developed an architecture that facilitates web-enabled use of CAD tools, along with the novel concept of synchronized collaboration. The architecture can support multiple teleradiology applications and case studies are presented here.
Data acquisition and analysis in the DOE/NASA Wind Energy Program
NASA Technical Reports Server (NTRS)
Neustadter, H. E.
1980-01-01
Four categories of data systems, each responding to a distinct information need are presented. The categories are: control, technology, engineering and performance. The focus is on the technology data system which consists of the following elements: sensors which measure critical parameters such as wind speed and direction, output power, blade loads and strains, and tower vibrations; remote multiplexing units (RMU) mounted on each wind turbine which frequency modulate, multiplex and transmit sensor outputs; the instrumentation available to record, process and display these signals; and centralized computer analysis of data. The RMU characteristics and multiplexing techniques are presented. Data processing is illustrated by following a typical signal through instruments such as the analog tape recorder, analog to digital converter, data compressor, digital tape recorder, video (CRT) display, and strip chart recorder.
NIMH Prototype Management Information System for Community Mental Health Centers
Wurster, Cecil R.; Goodman, John D.
1980-01-01
Various approaches to centralized support of computer applications in health care are described. The NIMH project to develop a prototype Management Information System (MIS) for community mental health centers is presented and discussed as a centralized development of an automated data processing system for multiple user organizations. The NIMH program is summarized, the prototype MIS is characterized, and steps taken to provide for the differing needs of the mental health centers are highlighted.
Secure Dynamic access control scheme of PHR in cloud computing.
Chen, Tzer-Shyong; Liu, Chia-Hui; Chen, Tzer-Long; Chen, Chin-Sheng; Bau, Jian-Guo; Lin, Tzu-Ching
2012-12-01
With the development of information technology and medical technology, medical information has been developed from traditional paper records into electronic medical records, which have now been widely applied. The new-style medical information exchange system "personal health records (PHR)" is gradually developed. PHR is a kind of health records maintained and recorded by individuals. An ideal personal health record could integrate personal medical information from different sources and provide complete and correct personal health and medical summary through the Internet or portable media under the requirements of security and privacy. A lot of personal health records are being utilized. The patient-centered PHR information exchange system allows the public autonomously maintain and manage personal health records. Such management is convenient for storing, accessing, and sharing personal medical records. With the emergence of Cloud computing, PHR service has been transferred to storing data into Cloud servers that the resources could be flexibly utilized and the operation cost can be reduced. Nevertheless, patients would face privacy problem when storing PHR data into Cloud. Besides, it requires a secure protection scheme to encrypt the medical records of each patient for storing PHR into Cloud server. In the encryption process, it would be a challenge to achieve accurately accessing to medical records and corresponding to flexibility and efficiency. A new PHR access control scheme under Cloud computing environments is proposed in this study. With Lagrange interpolation polynomial to establish a secure and effective PHR information access scheme, it allows to accurately access to PHR with security and is suitable for enormous multi-users. Moreover, this scheme also dynamically supports multi-users in Cloud computing environments with personal privacy and offers legal authorities to access to PHR. From security and effectiveness analyses, the proposed PHR access scheme in Cloud computing environments is proven flexible and secure and could effectively correspond to real-time appending and deleting user access authorization and appending and revising PHR records.
Digital communication support and Alzheimer's disease.
Ekström, Anna; Ferm, Ulrika; Samuelsson, Christina
2017-08-01
Communication is one of the areas where people with dementia and their caregivers experience most challenges. The purpose of this study is to contribute to the understanding of possibilities and pitfalls of using personalized communication applications installed on tablet computers to support communication for people with dementia and their conversational partners. The study is based on video recordings of a woman, 52 years old, with Alzheimer's disease interacting with her husband in their home. The couple was recorded interacting with and without a tablet computer including a personalized communication application. The results from the present study reveal both significant possibilities and potential difficulties in introducing a digital communication device to people with dementia and their conversational partners. For the woman in the present study, the amount of interactive actions and the number of communicative actions seem to increase with the use of the communication application. The results also indicate that problems associated with dementia are foregrounded in interaction where the tablet computer is used.
ERIC Educational Resources Information Center
Padilla Mercado, Jeralyne B.; Coombs, Eri M.; De Jesus, Jenny P.; Bretz, Stacey Lowery; Danielson, Neil D.
2018-01-01
Multifunctional chemical analysis (MCA) systems provide a viable alternative for large scale instruction while supporting a hands-on approach to more advanced instrumentation. These systems are robust and typically use student stations connected to a remote central computer for data collection, minimizing the need for computers at every student…
Mensing, Scott A.; Sharpe, Saxon E.; Tunno, Irene; Sada, Don W.; Thomas, Jim M.; Starratt, Scott W.; Smith, Jeremy
2013-01-01
Evidence of a multi-centennial scale dry period between ∼2800 and 1850 cal yr BP is documented by pollen, mollusks, diatoms, and sediment in spring sediments from Stonehouse Meadow in Spring Valley, eastern central Nevada, U.S. We refer to this period as the Late Holocene Dry Period. Based on sediment recovered, Stonehouse Meadow was either absent or severely restricted in size at ∼8000 cal yr BP. Beginning ∼7500 cal yr BP, the meadow became established and persisted to ∼3000 cal yr BP when it began to dry. Comparison of the timing of this late Holocene drought record to multiple records extending from the eastern Sierra Nevada across the central Great Basin to the Great Salt Lake support the interpretation that this dry period was regional. The beginning and ending dates vary among sites, but all sites record multiple centuries of dry climate between 2500 and 1900 cal yr BP. This duration makes it the longest persistent dry period within the late Holocene. In contrast, sites in the northern Great Basin record either no clear evidence of drought, or have wetter than average climate during this period, suggesting that the northern boundary between wet and dry climates may have been between about 40° and 42° N latitude. This dry in the southwest and wet in the northwest precipitation pattern across the Great Basin is supported by large-scale spatial climate pattern hypotheses involving ENSO, PDO, AMO, and the position of the Aleutian Low and North Pacific High, particularly during winter.
A Web-based home welfare and care services support system using a pen type image sensor.
Ogawa, Hidekuni; Yonezawa, Yoshiharu; Maki, Hiromichi; Sato, Haruhiko; Hahn, Allen W; Caldwell, W Morton
2003-01-01
A long-term care insurance law for elderly persons was put in force two years ago in Japan. The Home Helpers, who are employed by hospitals, care companies or the welfare office, provide home welfare and care services for the elderly, such as cooking, bathing, washing, cleaning, shopping, etc. We developed a web-based home welfare and care services support system using wireless Internet mobile phones and Internet client computers, which employs a pen type image sensor. The pen type image sensor is used by the elderly people as the entry device for their care requests. The client computer sends the requests to the server computer in the Home Helper central office, and then the server computer automatically transfers them to the Home Helper's mobile phone. This newly-developed home welfare and care services support system is easily operated by elderly persons and enables Homes Helpers to save a significant amount of time and extra travel.
Khipu accounting in ancient Peru.
Urton, Gary; Brezine, Carrie J
2005-08-12
Khipu are knotted-string devices that were used for bureaucratic recording and communication in the Inka Empire. We recently undertook a computer analysis of 21 khipu from the Inka administrative center of Puruchuco, on the central coast of Peru. Results indicate that this khipu archive exemplifies the way in which census and tribute data were synthesized, manipulated, and transferred between different accounting levels in the Inka administrative system.
NASA Astrophysics Data System (ADS)
Fornace, Kyrstin L.; Hughen, Konrad A.; Shanahan, Timothy M.; Fritz, Sherilyn C.; Baker, Paul A.; Sylva, Sean P.
2014-12-01
A record of the hydrogen isotopic composition of terrestrial leaf waxes (δDwax) in sediment cores from Lake Titicaca provides new insight into the precipitation history of the Central Andes and controls of South American Summer Monsoon (SASM) variability since the last glacial period. Comparison of the δDwax record with a 19-kyr δD record from the nearby Illimani ice core supports the interpretation that precipitation δD is the primary control on δDwax with a lesser but significant role for local evapotranspiration and other secondary influences on δDwax. The Titicaca δDwax record confirms overall wetter conditions in the Central Andes during the last glacial period relative to a drier Holocene. During the last deglaciation, abrupt δDwax shifts correspond to millennial-scale events observed in the high-latitude North Atlantic, with dry conditions corresponding to the Bølling-Allerød and early Holocene periods and wetter conditions during late glacial and Younger Dryas intervals. We observe a trend of increasing monsoonal precipitation from the early to the late Holocene, consistent with summer insolation forcing of the SASM, but similar hydrologic variability on precessional timescales is not apparent during the last glacial period. Overall, this study demonstrates the relative importance of high-latitude versus tropical forcing as a dominant control on glacial SASM precipitation variability.
NASA Technical Reports Server (NTRS)
Felberg, F. H.
1984-01-01
The Jet Propulsion Laboratory, a research and development organization with about 5,000 employees, presents a complicated set of requirements for an institutional system of computing and informational services. The approach taken by JPL in meeting this challenge is one of controlled flexibility. A central communications network is provided, together with selected computing facilities for common use. At the same time, staff members are given considerable discretion in choosing the mini- and microcomputers that they believe will best serve their needs. Consultation services, computer education, and other support functions are also provided.
76 FR 21373 - Privacy Act of 1974; Report of a New System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-15
... Information Security Management Act of 2002; the Computer Fraud and Abuse Act of 1986; the Health Insurance... 1974; the Federal Information Security Management Act of 2002; the Computer Fraud and Abuse Act of 1986... established by State law; (3) support litigation involving the Agency; (4) combat fraud, waste, and abuse in...
Love, Erika; Butzin, Diane; Robinson, Robert E.; Lee, Soo
1971-01-01
A project to recatalog and reclassify the book collection of the Bowman Gray School of Medicine Library utilizing the Magnetic Tape/Selectric Typwriter system for simultaneous catalog card production and computer stored data acquisition marks the beginning of eventual computerization of all library operations. A keyboard optical display system will be added by late 1970. Major input operations requiring the creation of “hard copy” will continue via the MTST system. Updating, editing and retrieval operations as well as input without hard copy production will be done through the “on-line” keyboard optical display system. Once the library's first data bank, the book catalog, has been established the computer may be consulted directly for library holdings from any optical display terminal throughout the medical center. Three basic information retrieval operations may be carried out through “on-line” optical display terminals. Output options include the reproduction of part or all of a given document, or the generation of statistical data, which are derived from two Acquisition Code lines. The creation of a central bibliographic record of Bowman Gray Faculty publications patterned after the cataloging program is presently under way. The cataloging and computer storage of serial holdings records will begin after completion of the reclassification project. All acquisitions added to the collection since October 1967 are computer-stored and fully retrievable. Reclassification of older titles will be completed in early 1971. PMID:5542915
"Tree Investigators": Supporting Families' Scientific Talk in an Arboretum with Mobile Computers
ERIC Educational Resources Information Center
Zimmerman, Heather Toomey; Land, Susan M.; McClain, Lucy R.; Mohney, Michael R.; Choi, Gi Woong; Salman, Fariha H.
2015-01-01
This research examines the "Tree Investigators" project to support science learning with mobile devices during family public programmes in an arboretum. Using a case study methodology, researchers analysed video records of 10 families (25 people) using mobile technologies with naturalists at an arboretum to understand how mobile devices…
Uniformity testing: assessment of a centralized web-based uniformity analysis system.
Klempa, Meaghan C
2011-06-01
Uniformity testing is performed daily to ensure adequate camera performance before clinical use. The aim of this study is to assess the reliability of Beth Israel Deaconess Medical Center's locally built, centralized, Web-based uniformity analysis system by examining the differences between manufacturer and Web-based National Electrical Manufacturers Association integral uniformity calculations measured in the useful field of view (FOV) and the central FOV. Manufacturer and Web-based integral uniformity calculations measured in the useful FOV and the central FOV were recorded over a 30-d period for 4 cameras from 3 different manufacturers. These data were then statistically analyzed. The differences between the uniformity calculations were computed, in addition to the means and the SDs of these differences for each head of each camera. There was a correlation between the manufacturer and Web-based integral uniformity calculations in the useful FOV and the central FOV over the 30-d period. The average differences between the manufacturer and Web-based useful FOV calculations ranged from -0.30 to 0.099, with SD ranging from 0.092 to 0.32. For the central FOV calculations, the average differences ranged from -0.163 to 0.055, with SD ranging from 0.074 to 0.24. Most of the uniformity calculations computed by this centralized Web-based uniformity analysis system are comparable to the manufacturers' calculations, suggesting that this system is reasonably reliable and effective. This finding is important because centralized Web-based uniformity analysis systems are advantageous in that they test camera performance in the same manner regardless of the manufacturer.
2013-03-01
Allen 1974, 1978; Bridge and Leeder 1979; Mackey and Bridge 1992) that computes synthetic stratigraphy for a floodplain cross section. The model...typical of that used to record and communicate geologic information for engineering applications. The computed stratigraphy differentiates between...belt dimensions measured for two well-studied river systems: (A) the Linge River within the Rhine-Meuse Delta , Netherlands, and (B) the Lower
Elasmobranch bycatch in the Italian Adriatic pelagic trawl fishery
Fortuna, Caterina Maria; Moro, Fabrizio; Sala, Antonello
2018-01-01
Elasmobranchs are among the most threatened long-lived marine species worldwide, and incidental capture is a major source of mortality. The northern central Adriatic Sea, though one of the most overfished basins of the Mediterranean Sea, supports a very valuable marine biodiversity, including elasmobranchs. This study assesses the impact of the northern central Adriatic pelagic trawl fishery on common smooth-hound (Mustelus mustelus), spiny dogfish (Squalus acanthias), common eagle ray (Myliobatis aquila), and pelagic stingray (Pteroplatytrygon violacea) by examining incidental catches recorded between 2006 and 2015. The distribution of bycatch events was evaluated using geo-referenced data. Generalized Linear Models were computed to standardize the catch of the four species and to predict the relative abundance of bycatch events. Data analysis shows that most bycatch events involving all four species occurred in the northern Adriatic Sea. The models predicted significant, distinct temporal patterns of standardized catches in line with previous investigations. Water depth, season, and fishing region were the best predictors to explain bycatch events. The present data suggest that the northern Adriatic may be an important nursery area for several elasmobranchs. They also highlight the urgent need for a better understanding of the interactions between elasmobranchs and fisheries to develop and apply suitable, ad hoc management measures. PMID:29377920
Elasmobranch bycatch in the Italian Adriatic pelagic trawl fishery.
Bonanomi, Sara; Pulcinella, Jacopo; Fortuna, Caterina Maria; Moro, Fabrizio; Sala, Antonello
2018-01-01
Elasmobranchs are among the most threatened long-lived marine species worldwide, and incidental capture is a major source of mortality. The northern central Adriatic Sea, though one of the most overfished basins of the Mediterranean Sea, supports a very valuable marine biodiversity, including elasmobranchs. This study assesses the impact of the northern central Adriatic pelagic trawl fishery on common smooth-hound (Mustelus mustelus), spiny dogfish (Squalus acanthias), common eagle ray (Myliobatis aquila), and pelagic stingray (Pteroplatytrygon violacea) by examining incidental catches recorded between 2006 and 2015. The distribution of bycatch events was evaluated using geo-referenced data. Generalized Linear Models were computed to standardize the catch of the four species and to predict the relative abundance of bycatch events. Data analysis shows that most bycatch events involving all four species occurred in the northern Adriatic Sea. The models predicted significant, distinct temporal patterns of standardized catches in line with previous investigations. Water depth, season, and fishing region were the best predictors to explain bycatch events. The present data suggest that the northern Adriatic may be an important nursery area for several elasmobranchs. They also highlight the urgent need for a better understanding of the interactions between elasmobranchs and fisheries to develop and apply suitable, ad hoc management measures.
Colligan, Lacey; Potts, Henry W W; Finn, Chelsea T; Sinkin, Robert A
2015-07-01
Healthcare institutions worldwide are moving to electronic health records (EHRs). These transitions are particularly numerous in the US where healthcare systems are purchasing and implementing commercial EHRs to fulfill federal requirements. Despite the central role of EHRs to workflow, the cognitive impact of these transitions on the workforce has not been widely studied. This study assesses the changes in cognitive workload among pediatric nurses during data entry and retrieval tasks during transition from a hybrid electronic and paper information system to a commercial EHR. Baseline demographics and computer attitude and skills scores were obtained from 74 pediatric nurses in two wards. They also completed an established and validated instrument, the NASA-TLX, that is designed to measure cognitive workload; this instrument was used to evaluate cognitive workload of data entry and retrieval. The NASA-TLX was administered at baseline (pre-implementation), 1, 5 and 10 shifts and 4 months post-implementation of the new EHR. Most nurse participants experienced significant increases of cognitive workload at 1 and 5 shifts after "go-live". These increases abated at differing rates predicted by participants' computer attitudes scores (p = 0.01). There is substantially increased cognitive workload for nurses during the early phases (1-5 shifts) of EHR transitions. Health systems should anticipate variability across workers adapting to "meaningful use" EHRs. "One-size-fits-all" training strategies may not be suitable and longer periods of technical support may be necessary for some workers. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkata, Manjunath Gorentla; Aderholdt, William F
The pre-exascale systems are expected to have a significant amount of hierarchical and heterogeneous on-node memory, and this trend of system architecture in extreme-scale systems is expected to continue into the exascale era. along with hierarchical-heterogeneous memory, the system typically has a high-performing network ad a compute accelerator. This system architecture is not only effective for running traditional High Performance Computing (HPC) applications (Big-Compute), but also for running data-intensive HPC applications and Big-Data applications. As a consequence, there is a growing desire to have a single system serve the needs of both Big-Compute and Big-Data applications. Though the system architecturemore » supports the convergence of the Big-Compute and Big-Data, the programming models and software layer have yet to evolve to support either hierarchical-heterogeneous memory systems or the convergence. A programming abstraction to address this problem. The programming abstraction is implemented as a software library and runs on pre-exascale and exascale systems supporting current and emerging system architecture. Using distributed data-structures as a central concept, it provides (1) a simple, usable, and portable abstraction for hierarchical-heterogeneous memory and (2) a unified programming abstraction for Big-Compute and Big-Data applications.« less
Information Security and the Internet.
ERIC Educational Resources Information Center
Doddrell, Gregory R.
1996-01-01
As business relies less on "fortress" style central computers and more on distributed systems, the risk of disruption increases because of inadequate physical security, support services, and site monitoring. This article discusses information security and why protection is required on the Internet, presents a best practice firewall, and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
1996-05-01
The Network Information System (NWIS) was initially implemented in May 1996 as a system in which computing devices could be recorded so that unique names could be generated for each device. Since then the system has grown to be an enterprise wide information system which is integrated with other systems to provide the seamless flow of data through the enterprise. The system Iracks data for two main entities: people and computing devices. The following are the type of functions performed by NWIS for these two entities: People Provides source information to the enterprise person data repository for select contractors andmore » visitors Generates and tracks unique usernames and Unix user IDs for every individual granted cyber access Tracks accounts for centrally managed computing resources, and monitors and controls the reauthorization of the accounts in accordance with the DOE mandated interval Computing Devices Generates unique names for all computing devices registered in the system Tracks the following information for each computing device: manufacturer, make, model, Sandia property number, vendor serial number, operating system and operating system version, owner, device location, amount of memory, amount of disk space, and level of support provided for the machine Tracks the hardware address for network cards Tracks the P address registered to computing devices along with the canonical and alias names for each address Updates the Dynamic Domain Name Service (DDNS) for canonical and alias names Creates the configuration files for DHCP to control the DHCP ranges and allow access to only properly registered computers Tracks and monitors classified security plans for stand-alone computers Tracks the configuration requirements used to setup the machine Tracks the roles people have on machines (system administrator, administrative access, user, etc...) Allows systems administrators to track changes made on the machine (both hardware and software) Generates an adjustment history of changes on selected fields« less
McCormack, Jane; Baker, Elise; Masso, Sarah; Crowe, Kathryn; McLeod, Sharynne; Wren, Yvonne; Roulstone, Sue
2017-06-01
Implementation fidelity refers to the degree to which an intervention or programme adheres to its original design. This paper examines implementation fidelity in the Sound Start Study, a clustered randomised controlled trial of computer-assisted support for children with speech sound disorders (SSD). Sixty-three children with SSD in 19 early childhood centres received computer-assisted support (Phoneme Factory Sound Sorter [PFSS] - Australian version). Educators facilitated the delivery of PFSS targeting phonological error patterns identified by a speech-language pathologist. Implementation data were gathered via (1) the computer software, which recorded when and how much intervention was completed over 9 weeks; (2) educators' records of practice sessions; and (3) scoring of fidelity (intervention procedure, competence and quality of delivery) from videos of intervention sessions. Less than one-third of children received the prescribed number of days of intervention, while approximately one-half participated in the prescribed number of intervention plays. Computer data differed from educators' data for total number of days and plays in which children participated; the degree of match was lower as data became more specific. Fidelity to intervention procedures, competency and quality of delivery was high. Implementation fidelity may impact intervention outcomes and so needs to be measured in intervention research; however, the way in which it is measured may impact on data.
Heat-assisted magnetic recording of bit-patterned media beyond 10 Tb/in2
NASA Astrophysics Data System (ADS)
Vogler, Christoph; Abert, Claas; Bruckner, Florian; Suess, Dieter; Praetorius, Dirk
2016-03-01
The limits of areal storage density that is achievable with heat-assisted magnetic recording are unknown. We addressed this central question and investigated the areal density of bit-patterned media. We analyzed the detailed switching behavior of a recording bit under various external conditions, allowing us to compute the bit error rate of a write process (shingled and conventional) for various grain spacings, write head positions, and write temperatures. Hence, we were able to optimize the areal density yielding values beyond 10 Tb/in2. Our model is based on the Landau-Lifshitz-Bloch equation and uses hard magnetic recording grains with a 5-nm diameter and 10-nm height. It assumes a realistic distribution of the Curie temperature of the underlying material, grain size, as well as grain and head position.
de Barros, Alba Lúcia; Fakih, Flávio Trevisani; Michel, Jeanne Liliane
2002-01-01
This article reports the pathway used to build a prototype of a computer nurse's clinical decision making support system, using NANDA, NIC and NOC classifications, as an auxiliary tool in the insertion of nursing data in the computerized patient record of Hospital São Paulo/UNIFESP.
Laboratory and software applications for clinical trials: the global laboratory environment.
Briscoe, Chad
2011-11-01
The Applied Pharmaceutical Software Meeting is held annually. It is sponsored by The Boston Society, a not-for-profit organization that coordinates a series of meetings within the global pharmaceutical industry. The meeting generally focuses on laboratory applications, but in recent years has expanded to include some software applications for clinical trials. The 2011 meeting emphasized the global laboratory environment. Global clinical trials generate massive amounts of data in many locations that must be centralized and processed for efficient analysis. Thus, the meeting had a strong focus on establishing networks and systems for dealing with the computer infrastructure to support such environments. In addition to the globally installed laboratory information management system, electronic laboratory notebook and other traditional laboratory applications, cloud computing is quickly becoming the answer to provide efficient, inexpensive options for managing the large volumes of data and computing power, and thus it served as a central theme for the meeting.
Imprecise results: Utilizing partial computations in real-time systems
NASA Technical Reports Server (NTRS)
Lin, Kwei-Jay; Natarajan, Swaminathan; Liu, Jane W.-S.
1987-01-01
In real-time systems, a computation may not have time to complete its execution because of deadline requirements. In such cases, no result except the approximate results produced by the computations up to that point will be available. It is desirable to utilize these imprecise results if possible. Two approaches are proposed to enable computations to return imprecise results when executions cannot be completed normally. The milestone approach records results periodically, and if a deadline is reached, returns the last recorded result. The sieve approach demarcates sections of code which can be skipped if the time available is insufficient. By using these approaches, the system is able to produce imprecise results when deadlines are reached. The design of the Concord project is described which supports imprecise computations using these techniques. Also presented is a general model of imprecise computations using these techniques, as well as one which takes into account the influence of the environment, showing where the latter approach fits into this model.
Time and Space Partition Platform for Safe and Secure Flight Software
NASA Astrophysics Data System (ADS)
Esquinas, Angel; Zamorano, Juan; de la Puente, Juan A.; Masmano, Miguel; Crespo, Alfons
2012-08-01
There are a number of research and development activities that are exploring Time and Space Partition (TSP) to implement safe and secure flight software. This approach allows to execute different real-time applications with different levels of criticality in the same computer board. In order to do that, flight applications must be isolated from each other in the temporal and spatial domains. This paper presents the first results of a partitioning platform based on the Open Ravenscar Kernel (ORK+) and the XtratuM hypervisor. ORK+ is a small, reliable realtime kernel supporting the Ada Ravenscar Computational model that is central to the ASSERT development process. XtratuM supports multiple virtual machines, i.e. partitions, on a single computer and is being used in the Integrated Modular Avionics for Space study. ORK+ executes in an XtratuM partition enabling Ada applications to share the computer board with other applications.
Shuttle Program Information Management System (SPIMS) data base
NASA Technical Reports Server (NTRS)
1983-01-01
The Shuttle Program Information Management System (SPIMS) is a computerized data base operations system. The central computer is the CDC 170-730 located at Johnson Space Center (JSC), Houston, Texas. There are several applications which have been developed and supported by SPIMS. A brief description is given.
Rinkus, Susan M.; Chitwood, Ainsley
2002-01-01
The incorporation of electronic medical records into busy physician clinics has been a major development in the healthcare industry over the past decade. Documentation of key nursing activities, especially when interacting with patients who have chronic diseases, is often lacking or missing from the paper medical record. A case study of a patient with diabetes mellitus was created. Well established methods for the assessment of usability in the areas of human-computer interaction and computer supported cooperative work were employed to compare the nursing documentation of two tasks in a commercially available electronic medical record (eRecord) and in a paper medical record. Overall, the eRecord was found to improve the timeliness and quality of nursing documentation. With certain tasks, the number of steps to accomplish the same task was higher, which may result in the perception by the end user that the tool is more complex and therefore difficult to use. Recommendations for the eRecord were made to expand the documentation of patient teaching and adherence assessment and to incorporate web technology for patient access to medical records and healthcare information. PMID:12463905
Telestroke 10 years later--'telestroke 2.0'.
Switzer, Jeffrey A; Levine, Steven R; Hess, David C
2009-01-01
The lack of physicians with specialty stroke training represents a significant challenge to the future of stroke. This deficit limits both quality stroke care and clinical research initiatives. The use of telemedicine for stroke ('telestroke') has been an attempt to overcome this shortage and extend stroke expertise to locations which lack coverage. However, the initial telestroke systems required a point-to-point connection for transmission and only provided videoconferencing which limited their generalizability and usefulness. 'Telestroke 2.0' is the authors' vision of an integrative web-based telestroke system combining high-quality audiovideo transmission, the ability of consults and teleradiology to be carried out from any desktop or laptop computer with web-access, decision and technical support, creation of billable physician documentation and electronic medical record connectivity. These features will facilitate the development of statewide and regional telestroke call networks with an opportunity for physician supply companies to fill in coverage gaps. In addition, telestroke 2.0 may improve acute stroke research by increasing trial efficiency via the addition of non-academic recruitment sites, enhancing trial validity by centralizing neurologic examinations via recorded encounters, and generalizing clinical trial results to community hospital settings. Greater diffusion and long-term sustainability of telestroke systems will be dependent upon improvements in patient and hospital reimbursement for acute stroke and telestroke care. Copyright 2009 S. Karger AG, Basel.
Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Molthan, Andrew L.; Limaye, Ashutosh S.; Srikishen, Jayanthi
2011-01-01
Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula s "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA s National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA s SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT s experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by geostationary satellite observations processed on virtual machines powered by Nebula.
NASA Astrophysics Data System (ADS)
Podesto, B.; Lapointe, A.; Larose, G.; Robichaud, Y.; Vaillancourt, C.
1981-03-01
The design and construction of a Real-Time Digital Data Acquisition System (RTDDAS) to be used in substations for on-site recording and preprocessing load response data were included. The gathered data can be partially processed on site to compute the apparent, active and reactive powers, voltage and current rms values, and instantaneous values of phase voltages and currents. On-site processing capability is provided for rapid monitoring of the field data to ensure that the test setup is suitable. Production analysis of field data is accomplished off-line on a central computer from data recorded on a dual-density (800/1600) magnetic tape which is IBM-compatible. Parallel channels of data can be recorded at a variable rate from 480 to 9000 samples per second per channel. The RTDDAS is housed in a 9.1 m (30-ft) trailer which is shielded from electromagnetic interference and protected by isolators from switching surges. The test must sometimes be performed. Information pertaining to the installation, software operation, and maintenance is presented.
Raico Gallardo, Yolanda Natali; da Silva-Olivio, Isabela Rodrigues Teixeira; Mukai, Eduardo; Morimoto, Susana; Sesma, Newton; Cordaro, Luca
2017-05-01
To systematically assess the current dental literature comparing the accuracy of computer-aided implant surgery when using different supporting tissues (tooth, mucosa, or bone). Two reviewers searched PubMed (1972 to January 2015) and the Cochrane Central Register of Controlled Trials (Central) (2002 to January 2015). For the assessment of accuracy, studies were included with the following outcome measures: (i) angle deviation, (ii) deviation at the entry point, and (iii) deviation at the apex. Eight clinical studies from the 1602 articles initially identified met the inclusion criteria for the qualitative analysis. Four studies (n = 599 implants) were evaluated using meta-analysis. The bone-supported guides showed a statistically significant greater deviation in angle (P < 0.001), entry point (P = 0.01), and the apex (P = 0.001) when compared to the tooth-supported guides. Conversely, when only retrospective studies were analyzed, not significant differences are revealed in the deviation of the entry point and apex. The mucosa-supported guides indicated a statistically significant greater reduction in angle deviation (P = 0.02), deviation at the entry point (P = 0.002), and deviation at the apex (P = 0.04) when compared to the bone-supported guides. Between the mucosa- and tooth-supported guides, there were no statistically significant differences for any of the outcome measures. It can be concluded that the tissue of the guide support influences the accuracy of computer-aided implant surgery. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
VenomKB, a new knowledge base for facilitating the validation of putative venom therapies
Romano, Joseph D.; Tatonetti, Nicholas P.
2015-01-01
Animal venoms have been used for therapeutic purposes since the dawn of recorded history. Only a small fraction, however, have been tested for pharmaceutical utility. Modern computational methods enable the systematic exploration of novel therapeutic uses for venom compounds. Unfortunately, there is currently no comprehensive resource describing the clinical effects of venoms to support this computational analysis. We present VenomKB, a new publicly accessible knowledge base and website that aims to act as a repository for emerging and putative venom therapies. Presently, it consists of three database tables: (1) Manually curated records of putative venom therapies supported by scientific literature, (2) automatically parsed MEDLINE articles describing compounds that may be venom derived, and their effects on the human body, and (3) automatically retrieved records from the new Semantic Medline resource that describe the effects of venom compounds on mammalian anatomy. Data from VenomKB may be selectively retrieved in a variety of popular data formats, are open-source, and will be continually updated as venom therapies become better understood. PMID:26601758
Morris, Tommy J; Pajak, John; Havlik, Frank; Kenyon, Jessica; Calcagni, Dean
2006-08-01
This paper discusses the innovation process of the Battlefield Medical Information System- Tactical (BMIST), a point-of-care mobile computing solution for reducing medical errors and improving the quality of care provided to our military personnel in the field. In such remote environments, medical providers have traditionally had limited access to medical information, a situation quite analogous to that in remote areas of underdeveloped or developing countries. BMIST provides an all-in-one suite of mobile applications that empowers providers via access to critical medical information and powerful clinical decision support tools to accurately create an electronic health record (EHR). This record is synchronized with Department of Defense (DOD) joint health surveillance and medical information systems from the earliest echelons of care through chronic care provided by the Veterans Administration. Specific goals met in the initial phase were: integration of the PDA and wireless interface; development of the local application and user interface; development of a communications infrastructure and development of a data storage and retrieval system. The system had been used extensively in the field to create an EHR far forward that supports a longitudinal medical record across time and across all elements of the Military Healthcare System.
NASA Technical Reports Server (NTRS)
Byrne, F. (Inventor)
1981-01-01
A high speed common data buffer system is described for providing an interface and communications medium between a plurality of computers utilized in a distributed computer complex forming part of a checkout, command and control system for space vehicles and associated ground support equipment. The system includes the capability for temporarily storing data to be transferred between computers, for transferring a plurality of interrupts between computers, for monitoring and recording these transfers, and for correcting errors incurred in these transfers. Validity checks are made on each transfer and appropriate error notification is given to the computer associated with that transfer.
Examination of District Technology Coordinators in South Central Texas
ERIC Educational Resources Information Center
Egeolu, Charity Nnenna
2013-01-01
The profusion of computers and educational technologies in schools has precipitated the need for staff with technological skill sets necessary for the integration and support of educational technology infrastructures across multiple platforms at schools and district levels. The purpose of the quantitative survey study was to explore technology…
Autonomously Organized and Funded IT Groups
ERIC Educational Resources Information Center
Nichol, Bruce
2004-01-01
Central IT organizations under stress often cannot offer a high level of service to groups with above-average support needs. An example of such a group would be a well-funded, research-oriented computer science department. Several factors contribute to the increased demand on IT organizations. Given the availability of relatively…
ERIC Educational Resources Information Center
Demski, Jennifer
2009-01-01
This article describes how centralized presentation control systems enable IT support staff to monitor equipment and assist end users more efficiently. At Temple University, 70 percent of the classrooms are equipped with an AMX touch panel, linked via a Netlink controller to an in-classroom computer, projector, DVD/VCR player, and speakers. The…
BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark.
Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung
2016-05-01
Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today's data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG's simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact.
Description of data base management systems activities
NASA Technical Reports Server (NTRS)
1983-01-01
One of the major responsibilities of the JPL Computing and Information Services Office is to develop and maintain a JPL plan for providing computing services to the JPL management and administrative community that will lead to improved productivity. The CISO plan to accomplish this objective has been titled 'Management and Administrative Support Systems' (MASS). The MASS plan is based on the continued use of JPL's IBM 3032 Computer system for administrative computing and for the MASS functions. The current candidate administrative Data Base Management Systems required to support the MASS include ADABASE, Cullinane IDMS and TOTAL. Previous uses of administrative Data Base Systems have been applied to specific local functions rather than in a centralized manner with elements common to the many user groups. Limited capacity data base systems have been installed in microprocessor based office automation systems in a few Project and Management Offices using Ashton-Tate dBASE II. These experiences plus some other localized in house DBMS uses have provided an excellent background for developing user and system requirements for a single DBMS to support the MASS program.
Desiderata for computable representations of electronic health records-driven phenotype algorithms
Mo, Huan; Thompson, William K; Rasmussen, Luke V; Pacheco, Jennifer A; Jiang, Guoqian; Kiefer, Richard; Zhu, Qian; Xu, Jie; Montague, Enid; Carrell, David S; Lingren, Todd; Mentch, Frank D; Ni, Yizhao; Wehbe, Firas H; Peissig, Peggy L; Tromp, Gerard; Larson, Eric B; Chute, Christopher G; Pathak, Jyotishman; Speltz, Peter; Kho, Abel N; Jarvik, Gail P; Bejan, Cosmin A; Williams, Marc S; Borthwick, Kenneth; Kitchner, Terrie E; Roden, Dan M; Harris, Paul A
2015-01-01
Background Electronic health records (EHRs) are increasingly used for clinical and translational research through the creation of phenotype algorithms. Currently, phenotype algorithms are most commonly represented as noncomputable descriptive documents and knowledge artifacts that detail the protocols for querying diagnoses, symptoms, procedures, medications, and/or text-driven medical concepts, and are primarily meant for human comprehension. We present desiderata for developing a computable phenotype representation model (PheRM). Methods A team of clinicians and informaticians reviewed common features for multisite phenotype algorithms published in PheKB.org and existing phenotype representation platforms. We also evaluated well-known diagnostic criteria and clinical decision-making guidelines to encompass a broader category of algorithms. Results We propose 10 desired characteristics for a flexible, computable PheRM: (1) structure clinical data into queryable forms; (2) recommend use of a common data model, but also support customization for the variability and availability of EHR data among sites; (3) support both human-readable and computable representations of phenotype algorithms; (4) implement set operations and relational algebra for modeling phenotype algorithms; (5) represent phenotype criteria with structured rules; (6) support defining temporal relations between events; (7) use standardized terminologies and ontologies, and facilitate reuse of value sets; (8) define representations for text searching and natural language processing; (9) provide interfaces for external software algorithms; and (10) maintain backward compatibility. Conclusion A computable PheRM is needed for true phenotype portability and reliability across different EHR products and healthcare systems. These desiderata are a guide to inform the establishment and evolution of EHR phenotype algorithm authoring platforms and languages. PMID:26342218
Root resorption during orthodontic treatment.
Walker, Sally
2010-01-01
Medline, Embase, LILACS, The Cochrane Library (Cochrane Database of Systematic Reviews, CENTRAL, and Cochrane Oral Health Group Trials Register) Web of Science, EBM Reviews, Computer Retrieval of Information on Scientific Project (CRISP, www.crisp.cit.nih.gov), On-Line Computer Library Center (www.oclc.org), Google Index to Scientific and Technical Proceedings, PAHO (www.paho.org), WHOLis (www.who.int/library/databases/en), BBO (Brazilian Bibliography of Dentistry), CEPS (Chinese Electronic Periodical Services), Conference materials (www.bl.uk/services/bsds/dsc/conference.html), ProQuest Dissertation Abstracts and Thesis database, TrialCentral (www.trialscentral.org), National Research Register (www.controlled-trials.com), www.Clinicaltrials.gov and SIGLE (System for Information on Grey Literature in Europe). Randomised controlled trials including split mouth design, recording the presence or absence of external apical root resorption (EARR) by treatment group at the end of the treatment period. Data were extracted independently by two reviewers using specially designed and piloted forms. Quality was also assessed independently by the same reviewers. After evaluating titles and abstracts, 144 full articles were obtained of which 13 articles, describing 11 trials, fulfilled the criteria for inclusion. Differences in the methodological approaches and reporting results made quantitative statistical comparisons impossible. Evidence suggests that comprehensive orthodontic treatment causes increased incidence and severity of root resorption, and heavy forces might be particularly harmful. Orthodontically induced inflammatory root resorption is unaffected by archwire sequencing, bracket prescription, and self-ligation. Previous trauma and tooth morphology are unlikely causative factors. There is some evidence that a two- to three-month pause in treatment decreases total root resorption. The results were inconclusive in the clinical management of root resorption, but there is evidence to support the use of light forces, especially with incisor intrusion.
Use of a secure Internet Web site for collaborative medical research.
Marshall, W W; Haley, R W
2000-10-11
Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.
NASA Astrophysics Data System (ADS)
Larour, Eric; Utke, Jean; Bovin, Anton; Morlighem, Mathieu; Perez, Gilberto
2016-11-01
Within the framework of sea-level rise projections, there is a strong need for hindcast validation of the evolution of polar ice sheets in a way that tightly matches observational records (from radar, gravity, and altimetry observations mainly). However, the computational requirements for making hindcast reconstructions possible are severe and rely mainly on the evaluation of the adjoint state of transient ice-flow models. Here, we look at the computation of adjoints in the context of the NASA/JPL/UCI Ice Sheet System Model (ISSM), written in C++ and designed for parallel execution with MPI. We present the adaptations required in the way the software is designed and written, but also generic adaptations in the tools facilitating the adjoint computations. We concentrate on the use of operator overloading coupled with the AdjoinableMPI library to achieve the adjoint computation of the ISSM. We present a comprehensive approach to (1) carry out type changing through the ISSM, hence facilitating operator overloading, (2) bind to external solvers such as MUMPS and GSL-LU, and (3) handle MPI-based parallelism to scale the capability. We demonstrate the success of the approach by computing sensitivities of hindcast metrics such as the misfit to observed records of surface altimetry on the northeastern Greenland Ice Stream, or the misfit to observed records of surface velocities on Upernavik Glacier, central West Greenland. We also provide metrics for the scalability of the approach, and the expected performance. This approach has the potential to enable a new generation of hindcast-validated projections that make full use of the wealth of datasets currently being collected, or already collected, in Greenland and Antarctica.
NASA Astrophysics Data System (ADS)
Perez, G. L.; Larour, E. Y.; Morlighem, M.
2016-12-01
Within the framework of sea-level rise projections, there is a strong need for hindcast validation of the evolution of polar ice sheets in a way that tightly matches observational records (from radar and altimetry observations mainly). However, the computational requirements for making hindcast reconstructions possible are severe and rely mainly on the evaluation of the adjoint state of transient ice-flow models. Here, we look at the computation of adjoints in the context of the NASA/JPL/UCI Ice Sheet System Model, written in C++ and designed for parallel execution with MPI. We present the adaptations required in the way the software is designed and written but also generic adaptations in the tools facilitating the adjoint computations. We concentrate on the use of operator overloading coupled with the AdjoinableMPI library to achieve the adjoint computation of ISSM. We present a comprehensive approach to 1) carry out type changing through ISSM, hence facilitating operator overloading, 2) bind to external solvers such as MUMPS and GSL-LU and 3) handle MPI-based parallelism to scale the capability. We demonstrate the success of the approach by computing sensitivities of hindcast metrics such as the misfit to observed records of surface altimetry on the North-East Greenland Ice Stream, or the misfit to observed records of surface velocities on Upernavik Glacier, Central West Greenland. We also provide metrics for the scalability of the approach, and the expected performance. This approach has the potential of enabling a new generation of hindcast-validated projections that make full use of the wealth of datasets currently being collected, or alreay collected in Greenland and Antarctica, such as surface altimetry, surface velocities, and/or gravity measurements.
Joint Refugee Information Clearing Office (JRICO)
1976-03-02
A. JRICO Documentation 42 B. DA Documentation 62 C. DOD Documentation 75 D. IATF Documentation 83 E. Operating Procedures 103 F. Data...Indochina ( IATF ) and maintain access to the IATF computer data files on refugees, as well as coordinating with OSD, refugee resettlement centers...providing facilities and administrative support as requir- ed and negotiating with IATF for on-line computer access to the refugee records in the
Angotzi, Gian Nicola; Boi, Fabio; Zordan, Stefano; Bonfanti, Andrea; Vato, Alessandro
2014-01-01
A portable 16-channels microcontroller-based wireless system for a bi-directional interaction with the central nervous system is presented in this work. The device is designed to be used with freely behaving small laboratory animals and allows recording of spontaneous and evoked neural activity wirelessly transmitted and stored on a personal computer. Biphasic current stimuli with programmable duration, frequency and amplitude may be triggered in real-time on the basis of the recorded neural activity as well as by the animal behavior within a specifically designed experimental setup. An intuitive graphical user interface was developed to configure and to monitor the whole system. The system was successfully tested through bench tests and in vivo measurements on behaving rats chronically implanted with multi-channels microwire arrays. PMID:25096831
Murdoch, Jamie; Barnes, Rebecca; Pooler, Jillian; Lattimer, Valerie; Fletcher, Emily; Campbell, John L
2015-02-01
Telephone triage represents one strategy to manage demand for face-to-face GP appointments in primary care. Although computer decision-support software (CDSS) is increasingly used by nurses to triage patients, little is understood about how interaction is organized in this setting. Specifically any interactional dilemmas this computer-mediated setting invokes; and how these may be consequential for communication with patients. Using conversation analytic methods we undertook a multi-modal analysis of 22 audio-recorded telephone triage nurse-caller interactions from one GP practice in England, including 10 video-recordings of nurses' use of CDSS during triage. We draw on Goffman's theoretical notion of participation frameworks to make sense of these interactions, presenting 'telling cases' of interactional dilemmas nurses faced in meeting patient's needs and accurately documenting the patient's condition within the CDSS. Our findings highlight troubles in the 'interactional workability' of telephone triage exposing difficulties faced in aligning the proximal and wider distal context that structures CDSS-mediated interactions. Patients present with diverse symptoms, understanding of triage consultations, and communication skills which nurses need to negotiate turn-by-turn with CDSS requirements. Nurses therefore need to have sophisticated communication, technological and clinical skills to ensure patients' presenting problems are accurately captured within the CDSS to determine safe triage outcomes. Dilemmas around how nurses manage and record information, and the issues of professional accountability that may ensue, raise questions about the impact of CDSS and its use in supporting nurses to deliver safe and effective patient care. Copyright © 2014 Elsevier Ltd. All rights reserved.
Bertollo, David N; Alexander, Mary Jane; Shinn, Marybeth; Aybar, Jalila B
2007-06-01
This column describes the nonproprietary software Talker, used to adapt screening instruments to audio computer-assisted self-interviewing (ACASI) systems for low-literacy populations and other populations. Talker supports ease of programming, multiple languages, on-site scoring, and the ability to update a central research database. Key features include highly readable text display, audio presentation of questions and audio prompting of answers, and optional touch screen input. The scripting language for adapting instruments is briefly described as well as two studies in which respondents provided positive feedback on its use.
Deep learning aided decision support for pulmonary nodules diagnosing: a review.
Yang, Yixin; Feng, Xiaoyi; Chi, Wenhao; Li, Zhengyang; Duan, Wenzhe; Liu, Haiping; Liang, Wenhua; Wang, Wei; Chen, Ping; He, Jianxing; Liu, Bo
2018-04-01
Deep learning techniques have recently emerged as promising decision supporting approaches to automatically analyze medical images for different clinical diagnosing purposes. Diagnosing of pulmonary nodules by using computer-assisted diagnosing has received considerable theoretical, computational, and empirical research work, and considerable methods have been developed for detection and classification of pulmonary nodules on different formats of images including chest radiographs, computed tomography (CT), and positron emission tomography in the past five decades. The recent remarkable and significant progress in deep learning for pulmonary nodules achieved in both academia and the industry has demonstrated that deep learning techniques seem to be promising alternative decision support schemes to effectively tackle the central issues in pulmonary nodules diagnosing, including feature extraction, nodule detection, false-positive reduction, and benign-malignant classification for the huge volume of chest scan data. The main goal of this investigation is to provide a comprehensive state-of-the-art review of the deep learning aided decision support for pulmonary nodules diagnosing. As far as the authors know, this is the first time that a review is devoted exclusively to deep learning techniques for pulmonary nodules diagnosing.
An automated system for the study of ionospheric spatial structures
NASA Astrophysics Data System (ADS)
Belinskaya, I. V.; Boitman, O. N.; Vugmeister, B. O.; Vyborova, V. M.; Zakharov, V. N.; Laptev, V. A.; Mamchenko, M. S.; Potemkin, A. A.; Radionov, V. V.
The system is designed for the study of the vertical distribution of electron density and the parameters of medium-scale ionospheric irregularities over the sounding site as well as the reconstruction of the spatial distribution of electron density within the range of up to 300 km from the sounding location. The system comprises an active central station as well as passive companion stations. The central station is equipped with the digital ionosonde ``Basis'', the measuring-and-computing complex IVK-2, and the receiver-recorder PRK-3M. The companion stations are equipped with receivers-recorders PRK-3. The automated comlex software system includes 14 subsystems. Data transfer between them is effected using magnetic disk data sets. The system is operated in both ionogram mode and Doppler shift and angle-of-arrival mode. Using data obtained in these two modes, the reconstruction of the spatial distribution of electron density in the region is carried out. Reconstruction is checked for accuracy using data from companion stations.
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2004-09-01
To increase the security and throughput of ISO traffic through international terminals more technology must be applied to the problem. A transnational central archive of inspection records is discussed that can be accessed by national agencies as ISO containers approach their borders. The intent is to improve the throughput and security of the cargo inspection process. A review of currently available digital media archiving technologies is presented and their possible application to the tracking of international ISO container shipments. Specific image formats employed by current x-ray inspection systems are discussed. Sample x-ray data from systems in use today are shown that could be entered into such a system. Data from other inspection technologies are shown to be easily integrated, as well as the creation of database records suitable for interfacing with other computer systems. Overall system performance requirements are discussed in terms of security, response time and capacity. Suggestions for pilot projects based on existing border inspection processes are made also.
High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers
NASA Astrophysics Data System (ADS)
Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.
2017-12-01
The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.
NASA Astrophysics Data System (ADS)
Moschetti, M. P.; Rennolet, S.; Thompson, E.; Yeck, W.; McNamara, D. E.; Herrmann, R. B.; Powers, P.; Hoover, S. M.
2016-12-01
Recent efforts to characterize the seismic hazard resulting from increased seismicity rates in Oklahoma and Kansas highlight the need for a regionalized ground motion characterization. To support these efforts, we measure and compile strong ground motions and compare these average ground motions intensity measures (IMs) with existing ground motion prediction equations (GMPEs). IMs are computed for available broadband and strong-motion records from M≥3 earthquakes occurring January 2009-April 2016, using standard strong motion processing guidelines. We verified our methods by comparing results from specific earthquakes to other standard procedures such as the USGS Shakemap system. The large number of records required an automated processing scheme, which was complicated by the extremely high rate of small-magnitude earthquakes 2014-2016. Orientation-independent IMs include peak ground motions (acceleration and velocity) and pseudo-spectral accelerations (5 percent damping, 0.1-10 s period). Metadata for the records included relocated event hypocenters. The database includes more than 160,000 records from about 3200 earthquakes. Estimates of the mean and standard deviation of the IMs are computed by distance binning at intervals of 2 km. Mean IMs exhibit a clear break in geometrical attenuation at epicentral distances of about 50-70 km, which is consistent with previous studies in the CEUS. Comparisons of these ground motions with modern GMPEs provide some insight into the relative IMs of induced earthquakes in Oklahoma and Kansas relative to the western U.S. and the central and eastern U.S. The site response for these stations is uncertain because very little is known about shallow seismic velocity in the region, and we make no attempt to correct observed IMs to a reference site conditions. At close distances, the observed IMs are lower than the predictions of the seed GMPEs of the NGA-East project (and about consistent with NGA-West-2 ground motions). This ground motion database may be used to inform future seismic hazard forecast models and in the development of regionally appropriate GMPEs.
EON: a component-based approach to automation of protocol-directed therapy.
Musen, M A; Tu, S W; Das, A K; Shahar, Y
1996-01-01
Provision of automated support for planning protocol-directed therapy requires a computer program to take as input clinical data stored in an electronic patient-record system and to generate as output recommendations for therapeutic interventions and laboratory testing that are defined by applicable protocols. This paper presents a synthesis of research carried out at Stanford University to model the therapy-planning task and to demonstrate a component-based architecture for building protocol-based decision-support systems. We have constructed general-purpose software components that (1) interpret abstract protocol specifications to construct appropriate patient-specific treatment plans; (2) infer from time-stamped patient data higher-level, interval-based, abstract concepts; (3) perform time-oriented queries on a time-oriented patient database; and (4) allow acquisition and maintenance of protocol knowledge in a manner that facilitates efficient processing both by humans and by computers. We have implemented these components in a computer system known as EON. Each of the components has been developed, evaluated, and reported independently. We have evaluated the integration of the components as a composite architecture by implementing T-HELPER, a computer-based patient-record system that uses EON to offer advice regarding the management of patients who are following clinical trial protocols for AIDS or HIV infection. A test of the reuse of the software components in a different clinical domain demonstrated rapid development of a prototype application to support protocol-based care of patients who have breast cancer. PMID:8930854
77 FR 4025 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-26
...; System of Records AGENCY: U.S. Central Command, DoD. ACTION: Notice to Amend a System of Records. SUMMARY: The U.S. Central Command is amending a system of records notice in its existing inventory of record... INFORMATION: The U.S. Central Command systems of records notices subject to the Privacy Act of 1974 (5 U.S.C...
Custovic, Adnan; Ainsworth, John; Arshad, Hasan; Bishop, Christopher; Buchan, Iain; Cullinan, Paul; Devereux, Graham; Henderson, John; Holloway, John; Roberts, Graham; Turner, Steve; Woodcock, Ashley; Simpson, Angela
2015-01-01
We created Asthma e-Lab, a secure web-based research environment to support consistent recording, description and sharing of data, computational/statistical methods and emerging findings across the five UK birth cohorts. The e-Lab serves as a data repository for our unified dataset and provides the computational resources and a scientific social network to support collaborative research. All activities are transparent, and emerging findings are shared via the e-Lab, linked to explanations of analytical methods, thus enabling knowledge transfer. eLab facilitates the iterative interdisciplinary dialogue between clinicians, statisticians, computer scientists, mathematicians, geneticists and basic scientists, capturing collective thought behind the interpretations of findings. PMID:25805205
UNC Collaboratory Project: Overview
1990-11-01
technical, and other expository documents. Crucial to our success has been the selection of driving problems whose solutions have been of significance not...systems, and with the growing necessity for "team science", we believe the time is right to select a new driving problem -- support for multiple...the WE computer system. The WE system includes sensors imbedded within it that record each users’ action These records include each menu selection
Sandow, M J; Fisher, T J; Howard, C Q; Papas, S
2014-05-01
This study was part of a larger project to develop a (kinetic) theory of carpal motion based on computationally derived isometric constraints. Three-dimensional models were created from computed tomography scans of the wrists of ten normal subjects and carpal spatial relationships at physiological motion extremes were assessed. Specific points on the surface of the various carpal bones and the radius that remained isometric through range of movement were identified. Analysis of the isometric constraints and intercarpal motion suggests that the carpus functions as a stable central column (lunate-capitate-hamate-trapezoid-trapezium) with a supporting lateral column (scaphoid), which behaves as a 'two gear four bar linkage'. The triquetrum functions as an ulnar translation restraint, as well as controlling lunate flexion. The 'trapezoid'-shaped trapezoid places the trapezium anterior to the transverse plane of the radius and ulna, and thus rotates the principal axis of the central column to correspond to that used in the 'dart thrower's motion'. This study presents a forward kinematic analysis of the carpus that provides the basis for the development of a unifying kinetic theory of wrist motion based on isometric constraints and rules-based motion.
Indico central - events organisation, ergonomics and collaboration tools integration
NASA Astrophysics Data System (ADS)
Benito Gonzélez López, José; Ferreira, José Pedro; Baron, Thomas
2010-04-01
While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)
A Bayesian and Physics-Based Ground Motion Parameters Map Generation System
NASA Astrophysics Data System (ADS)
Ramirez-Guzman, L.; Quiroz, A.; Sandoval, H.; Perez-Yanez, C.; Ruiz, A. L.; Delgado, R.; Macias, M. A.; Alcántara, L.
2014-12-01
We present the Ground Motion Parameters Map Generation (GMPMG) system developed by the Institute of Engineering at the National Autonomous University of Mexico (UNAM). The system delivers estimates of information associated with the social impact of earthquakes, engineering ground motion parameters (gmp), and macroseismic intensity maps. The gmp calculated are peak ground acceleration and velocity (pga and pgv) and response spectral acceleration (SA). The GMPMG relies on real-time data received from strong ground motion stations belonging to UNAM's networks throughout Mexico. Data are gathered via satellite and internet service providers, and managed with the data acquisition software Earthworm. The system is self-contained and can perform all calculations required for estimating gmp and intensity maps due to earthquakes, automatically or manually. An initial data processing, by baseline correcting and removing records containing glitches or low signal-to-noise ratio, is performed. The system then assigns a hypocentral location using first arrivals and a simplified 3D model, followed by a moment tensor inversion, which is performed using a pre-calculated Receiver Green's Tensors (RGT) database for a realistic 3D model of Mexico. A backup system to compute epicentral location and magnitude is in place. A Bayesian Kriging is employed to combine recorded values with grids of computed gmp. The latter are obtained by using appropriate ground motion prediction equations (for pgv, pga and SA with T=0.3, 0.5, 1 and 1.5 s ) and numerical simulations performed in real time, using the aforementioned RGT database (for SA with T=2, 2.5 and 3 s). Estimated intensity maps are then computed using SA(T=2S) to Modified Mercalli Intensity correlations derived for central Mexico. The maps are made available to the institutions in charge of the disaster prevention systems. In order to analyze the accuracy of the maps, we compare them against observations not considered in the computations, and present some examples of recent earthquakes. We conclude that the system provides information with a fair goodness-of-fit against observations. This project is partially supported by DGAPA-PAPIIT (UNAM) project TB100313-RR170313.
Franz, Berkeley; Murphy, John W
2015-01-01
Electronic medical records are regarded as an important tool in primary health-care settings. Because these records are thought to standardize medical information, facilitate provider communication, and improve office efficiency, many practices are transitioning to these systems. However, much of the concern with improving the practice of record keeping has related to technological innovations and human-computer interaction. Drawing on the philosophical reflection raised in Jacques Ellul's work, this article questions the technological imperative that may be supporting medical record keeping. Furthermore, given the growing emphasis on community-based care, this article discusses important non-technological aspects of electronic medical records that might bring the use of these records in line with participatory primary-care medicine.
1989-07-11
applicable because this implementation does not support temporary files with names. ag . EE2401D is inapplicable because this implementation does not...buffer. No spanned records with ASCII.NUL are output. A line terminator followed by a page terminator may be represented as: ASC::. CR ASCU :.FF ASCII.CR if
Patterns of Debate in Tertiary Level Asynchronous Text-Based Conferencing
ERIC Educational Resources Information Center
Coffin, Caroline; Painter, Clare; Hewings, Ann
2005-01-01
Argumentation can be defined at different levels and serve different purposes, but its role in knowledge understanding and construction has given it a central place in education, particularly at tertiary level. The advent of computer-supported text-based conferences has created new sites where such educational dialogues can take place, but the…
Evaluating open-source cloud computing solutions for geosciences
NASA Astrophysics Data System (ADS)
Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong
2013-09-01
Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.
Eccher, C; Berloffa, F; Demichelis, F; Larcher, B; Galvagni, M; Sboner, A; Graiff, A; Forti, S
1999-01-01
Introduction This study describes a tele-consultation system (TCS) developed to provide a computing environment over a Wide Area Network (WAN) in North Italy (Province of Trento), that can be used by two or more physicians to share medical data and to work co-operatively on medical records. A pilot study has been carried out in oncology to assess the effectiveness of the system. The aim of this project is to facilitate the management of oncology patients by improving communication among the specialists of central and district hospitals. Methods and Results The TCS is an Intranet-based solution. The Intranet is based on a PC WAN with Windows NT Server, Microsoft SQL Server, and Internet Information Server. TCS is composed of native and custom applications developed in the Microsoft Windows (9x and NT) environment. The basic component of the system is the multimedia digital medical record, structured as a collection of HTML and ASP pages. A distributed relational database will allow users to store and retrieve medical records, accessed by a dedicated Web browser via the Web Server. The medical data to be stored and the presentation architecture of the clinical record had been determined in close collaboration with the clinicians involved in the project. TCS will allow a multi-point tele-consultation (TC) among two or more participants on remote computers, providing synchronized surfing through the clinical report. A set of collaborative and personal tools, whiteboard with drawing tools, point-to-point digital audio-conference, chat, local notepad, e-mail service, are integrated in the system to provide an user friendly environment. TCS has been developed as a client-server architecture. The client part of the system is based on the Microsoft Web Browser control and provides the user interface and the tools described above. The server part, running all the time on a dedicated computer, accepts connection requests and manages the connections among the participants in a TC, allowing multiple TC to run simultaneously. TCS has been developed in Visual C++ environment using MFC library and COM technology; ActiveX controls have been written in Visual Basic to perform dedicated tasks from the inside of the HTML clinical report. Before deploying the system in the hospital departments involved in the project, TCS has been tested in our laboratory by clinicians involved in the project to evaluate the usability of the system. Discussion TCS has the potential to support a "multi-disciplinary distributed virtual oncological meeting". The specialists of different departments and of different hospitals can attend "virtual meetings" and interactively discuss on medical data. An expected benefit of the "virtual meeting" should be the possibility to provide expert remote advice from oncologists to peripheral cancer units in formulating treatment plans, conducting follow-up sessions and supporting clinical research.
Large-Scale Data Collection Metadata Management at the National Computation Infrastructure
NASA Astrophysics Data System (ADS)
Wang, J.; Evans, B. J. K.; Bastrakova, I.; Ryder, G.; Martin, J.; Duursma, D.; Gohar, K.; Mackey, T.; Paget, M.; Siddeswara, G.
2014-12-01
Data Collection management has become an essential activity at the National Computation Infrastructure (NCI) in Australia. NCI's partners (CSIRO, Bureau of Meteorology, Australian National University, and Geoscience Australia), supported by the Australian Government and Research Data Storage Infrastructure (RDSI), have established a national data resource that is co-located with high-performance computing. This paper addresses the metadata management of these data assets over their lifetime. NCI manages 36 data collections (10+ PB) categorised as earth system sciences, climate and weather model data assets and products, earth and marine observations and products, geosciences, terrestrial ecosystem, water management and hydrology, astronomy, social science and biosciences. The data is largely sourced from NCI partners, the custodians of many of the national scientific records, and major research community organisations. The data is made available in a HPC and data-intensive environment - a ~56000 core supercomputer, virtual labs on a 3000 core cloud system, and data services. By assembling these large national assets, new opportunities have arisen to harmonise the data collections, making a powerful cross-disciplinary resource.To support the overall management, a Data Management Plan (DMP) has been developed to record the workflows, procedures, the key contacts and responsibilities. The DMP has fields that can be exported to the ISO19115 schema and to the collection level catalogue of GeoNetwork. The subset or file level metadata catalogues are linked with the collection level through parent-child relationship definition using UUID. A number of tools have been developed that support interactive metadata management, bulk loading of data, and support for computational workflows or data pipelines. NCI creates persistent identifiers for each of the assets. The data collection is tracked over its lifetime, and the recognition of the data providers, data owners, data generators and data aggregators are updated. A Digital Object Identifier is assigned using the Australian National Data Service (ANDS). Once the data has been quality assured, a DOI is minted and the metadata record updated. NCI's data citation policy establishes the relationship between research outcomes, data providers, and the data.
Flute ``breath support'' perception and its acoustical correlates
NASA Astrophysics Data System (ADS)
Cossette, Isabelle A.; Sabourin, Patrick
2004-05-01
Music educators and performers commonly refer to ``breath support'' in flute playing, yet the term ``support'' is neither well-defined nor consistently used. Different breathing strategies used by professional flautists who were instructed to play with and without support were previously identified by the authors. In the current study, 14 musical excerpts with and without support were recorded by five professional flautists. Eleven professional flautists listened to the recordings in a random order and ranked (1 to 6) how much of the following sound qualities they judged to be in each example: support, intonation, control and musical expressiveness. Answers to the test showed that musical expressiveness was associated more closely with the supported excerpts than the answers about support itself. The ratings for each sound quality were highly intercorrelated. Acoustical parameters were analyzed (frequency and centroid variation within each note) and compared with the results of the perception test in order to better understand how the acoustical and psychological variables were related. The acoustical analysis of the central part of the notes did not show evident correlation with the answers of the perception test. [Work funded by the Social Sciences and Humanities Research Council of Canada.
NASA Technical Reports Server (NTRS)
1998-01-01
SYMED, Inc., developed a unique electronic medical records and information management system. The S2000 Medical Interactive Care System (MICS) incorporates both a comprehensive and interactive medical care support capability and an extensive array of digital medical reference materials in either text or high resolution graphic form. The system was designed, in cooperation with NASA, to improve the effectiveness and efficiency of physician practices. The S2000 is a MS (Microsoft) Windows based software product which combines electronic forms, medical documents, records management, and features a comprehensive medical information system for medical diagnostic support and treatment. SYMED, Inc. offers access to its medical systems to all companies seeking competitive advantages.
BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark
Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung
2016-01-01
Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today’s data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG’s simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact. PMID:27390389
Desiderata for computable representations of electronic health records-driven phenotype algorithms.
Mo, Huan; Thompson, William K; Rasmussen, Luke V; Pacheco, Jennifer A; Jiang, Guoqian; Kiefer, Richard; Zhu, Qian; Xu, Jie; Montague, Enid; Carrell, David S; Lingren, Todd; Mentch, Frank D; Ni, Yizhao; Wehbe, Firas H; Peissig, Peggy L; Tromp, Gerard; Larson, Eric B; Chute, Christopher G; Pathak, Jyotishman; Denny, Joshua C; Speltz, Peter; Kho, Abel N; Jarvik, Gail P; Bejan, Cosmin A; Williams, Marc S; Borthwick, Kenneth; Kitchner, Terrie E; Roden, Dan M; Harris, Paul A
2015-11-01
Electronic health records (EHRs) are increasingly used for clinical and translational research through the creation of phenotype algorithms. Currently, phenotype algorithms are most commonly represented as noncomputable descriptive documents and knowledge artifacts that detail the protocols for querying diagnoses, symptoms, procedures, medications, and/or text-driven medical concepts, and are primarily meant for human comprehension. We present desiderata for developing a computable phenotype representation model (PheRM). A team of clinicians and informaticians reviewed common features for multisite phenotype algorithms published in PheKB.org and existing phenotype representation platforms. We also evaluated well-known diagnostic criteria and clinical decision-making guidelines to encompass a broader category of algorithms. We propose 10 desired characteristics for a flexible, computable PheRM: (1) structure clinical data into queryable forms; (2) recommend use of a common data model, but also support customization for the variability and availability of EHR data among sites; (3) support both human-readable and computable representations of phenotype algorithms; (4) implement set operations and relational algebra for modeling phenotype algorithms; (5) represent phenotype criteria with structured rules; (6) support defining temporal relations between events; (7) use standardized terminologies and ontologies, and facilitate reuse of value sets; (8) define representations for text searching and natural language processing; (9) provide interfaces for external software algorithms; and (10) maintain backward compatibility. A computable PheRM is needed for true phenotype portability and reliability across different EHR products and healthcare systems. These desiderata are a guide to inform the establishment and evolution of EHR phenotype algorithm authoring platforms and languages. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Development of a PC-based ground support system for a small satellite instrument
NASA Astrophysics Data System (ADS)
Deschambault, Robert L.; Gregory, Philip R.; Spenler, Stephen; Whalen, Brian A.
1993-11-01
The importance of effective ground support for the remote control and data retrieval of a satellite instrument cannot be understated. Problems with ground support may include the need to base personnel at a ground tracking station for extended periods, and the delay between the instrument observation and the processing of the data by the science team. Flexible solutions to such problems in the case of small satellite systems are provided by using low-cost, powerful personal computers and off-the-shelf software for data acquisition and processing, and by using Internet as a communication pathway to enable scientists to view and manipulate satellite data in real time at any ground location. The personal computer based ground support system is illustrated for the case of the cold plasma analyzer flown on the Freja satellite. Commercial software was used as building blocks for writing the ground support equipment software. Several levels of hardware support, including unit tests and development, functional tests, and integration were provided by portable and desktop personal computers. Satellite stations in Saskatchewan and Sweden were linked to the science team via phone lines and Internet, which provided remote control through a central point. These successful strategies will be used on future small satellite space programs.
Wang, Zhi-Long; Zhou, Zhi-Guo; Chen, Ying; Li, Xiao-Ting; Sun, Ying-Shi
The aim of this study was to diagnose lymph node metastasis of esophageal cancer by support vector machines model based on computed tomography. A total of 131 esophageal cancer patients with preoperative chemotherapy and radical surgery were included. Various indicators (tumor thickness, tumor length, tumor CT value, total number of lymph nodes, and long axis and short axis sizes of largest lymph node) on CT images before and after neoadjuvant chemotherapy were recorded. A support vector machines model based on these CT indicators was built to predict lymph node metastasis. Support vector machines model diagnosed lymph node metastasis better than preoperative short axis size of largest lymph node on CT. The area under the receiver operating characteristic curves were 0.887 and 0.705, respectively. The support vector machine model of CT images can help diagnose lymph node metastasis in esophageal cancer with preoperative chemotherapy.
Ursenbacher, Sylvain; Guillon, Michaël; Cubizolle, Hervé; Dupoué, Andréaz; Blouin-Demers, Gabriel; Lourdais, Olivier
2015-07-01
Understanding the impact of postglacial recolonization on genetic diversity is essential in explaining current patterns of genetic variation. The central-marginal hypothesis (CMH) predicts a reduction in genetic diversity from the core of the distribution to peripheral populations, as well as reduced connectivity between peripheral populations. While the CMH has received considerable empirical support, its broad applicability is still debated and alternative hypotheses predict different spatial patterns of genetic diversity. Using microsatellite markers, we analysed the genetic diversity of the adder (Vipera berus) in western Europe to reconstruct postglacial recolonization. Approximate Bayesian Computation (ABC) analyses suggested a postglacial recolonization from two routes: a western route from the Atlantic Coast up to Belgium and a central route from the Massif Central to the Alps. This cold-adapted species likely used two isolated glacial refugia in southern France, in permafrost-free areas during the last glacial maximum. Adder populations further from putative glacial refugia had lower genetic diversity and reduced connectivity; therefore, our results support the predictions of the CMH. Our study also illustrates the utility of highly variable nuclear markers, such as microsatellites, and ABC to test competing recolonization hypotheses. © 2015 John Wiley & Sons Ltd.
Variability and trends in runoff efficiency in the conterminous United States
McCabe, Gregory J.; Wolock, David M.
2016-01-01
Variability and trends in water-year runoff efficiency (RE) — computed as the ratio of water-year runoff (streamflow per unit area) to water-year precipitation — in the conterminous United States (CONUS) are examined for the 1951 through 2012 period. Changes in RE are analyzed using runoff and precipitation data aggregated to United States Geological Survey 8-digit hydrologic cataloging units (HUs). Results indicate increases in RE for some regions in the north-central CONUS and large decreases in RE for the south-central CONUS. The increases in RE in the north-central CONUS are explained by trends in climate, whereas the large decreases in RE in the south-central CONUS likely are related to groundwater withdrawals from the Ogallala aquifer to support irrigated agriculture.
Eurogrid: a new glideinWMS based portal for CDF data analysis
NASA Astrophysics Data System (ADS)
Amerio, S.; Benjamin, D.; Dost, J.; Compostella, G.; Lucchesi, D.; Sfiligoi, I.
2012-12-01
The CDF experiment at Fermilab ended its Run-II phase on September 2011 after 11 years of operations and 10 fb-1 of collected data. CDF computing model is based on a Central Analysis Farm (CAF) consisting of local computing and storage resources, supported by OSG and LCG resources accessed through dedicated portals. At the beginning of 2011 a new portal, Eurogrid, has been developed to effectively exploit computing and disk resources in Europe: a dedicated farm and storage area at the TIER-1 CNAF computing center in Italy, and additional LCG computing resources at different TIER-2 sites in Italy, Spain, Germany and France, are accessed through a common interface. The goal of this project is to develop a portal easy to integrate in the existing CDF computing model, completely transparent to the user and requiring a minimum amount of maintenance support by the CDF collaboration. In this paper we will review the implementation of this new portal, and its performance in the first months of usage. Eurogrid is based on the glideinWMS software, a glidein based Workload Management System (WMS) that works on top of Condor. As CDF CAF is based on Condor, the choice of the glideinWMS software was natural and the implementation seamless. Thanks to the pilot jobs, user-specific requirements and site resources are matched in a very efficient way, completely transparent to the users. Official since June 2011, Eurogrid effectively complements and supports CDF computing resources offering an optimal solution for the future in terms of required manpower for administration, support and development.
Exact computation of the maximum-entropy potential of spiking neural-network models.
Cofré, R; Cessac, B
2014-05-01
Understanding how stimuli and synaptic connectivity influence the statistics of spike patterns in neural networks is a central question in computational neuroscience. The maximum-entropy approach has been successfully used to characterize the statistical response of simultaneously recorded spiking neurons responding to stimuli. However, in spite of good performance in terms of prediction, the fitting parameters do not explain the underlying mechanistic causes of the observed correlations. On the other hand, mathematical models of spiking neurons (neuromimetic models) provide a probabilistic mapping between the stimulus, network architecture, and spike patterns in terms of conditional probabilities. In this paper we build an exact analytical mapping between neuromimetic and maximum-entropy models.
Digital Recording and Documentation of Endoscopic Procedures: Do Patients and Doctors Think Alike?
Willner, Nadav; Peled-Raz, Maya; Shteinberg, Dan; Shteinberg, Michal; Keren, Dean; Rainis, Tova
2016-01-01
Aims and Methods. Conducting a survey study of a large number of patients and gastroenterologists aimed at identifying relevant predictors of interest in digital recording and documentation (DRD) of endoscopic procedures. Outpatients presenting to the endoscopy unit at our institution for an endoscopy examination were anonymously surveyed, regarding their views and opinions of a possible recording of the procedure. A parallel survey for gastroenterologists was conducted. Results. 417 patients and 62 gastroenterologists participated in two parallel surveys regarding DRD of endoscopic procedures. 66.4% of the patients expressed interest in digital documentation of their endoscopic procedure, with 90.5% of them requesting a copy. 43.6% of the physicians supported digital recording while 27.4% opposed it, with 48.4% opposing to making a copy of the recording available to the patient. No sociodemographic or background factors predicted patient's interest in DRD. 66% of the physicians reported having recording facilities in their institutions, but only 43.6% of them stated performing recording. Having institutional guidelines for DRD was found to be the only significant predictor for routine recording. Conclusions. Our study exposes patients' positive views of digital recording and documentation of endoscopic procedures. In contrast, physicians appear to be much more reluctant towards DRD and are centrally motivated by legal concerns when opposing DRD, as well as when supporting it.
Feaster, Toby D.; Shelton, John M.; Robbins, Jeanne C.
2015-10-20
Heavy rainfall occurred across South Carolina during October 1–5, 2015, as a result of an upper atmospheric low-pressure system that funneled tropical moisture from Hurricane Joaquin into the State. The storm caused major flooding from the central to the coastal areas of South Carolina. Almost 27 inches of rain fell near Mount Pleasant in Charleston County during this period. U.S. Geological Survey streamgages recorded peaks of record at 17 locations, and 15 other locations had peaks that ranked in the top 5 for the period of record. During the October 2015 flood event, U.S. Geological Survey personnel made about 140 streamflow measurements at 86 locations to verify, update, or extend existing rating curves, which are used to compute streamflow from monitored river stage.
Clinical Decision Support for a Multicenter Trial of Pediatric Head Trauma
Swietlik, Marguerite; Deakyne, Sara; Hoffman, Jeffrey M.; Grundmeier, Robert W.; Paterno, Marilyn D.; Rocha, Beatriz H.; Schaeffer, Molly H; Pabbathi, Deepika; Alessandrini, Evaline; Ballard, Dustin; Goldberg, Howard S.; Kuppermann, Nathan; Dayan, Peter S.
2016-01-01
Summary Introduction For children who present to emergency departments (EDs) due to blunt head trauma, ED clinicians must decide who requires computed tomography (CT) scanning to evaluate for traumatic brain injury (TBI). The Pediatric Emergency Care Applied Research Network (PECARN) derived and validated two age-based prediction rules to identify children at very low risk of clinically-important traumatic brain injuries (ciTBIs) who do not typically require CT scans. In this case report, we describe the strategy used to implement the PECARN TBI prediction rules via electronic health record (EHR) clinical decision support (CDS) as the intervention in a multicenter clinical trial. Methods Thirteen EDs participated in this trial. The 10 sites receiving the CDS intervention used the Epic® EHR. All sites implementing EHR-based CDS built the rules by using the vendor’s CDS engine. Based on a sociotechnical analysis, we designed the CDS so that recommendations could be displayed immediately after any provider entered prediction rule data. One central site developed and tested the intervention package to be exported to other sites. The intervention package included a clinical trial alert, an electronic data collection form, the CDS rules and the format for recommendations. Results The original PECARN head trauma prediction rules were derived from physician documentation while this pragmatic trial led each site to customize their workflows and allow multiple different providers to complete the head trauma assessments. These differences in workflows led to varying completion rates across sites as well as differences in the types of providers completing the electronic data form. Site variation in internal change management processes made it challenging to maintain the same rigor across all sites. This led to downstream effects when data reports were developed. Conclusions The process of a centralized build and export of a CDS system in one commercial EHR system successfully supported a multicenter clinical trial. PMID:27437059
Altered cardiorespiratory coupling in young male adults with excessive online gaming.
Chang, Jae Seung; Kim, Eun Young; Jung, Dooyoung; Jeong, Seong Hoon; Kim, Yeni; Roh, Myoung-Sun; Ahn, Yong Min; Hahm, Bong-Jin
2015-09-01
This study aimed to investigate changes in heart rate variability and cardiorespiratory coupling in male college students with problematic Internet use (PIU) excessive gaming type during action video game play to assess the relationship between PIU tendency and central autonomic regulation. Electrocardiograms and respiration were simultaneously recorded from 22 male participants with excessive online gaming and 22 controls during action video game play. Sample entropy (SampEn) was computed to assess autonomic regularity, and cross-SampEn was calculated to quantify autonomic coordination. During video game play, reduced cardiorespiratory coupling (CRC) was observed in individuals with PIU excessive gaming type compared with controls, implicating central autonomic dysregulation. The PIU tendency was associated with the severity of autonomic dysregulation. These findings indicate impaired CRC in PIU excessive gaming type, which may reflect alterations of central inhibitory control over autonomic responses to pleasurable online stimuli. Copyright © 2015 Elsevier B.V. All rights reserved.
Gobbato, Luca; Paniz, Gianluca; Mazzocco, Fabio; Chierico, Andrea; Tsukiyama, Teppei; Levi, Paul A; Weisgold, Arnold S
2013-05-01
When utilizing a single implant-supported crown to replace a central incisor, understanding the final shape of the implant restoration is an important factor to help achieve a successful esthetic outcome. In today's dentistry, tooth shape is a critical factor when dental implant prostheses are considered in the esthetic zone. The major esthetic goal for this type of restoration is to achieve the closest possible symmetry with the adjacent tooth, both at the soft and at the hard tissue levels. The goal of this study was to objectively analyze the significance of natural crown shape when replacing a central incisor with a single implant-supported crown. In this study, we investigated the shape of the crowns of maxillary central incisors in 60 individuals who presented to our clinics with an untreatable central incisor. The presence of a dental diastema, "black triangle," presence or absence of gingival symmetry, and the presence or absence of dental symmetry were recorded in the pre- and postoperative photographs. Out of 60 patients, 33.3% had triangular-shaped crowns, 16.6% square/tapered, and 50% square-shaped crown form. After treatment was rendered, 65% of the triangular group, 40% of the square/tapered group, and 13.3% of the square group required an additional restoration on the adjacent central incisor in order to fulfill the esthetic needs of the patients. Data analysis revealed that if there is a "black triangle," a diastema, or presence of dental or gingival asymmetry, an additional restoration on the adjacent central incisor is often required in order to fulfill esthetic goals. The additional restoration is highly recommended in situations with a triangular crown shape, while it is suggested in cases of square/tapered and square tooth shapes in the presence of a dental diastema.
NASA Astrophysics Data System (ADS)
Camp, Henry N.
1996-02-01
Challenges in implementing a computer-based patient record (CPR)--such as absolute data integrity, high availability, permanent on-line storage of very large complex records, rapid search times, ease of use, commercial viability, and portability to other hospitals and doctor's offices--are given along with their significance, the solutions, and their successes. The THERESA CPR has been used sine 1983 in direct patient care by a public hospital that is the primary care provider to 350,000 people. It has 1000 beds with 45,000 admissions and 750,000 outpatient visits annually. The system supports direct provider entry, including by physicians, of complete medical `documents'. Its demonstration site currently contains 1.1 billion data items on 1 million patients. It is also a clinical decision-aiding tool used for quality assurance and cost containment, for teaching as faculty and students can easily find and `thumb through' all cases similar to a particular study, and for research with over a billion medical items that can be searched and analyzed on-line within context and with continuity. The same software can also run in a desktop microcomputer managing a private practice physician's office.
Arens-Volland, Andreas G; Spassova, Lübomira; Bohn, Torsten
2015-12-01
The aim of this review was to analyze computer-based tools for dietary management (including web-based and mobile devices) from both scientific and applied perspectives, presenting advantages and disadvantages as well as the state of validation. For this cross-sectional analysis, scientific results from 41 articles retrieved via a medline search as well as 29 applications from online markets were identified and analyzed. Results show that many approaches computerize well-established existing nutritional concepts for dietary assessment, e.g., food frequency questionnaires (FFQ) or dietary recalls (DR). Both food records and barcode scanning are less prominent in research but are frequently offered by commercial applications. Integration with a personal health record (PHR) or a health care workflow is suggested in the literature but is rarely found in mobile applications. It is expected that employing food records for dietary assessment in research settings will be increasingly used when simpler interfaces, e.g., barcode scanning techniques, and comprehensive food databases are applied, which can also support user adherence to dietary interventions and follow-up phases of nutritional studies. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.
2010-01-01
Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198
NASA Astrophysics Data System (ADS)
Peter, Daniel; Videau, Brice; Pouget, Kevin; Komatitsch, Dimitri
2015-04-01
Improving the resolution of tomographic images is crucial to answer important questions on the nature of Earth's subsurface structure and internal processes. Seismic tomography is the most prominent approach where seismic signals from ground-motion records are used to infer physical properties of internal structures such as compressional- and shear-wave speeds, anisotropy and attenuation. Recent advances in regional- and global-scale seismic inversions move towards full-waveform inversions which require accurate simulations of seismic wave propagation in complex 3D media, providing access to the full 3D seismic wavefields. However, these numerical simulations are computationally very expensive and need high-performance computing (HPC) facilities for further improving the current state of knowledge. During recent years, many-core architectures such as graphics processing units (GPUs) have been added to available large HPC systems. Such GPU-accelerated computing together with advances in multi-core central processing units (CPUs) can greatly accelerate scientific applications. There are mainly two possible choices of language support for GPU cards, the CUDA programming environment and OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted mainly by AMD graphic cards. In order to employ such hardware accelerators for seismic wave propagation simulations, we incorporated a code generation tool BOAST into an existing spectral-element code package SPECFEM3D_GLOBE. This allows us to use meta-programming of computational kernels and generate optimized source code for both CUDA and OpenCL languages, running simulations on either CUDA or OpenCL hardware accelerators. We show here applications of forward and adjoint seismic wave propagation on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.
2009-05-01
Information Literacy – Oral Communication – Written Communication – Critical Thinking – Decision Making – Stamina – Courage – Discipline...Emory 8 I n t e g r i t y - S e r v i c e - E x c e l l e n c e Supporting “ information literacy ” - Provide right type of help near “point of...plays central role in supporting “ information literacy ” Columbia University •Computer Lab •Center for New Media in Teaching and Learning
Closely Spaced Independent Parallel Runway Simulation.
1984-10-01
facility consists of the Central Computer Facility, the Controller Laboratory, and the Simulator Pilot Complex. CENTRAL COMPUTER FACILITY. The Central... Computer Facility consists of a group of mainframes, minicomputers, and associated peripherals which host the operational and data acquisition...in the Controller Laboratory and convert their verbal directives into a keyboard entry which is transmitted to the Central Computer Complex, where
Pérez-Santonja, T; Gómez-Paredes, L; Álvarez-Montero, S; Cabello-Ballesteros, L; Mombiela-Muruzabal, M T
2017-04-01
The introduction of electronic medical records and computer media in clinics, has influenced the physician-patient relationship. These modifications have many advantages, but there is concern that the computer has become too important, going from a working tool to the centre of our attention during the clinical interview, decreasing doctor interaction with the patient. The objective of the study was to estimate the percentage of time that family physicians spend on computer media compared to interpersonal communication with the patient, and whether this time is modified depending on different variables such as, doctor's age or reason for the consultation. An observational and descriptive study was conducted for 10 weeks, with 2 healthcare centres involved. The researchers attended all doctor- patient interviews, recording the patient time in and out of the consultation. Each time the doctor fixed his gaze on computer media the time was clocked. A total of 436 consultations were collected. The doctors looked at the computer support a median 38.33% of the total duration of an interview. Doctors of 45 years and older spent more time fixing their eyes on computer media (P<.05). Family physicians used almost 40% of the consultation time looking at computer media, and depends on age of physician, number of queries, and number of medical appointments. Copyright © 2016 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España, S.L.U. All rights reserved.
Social disadvantage and borderline personality disorder: A study of social networks.
Beeney, Joseph E; Hallquist, Michael N; Clifton, Allan D; Lazarus, Sophie A; Pilkonis, Paul A
2018-01-01
Examining differences in social integration, social support, and relationship characteristics in social networks may be critical for understanding the character and costs of the social difficulties experienced of borderline personality disorder (BPD). We conducted an ego-based (self-reported, individual) social network analysis of 142 participants recruited from clinical and community sources. Each participant listed the 30 most significant people (called alters) in their social network, then rated each alter in terms of amount of contact, social support, attachment strength and negative interactions. In addition, measures of social integration were determined using participant's report of the connection between people in their networks. BPD was associated with poorer social support, more frequent negative interactions, and less social integration. Examination of alter-by-BPD interactions indicated that whereas participants with low BPD symptoms had close relationships with people with high centrality within their networks, participants with high BPD symptoms had their closest relationships with people less central to their networks. The results suggest that individuals with BPD are at a social disadvantage: Those with whom they are most closely linked (including romantic partners) are less socially connected (i.e., less central) within their social network. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center
NASA Astrophysics Data System (ADS)
Molthan, A.; Limaye, A. S.
2011-12-01
Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula's "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA's National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA's SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT's experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by geostationary satellite observations processed on virtual machines powered by Nebula. This presentation will provide an overview of these activities from a scientific and cloud computing applications perspective, identifying the strengths and weaknesses for deploying each project within an IaaS environment, and ways to collaborate with the Nebula or other cloud-user communities to collaborate on projects as they go forward.
Ge, Hong-You; Vangsgaard, Steffen; Omland, Øyvind; Madeleine, Pascal; Arendt-Nielsen, Lars
2014-12-06
Musculoskeletal pain from the upper extremity and shoulder region is commonly reported by computer users. However, the functional status of central pain mechanisms, i.e., central sensitization and conditioned pain modulation (CPM), has not been investigated in this population. The aim was to evaluate sensitization and CPM in computer users with and without chronic musculoskeletal pain. Pressure pain threshold (PPT) mapping in the neck-shoulder (15 points) and the elbow (12 points) was assessed together with PPT measurement at mid-point in the tibialis anterior (TA) muscle among 47 computer users with chronic pain in the upper extremity and/or neck-shoulder pain (pain group) and 17 pain-free computer users (control group). Induced pain intensities and profiles over time were recorded using a 0-10 cm electronic visual analogue scale (VAS) in response to different levels of pressure stimuli on the forearm with a new technique of dynamic pressure algometry. The efficiency of CPM was assessed using cuff-induced pain as conditioning pain stimulus and PPT at TA as test stimulus. The demographics, job seniority and number of working hours/week using a computer were similar between groups. The PPTs measured at all 15 points in the neck-shoulder region were not significantly different between groups. There were no significant differences between groups neither in PPTs nor pain intensity induced by dynamic pressure algometry. No significant difference in PPT was observed in TA between groups. During CPM, a significant increase in PPT at TA was observed in both groups (P < 0.05) without significant differences between groups. For the chronic pain group, higher clinical pain intensity, lower PPT values from the neck-shoulder and higher pain intensity evoked by the roller were all correlated with less efficient descending pain modulation (P < 0.05). This suggests that the excitability of the central pain system is normal in a large group of computer users with low pain intensity chronic upper extremity and/or neck-shoulder pain and that increased excitability of the pain system cannot explain the reported pain. However, computer users with higher pain intensity and lower PPTs were found to have decreased efficiency in descending pain modulation.
Deakyne, S J; Bajaj, L; Hoffman, J; Alessandrini, E; Ballard, D W; Norris, R; Tzimenatos, L; Swietlik, M; Tham, E; Grundmeier, R W; Kuppermann, N; Dayan, P S
2015-01-01
Overuse of cranial computed tomography scans in children with blunt head trauma unnecessarily exposes them to radiation. The Pediatric Emergency Care Applied Research Network (PECARN) blunt head trauma prediction rules identify children who do not require a computed tomography scan. Electronic health record (EHR) based clinical decision support (CDS) may effectively implement these rules but must only be provided for appropriate patients in order to minimize excessive alerts. To develop, implement and evaluate site-specific groupings of chief complaints (CC) that accurately identify children with head trauma, in order to activate data collection in an EHR. As part of a 13 site clinical trial comparing cranial computed tomography use before and after implementation of CDS, four PECARN sites centrally developed and locally implemented CC groupings to trigger a clinical trial alert (CTA) to facilitate the completion of an emergency department head trauma data collection template. We tested and chose CC groupings to attain high sensitivity while maintaining at least moderate specificity. Due to variability in CCs available, identical groupings across sites were not possible. We noted substantial variability in the sensitivity and specificity of seemingly similar CC groupings between sites. The implemented CC groupings had sensitivities greater than 90% with specificities between 75-89%. During the trial, formal testing and provider feedback led to tailoring of the CC groupings at some sites. CC groupings can be successfully developed and implemented across multiple sites to accurately identify patients who should have a CTA triggered to facilitate EHR data collection. However, CC groupings will necessarily vary in order to attain high sensitivity and moderate-to-high specificity. In future trials, the balance between sensitivity and specificity should be considered based on the nature of the clinical condition, including prevalence and morbidity, in addition to the goals of the intervention being considered.
Requirements for the structured recording of surgical device data in the digital operating room.
Rockstroh, Max; Franke, Stefan; Neumuth, Thomas
2014-01-01
Due to the increasing complexity of the surgical working environment, increasingly technical solutions must be found to help relieve the surgeon. This objective is supported by a structured storage concept for all relevant device data. In this work, we present a concept and prototype development of a storage system to address intraoperative medical data. The requirements of such a system are described, and solutions for data transfer, processing, and storage are presented. In a subsequent study, a prototype based on the presented concept is tested for correct and complete data transmission and storage and for the ability to record a complete neurosurgical intervention with low processing latencies. In the final section, several applications for the presented data recorder are shown. The developed system based on the presented concept is able to store the generated data correctly, completely, and quickly enough even if much more data than expected are sent during a surgical intervention. The Surgical Data Recorder supports automatic recognition of the interventional situation by providing a centralized data storage and access interface to the OR communication bus. In the future, further data acquisition technologies should be integrated. Therefore, additional interfaces must be developed. The data generated by these devices and technologies should also be stored in or referenced by the Surgical Data Recorder to support the analysis of the OR situation.
32 CFR Appendix F to Part 651 - Glossary
Code of Federal Regulations, 2013 CFR
2013-07-01
.... ASA(AL&T) Assistant Secretary of the Army (Acquisition, Logistics, and Technology). ASA(FM) Assistant.../Cost Analysis. EICS Environmental Impact Computer System. EIFS Economic Impact Forecast System. EIS... Record of Non-Applicability. RSC Regional Support Command. S&T Science and Technology. SA Secretary of...
32 CFR Appendix F to Part 651 - Glossary
Code of Federal Regulations, 2012 CFR
2012-07-01
.... ASA(AL&T) Assistant Secretary of the Army (Acquisition, Logistics, and Technology). ASA(FM) Assistant.../Cost Analysis. EICS Environmental Impact Computer System. EIFS Economic Impact Forecast System. EIS... Record of Non-Applicability. RSC Regional Support Command. S&T Science and Technology. SA Secretary of...
32 CFR Appendix F to Part 651 - Glossary
Code of Federal Regulations, 2014 CFR
2014-07-01
.... ASA(AL&T) Assistant Secretary of the Army (Acquisition, Logistics, and Technology). ASA(FM) Assistant.../Cost Analysis. EICS Environmental Impact Computer System. EIFS Economic Impact Forecast System. EIS... Record of Non-Applicability. RSC Regional Support Command. S&T Science and Technology. SA Secretary of...
NASA Astrophysics Data System (ADS)
Cárdenas, Jhon; Orjuela-Cañón, Alvaro D.; Cerquera, Alexander; Ravelo, Antonio
2017-11-01
Different studies have used Transfer Entropy (TE) and Granger Causality (GC) computation to quantify interconnection between physiological systems. These methods have disadvantages in parametrization and availability in analytic formulas to evaluate the significance of the results. Other inconvenience is related with the assumptions in the distribution of the models generated from the data. In this document, the authors present a way to measure the causality that connect the Central Nervous System (CNS) and the Cardiac System (CS) in people diagnosed with obstructive sleep apnea syndrome (OSA) before and during treatment with continuous positive air pressure (CPAP). For this purpose, artificial neural networks were used to obtain models for GC computation, based on time series of normalized powers calculated from electrocardiography (EKG) and electroencephalography (EEG) signals recorded in polysomnography (PSG) studies.
The Relational Model Distilled to Support Data Modeling in IS 2002
ERIC Educational Resources Information Center
Waguespack, Leslie J., Jr.
2010-01-01
No individual subject area in IS 2002 impacts more aspects of computing theory or professional preparation than data modeling. For more than four decades the bedrock of data modeling has been the relational data model. There are numerous extensions, variations and implementations of this theory but its core remains the central anchor in the…
ERIC Educational Resources Information Center
Sander, Elisabeth; Heiß, Andrea
2014-01-01
Three different versions of a learning program on trigonometry were compared, a program controlled, non-interactive version (CG), an interactive, conflict inducing version (EG 1), and an interactive one which was supposed to reduce the occurrence of a cognitive conflict regarding the central problem solution (EG 2). Pupils (N = 101) of a…
X-ray investigation of cross-breed silk in cocoon, yarn and fabric forms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radhalakshmi, Y. C.; Kariappa,; Siddaraju, G. N.
2012-06-05
Recently Central Sericulture Research and Training Institute, Mysore developed many improved cross breeds and bivoltine hybrids. Newly developed cross breeds recorded fibre characteristics which are significantly superior over existing control hybrids. This aspect has been investigated using X-ray diffraction technique. We have employed line profile analysis to compute microstructural parameters. These parameters are compared with physical parameters of newly developed cross breed silk fibers for a better understanding of structure-property relation in these samples.
Lightning Detection and Ranging system LDAR system description and performance objectives
NASA Technical Reports Server (NTRS)
Poehler, H. A.; Lennon, C. L.
1979-01-01
The instruments used at the six remote stations to measure both the time-of-arrival of the envelope of the pulsed 60 MHz to 80 MHz portion of the RF signal emitted by lightning, and the electric field waveforms are described as well as the two methods of transmitting the signal to the central station. Other topics discussed include data processing, recording, and reduction techniques and the software used for the 2100S, 2114, and 2116 computers.
The Development of Administrative Measures as Indicators of Soldier Effectiveness
1987-08-01
26 CONTENTS (Continued) Page LIST OF TABLES (Continued) Table 13. Frequency distributions for selected variables: MPRJ/ EMF comparison 28 14...8217,»j«’N-j.,’j’«>,’j.’«J.>>>Ji’v-.%j.*’j">J,X Table 13 Frequency Distributions for Selected Variables: (n = 650 soldiers) MPRJ/ EMF Comparioon...information were examined: (a) the Enlisted Master File ( EMF ), a central computer record of selected personnel actions; (b) the Official Military Personnel
Do GPs record the occupation of their patients?
Richards-Taylor, A; Keay, J; Thorley, K
2013-03-01
General practitioners (GPs) have a central role in providing advice about fitness for work, yet there are concerns about their understanding of the relationship between work and health. To assess whether GPs in one Cornish practice record the occupation of patients of working age and to quantify how important GPs in Cornwall consider recording of occupation in working-age patients. An audit of the notes of 300 working-age patients in one practice, a search of the computer records at a different practice and a questionnaire survey of 202 GPs in practices in Cornwall. Occupation was recorded in 50 (17%) of the 300 patient notes audited. The questionnaire response rate was 31%. Few (8%) respondents reported training in occupational medicine. Most (65%) of GPs recorded their patients' occupation some of the time. A third (32%) of GPs did not consider it important to record patients' occupations. GPs in two Cornish practices recorded the occupation of working-age patients infrequently, but over two-thirds of GPs in Cornwall believe it is important to do so. If these results reflect the practice of UK GPs, the new 'e-fit note' may be of limited value in monitoring and analysing sickness absence.
Lindberg, D A; Humphreys, B L
1995-01-01
The High-Performance Computing and Communications (HPCC) program is a multiagency federal effort to advance the state of computing and communications and to provide the technologic platform on which the National Information Infrastructure (NII) can be built. The HPCC program supports the development of high-speed computers, high-speed telecommunications, related software and algorithms, education and training, and information infrastructure technology and applications. The vision of the NII is to extend access to high-performance computing and communications to virtually every U.S. citizen so that the technology can be used to improve the civil infrastructure, lifelong learning, energy management, health care, etc. Development of the NII will require resolution of complex economic and social issues, including information privacy. Health-related applications supported under the HPCC program and NII initiatives include connection of health care institutions to the Internet; enhanced access to gene sequence data; the "Visible Human" Project; and test-bed projects in telemedicine, electronic patient records, shared informatics tool development, and image systems. PMID:7614116
The Recovery of a Clinical Database Management System after Destruction by Fire *
Covvey, H.D.; McAlister, N.H.; Greene, J.; Wigle, E.D.
1981-01-01
In August 1980 a fire in the Cardiovascular Unit at Toronto General Hospital severely damaged the physical plant and rendered all on-site equipment unrecoverable. Among the hardware items in the fire was the computer which supports our cardiovascular database system. Within hours after the fire it was determined that the computer was no longer serviceable. Beyond off-site back-up tapes, there was the possibility that recent records on the computer had suffered a similar fate. Immediate procedures were instituted to obtain a replacement computer system and to clean media to permit data recovery. Within 2 months a partial system was supporting all users, and all data was recovered and being used. The destructive potential of a fire is rarely seriously considered relative to computer equipment in our clinical environments. Full-replacement value insurance; an excellent equipment supplier with the capacity to respond to an emergency; backup and recovery procedures with off-site storage; and dedicated staff are key hedges against disaster.
NASA Astrophysics Data System (ADS)
Zhang, H.; Griffiths, M. L.; Wu, S.; Kong, W.; Chiang, J. C. H.; Atwood, A. R.; Cheng, H.; Huang, J.; Xie, S.
2017-12-01
Chinese speleothem δ18Oc records have revealed that the Asian summer monsoon underwent pronounced millennial-scale variability during the last deglaciation, yet there is still debate as to what the δ18Oc signals represent. Traditionally, these δ18Oc records were interpreted as a proxy for regional rainfall variability via the East Asian Summer Monsoon (EASM), however, recent isotope-enabled model simulations have suggested that precipitation δ18O over central China is more a reflection of rainfall in the upstream region of the Indian monsoon. Therefore, despite the increased number of speleothem records emerging from the EASM region, we still lack a robust understanding of how local monsoon rainfall variability fluctuated in central China during the last deglaciation. To address this, here we present two new multiproxy speleothem records from Haozhu Cave (HZ), central China, during the deglaciation. HZ δ18Oc time series largely parallel those from other distal cave sites in China and India, suggesting that the oxygen isotopes are indeed dominated by upstream rainout. To inspect the local hydrology, we also examined Sr-Mg-Ba/Ca ratios and d13C. Interestingly, results show that during Heinrich Stadial 1 and the Younger Dryas, the d13C and trace elements decrease significantly, which we interpret to reflect higher cave recharge. Thus, despite a weakened Indian monsoon during these cooling events (inferred from the δ18Oc), our results suggest that central China was in fact wetter. To test this hypothesis, we examined past rainfall variability in China using CESM1.0.5 imposed with 1Sv of North Atlantic (NA) fresh water forcing. Similar to the proxies, results from these simulations demonstrate that south-central China was wetter following NA cooling, whilst northern China was drier. This `dipole' pattern can best be explained by a seasonally-lagged onset of the mei-yu stage of monsoon evolution. A later onset of mei-yu to midsummer during NA cooling would have resulted in a shorter midsummer stage, leaving south-central China wet at the expense of dry conditions to the north. Our proxy and model results thus support a recent hypothesis, that paleoclimate changes over East Asia reflect the timing and duration of its intraseasonal stages, modulated by the position of the westerlies relative to the Tibetan Plateau.
A knowledge-based patient assessment system: conceptual and technical design.
Reilly, C. A.; Zielstorff, R. D.; Fox, R. L.; O'Connell, E. M.; Carroll, D. L.; Conley, K. A.; Fitzgerald, P.; Eng, T. K.; Martin, A.; Zidik, C. M.; Segal, M.
2000-01-01
This paper describes the design of an inpatient patient assessment application that captures nursing assessment data using a wireless laptop computer. The primary aim of this system is to capture structured information for facilitating decision support and quality monitoring. The system also aims to improve efficiency of recording patient assessments, reduce costs, and improve discharge planning and early identification of patient learning needs. Object-oriented methods were used to elicit functional requirements and to model the proposed system. A tools-based development approach is being used to facilitate rapid development and easy modification of assessment items and rules for decision support. Criteria for evaluation include perceived utility by clinician users, validity of decision support rules, time spent recording assessments, and perceived utility of aggregate reports for quality monitoring. PMID:11079970
A knowledge-based patient assessment system: conceptual and technical design.
Reilly, C A; Zielstorff, R D; Fox, R L; O'Connell, E M; Carroll, D L; Conley, K A; Fitzgerald, P; Eng, T K; Martin, A; Zidik, C M; Segal, M
2000-01-01
This paper describes the design of an inpatient patient assessment application that captures nursing assessment data using a wireless laptop computer. The primary aim of this system is to capture structured information for facilitating decision support and quality monitoring. The system also aims to improve efficiency of recording patient assessments, reduce costs, and improve discharge planning and early identification of patient learning needs. Object-oriented methods were used to elicit functional requirements and to model the proposed system. A tools-based development approach is being used to facilitate rapid development and easy modification of assessment items and rules for decision support. Criteria for evaluation include perceived utility by clinician users, validity of decision support rules, time spent recording assessments, and perceived utility of aggregate reports for quality monitoring.
What the Toadfish Ear Tells the Toadfish Brain About Sound.
Edds-Walton, Peggy L
2016-01-01
Of the three, paired otolithic endorgans in the ear of teleost fishes, the saccule is the one most often demonstrated to have a major role in encoding frequencies of biologically relevant sounds. The toadfish saccule also encodes sound level and sound source direction in the phase-locked activity conveyed via auditory afferents to nuclei of the ipsilateral octaval column in the medulla. Although paired auditory receptors are present in teleost fishes, binaural processes were believed to be unimportant due to the speed of sound in water and the acoustic transparency of the tissues in water. In contrast, there are behavioral and anatomical data that support binaural processing in fishes. Studies in the toadfish combined anatomical tract-tracing and physiological recordings from identified sites along the ascending auditory pathway to document response characteristics at each level. Binaural computations in the medulla and midbrain sharpen the directional information provided by the saccule. Furthermore, physiological studies in the central nervous system indicated that encoding frequency, sound level, temporal pattern, and sound source direction are important components of what the toadfish ear tells the toadfish brain about sound.
Short-term memory and long-term memory are still different.
Norris, Dennis
2017-09-01
A commonly expressed view is that short-term memory (STM) is nothing more than activated long-term memory. If true, this would overturn a central tenet of cognitive psychology-the idea that there are functionally and neurobiologically distinct short- and long-term stores. Here I present an updated case for a separation between short- and long-term stores, focusing on the computational demands placed on any STM system. STM must support memory for previously unencountered information, the storage of multiple tokens of the same type, and variable binding. None of these can be achieved simply by activating long-term memory. For example, even a simple sequence of digits such as "1, 3, 1" where there are 2 tokens of the digit "1" cannot be stored in the correct order simply by activating the representations of the digits "1" and "3" in LTM. I also review recent neuroimaging data that has been presented as evidence that STM is activated LTM and show that these data are exactly what one would expect to see based on a conventional 2-store view. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Real-time data reduction capabilities at the Langley 7 by 10 foot high speed tunnel
NASA Technical Reports Server (NTRS)
Fox, C. H., Jr.
1980-01-01
The 7 by 10 foot high speed tunnel performs a wide range of tests employing a variety of model installation methods. To support the reduction of static data from this facility, a generalized wind tunnel data reduction program had been developed for use on the Langley central computer complex. The capabilities of a version of this generalized program adapted for real time use on a dedicated on-site computer are discussed. The input specifications, instructions for the console operator, and full descriptions of the algorithms are included.
Ground Motion in Central Mexico: A Comprehensive Analysis
NASA Astrophysics Data System (ADS)
Ramirez-Guzman, L.; Juarez, A.; Rábade, S.; Aguirre, J.; Bielak, J.
2015-12-01
This study presents a detailed analysis of the ground motion in Central Mexico based on numerical simulations, as well as broadband and strong ground motion records. We describe and evaluate a velocity model for Central Mexico derived from noise and regional earthquake cross-correlations, which is used throughout this research to estimate the ground motion in the region. The 3D crustal model includes a geotechnical structure of the Valley of Mexico (VM), subduction zone geometry, and 3D velocity distributions. The latter are based on more than 200 low magnitude (Mw < 4.5) earthquakes and two years of noise recordings. We emphasize the analysis on the ground motion in the Valley of Mexico originating from intra-slab deep events and temblors located along the Pacific coast. Also, we quantify the effects Trans-Mexican Volcanic Belt (TMVB) and the low-velocity deposits on the ground motion. The 3D octree-based finite element wave propagation computations, valid up to 1 Hz, reveal that the inclusion of a basin with a structure as complex as the Valley of Mexico dramatically enhances the regional effects induced by the TMVB. Moreover, the basin not only produces ground motion amplification and anomalous duration, but it also favors the energy focusing into zones of Mexico City where structures typically undergo high levels of damage.
Deep learning aided decision support for pulmonary nodules diagnosing: a review
Yang, Yixin; Feng, Xiaoyi; Chi, Wenhao; Li, Zhengyang; Duan, Wenzhe; Liu, Haiping; Liang, Wenhua; Wang, Wei; Chen, Ping
2018-01-01
Deep learning techniques have recently emerged as promising decision supporting approaches to automatically analyze medical images for different clinical diagnosing purposes. Diagnosing of pulmonary nodules by using computer-assisted diagnosing has received considerable theoretical, computational, and empirical research work, and considerable methods have been developed for detection and classification of pulmonary nodules on different formats of images including chest radiographs, computed tomography (CT), and positron emission tomography in the past five decades. The recent remarkable and significant progress in deep learning for pulmonary nodules achieved in both academia and the industry has demonstrated that deep learning techniques seem to be promising alternative decision support schemes to effectively tackle the central issues in pulmonary nodules diagnosing, including feature extraction, nodule detection, false-positive reduction, and benign-malignant classification for the huge volume of chest scan data. The main goal of this investigation is to provide a comprehensive state-of-the-art review of the deep learning aided decision support for pulmonary nodules diagnosing. As far as the authors know, this is the first time that a review is devoted exclusively to deep learning techniques for pulmonary nodules diagnosing. PMID:29780633
76 FR 43993 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-22
...; System of Records AGENCY: National Security Agency/Central Security Service, Department of Defense. ACTION: Notice to Delete a System of Records. SUMMARY: The National Security Agency/Central Security.... FOR FURTHER INFORMATION CONTACT: Ms. Anne Hill, National Security Agency/Central Security Service...
Tilt changes of short duration
McHugh, Stuart
1976-01-01
Section I of this report contains a classification scheme for short period tilt data. For convenience, all fluctuations in the local tilt field of less than 24 hours duration will be designated SP (i.e., short period) tilt events. Three basic categories of waveshape appearance are defined, and the rules for naming the waveforms are outlined. Examples from tilt observations at four central California sites are provided. Section II contains some coseismic tilt data. Fourteen earthquakes in central California, ranging in magnitude from 2.9 to 5.2, were chosen for study on four tiltmeters within 10 source dimensions of the epicenters. The raw records from each of the four tiltmeters at the times of the earthquakes were photographed and are presented in this section. Section III contains documentation of computer programs used in the analysis of the short period tilt data. Program VECTOR computes the difference vector of a tilt event and displays the sequence of events as a head-to-tail vector plot. Program ONSTSP 1) requires two component digitized tilt data as input, 2) scales and plots the data, and 3) computes and displays the amplitude, azimuth, and normalized derivative of the tilt amplitude. Program SHARPS computes the onset sharpness, (i.e., the normalized derivative of the tilt amplitude at the onset of the tilt event) as a function of source-station distance from a model of creep-related tilt changes. Program DSPLAY plots the digitized data.
Vertebrobasilar system computed tomographic angiography in central vertigo
Paşaoğlu, Lale
2017-01-01
Abstract The incidence of vertigo in the population is 20% to 30% and one-fourth of the cases are related to central causes. The aim of this study was to evaluate computed tomography angiography (CTA) findings of the vertebrobasilar system in central vertigo without stroke. CTA and magnetic resonance images of patients with vertigo were retrospectively evaluated. One hundred twenty-nine patients suspected of having central vertigo according to history, physical examination, and otological and neurological tests without signs of infarction on diffusion-weighted magnetic resonance imaging were included in the study. The control group included 120 patients with similar vascular disease risk factors but without vertigo. Vertebral and basilar artery diameters, hypoplasias, exit-site variations of vertebral artery, vertebrobasilar tortuosity, and stenosis of ≥50% detected on CTA were recorded for all patients. Independent-samples t test was used in variables with normal distribution, and Mann–Whitney U test in non-normal distribution. The difference of categorical variable distribution according to groups was analyzed with χ2 and/or Fisher exact test. Vertebral artery hypoplasia and ≥50% stenosis were seen more often in the vertigo group (P = 0.000, <0.001). Overall 78 (60.5%) vertigo patients had ≥50% stenosis, 54 (69.2%) had stenosis at V1 segment, 9 (11.5%) at V2 segment, 2 (2.5%) at V3 segment, and 13 (16.6%) at V4 segment. Both vertigo and control groups had similar basilar artery hypoplasia and ≥50% stenosis rates (P = 0.800, >0.05). CTA may be helpful to clarify the association between abnormal CTA findings of vertebral arteries and central vertigo. This article reveals the opportunity to diagnose posterior circulation abnormalities causing central vertigo with a feasible method such as CTA. PMID:28328808
Vertebrobasilar system computed tomographic angiography in central vertigo.
Paşaoğlu, Lale
2017-03-01
The incidence of vertigo in the population is 20% to 30% and one-fourth of the cases are related to central causes. The aim of this study was to evaluate computed tomography angiography (CTA) findings of the vertebrobasilar system in central vertigo without stroke.CTA and magnetic resonance images of patients with vertigo were retrospectively evaluated. One hundred twenty-nine patients suspected of having central vertigo according to history, physical examination, and otological and neurological tests without signs of infarction on diffusion-weighted magnetic resonance imaging were included in the study. The control group included 120 patients with similar vascular disease risk factors but without vertigo. Vertebral and basilar artery diameters, hypoplasias, exit-site variations of vertebral artery, vertebrobasilar tortuosity, and stenosis of ≥50% detected on CTA were recorded for all patients. Independent-samples t test was used in variables with normal distribution, and Mann-Whitney U test in non-normal distribution. The difference of categorical variable distribution according to groups was analyzed with χ and/or Fisher exact test.Vertebral artery hypoplasia and ≥50% stenosis were seen more often in the vertigo group (P = 0.000, <0.001). Overall 78 (60.5%) vertigo patients had ≥50% stenosis, 54 (69.2%) had stenosis at V1 segment, 9 (11.5%) at V2 segment, 2 (2.5%) at V3 segment, and 13 (16.6%) at V4 segment. Both vertigo and control groups had similar basilar artery hypoplasia and ≥50% stenosis rates (P = 0.800, >0.05).CTA may be helpful to clarify the association between abnormal CTA findings of vertebral arteries and central vertigo.This article reveals the opportunity to diagnose posterior circulation abnormalities causing central vertigo with a feasible method such as CTA.
A novel clinical decision support algorithm for constructing complete medication histories.
Long, Ju; Yuan, Michael Juntao
2017-07-01
A patient's complete medication history is a crucial element for physicians to develop a full understanding of the patient's medical conditions and treatment options. However, due to the fragmented nature of medical data, this process can be very time-consuming and often impossible for physicians to construct a complete medication history for complex patients. In this paper, we describe an accurate, computationally efficient and scalable algorithm to construct a medication history timeline. The algorithm is developed and validated based on 1 million random prescription records from a large national prescription data aggregator. Our evaluation shows that the algorithm can be scaled horizontally on-demand, making it suitable for future delivery in a cloud-computing environment. We also propose that this cloud-based medication history computation algorithm could be integrated into Electronic Medical Records, enabling informed clinical decision-making at the point of care. Copyright © 2017 Elsevier B.V. All rights reserved.
Experience using radio frequency laptops to access the electronic medical record in exam rooms.
Dworkin, L. A.; Krall, M.; Chin, H.; Robertson, N.; Harris, J.; Hughes, J.
1999-01-01
Kaiser Permanente, Northwest, evaluated the use of laptop computers to access our existing comprehensive Electronic Medical Record in exam rooms via a wireless radiofrequency (RF) network. Eleven of 22 clinicians who were offered the laptops successfully adopted their use in the exam room. These clinicians were able to increase their exam room time with the patient by almost 4 minutes (25%), apparently without lengthening their overall work day. Patient response to exam room computing was overwhelmingly positive. The RF network response time was similar to the hardwired network. Problems cited by some laptop users and many of the eleven non-adopters included battery issues, different equipment layout and function, and inadequate training. IT support needs for the RF laptops were two to four times greater than for hardwired desktops. Addressing the reliability and training issues should increase clinician acceptance, making a successful general roll-out for exam room computing more likely. PMID:10566458
Curriculum Connection. Take Technology Outdoors.
ERIC Educational Resources Information Center
Dean, Bruce Robert
1992-01-01
Technology can support hands-on science as elementary students use computers to formulate field guides to nature surrounding their school. Students examine other field guides; open databases for recording information; collect, draw, and identify plants, insects, and animals; enter data into the database; then generate a computerized field guide.…
Automated Computer Access Request System
NASA Technical Reports Server (NTRS)
Snook, Bryan E.
2010-01-01
The Automated Computer Access Request (AutoCAR) system is a Web-based account provisioning application that replaces the time-consuming paper-based computer-access request process at Johnson Space Center (JSC). Auto- CAR combines rules-based and role-based functionality in one application to provide a centralized system that is easily and widely accessible. The system features a work-flow engine that facilitates request routing, a user registration directory containing contact information and user metadata, an access request submission and tracking process, and a system administrator account management component. This provides full, end-to-end disposition approval chain accountability from the moment a request is submitted. By blending both rules-based and rolebased functionality, AutoCAR has the flexibility to route requests based on a user s nationality, JSC affiliation status, and other export-control requirements, while ensuring a user s request is addressed by either a primary or backup approver. All user accounts that are tracked in AutoCAR are recorded and mapped to the native operating system schema on the target platform where user accounts reside. This allows for future extensibility for supporting creation, deletion, and account management directly on the target platforms by way of AutoCAR. The system s directory-based lookup and day-today change analysis of directory information determines personnel moves, deletions, and additions, and automatically notifies a user via e-mail to revalidate his/her account access as a result of such changes. AutoCAR is a Microsoft classic active server page (ASP) application hosted on a Microsoft Internet Information Server (IIS).
34 CFR 6.4 - Central records; confidentiality.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Education Office of the Secretary, Department of Education INVENTIONS AND PATENTS (GENERAL) § 6.4 Central records; confidentiality. Central files and records shall be maintained of all inventions, patents, and... Department under such patents. Invention reports required from employees or others for the purpose of...
Keylogger Application to Monitoring Users Activity with Exact String Matching Algorithm
NASA Astrophysics Data System (ADS)
Rahim, Robbi; Nurdiyanto, Heri; Saleh A, Ansari; Abdullah, Dahlan; Hartama, Dedy; Napitupulu, Darmawan
2018-01-01
The development of technology is very fast, especially in the field of Internet technology that at any time experiencing significant changes, The development also supported by the ability of human resources, Keylogger is a tool that most developed because this application is very rarely recognized a malicious program by antivirus, keylogger will record all activities related to keystrokes, the recording process is accomplished by using string matching method. The application of string matching method in the process of recording the keyboard is to help the admin in knowing what the user accessed on the computer.
Reduction and analysis of data collected during the electromagnetic tornado experiment
NASA Technical Reports Server (NTRS)
Davisson, L. D.
1976-01-01
Techniques for data processing and analysis are described to support tornado detection by analysis of radio frequency interference in various frequency bands, and sea state determination from short pulse radar measurements. Activities include: strip chart recording of tornado data; the development and implementation of computer programs for digitalization and analysis of the data; data reduction techniques for short pulse radar data, and the simulation of radar returns from the sea surface by computer models.
Efficient Privacy-Aware Record Integration.
Kuzu, Mehmet; Kantarcioglu, Murat; Inan, Ali; Bertino, Elisa; Durham, Elizabeth; Malin, Bradley
2013-01-01
The integration of information dispersed among multiple repositories is a crucial step for accurate data analysis in various domains. In support of this goal, it is critical to devise procedures for identifying similar records across distinct data sources. At the same time, to adhere to privacy regulations and policies, such procedures should protect the confidentiality of the individuals to whom the information corresponds. Various private record linkage (PRL) protocols have been proposed to achieve this goal, involving secure multi-party computation (SMC) and similarity preserving data transformation techniques. SMC methods provide secure and accurate solutions to the PRL problem, but are prohibitively expensive in practice, mainly due to excessive computational requirements. Data transformation techniques offer more practical solutions, but incur the cost of information leakage and false matches. In this paper, we introduce a novel model for practical PRL, which 1) affords controlled and limited information leakage, 2) avoids false matches resulting from data transformation. Initially, we partition the data sources into blocks to eliminate comparisons for records that are unlikely to match. Then, to identify matches, we apply an efficient SMC technique between the candidate record pairs. To enable efficiency and privacy, our model leaks a controlled amount of obfuscated data prior to the secure computations. Applied obfuscation relies on differential privacy which provides strong privacy guarantees against adversaries with arbitrary background knowledge. In addition, we illustrate the practical nature of our approach through an empirical analysis with data derived from public voter records.
Musser, Jonathan W.; Watson, Kara M.; Painter, Jaime A.; Gotvald, Anthony J.
2016-02-22
Heavy rainfall occurred across South Carolina during October 1–5, 2015, as a result of an upper atmospheric low-pressure system that funneled tropical moisture from Hurricane Joaquin into the State. The storm caused major flooding in the central and coastal parts of South Carolina. Almost 27 inches of rain fell near Mount Pleasant in Charleston County during this period. U.S. Geological Survey (USGS) streamgages recorded peaks of record at 17 locations, and 15 other locations had peaks that ranked in the top 5 for the period of record. During the October 2015 flood event, USGS personnel made about 140 streamflow measurements at 86 locations to verify, update, or extend existing rating curves (which are used to compute streamflow from monitored river stage). Immediately after the storm event, USGS personnel documented 602 high-water marks, noting the location and height of the water above land surface. Later in October, 50 additional high-water marks were documented near bridges for South Carolina Department of Transportation. Using a subset of these high-water marks, 20 flood-inundation maps of 12 communities were created. Digital datasets of the inundation area, modeling boundary, and water depth rasters are all available for download.
Wright, Adam; Sittig, Dean F; Ash, Joan S; Erickson, Jessica L; Hickman, Trang T; Paterno, Marilyn; Gebhardt, Eric; McMullen, Carmit; Tsurikova, Ruslana; Dixon, Brian E; Fraser, Greg; Simonaitis, Linas; Sonnenberg, Frank A; Middleton, Blackford
2015-11-01
To identify challenges, lessons learned and best practices for service-oriented clinical decision support, based on the results of the Clinical Decision Support Consortium, a multi-site study which developed, implemented and evaluated clinical decision support services in a diverse range of electronic health records. Ethnographic investigation using the rapid assessment process, a procedure for agile qualitative data collection and analysis, including clinical observation, system demonstrations and analysis and 91 interviews. We identified challenges and lessons learned in eight dimensions: (1) hardware and software computing infrastructure, (2) clinical content, (3) human-computer interface, (4) people, (5) workflow and communication, (6) internal organizational policies, procedures, environment and culture, (7) external rules, regulations, and pressures and (8) system measurement and monitoring. Key challenges included performance issues (particularly related to data retrieval), differences in terminologies used across sites, workflow variability and the need for a legal framework. Based on the challenges and lessons learned, we identified eight best practices for developers and implementers of service-oriented clinical decision support: (1) optimize performance, or make asynchronous calls, (2) be liberal in what you accept (particularly for terminology), (3) foster clinical transparency, (4) develop a legal framework, (5) support a flexible front-end, (6) dedicate human resources, (7) support peer-to-peer communication, (8) improve standards. The Clinical Decision Support Consortium successfully developed a clinical decision support service and implemented it in four different electronic health records and four diverse clinical sites; however, the process was arduous. The lessons identified by the Consortium may be useful for other developers and implementers of clinical decision support services. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Virtualized Networks and Virtualized Optical Line Terminal (vOLT)
NASA Astrophysics Data System (ADS)
Ma, Jonathan; Israel, Stephen
2017-03-01
The success of the Internet and the proliferation of the Internet of Things (IoT) devices is forcing telecommunications carriers to re-architecture a central office as a datacenter (CORD) so as to bring the datacenter economics and cloud agility to a central office (CO). The Open Network Operating System (ONOS) is the first open-source software-defined network (SDN) operating system which is capable of managing and controlling network, computing, and storage resources to support CORD infrastructure and network virtualization. The virtualized Optical Line Termination (vOLT) is one of the key components in such virtualized networks.
A shared computer-based problem-oriented patient record for the primary care team.
Linnarsson, R; Nordgren, K
1995-01-01
1. INTRODUCTION. A computer-based patient record (CPR) system, Swedestar, has been developed for use in primary health care. The principal aim of the system is to support continuous quality improvement through improved information handling, improved decision-making, and improved procedures for quality assurance. The Swedestar system has evolved during a ten-year period beginning in 1984. 2. SYSTEM DESIGN. The design philosophy is based on the following key factors: a shared, problem-oriented patient record; structured data entry based on an extensive controlled vocabulary; advanced search and query functions, where the query language has the most important role; integrated decision support for drug prescribing and care protocols and guidelines; integrated procedures for quality assurance. 3. A SHARED PROBLEM-ORIENTED PATIENT RECORD. The core of the CPR system is the problem-oriented patient record. All problems of one patient, recorded by different members of the care team, are displayed on the problem list. Starting from this list, a problem follow-up can be made, one problem at a time or for several problems simultaneously. Thus, it is possible to get an integrated view, across provider categories, of those problems of one patient that belong together. This shared problem-oriented patient record provides an important basis for the primary care team work. 4. INTEGRATED DECISION SUPPORT. The decision support of the system includes a drug prescribing module and a care protocol module. The drug prescribing module is integrated with the patient records and includes an on-line check of the patient's medication list for potential interactions and data-driven reminders concerning major drug problems. Care protocols have been developed for the most common chronic diseases, such as asthma, diabetes, and hypertension. The patient records can be automatically checked according to the care protocols. 5. PRACTICAL EXPERIENCE. The Swedestar system has been implemented in a primary care area with 30,000 inhabitants. It is being used by all the primary care team members: 15 general practitioners, 25 district nurses, and 10 physiotherapists. Several years of practical experience of the CPR system shows that it has a positive impact on quality of care on four levels: 1) improved clinical follow-up of individual patients; 2) facilitated follow-up of aggregated data such as practice activity analysis, annual reports, and clinical indicators; 3) automated medical audit; and 4) concurrent audit. Within that primary care area, quality of care has improved substantially in several aspects due to the use of the CPR system [1].
When Informationists Get Involved: the CHICA-GIS Project.
Whipple, Elizabeth C; Odell, Jere D; Ralston, Rick K; Liu, Gilbert C
2013-01-01
Child Health Improvement through Computer Automation (CHICA) is a computer decision support system (CDSS) that interfaces with existing electronic medical record systems (EMRS) and delivers "just-in-time" patient-relevant guidelines to physicians during the clinical encounter and accurately captures structured data from all who interact with the system. "Delivering Geospatial Intelligence to Health Care Professionals (CHICA-GIS)" (1R01LM010923-01) expands the medical application of Geographic Information Systems (GIS) by integrating a geographic information system with CHICA. To provide knowledge management support for CHICA-GIS, three informationists at the Indiana University School of Medicine were awarded a supplement from the National Library Medicine. The informationists will enhance CHICA-GIS by: improving the accuracy and accessibility of information, managing and mapping the knowledge which undergirds the CHICA-GIS decision support tool, supporting community engagement and consumer health information outreach, and facilitating the dissemination of new CHICA-GIS research results and services.
75 FR 56079 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-15
... to the National Security Agency/Central Security Service, Freedom of Information Act/Privacy Act...; System of Records AGENCY: National Security Agency/Central Security Service, DoD. ACTION: Notice to amend a system of records. SUMMARY: The National Security Agency/Central Security Service is proposing to...
Intravascular lymphoma involving the central and peripheral nervous systems in a dog.
Bush, William W; Throop, Juliene L; McManus, Patricia M; Kapatkin, Amy S; Vite, Charles H; Van Winkle, Tom J
2003-01-01
A 5-year-old, castrated male mixed-breed dog was presented for paraparesis, ataxia, hyperesthesia, and thrombocytopenia of 5 months' duration and recurrent seizures during the preceding 2 weeks. Multifocal neurological, ophthalmological, pulmonary, and cardiac diseases were identified. Magnetic resonance imaging and cerebrospinal fluid analysis supported a tentative diagnosis of neoplastic or inflammatory disease. A computed tomography-guided biopsy provided both cytopathological and histopathological evidence of intravascular lymphoma. The disease progressed despite chemotherapy with prednisone, L-asparginase, and vincristine. Postmortem histopathological examinations suggested intravascular lymphoma in the central and peripheral nervous systems as well as in multiple other organ systems. This is the first description of an antemortem diagnosis and treatment of intravascular lymphoma involving the central nervous system of a dog.
Evolution in Clinical Knowledge Management Strategy at Intermountain Healthcare
Hulse, Nathan C.; Galland, Joel; Borsato, Emerson P.
2012-01-01
In this manuscript, we present an overview of the clinical knowledge management strategy at Intermountain Healthcare in support of our electronic medical record systems. Intermountain first initiated efforts in developing a centralized enterprise knowledge repository in 2001. Applications developed, areas of emphasis served, and key areas of focus are presented. We also detail historical and current areas of emphasis, in response to business needs. PMID:23304309
[Sleep and respiratory disorders in myotonic dystrophy of Steinert].
López-Esteban, P; Peraita-Adrados, R
2000-03-01
It has been hypothesized that hypersomnia and sleep related respiratory impairment are both central in origin in myotonic dystrophy. To describe by means of video-polysomnographic recordings the central origin of the sleep respiratory disorders. We studied 11 patients, 6 men and 5 women (mean age 42.7 years) with myotonic dystrophy. A moderate to severe ventilatory impairment of a primarily restrictive type was seen in all patients, three of them after the first episode of respiratory insufficiency. The patients were evaluated in order to determine their body mass index and presence of sleep-related complaints. Video-polysomnographic recordings (EEG, EOG, EKG, submental and tibialis anterior EMGs, respiration and Sa02) and pulmonary function tests were performed in each patient. Identical recordings were repeated in six cases, which were to undergo non-invasive bi-level ventilation (BiPAP) in order to adjust the inspiratory and expiratory pressures and the machine mode. We found slight hypopnea and apnea, predominantly of a central type, in stage 1 and REM sleep and alveolar hypoventilation in all patients. Sleep was disrupted and the efficiency index was very low. In three patients HLA typing showed a positive DQ6 haplotype. Six patients were treated with n-BiPAP. Nasal-BIPAP should be considered as an alternative in ventilatory support during sleep in these patients and video-polysomnography as a valid method of evaluating the ideal time to start treatment.
The Tale of Flooding over the Central United States: Not Bigger but More Frequent
NASA Astrophysics Data System (ADS)
Mallakpour, I.; Villarini, G.
2014-12-01
Flooding over the central United States is responsible for large societal and economic impacts, quantifiable in tens of fatalities and billions of dollars in damage. Because of these large repercussions, it is of paramount importance to examine whether the magnitude and/or frequency of flood events have been changing over the most recent decades. Here we address this research question using annual and seasonal maximum daily streamflow records from 774 U.S. Geological Survey (USGS) stations over the central United States (the study area includes North Dakota, South Dakota, Nebraska, Kansas, Missouri, Iowa, Minnesota, Wisconsin, Illinois, West Virginia, Kentucky, Ohio, Indiana, and Michigan). The focus is on "long" records (i.e., at least 50 years of data) ending no earlier than 2011. Analyses are performed using block-maximum and peak-over-threshold approaches. We find limited evidence suggesting increasing or decreasing trends in the magnitude of flood peaks over this area. On the other hand, there is much stronger evidence of increasing frequency of flood events. Therefore, our results support the notion that it is not so much that the largest flood peaks are getting larger, but rather that we have been experiencing a larger number of flood events every year. By examining the rainfall records, we are able to link these increasing trends to similar patterns in heavy rainfall over the region.
García-Muñoz Rodrigo, Fermín; Urquía Martí, Lourdes; Galán Henríquez, Gloria; Rivero Rodríguez, Sonia; Hernández Gómez, Alberto
2018-06-18
To characterize the neural breathing pattern in preterm infants supported with non-invasive neurally adjusted ventilatory assist (NIV-NAVA). Single-center prospective observational study. The electrical activity of the diaphragm (EAdi) was periodically recorded in 30-second series with the Edi catheter and the Servo-n software (Maquet, Solna, Sweden) in preterm infants supported with NIV-NAVA. The EAdi Peak , EAdi Min , EAdi Tonic , EAdi Phasic , neural inspiratory, and expiratory times (nTi and nTe) and the neural respiratory rate (nRR) were calculated. EAdi curves were generated by Excel for visual examination and classified according to the predominant pattern. 291 observations were analyzed in 19 patients with a mean GA of 27.3 weeks (range 24-36 weeks), birth weight 1028 g (510-2945 g), and a median (IQR) postnatal age of 18 days (4-27 days). The distribution of respiratory patterns was phasic without tonic activity 61.9%, phasic with basal tonic activity 18.6, tonic burst 3.8%, central apnea 7.9%, and mixed pattern 7.9%. In addition, 12% of the records showed apneas of >10 seconds, and 50.2% one or more "sighs", defined as breaths with an EAdi Peak and/or nTi greater than twice the average EAdi Peak and/or nTi of the recording. Neural times were measurable in 252 observations. The nTi was, median (IQR): 279 ms (253-285 ms), the nTe 764 ms (642-925 ms), and the nRR 63 bpm (51-70), with a great intra and inter-subjects variability. The neural breathing patterns in preterm infants supported with NIV-NAVA are quite variable and are characterized by the presence of significant tonic activity. Central apneas and sighs are common in this group of patients. The nTi seems to be shorter than the mechanical Ti commonly used in assisted ventilation.
76 FR 58786 - Privacy Act of 1974; Systems of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-22
... National Security Agency/Central Security System systems of records notices subject to the Privacy Act of... inquiries to the National Security Agency/Central Security Service, Freedom of Information Act/Privacy Act...; Systems of Records AGENCY: National Security Agency/Central Security Service, Department of Defense (DoD...
77 FR 56628 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-13
... to the National Security Agency/Central Security Service, Freedom of Information Act/Privacy Act...; System of Records AGENCY: National Security Agency/Central Security Service, DoD. ACTION: Notice to add a system of records. SUMMARY: The National Security Agency/Central Security Service proposes to add a new...
78 FR 45913 - Privacy Act of 1974; Systems of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-30
... National Security Agency/Central Security Service systems of records subject to the Privacy Act of 1974 (5... National Security Agency/Central Security Service, Freedom of Information Act/Privacy Act Office, 9800...; Systems of Records AGENCY: National Security Agency/Central Security Service, DoD. ACTION: Notice to alter...
Hari, Pradip; Ko, Kevin; Koukoumidis, Emmanouil; Kremer, Ulrich; Martonosi, Margaret; Ottoni, Desiree; Peh, Li-Shiuan; Zhang, Pei
2008-10-28
Increasingly, spatial awareness plays a central role in many distributed and mobile computing applications. Spatially aware applications rely on information about the geographical position of compute devices and their supported services in order to support novel functionality. While many spatial application drivers already exist in mobile and distributed computing, very little systems research has explored how best to program these applications, to express their spatial and temporal constraints, and to allow efficient implementations on highly dynamic real-world platforms. This paper proposes the SARANA system architecture, which includes language and run-time system support for spatially aware and resource-aware applications. SARANA allows users to express spatial regions of interest, as well as trade-offs between quality of result (QoR), latency and cost. The goal is to produce applications that use resources efficiently and that can be run on diverse resource-constrained platforms ranging from laptops to personal digital assistants and to smart phones. SARANA's run-time system manages QoR and cost trade-offs dynamically by tracking resource availability and locations, brokering usage/pricing agreements and migrating programs to nodes accordingly. A resource cost model permeates the SARANA system layers, permitting users to express their resource needs and QoR expectations in units that make sense to them. Although we are still early in the system development, initial versions have been demonstrated on a nine-node system prototype.
SPIRES Tailored to a Special Library: A Mainframe Answer for a Small Online Catalog.
ERIC Educational Resources Information Center
Newton, Mary
1989-01-01
Describes the design and functions of a technical library database maintained on a mainframe computer and supported by the SPIRES database management system. The topics covered include record structures, vocabulary control, input procedures, searching features, time considerations, and cost effectiveness. (three references) (CLB)
Gradiency and Visual Context in Syntactic Garden-Paths
ERIC Educational Resources Information Center
Farmer, Thomas A.; Anderson, Sarah E.; Spivey, Michael J.
2007-01-01
Through recording the streaming x- and y-coordinates of computer-mouse movements, we report evidence that visual context provides an immediate constraint on the resolution of syntactic ambiguity in the visual-world paradigm. This finding converges with previous eye-tracking results that support a constraint-based account of sentence processing, in…
IMS Version 3 Student Data Base Maintenance Program.
ERIC Educational Resources Information Center
Brown, John R.
Computer routines that update the Instructional Management System (IMS) Version 3 student data base which supports the Southwest Regional Laboratory's (SWRL) student monitoring system are described. Written in IBM System 360 FORTRAN IV, the program updates the data base by adding, changing and deleting records, as well as adding and deleting…
ERIC Educational Resources Information Center
Peters, Randall D.
2004-01-01
In these studies, a vegetable can containing fluid was swung as a pendulum by supporting its end-lips with a pair of knife edges. The motion was measured with a capacitive sensor and the logarithmic decrement in free decay was estimated from computer-collected records. Measurements performed with nine different homogeneous liquids, distributed…
Recording Images Observed Using Ripple Tanks
ERIC Educational Resources Information Center
Auty, Geoff
2018-01-01
Diagrams and photographs (or computer simulations) should not replace effective observations of the wave properties that can be illustrated using a ripple tank, but they can provide support when discussing and revising what has been observed. This article explains and illustrates a route towards successful photography, which is much easier with…
ERIC Educational Resources Information Center
Quixal, Martí; Meurers, Detmar
2016-01-01
The paper tackles a central question in the field of Intelligent Computer-Assisted Language Learning (ICALL): How can language learning tasks be conceptualized and made explicit in a way that supports the pedagogical goals of current Foreign Language Teaching and Learning and at the same time provides an explicit characterization of the Natural…
DeCourcy, Kelly; Hostnik, Eric T; Lorbach, Josh; Knoblaugh, Sue
2016-12-01
An adult leopard gecko ( Eublepharis macularius ) presented for lethargy, hyporexia, weight loss, decreased passage of waste, and a palpable caudal coelomic mass. Computed tomography showed a heterogeneous hyperattenuating (∼143 Hounsfield units) structure within the right caudal coelom. The distal colon-coprodeum lumen or urinary bladder was hypothesized as the most likely location for the heterogeneous structure. Medical support consisted of warm water and lubricant enema, as well as a heated environment. Medical intervention aided the passage of a plug comprised centrally of cholesterol and urates with peripheral stratified layers of fibrin, macrophages, heterophils, and bacteria. Within 24 hr, a follow-up computed tomography scan showed resolution of the pelvic canal plug.
NASA Astrophysics Data System (ADS)
Andersen, Nils; Lauterbach, Stefan; Erlenkeuser, Helmut; Danielopol, Dan L.; Namiotko, Tadeusz; Hüls, Matthias; Belmecheri, Soumaya; Dulski, Peter; Nantke, Carla; Meyer, Hanno; Chapligin, Bernhard; von Grafenstein, Ulrich; Brauer, Achim
2017-09-01
The so-called 8.2 ka event represents one of the most prominent cold climate anomalies during the Holocene warm period. Accordingly, several studies have addressed its trigger mechanisms, absolute dating and regional characteristics so far. However, knowledge about subsequent climate recovery is still limited although this might be essential for the understanding of rapid climatic changes. Here we present a new sub-decadally resolved and precisely dated oxygen isotope (δ18O) record for the interval between 7.7 and 8.7 ka BP (103 calendar years before AD 1950), derived from the calcareous valves of benthic ostracods preserved in the varved lake sediments of pre-Alpine Mondsee (Austria). Besides a clear reflection of the 8.2 ka event, showing a good agreement in timing, duration and magnitude with other regional stable isotope records, the high-resolution Mondsee lake sediment record provides evidence for a 75-year-long interval of higher-than-average δ18O values directly after the 8.2 ka event, possibly reflecting increased air temperatures in Central Europe. This observation is consistent with evidence from other proxy records in the North Atlantic realm, thus most probably reflecting a hemispheric-scale climate signal rather than a local phenomenon. As a possible trigger we suggest an enhanced resumption of the Atlantic meridional overturning circulation (AMOC), supporting assumptions from climate model simulations.
Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.
2014-01-01
We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.
Strang, Adam J; Berg, William P; Hieronymus, Mathias
2009-08-01
Muscle fatigue has been shown to result in early onset of anticipatory postural adjustments (APAs) relative to those produced in a non-fatigued state. This adaptation is thought to reflect an attempt to preserve postural stability during a focal movement performed in a fatigued state. It remains unclear, however, whether this adaptation is of central (e.g., central nervous system motor command) or peripheral (e.g., muscle contractile properties), origin. One way to confirm that this adaptation is centrally driven is to identify fatigued-induced early APA onsets in non-fatigued muscles. In this study, APAs were obtained using a rapid bilateral reaching maneuver and recorded via surface electromyography before and after conditions of rest (n = 25) or fatigue (n = 25). Fatigue was generated using isokinetic exercise of the right leg. Results showed that fatigue-induced early APA onsets occurred in fatigued and non-fatigued muscles, confirming that fatigue-induced early APA onset is a centrally mediated adaptation.
Crowd-Sourcing Seismic Data: Lessons Learned from the Quake-Catcher Network
NASA Astrophysics Data System (ADS)
Cochran, E. S.; Sumy, D. F.; DeGroot, R. M.; Clayton, R. W.
2015-12-01
The Quake Catcher Network (QCN; qcn.caltech.edu) uses low cost micro-electro-mechanical system (MEMS) sensors hosted by volunteers to collect seismic data. Volunteers use accelerometers internal to laptop computers, phones, tablets or small (the size of a matchbox) MEMS sensors plugged into desktop computers using a USB connector to collect scientifically useful data. Data are collected and sent to a central server using the Berkeley Open Infrastructure for Network Computing (BOINC) distributed computing software. Since 2008, when the first citizen scientists joined the QCN project, sensors installed in museums, schools, offices, and residences have collected thousands of earthquake records. We present and describe the rapid installations of very dense sensor networks that have been undertaken following several large earthquakes including the 2010 M8.8 Maule Chile, the 2010 M7.1 Darfield, New Zealand, and the 2015 M7.8 Gorkha, Nepal earthquake. These large data sets allowed seismologists to develop new rapid earthquake detection capabilities and closely examine source, path, and site properties that impact ground shaking at a site. We show how QCN has engaged a wide sector of the public in scientific data collection, providing the public with insights into how seismic data are collected and used. Furthermore, we describe how students use data recorded by QCN sensors installed in their classrooms to explore and investigate earthquakes that they felt, as part of 'teachable moment' exercises.
Noninvasive ventilation in a child affected by achondroplasia respiratory difficulty syndrome.
Ottonello, Giancarlo; Villa, Giovanna; Moscatelli, Andrea; Diana, Maria Cristina; Pavanello, Marco
2007-01-01
Achondroplasia can result in respiratory difficulty in early infancy, from anatomical abnormalities such as mid-facial hypoplasia and/or adenotonsillar hypertrophy, leading to obstructive apnea, or to pathophysiological changes occurring in nasopharyngeal or glossal muscle tone, related to neurological abnormalities (foramen magnum and/or hypoglossal canal problems, hydrocephalus), leading to central apnea. More often, the two respiratory components (central and obstructive) are both evident in mixed apnea. Polysomnographic recording should be used during preoperative and postoperative assessment of achondroplastic children and in the subsequent follow-up to assess the adequacy of continuing home respiratory support, including supplemental oxygen, bilevel positive airway pressure, or assisted ventilation.
Wilson, J Adam; Williams, Justin C
2009-01-01
The clock speeds of modern computer processors have nearly plateaued in the past 5 years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card [graphics processing unit (GPU)] was developed for real-time neural signal processing of a brain-computer interface (BCI). The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter), followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a central processing unit-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels of 250 ms in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.
Domingo, Dorothy L.; R. Aster,; S. Grand,; J Ni,; W.S. Baldridge,; David C. Wilson USGS,
2010-01-01
After maintaining elevations near sea level for over 500 million years, the Colorado Plateau (CP) has a present average elevation of 2 km. We compute new receiver function images from the first dense seismic transect to cross the plateau that reveal a central CP crustal thickness of 42–50 km thinning to 30–35 km at the CP margins. Isostatic calculations show that only approximately 20% of central CP elevations can be explained by thickened crust alone, with the CP edges requiring nearly total mantle compensation. We calculate an uplift budget showing that CP buoyancy arises from a combination of crustal thickening, heating and alteration of the lithospheric root, dynamic support from mantle upwelling, and significant buoyant edge effects produced by small-scale convecting asthenosphere at its margins.
Removing the center from computing: biology's new mode of digital knowledge production.
November, Joseph
2011-06-01
This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.
A Medical Decision Support System for the Space Station Health Maintenance Facility
Ostler, David V.; Gardner, Reed M.; Logan, James S.
1988-01-01
NASA is developing a Health Maintenance Facility (HMF) to provide the equipment and supplies necessary to deliver medical care in the Space Station. An essential part of the Health Maintenance Facility is a computerized Medical Decision Support System (MDSS) that will enhance the ability of the medical officer (“paramedic” or “physician”) to maintain the crew's health, and to provide emergency medical care. The computer system has four major functions: 1) collect and integrate medical information into an electronic medical record from Space Station medical officers, HMF instrumentation, and exercise equipment; 2) provide an integrated medical record and medical reference information management system; 3) manage inventory for logistical support of supplies and secure pharmaceuticals; 4) supply audio and electronic mail communications between the medical officer and ground based flight surgeons. ImagesFigure 1
System and Method for Monitoring Distributed Asset Data
NASA Technical Reports Server (NTRS)
Gorinevsky, Dimitry (Inventor)
2015-01-01
A computer-based monitoring system and monitoring method implemented in computer software for detecting, estimating, and reporting the condition states, their changes, and anomalies for many assets. The assets are of same type, are operated over a period of time, and outfitted with data collection systems. The proposed monitoring method accounts for variability of working conditions for each asset by using regression model that characterizes asset performance. The assets are of the same type but not identical. The proposed monitoring method accounts for asset-to-asset variability; it also accounts for drifts and trends in the asset condition and data. The proposed monitoring system can perform distributed processing of massive amounts of historical data without discarding any useful information where moving all the asset data into one central computing system might be infeasible. The overall processing is includes distributed preprocessing data records from each asset to produce compressed data.
NASA Astrophysics Data System (ADS)
Papers are presented on ISDN, mobile radio systems and techniques for digital connectivity, centralized and distributed algorithms in computer networks, communications networks, quality assurance and impact on cost, adaptive filters in communications, the spread spectrum, signal processing, video communication techniques, and digital satellite services. Topics discussed include performance evaluation issues for integrated protocols, packet network operations, the computer network theory and multiple-access, microwave single sideband systems, switching architectures, fiber optic systems, wireless local communications, modulation, coding, and synchronization, remote switching, software quality, transmission, and expert systems in network operations. Consideration is given to wide area networks, image and speech processing, office communications application protocols, multimedia systems, customer-controlled network operations, digital radio systems, channel modeling and signal processing in digital communications, earth station/on-board modems, computer communications system performance evaluation, source encoding, compression, and quantization, and adaptive communications systems.
Vascular surgical data registries for small computers.
Kaufman, J L; Rosenberg, N
1984-08-01
Recent designs for computer-based vascular surgical registries and clinical data bases have employed large centralized systems with formal programming and mass storage. Small computers, of the types created for office use or for word processing, now contain sufficient speed and memory storage capacity to allow construction of decentralized office-based registries. Using a standardized dictionary of terms and a method of data organization adapted to word processing, we have created a new vascular surgery data registry, "VASREG." Data files are organized without programming, and a limited number of powerful logical statements in English are used for sorting. The capacity is 25,000 records with current inexpensive memory technology. VASREG is adaptable to computers made by a variety of manufacturers, and interface programs are available for conversion of the word processor formated registry data into forms suitable for analysis by programs written in a standard programming language. This is a low-cost clinical data registry available to any physician. With a standardized dictionary, preparation of regional and national statistical summaries may be facilitated.
Raper, J E
1977-01-01
Since February 1976, The Medical Library Center of New York, with the assistance of the SUNY/OCLC Network, has offered, on a subscription basis, a centralized automated cataloging service to health science libraries in the greater metropolitan New York area. By using workforms and prints of OCLC record (amended by the subscribing participants), technical services personnel at the center have fed cataloging data, via a CRT terminal, into the OCLC system, which provides (1) catalog cards, received in computer filing order; (2) book card, spine, and pocket labels; (3) accessions lists; and (4) data for eventual production of book catalogs and union catalogs. The experience of the center in the development, implementation, operation, and budgeting of its shared cataloging service is discussed. PMID:843650
Raper, J E
1977-04-01
Since February 1976, The Medical Library Center of New York, with the assistance of the SUNY/OCLC Network, has offered, on a subscription basis, a centralized automated cataloging service to health science libraries in the greater metropolitan New York area. By using workforms and prints of OCLC record (amended by the subscribing participants), technical services personnel at the center have fed cataloging data, via a CRT terminal, into the OCLC system, which provides (1) catalog cards, received in computer filing order; (2) book card, spine, and pocket labels; (3) accessions lists; and (4) data for eventual production of book catalogs and union catalogs. The experience of the center in the development, implementation, operation, and budgeting of its shared cataloging service is discussed.
1991-09-01
constant data into the gaining base’s computer records. Among the data elements to be loaded, the 1XT434 image contains the level detail effective date...the mission support effective date, and the PBR override (19:19-203). In conjunction with the 1XT434, the Mission Change Parameter Image (Constant...the gaining base (19:19-208). The level detail effective date establishes the date the MCDDFR and MCDDR "are considered by the requirements computation
CARDS: A blueprint and environment for domain-specific software reuse
NASA Technical Reports Server (NTRS)
Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine
1992-01-01
CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'
BUCKO- A BUCKLING ANALYSIS FOR RECTANGULAR PLATES WITH CENTRALLY LOCATED CUTOUTS
NASA Technical Reports Server (NTRS)
Nemeth, M. P.
1994-01-01
BUCKO is a computer program developed to predict the buckling load of a rectangular compression-loaded orthotropic plate with a centrally located cutout. The plate is assumed to be a balanced, symmetric laminate of uniform thickness. The cutout shape can be elliptical, circular, rectangular, or square. The BUCKO package includes sample data that demonstrates the essence of the program and its ease of usage. BUCKO uses an approximate one-dimensional formulation of the classical two-dimensional buckling problem following the Kantorovich method. The boundary conditions are considered to be simply supported unloaded edges and either clamped or simply supported loaded edges. The plate is loaded in uniaxial compression by either uniformly displacing or uniformly stressing two opposite edges of the plate. The BUCKO analysis consists of two parts: calculation of the inplane stress distribution prior to buckling, and calculation of the plate axial load and displacement at buckling. User input includes plate planform and cutout geometry, plate membrane and bending stiffnesses, finite difference parameters, boundary condition data, and loading data. Results generated by BUCKO are the prebuckling strain energy, inplane stress resultants, buckling mode shape, critical end shortening, and average axial and transverse strains at buckling. BUCKO is written in FORTRAN V for batch execution and has been implemented on a CDC CYBER 170 series computer operating under NOS with a central memory requirement of approximately 343K of 60 bit words. This program was developed in 1984 and was last updated in 1990.
Centrality-based Selection of Semantic Resources for Geosciences
NASA Astrophysics Data System (ADS)
Cerba, Otakar; Jedlicka, Karel
2017-04-01
Semantical questions intervene almost in all disciplines dealing with geographic data and information, because relevant semantics is crucial for any way of communication and interaction among humans as well as among machines. But the existence of such a large number of different semantic resources (such as various thesauri, controlled vocabularies, knowledge bases or ontologies) makes the process of semantics implementation much more difficult and complicates the use of the advantages of semantics. This is because in many cases users are not able to find the most suitable resource for their purposes. The research presented in this paper introduces a methodology consisting of an analysis of identical relations in Linked Data space, which covers a majority of semantic resources, to find a suitable resource of semantic information. Identical links interconnect representations of an object or a concept in various semantic resources. Therefore this type of relations is considered to be crucial from the view of Linked Data, because these links provide new additional information, including various views on one concept based on different cultural or regional aspects (so-called social role of Linked Data). For these reasons it is possible to declare that one reasonable criterion for feasible semantic resources for almost all domains, including geosciences, is their position in a network of interconnected semantic resources and level of linking to other knowledge bases and similar products. The presented methodology is based on searching of mutual connections between various instances of one concept using "follow your nose" approach. The extracted data on interconnections between semantic resources are arranged to directed graphs and processed by various metrics patterned on centrality computing (degree, closeness or betweenness centrality). Semantic resources recommended by the research could be used for providing semantically described keywords for metadata records or as names of items in data models. Such an approach enables much more efficient data harmonization, integration, sharing and exploitation. * * * * This publication was supported by the project LO1506 of the Czech Ministry of Education, Youth and Sports. This publication was supported by project Data-Driven Bioeconomy (DataBio) from the ICT-15-2016-2017, Big Data PPP call.
Giordano, Mauro; Ausiello, Pietro; Martorelli, Massimo; Sorrentino, Roberto
2012-09-01
To evaluate the reliability and accuracy of computer-designed surgical guides in osseointegrated oral implant rehabilitation. Six implant rehabilitations, with a total of 17 implants, were completed with computer-designed surgical guides, performed with the master model developed by muco-compressive and muco-static impressions. In the first case, the surgical guide had exclusively mucosal support, in the second case exclusively dental support. For all six cases computer-aided surgical planning was performed by virtual analyses with 3D models obtained by dental scan DICOM data. The accuracy and stability of implant osseointegration over two years post surgery was then evaluated with clinical and radiographic examinations. Radiographic examination, performed with digital acquisitions (RVG - Radio Video graph) and parallel techniques, allowed two-dimensional feedback with a margin of linear error of 10%. Implant osseointegration was recorded for all the examined rehabilitations. During the clinical and radiographic post-surgical assessments, over the following two years, the peri-implant bone level was found to be stable and without appearance of any complications. The margin of error recorded between pre-operative positions assigned by virtual analysis and the post-surgical digital radiographic observations was as low as 0.2mm. Computer-guided implant surgery can be very effective in oral rehabilitations, providing an opportunity for the surgeon: (a) to avoid the necessity of muco-periosteal detachments and then (b) to perform minimally invasive interventions, whenever appropriate, with a flapless approach. Copyright © 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
The Asian Monsoon Moisture Transportation Revealed by Two Cave Sites in Myanmar
NASA Astrophysics Data System (ADS)
Liu, G.; Wang, X.; Chiang, H. W.; Maung Maung, P.; Jiang, X.; Aung, L. T.; Tun, S. T.
2014-12-01
Here we present two well-resolved, calcite δ18O records on Myanmar speleothems. The samples were collected from a coastal site in southeastern Myanmar and a plateau site in central Myanmar, respectively. Chronologically determined by high-precision U/Th dating techniques, both records span a large portion of the past 40,000 years. The two records show similar millennial-scale oscillations during the last glacial period, which are also in-phase with the speleothem records from Chinese cave sites located in the downstream of Indian Monsoon trajectories. The δ18O values between the two profiles are virtually the same, ~ -7.5‰, during late Holocene, in concert with the numbers in modern rainfall at the two sites. However, in glacial time, the δ18O value of the central Myanmar record shifts from -6.5‰ to -8‰, approximately 2‰ lower than that in the coastal dataset, which varies from -4.5‰ to -6‰. We interpret the similarly low δ18O values during Holocene in both records as a result of strong monsoonal rainfall and water recycling particularly through forest transpiration. However in glacial time, with a possibly drier and less forested land, water recycling is weaker. Therefore, rainfall δ18O and subsequently speleothem δ18O appear a stronger geographical gradient, possibly dominated by the continental rainout effect. Our interpretation can be supported by the speleothem δ13C records from the two sites. Calcite δ13C from the coastal site varies slightly from ~-7‰ in the last glacial to ~-9‰ in Holocene. Whereas it shares a similar value to the coastal record during Holocene, the δ13C profile from the plateau site shows a much higher value, up to -0.7‰, during the glacial time. This suggests that the mountainous region in central Myanmar was likely dominated by C4 plants (e.g., grass) during the glacial time, while the same region is covered by forests today. Such change on vegetation type and coverage may influence the δ18O of recycling moisture transported further inland.
X-ray emission spectroscopy evidences a central carbon in the nitrogenase iron-molybdenum cofactor.
Lancaster, Kyle M; Roemelt, Michael; Ettenhuber, Patrick; Hu, Yilin; Ribbe, Markus W; Neese, Frank; Bergmann, Uwe; DeBeer, Serena
2011-11-18
Nitrogenase is a complex enzyme that catalyzes the reduction of dinitrogen to ammonia. Despite insight from structural and biochemical studies, its structure and mechanism await full characterization. An iron-molybdenum cofactor (FeMoco) is thought to be the site of dinitrogen reduction, but the identity of a central atom in this cofactor remains unknown. Fe Kβ x-ray emission spectroscopy (XES) of intact nitrogenase MoFe protein, isolated FeMoco, and the FeMoco-deficient nifB protein indicates that among the candidate atoms oxygen, nitrogen, and carbon, it is carbon that best fits the XES data. The experimental XES is supported by computational efforts, which show that oxidation and spin states do not affect the assignment of the central atom to C(4-). Identification of the central atom will drive further studies on its role in catalysis.
Development of a forestry government agency enterprise GIS system: a disconnected editing approach
NASA Astrophysics Data System (ADS)
Zhu, Jin; Barber, Brad L.
2008-10-01
The Texas Forest Service (TFS) has developed a geographic information system (GIS) for use by agency personnel in central Texas for managing oak wilt suppression and other landowner assistance programs. This Enterprise GIS system was designed to support multiple concurrent users accessing shared information resources. The disconnected editing approach was adopted in this system to avoid the overhead of maintaining an active connection between TFS central Texas field offices and headquarters since most field offices are operating with commercially provided Internet service. The GIS system entails maintaining a personal geodatabase on each local field office computer. Spatial data from the field is periodically up-loaded into a central master geodatabase stored in a Microsoft SQL Server at the TFS headquarters in College Station through the ESRI Spatial Database Engine (SDE). This GIS allows users to work off-line when editing data and requires connecting to the central geodatabase only when needed.
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.
1979-01-01
A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.
Balloon Support Systems Performance for the Cosmic Rays Energetics and Mass Mission
NASA Technical Reports Server (NTRS)
Tompson, Linda D.; Stuchlik, David W.
2006-01-01
The Ballooncraft Support Systems were developed by NASA Wallops Flight Facility for use on ULDB class balloon missions. The support systems have now flown two missions supporting the Cosmic Rays Energetics and Mass (CREAM) experiment. The first, CREAM I, flown in December 2004, was for a record breaking 41 days, 21 hours, and the second, flown in December 2005, was for 28 days, 9 hours. These support systems provide CREAM with power, telecommunications, command and data handling ioc!uding Plight computers, mechanical structures, thermal management and attitude control to help ensure a successful scientific mission. This paper will address the performance and success of these support systems over the two missions.
1991-03-01
test cases are gathered, studied, and evaluated; industry and other national European programs are studied; and experience is gained. This evolution ...application callable layer. The CGM Generator can be used to record device-independent picture descriptions. conceptually in parallel with the...contributors: I Organization Peter R. Bono Associates, Inc. Secretarial Support Susan Bonde , Diane Bono, E!aine Bono, Brenda Carson, Gillian Hall
Innovative Designs for the Smart ICU.
Halpern, Neil A
2014-03-01
Successfully designing a new ICU requires clarity of vision and purpose and the recognition that the patient room is the core of the ICU experience for patients, staff, and visitors. The ICU can be conceptualized into three components: the patient room, central areas, and universal support services. Each patient room should be designed for single patient use and be similarly configured and equipped. The design of the room should focus upon functionality, ease of use, healing, safety, infection control, communications, and connectivity. All aspects of the room, including its infrastructure; zones for work, care, and visiting; environment, medical devices, and approaches to privacy; logistics; and waste management, are important elements in the design process. Since most medical devices used at the ICU bedside are really sophisticated computers, the ICU needs to be capable of supporting the full scope of medical informatics. The patient rooms, the central ICU areas (central stations, corridors, supply rooms, pharmacy, laboratory, staff lounge, visitor waiting room, on-call suite, conference rooms, and offices), and the universal support services (infection prevention, finishings and flooring, staff communications, signage and wayfinding, security, and fire and safety) work best when fully interwoven. This coordination helps establish efficient and safe patient throughput and care and fosters physical and social cohesiveness within the ICU. A balanced approach to centralized and decentralized monitoring and logistics also offers great flexibility. Synchronization of the universal support services in the ICU with the hospital's existing systems maintains unity of purpose and continuity across the enterprise and avoids unnecessary duplication of efforts. Copyright © 2014 The American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.
Tobitt, Simon; Percival, Robert
2017-07-04
UK society is undergoing a technological revolution, including meeting health needs through technology. Government policy is shifting towards a "digital by default" position. Studies have trialled health technology interventions for those experiencing psychosis and shown them to be useful. To gauge levels of engagement with mobile phones (Internet-enabled or cell phone), computers and the Internet in the specific population of community mental health rehabilitation. Two surveys were conducted: with service-users on use/non-use of technologies, and interest in technology interventions and support; and with placements on facilities and support available to service-users. Levels of engagement in this population were substantially less than those recorded in the general UK and other clinical populations: 40.2% regularly use mobiles, 17.5% computers, and 14.4% the Internet. Users of all three technologies were significantly younger than non-users. Users of mobiles and computers were significantly more likely to live in lower support/higher independence placements. Of surveyed placements, 35.5% provide a communal computer and 38.7% IT skills sessions. Community mental health rehabilitation service-users risk finding themselves excluded by a "digital divide". Action is needed to ensure equal access to online opportunities, including healthcare innovations. Clinical and policy implications are discussed.
Zborowsky, Terri; Bunker-Hellmich, Lou; Morelli, Agneta; O'Neill, Mike
2010-01-01
Evidence-based findings of the effects of nursing station design on nurses' work environment and work behavior are essential to improve conditions and increase retention among these fundamental members of the healthcare delivery team. The purpose of this exploratory study was to investigate how nursing station design (i.e., centralized and decentralized nursing station layouts) affected nurses' use of space, patient visibility, noise levels, and perceptions of the work environment. Advances in information technology have enabled nurses to move away from traditional centralized paper-charting stations to smaller decentralized work stations and charting substations located closer to, or inside of, patient rooms. Improved understanding of the trade-offs presented by centralized and decentralized nursing station design has the potential to provide useful information for future nursing station layouts. This information will be critical for understanding the nurse environment "fit." The study used an exploratory design with both qualitative and quantitative methods. Qualitative data regarding the effects of nursing station design on nurses' health and work environment were gathered by means of focus group interviews. Quantitative data-gathering techniques included place- and person-centered space use observations, patient visibility assessments, sound level measurements, and an online questionnaire regarding perceptions of the work environment. Nurses on all units were observed most frequently performing telephone, computer, and administrative duties. Time spent using telephones, computers, and performing other administrative duties was significantly higher in the centralized nursing stations. Consultations with medical staff and social interactions were significantly less frequent in decentralized nursing stations. There were no indications that either centralized or decentralized nursing station designs resulted in superior visibility. Sound levels measured in all nursing stations exceeded recommended levels during all shifts. No significant differences were identified in nurses' perceptions of work control-demand-support in centralized and decentralized nursing station designs. The "hybrid" nursing design model in which decentralized nursing stations are coupled with centralized meeting rooms for consultation between staff members may strike a balance between the increase in computer duties and the ongoing need for communication and consultation that addresses the conflicting demands of technology and direct patient care.
A toolbox and record for scientific models
NASA Technical Reports Server (NTRS)
Ellman, Thomas
1994-01-01
Computational science presents a host of challenges for the field of knowledge-based software design. Scientific computation models are difficult to construct. Models constructed by one scientist are easily misapplied by other scientists to problems for which they are not well-suited. Finally, models constructed by one scientist are difficult for others to modify or extend to handle new types of problems. Construction of scientific models actually involves much more than the mechanics of building a single computational model. In the course of developing a model, a scientist will often test a candidate model against experimental data or against a priori expectations. Test results often lead to revisions of the model and a consequent need for additional testing. During a single model development session, a scientist typically examines a whole series of alternative models, each using different simplifying assumptions or modeling techniques. A useful scientific software design tool must support these aspects of the model development process as well. In particular, it should propose and carry out tests of candidate models. It should analyze test results and identify models and parts of models that must be changed. It should determine what types of changes can potentially cure a given negative test result. It should organize candidate models, test data, and test results into a coherent record of the development process. Finally, it should exploit the development record for two purposes: (1) automatically determining the applicability of a scientific model to a given problem; (2) supporting revision of a scientific model to handle a new type of problem. Existing knowledge-based software design tools must be extended in order to provide these facilities.
A multi-user real time inventorying system for radioactive materials: a networking approach.
Mehta, S; Bandyopadhyay, D; Hoory, S
1998-01-01
A computerized system for radioisotope management and real time inventory coordinated across a large organization is reported. It handles hundreds of individual users and their separate inventory records. Use of highly efficient computer network and database technologies makes it possible to accept, maintain, and furnish all records related to receipt, usage, and disposal of the radioactive materials for the users separately and collectively. The system's central processor is an HP-9000/800 G60 RISC server and users from across the organization use their personal computers to login to this server using the TCP/IP networking protocol, which makes distributed use of the system possible. Radioisotope decay is automatically calculated by the program, so that it can make the up-to-date radioisotope inventory data of an entire institution available immediately. The system is specifically designed to allow use by large numbers of users (about 300) and accommodates high volumes of data input and retrieval without compromising simplicity and accuracy. Overall, it is an example of a true multi-user, on-line, relational database information system that makes the functioning of a radiation safety department efficient.
Bandala, Victor M; Montoya, Leticia; Horak, Egon
2006-01-01
Two species of Crepidotus are recorded from cloud forest in the central region of Veracruz State (eastern Mexico): Crepidotus rubrovinosus sp. nov. and Crepidotus septicoides. The latter species was known previously only from the type locality in Brazil and from one record in tropical rain forest in southern Veracruz (as C. longicystis s. str. Singer). Descriptions, illustrations and discussions for both taxa are provided. A type study of C. fusisporus var. longicystis from USA is included, and it is concluded that the collection supporting this variety belongs to C. luteolus.
Health Information Technology as a Universal Donor to Bioethics Education.
Goodman, Kenneth W
2017-04-01
Health information technology, sometimes called biomedical informatics, is the use of computers and networks in the health professions. This technology has become widespread, from electronic health records to decision support tools to patient access through personal health records. These computational and information-based tools have engendered their own ethics literature and now present an opportunity to shape the standard medical and nursing ethics curricula. It is suggested that each of four core components in the professional education of clinicians-privacy, end-of-life care, access to healthcare and valid consent, and clinician-patient communication-offers an opportunity to leverage health information technology for curricular improvement. Using informatics in ethics education freshens ethics pedagogy and increases its utility, and does so without additional demands on overburdened curricula.
ALGORITHMS AND PROGRAMS FOR STRONG GRAVITATIONAL LENSING IN KERR SPACE-TIME INCLUDING POLARIZATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Bin; Maddumage, Prasad; Kantowski, Ronald
2015-05-15
Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravitymore » field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.« less
Algorithms and Programs for Strong Gravitational Lensing In Kerr Space-time Including Polarization
NASA Astrophysics Data System (ADS)
Chen, Bin; Kantowski, Ronald; Dai, Xinyu; Baron, Eddie; Maddumage, Prasad
2015-05-01
Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravity field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.
NASA Astrophysics Data System (ADS)
Li, Yue; Song, Yougui; Fitzsimmons, Kathryn E.; Chang, Hong; Orozbaev, Rustam; Li, Xinxin
2018-03-01
The extensive loess deposits of the Eurasian mid-latitudes provide important terrestrial archives of Quaternary climatic change. As yet, however, loess records in Central Asia are poorly understood. Here we investigate the grain size and magnetic characteristics of loess from the Nilka (NLK) section in the Ili Basin of eastern Central Asia. Weak pedogenesis suggested by frequency-dependent magnetic susceptibility (χfd%) and magnetic susceptibility (MS) peaks in primary loess suggest that MS is more strongly influenced by allogenetic magnetic minerals than pedogenesis, and may therefore be used to indicate wind strength. This is supported by the close correlation between variations in MS and proportions of the sand-sized fraction. To further explore the temporal variability in dust transport patterns, we identified three grain size end-members (EM1, mode size 47.5 µm; EM2, 33.6 µm; EM3, 18.9 µm) which represent distinct aerodynamic environments. EM1 and EM2 are inferred to represent grain size fractions transported from proximal sources in short-term, near-surface suspension during dust outbreaks. EM3 appears to represent a continuous background dust fraction under non-dust storm conditions. Of the three end-members, EM1 is most likely the most sensitive recorder of wind strength. We compare our EM1 proportions with mean grain size from the Jingyuan section in the Chinese loess plateau, and assess these in the context of modern and Holocene climate data. Our research suggests that the Siberian High pressure system is the dominant influence on wind dynamics, resulting in loess deposition in the eastern Ili Basin. Six millennial-scale cooling (Heinrich) events can be identified in the NLK loess records. Our grain size data support the hypothesis that the Siberian High acts as teleconnection between the climatic systems of the North Atlantic and East Asia in the high northern latitudes, but not for the mid-latitude westerlies.
Computer-Mediated Social Support for Physical Activity: A Content Analysis
ERIC Educational Resources Information Center
Stragier, Jeroen; Mechant, Peter; De Marez, Lieven; Cardon, Greet
2018-01-01
Purpose: Online fitness communities are a recent phenomenon experiencing growing user bases. They can be considered as online social networks in which recording, monitoring, and sharing of physical activity (PA) are the most prevalent practices. They have added a new dimension to the social experience of PA in which online peers function as…
The Power of Digital Storytelling to Support Teaching and Learning
ERIC Educational Resources Information Center
Robin, Bernard R.
2016-01-01
Although the term "digital storytelling" may not be familiar to all readers, over the last twenty years, an increasing number of educators, students and others around the world have created short movies by combining computer-based images, text, recorded audio narration, video clips and music in order to present information on various…
The Role of Context in a Collaborative Problem-Solving Task during Professional Development
ERIC Educational Resources Information Center
Ritella, Giuseppe; Ligorio, Maria Beatrice; Hakkarainen, Kai
2016-01-01
This article analyses how a group of teachers managed the resources available while performing computer-supported collaborative problem-solving tasks in the context of professional development. The authors video-recorded and analysed collaborative sessions during which the group of teachers used a digital environment to prepare a pedagogical…
Radiated Seismic Energy of Earthquakes in the South-Central Region of the Gulf of California, Mexico
NASA Astrophysics Data System (ADS)
Castro, Raúl R.; Mendoza-Camberos, Antonio; Pérez-Vertti, Arturo
2018-05-01
We estimated the radiated seismic energy (ES) of 65 earthquakes located in the south-central region of the Gulf of California. Most of these events occurred along active transform faults that define the Pacific-North America plate boundary and have magnitudes between M3.3 and M5.9. We corrected the spectral records for attenuation using nonparametric S-wave attenuation functions determined with the whole data set. The path effects were isolated from the seismic source using a spectral inversion. We computed radiated seismic energy of the earthquakes by integrating the square velocity source spectrum and estimated their apparent stresses. We found that most events have apparent stress between 3 × 10-4 and 3 MPa. Model independent estimates of the ratio between seismic energy and moment (ES/M0) indicates that this ratio is independent of earthquake size. We conclude that in general the apparent stress is low (σa < 3 MPa) in the south-central and southern Gulf of California.
Network modeling of PM10 concentration in Malaysia
NASA Astrophysics Data System (ADS)
Supian, Muhammad Nazirul Aiman Abu; Bakar, Sakhinah Abu; Razak, Fatimah Abdul
2017-08-01
Air pollution is not a new phenomenon in Malaysia. The Department of Environment (DOE) monitors the country's ambient air quality through a network of 51 stations. The air quality is measured using the Air Pollution Index (API) which is mainly recorded based on the concentration of particulate matter, PM10 readings. The Continuous Air Quality Monitoring (CAQM) stations are located in various places across the country. In this study, a network model of air quality based on PM10 concen tration for selected CAQM stations in Malaysia has been developed. The model is built using a graph formulation, G = (V, E) where vertex, V is a set of CAQM stations and edges, E is a set of correlation values for each pair of vertices. The network measurements such as degree distributions, closeness centrality, and betweenness centrality are computed to analyse the behaviour of the network. As a result, a rank of CAQM stations has been produced based on their centrality characteristics.
NASA Astrophysics Data System (ADS)
Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.
2009-12-01
As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.
[Research on the Application of Fuzzy Logic to Systems Analysis and Control
NASA Technical Reports Server (NTRS)
1998-01-01
Research conducted with the support of NASA Grant NCC2-275 has been focused in the main on the development of fuzzy logic and soft computing methodologies and their applications to systems analysis and control. with emphasis 011 problem areas which are of relevance to NASA's missions. One of the principal results of our research has been the development of a new methodology called Computing with Words (CW). Basically, in CW words drawn from a natural language are employed in place of numbers for computing and reasoning. There are two major imperatives for computing with words. First, computing with words is a necessity when the available information is too imprecise to justify the use of numbers, and second, when there is a tolerance for imprecision which can be exploited to achieve tractability, robustness, low solution cost, and better rapport with reality. Exploitation of the tolerance for imprecision is an issue of central importance in CW.
A modular architecture for transparent computation in recurrent neural networks.
Carmantini, Giovanni S; Beim Graben, Peter; Desroches, Mathieu; Rodrigues, Serafim
2017-01-01
Computation is classically studied in terms of automata, formal languages and algorithms; yet, the relation between neural dynamics and symbolic representations and operations is still unclear in traditional eliminative connectionism. Therefore, we suggest a unique perspective on this central issue, to which we would like to refer as transparent connectionism, by proposing accounts of how symbolic computation can be implemented in neural substrates. In this study we first introduce a new model of dynamics on a symbolic space, the versatile shift, showing that it supports the real-time simulation of a range of automata. We then show that the Gödelization of versatile shifts defines nonlinear dynamical automata, dynamical systems evolving on a vectorial space. Finally, we present a mapping between nonlinear dynamical automata and recurrent artificial neural networks. The mapping defines an architecture characterized by its granular modularity, where data, symbolic operations and their control are not only distinguishable in activation space, but also spatially localizable in the network itself, while maintaining a distributed encoding of symbolic representations. The resulting networks simulate automata in real-time and are programmed directly, in the absence of network training. To discuss the unique characteristics of the architecture and their consequences, we present two examples: (i) the design of a Central Pattern Generator from a finite-state locomotive controller, and (ii) the creation of a network simulating a system of interactive automata that supports the parsing of garden-path sentences as investigated in psycholinguistics experiments. Copyright © 2016 Elsevier Ltd. All rights reserved.
He, Xin; Hao, Man-Zhao; Wei, Ming; Xiao, Qin; Lan, Ning
2015-12-01
Involuntary central oscillations at single and double tremor frequencies drive the peripheral neuromechanical system of muscles and joints to cause tremor in Parkinson's disease (PD). The central signal of double tremor frequency was found to correlate more directly to individual muscle EMGs (Timmermann et al. 2003). This study is aimed at investigating what central components of oscillation contribute to inter-muscular synchronization in a group of upper extremity muscles during tremor in PD patients. 11 idiopathic, tremor dominant PD subjects participated in this study. Joint kinematics during tremor in the upper extremity was recorded along with EMGs of six upper arm muscles using a novel experimental apparatus. The apparatus provided support for the upper extremity on a horizontal surface with reduced friction, so that resting tremor in the arm can be recorded with a MotionMonitor II system. In each subject, the frequencies of rhythmic firings in upper arm muscles were determined using spectral analysis. Paired and pool-averaged coherence analyses of EMGs for the group of muscles were performed to correlate the level of inter-muscular synchronization to tremor amplitudes at shoulder and elbow. The phase shift between synchronized antagonistic muscle pairs was calculated to aid coherence analysis in the muscle pool. Recorded EMG revealed that rhythmic firings were present in most recorded muscles, which were either synchronized to form phase-locked bursting cycles at a subject specific frequency, or unsynchronized with a random phase distribution. Paired coherence showed a stronger synchronization among a subset of recorded arm muscles at tremor frequency than that at double tremor frequency. Furthermore, the number of synchronized muscles in the arm was positively correlated to tremor amplitudes at elbow and shoulder. Pool-averaged coherence at tremor frequency also showed a better correlation with the amplitude of resting tremor than that of double tremor frequency, indicating that the neuromechanical coupling in peripheral neuromuscular system was stronger at tremor frequency. Both paired and pool-averaged coherences are more consistent indexes to correlate to tremor intensity in a group of upper extremity muscles of PD patients. The central drive at tremor frequency contributes mainly to synchronize peripheral muscles in the modulation of tremor intensity.
Miao, Yinbin; Ma, Jianfeng; Liu, Ximeng; Wei, Fushan; Liu, Zhiquan; Wang, Xu An
2016-11-01
Online personal health record (PHR) is more inclined to shift data storage and search operations to cloud server so as to enjoy the elastic resources and lessen computational burden in cloud storage. As multiple patients' data is always stored in the cloud server simultaneously, it is a challenge to guarantee the confidentiality of PHR data and allow data users to search encrypted data in an efficient and privacy-preserving way. To this end, we design a secure cryptographic primitive called as attribute-based multi-keyword search over encrypted personal health records in multi-owner setting to support both fine-grained access control and multi-keyword search via Ciphertext-Policy Attribute-Based Encryption. Formal security analysis proves our scheme is selectively secure against chosen-keyword attack. As a further contribution, we conduct empirical experiments over real-world dataset to show its feasibility and practicality in a broad range of actual scenarios without incurring additional computational burden.
Walking robot: A design project for undergraduate students
NASA Technical Reports Server (NTRS)
1990-01-01
The design and construction of the University of Maryland walking machine was completed during the 1989 to 1990 academic year. It was required that the machine be capable of completing a number of tasks including walking a straight line, turning to change direction, and manuevering over an obstacle such as a set of stairs. The machine consists of two sets of four telescoping legs that alternately support the entire structure. A gear box and crank arm assembly is connected to the leg sets to provide the power required for the translational motion of the machine. By retracting all eight legs, the robot comes to rest on a central Bigfoot support. Turning is accomplished by rotating this machine about this support. The machine can be controlled by using either a user-operated remote tether or the onboard computer for the execution of control commands. Absolute encoders are attached to all motors to provide the control computer with information regarding the status of the motors. Long and short range infrared sensors provide the computer with feedback information regarding the machine's position relative to a series of stripes and reflectors. These infrared sensors simulate how the robot might sense and gain information about the environment of Mars.
University of Maryland walking robot: A design project for undergraduate students
NASA Technical Reports Server (NTRS)
Olsen, Bob; Bielec, Jim; Hartsig, Dave; Oliva, Mani; Grotheer, Phil; Hekmat, Morad; Russell, David; Tavakoli, Hossein; Young, Gary; Nave, Tom
1990-01-01
The design and construction required that the walking robot machine be capable of completing a number of tasks including walking in a straight line, turning to change direction, and maneuvering over an obstable such as a set of stairs. The machine consists of two sets of four telescoping legs that alternately support the entire structure. A gear-box and crank-arm assembly is connected to the leg sets to provide the power required for the translational motion of the machine. By retracting all eight legs, the robot comes to rest on a central Bigfoot support. Turning is accomplished by rotating the machine about this support. The machine can be controlled by using either a user operated remote tether or the on-board computer for the execution of control commands. Absolute encoders are attached to all motors (leg, main drive, and Bigfoot) to provide the control computer with information regarding the status of the motors (up-down motion, forward or reverse rotation). Long and short range infrared sensors provide the computer with feedback information regarding the machine's relative position to a series of stripes and reflectors. These infrared sensors simulate how the robot might sense and gain information about the environment of Mars.
Ground Support Software for Spaceborne Instrumentation
NASA Technical Reports Server (NTRS)
Anicich, Vincent; Thorpe, rob; Fletcher, Greg; Waite, Hunter; Xu, Hykua; Walter, Erin; Frick, Kristie; Farris, Greg; Gell, Dave; Furman, Jufy;
2004-01-01
ION is a system of ground support software for the ion and neutral mass spectrometer (INMS) instrument aboard the Cassini spacecraft. By incorporating commercial off-the-shelf database, Web server, and Java application components, ION offers considerably more ground-support-service capability than was available previously. A member of the team that operates the INMS or a scientist who uses the data collected by the INMS can gain access to most of the services provided by ION via a standard pointand click hyperlink interface generated by almost any Web-browser program running in almost any operating system on almost any computer. Data are stored in one central location in a relational database in a non-proprietary format, are accessible in many combinations and formats, and can be combined with data from other instruments and spacecraft. The use of the Java programming language as a system-interface language offers numerous capabilities for object-oriented programming and for making the database accessible to participants using a variety of computer hardware and software.
Measuring the Resilience of Advanced Life Support Systems
NASA Technical Reports Server (NTRS)
Bell, Ann Maria; Dearden, Richard; Levri, Julie A.
2002-01-01
Despite the central importance of crew safety in designing and operating a life support system, the metric commonly used to evaluate alternative Advanced Life Support (ALS) technologies does not currently provide explicit techniques for measuring safety. The resilience of a system, or the system s ability to meet performance requirements and recover from component-level faults, is fundamentally a dynamic property. This paper motivates the use of computer models as a tool to understand and improve system resilience throughout the design process. Extensive simulation of a hybrid computational model of a water revitalization subsystem (WRS) with probabilistic, component-level faults provides data about off-nominal behavior of the system. The data can then be used to test alternative measures of resilience as predictors of the system s ability to recover from component-level faults. A novel approach to measuring system resilience using a Markov chain model of performance data is also developed. Results emphasize that resilience depends on the complex interaction of faults, controls, and system dynamics, rather than on simple fault probabilities.
Logistics Support Analysis Techniques Guide
1985-03-15
LANGUAGE (DATA RECORDS) FORTRAN CDC 6600 D&V FSD P/D A H REMA-RKS: Program n-s-ists of F PLIATIffIONS, approx 4000 line of coding , 3 Safegard, AN/FSC... FORTRAN IV -EW-RAK9-- The model consz.sts of IT--k-LIC- I-U-0NS: approximately 367 lines of SiNCGARS, PERSHING II coding . %.’. ~ LSA TASK INTERFACE...system supported by Computer’ Systems Command. The current version of LADEN is coded totally in FORTRAN for virtual memory operating system
77 FR 26259 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-03
.... SUPPLEMENTARY INFORMATION: The National Security Agency systems of records notice subject to the Privacy Act of... of Records AGENCY: National Security Agency/Central Security Service. ACTION: Notice to Delete a System of Records. SUMMARY: The National Security Agency/Central Security Service is deleting a system of...
75 FR 67697 - Privacy Act of 1974; Systems of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-03
... National Security Agency's record system notices for records systems subject to the Privacy Act of 1974 (5... National Security Agency/Central Security Service, Freedom of Information Act (FOIA)/Privacy Act Office...; Systems of Records AGENCY: National Security Agency/Central Security Service, DoD. ACTION: Notice to add a...
75 FR 43494 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-26
... National Security Agency's record system notices for records systems subject to the Privacy Act of 1974 (5... National Security Agency/Central Security Service, Freedom of Information Act and Privacy Act Office, 9800...; System of Records AGENCY: National Security Agency/Central Security Service, DoD. ACTION: Notice to...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikolic, R J
This month's issue has the following articles: (1) Dawn of a New Era of Scientific Discovery - Commentary by Edward I. Moses; (2) At the Frontiers of Fundamental Science Research - Collaborators from national laboratories, universities, and international organizations are using the National Ignition Facility to probe key fundamental science questions; (3) Livermore Responds to Crisis in Post-Earthquake Japan - More than 70 Laboratory scientists provided round-the-clock expertise in radionuclide analysis and atmospheric dispersion modeling as part of the nation's support to Japan following the March 2011 earthquake and nuclear accident; (4) A Comprehensive Resource for Modeling, Simulation, and Experimentsmore » - A new Web-based resource called MIDAS is a central repository for material properties, experimental data, and computer models; and (5) Finding Data Needles in Gigabit Haystacks - Livermore computer scientists have developed a novel computer architecture based on 'persistent' memory to ease data-intensive computations.« less
Selected low-flow frequency statistics for continuous-record streamgages in Georgia, 2013
Gotvald, Anthony J.
2016-04-13
This report presents the annual and monthly minimum 1- and 7-day average streamflows with the 10-year recurrence interval (1Q10 and 7Q10) for 197 continuous-record streamgages in Georgia. Streamgages used in the study included active and discontinued stations having a minimum of 10 complete climatic years of record as of September 30, 2013. The 1Q10 and 7Q10 flow statistics were computed for 85 streamgages on unregulated streams with minimal diversions upstream, 43 streamgages on regulated streams, and 69 streamgages known, or considered, to be affected by varying degrees of diversions upstream. Descriptive information for each of these streamgages, including the U.S. Geological Survey (USGS) station number, station name, latitude, longitude, county, drainage area, and period of record analyzed also is presented.Kendall’s tau nonparametric test was used to determine the statistical significance of trends in annual and monthly minimum 1-day and 7-day average flows for the 197 streamgages. Significant negative trends in the minimum annual 1-day and 7-day average streamflow were indicated for 77 of the 197 streamgages. Many of these significant negative trends are due to the period of record ending during one of the recent droughts in Georgia, particularly those streamgages with record through the 2013 water year. Long-term unregulated streamgages with 70 or more years of record indicate significant negative trends in the annual minimum 7-day average flow for central and southern Georgia. Watersheds for some of these streamgages have experienced minimal human impact, thus indicating that the significant negative trends observed in flows at the long-term streamgages may be influenced by changing climatological conditions. A Kendall-tau trend analysis of the annual air temperature and precipitation totals for Georgia indicated no significant trends. A comprehensive analysis of causes of the trends in annual and monthly minimum 1-day and 7-day average flows in central and southern Georgia is outside the scope of this study. Further study is needed to determine some of the causes, including both climatological and human impacts, of the significant negative trends in annual minimum 1-day and 7-day average flows in central and southern Georgia.To assess the changes in the annual 1Q10 and 7Q10 statistics over time for long-term continuous streamgages with significant trends in record, the annual 1Q10 and 7Q10 statistics were computed on a decadal accumulated basis for 39 streamgages having 40 or more years of record that indicated a significant trend. Records from most of the streamgages showed a decline in 7Q10 statistics for the decades of 1980–89, 1990–99, and 2000–09 because of the recent droughts in Georgia. Twenty four of the 39 streamgages had complete records from 1980 to 2010, and records from 23 of these gages exhibited a decline in the 7Q10 statistics during this period, ranging from –6.3 to –76.2 percent with a mean of –27.3 percent. No attempts were made during this study to adjust streamflow records or statistical analyses on the basis of trends.The monthly and annual 1Q10 and 7Q10 flow statistics for the entire period of record analyzed in the study are incorporated into the USGS StreamStatsDB, which is a database accessible to users through the recently released USGS StreamStats application for Georgia. StreamStats is a Web-based geographic information system that provides users with access to an assortment of analytical tools that are useful for water-resources planning and management, and for engineering design applications, such as the design of bridges. StreamStats allows users to easily obtain streamflow statistics, basin characteristics, and other information for user-selected streamgages.
Accuracy and Calibration of Computational Approaches for Inpatient Mortality Predictive Modeling.
Nakas, Christos T; Schütz, Narayan; Werners, Marcus; Leichtle, Alexander B
2016-01-01
Electronic Health Record (EHR) data can be a key resource for decision-making support in clinical practice in the "big data" era. The complete database from early 2012 to late 2015 involving hospital admissions to Inselspital Bern, the largest Swiss University Hospital, was used in this study, involving over 100,000 admissions. Age, sex, and initial laboratory test results were the features/variables of interest for each admission, the outcome being inpatient mortality. Computational decision support systems were utilized for the calculation of the risk of inpatient mortality. We assessed the recently proposed Acute Laboratory Risk of Mortality Score (ALaRMS) model, and further built generalized linear models, generalized estimating equations, artificial neural networks, and decision tree systems for the predictive modeling of the risk of inpatient mortality. The Area Under the ROC Curve (AUC) for ALaRMS marginally corresponded to the anticipated accuracy (AUC = 0.858). Penalized logistic regression methodology provided a better result (AUC = 0.872). Decision tree and neural network-based methodology provided even higher predictive performance (up to AUC = 0.912 and 0.906, respectively). Additionally, decision tree-based methods can efficiently handle Electronic Health Record (EHR) data that have a significant amount of missing records (in up to >50% of the studied features) eliminating the need for imputation in order to have complete data. In conclusion, we show that statistical learning methodology can provide superior predictive performance in comparison to existing methods and can also be production ready. Statistical modeling procedures provided unbiased, well-calibrated models that can be efficient decision support tools for predicting inpatient mortality and assigning preventive measures.
The future of medical diagnostics: large digitized databases.
Kerr, Wesley T; Lau, Edward P; Owens, Gwen E; Trefler, Aaron
2012-09-01
The electronic health record mandate within the American Recovery and Reinvestment Act of 2009 will have a far-reaching affect on medicine. In this article, we provide an in-depth analysis of how this mandate is expected to stimulate the production of large-scale, digitized databases of patient information. There is evidence to suggest that millions of patients and the National Institutes of Health will fully support the mining of such databases to better understand the process of diagnosing patients. This data mining likely will reaffirm and quantify known risk factors for many diagnoses. This quantification may be leveraged to further develop computer-aided diagnostic tools that weigh risk factors and provide decision support for health care providers. We expect that creation of these databases will stimulate the development of computer-aided diagnostic support tools that will become an integral part of modern medicine.
Sensor Control And Film Annotation For Long Range, Standoff Reconnaissance
NASA Astrophysics Data System (ADS)
Schmidt, Thomas G.; Peters, Owen L.; Post, Lawrence H.
1984-12-01
This paper describes a Reconnaissance Data Annotation System that incorporates off-the-shelf technology and system designs providing a high degree of adaptability and interoperability to satisfy future reconnaissance data requirements. The history of data annotation for reconnaissance is reviewed in order to provide the base from which future developments can be assessed and technical risks minimized. The system described will accommodate new developments in recording head assemblies and the incorporation of advanced cameras of both the film and electro-optical type. Use of microprocessor control and digital bus inter-face form the central design philosophy. For long range, high altitude, standoff missions, the Data Annotation System computes the projected latitude and longitude of central target position from aircraft position and attitude. This complements the use of longer ranges and high altitudes for reconnaissance missions.
Nurses and computers. An international perspective on nurses' requirements.
Bond, Carol S
2007-01-01
This paper reports the findings from a Florence Nightingale Foundation Travel Scholarship undertaken by the author in the spring of 2006. The aim of the visit was to explore nurses' attitudes towards, and experiences of, using computers in their practice, and the requirements that they have to encourage, promote and support them in using ICT. Nurses were found to be using computers mainly for carrying out administrative tasks, such as updating records, rather than as information tools to support evidence based practice, or patient information needs. Nurses discussed the systems they used, the equipment provided, and their skills, or more often their lack of skills. The need for support was a frequent comment, most nurses feeling that it was essential that help was available at the point of need, and that it was provided by someone, preferably a nurse, who understood the work context. Three groups of nurses were identified. Engagers; Worried Willing and Resisters. The report concludes that pre-registration education has a responsibility to seek to ensure that newly qualified nurses enter practice as engagers.
Loess record of the Pleistocene-Holocene transition on the northern and central Great Plains, USA
Mason, J.A.; Miao, X.; Hanson, P.R.; Johnson, W.C.; Jacobs, P.M.; Goble, R.J.
2008-01-01
Various lines of evidence support conflicting interpretations of the timing, abruptness, and nature of climate change in the Great Plains during the Pleistocene-Holocene transition. Loess deposits and paleosols on both the central and northern Great Plains provide a valuable record that can help address these issues. A synthesis of new and previously reported optical and radiocarbon ages indicates that the Brady Soil, which marks the boundary between late Pleistocene Peoria Loess and Holocene Bignell Loess, began forming after a reduction in the rate of Peoria Loess accumulation that most likely occurred between 13.5 and 15 cal ka. Brady Soil formation spanned all or part of the B??lling-Aller??d episode (approximately 14.7-12.9 cal ka) and all of the Younger Dryas episode (12.9-11.5 cal ka) and extended at least 1000 years beyond the end of the Younger Dryas. The Brady Soil was buried by Bignell Loess sedimentation beginning around 10.5-9 cal ka, and continuing episodically through the Holocene. Evidence for a brief increase in loess influx during the Younger Dryas is noteworthy but very limited. Most late Quaternary loess accumulation in the central Great Plains was nonglacigenic and was under relatively direct climatic control. Thus, Brady Soil formation records climatic conditions that minimized eolian activity and allowed effective pedogenesis, probably through relatively high effective moisture. Optical dating of loess in North Dakota supports correlation of the Leonard Paleosol on the northern Great Plains with the Brady Soil. Thick loess in North Dakota was primarily derived from the Missouri River floodplain; thus, its stratigraphy may in part reflect glacial influence on the Missouri River. Nonetheless, the persistence of minimal loess accumulation and soil formation until 10 cal ka at our North Dakota study site is best explained by a prolonged interval of high effective moisture correlative with the conditions that favored Brady Soil formation. Burial of both the Brady Soil and the Leonard Paleosol by renewed loess influx probably represents eolian system response that occurred when gradual change toward a drier climate eventually crossed the threshold for eolian activity. Overall, the loess-paleosol sequences of the central and northern Great Plains record a broad peak of high effective moisture across the late Pleistocene to Holocene boundary, rather than well-defined climatic episodes corresponding to the B??lling-Aller??d and Younger Dryas episodes in the North Atlantic region. ?? 2008 Elsevier Ltd. All rights reserved.
Fracture behavior of large-scale thin-sheet aluminum alloy
NASA Technical Reports Server (NTRS)
Dewit, Roland; Fields, Richard J.; Mordfin, Leonard; Low, Samuel R.; Harne, Donald
1994-01-01
A series of fracture tests on large-scale, pre-cracked, aluminum alloy panels is being carried out to examine and to characterize the process by which cracks propagate and link up in this material. Extended grips and test fixtures were specially designed to enable the panel specimens to be loaded in tension, in a 1780-kN-capacity universal testing machine. Twelve panel specimens, each consisting of a single sheet of bare 2024-T3 aluminum alloy, 3988 mm high, 2286 mm wide, and 1.016 mm thick are being fabricated with simulated through-cracks oriented horizontally at mid-height. Using existing information, a test matrix has been set up that explores regions of failure that are controlled by fracture mechanics, with additional tests near the boundary between plastic collapse and fracture. In addition, a variety of multiple site damage (MSD) configurations have been included to distinguish between various proposed linkage mechanisms. All tests but one use anti-buckling guides. At this writing seven specimens have been tested. Three were fabricated with a single central crack, three others had multiple cracks on each side of the central crack, and one had a single crack but no anti-buckling guides. Each fracture event was recorded on film, video, computer, magnetic tape, and occasionally optical microscopy. The visual showed the crack tip with a load meter in the field of view, using motion picture film for one tip and SVHS video tape for the other. The computer recorded the output of the testing machine load cell, the stroke, and twelve strain gages at 1.5 second intervals. A wideband FM magnetic tape recorder was used to record data from the same sources. The data were analyzed by two different procedures: (1) the plastic zone model based on the residual strength diagram; and (2) the R-curve. The first three tests were used to determine the basic material properties, and these results were then used in the analysis of the two subsequent tests with MSD cracks. There is good agreement between measured values and results obtained from the model.
NASA Astrophysics Data System (ADS)
Suganuma, Yusuke; Haneda, Yuki; Kameo, Koji; Kubota, Yoshimi; Hayashi, Hiroki; Itaki, Takuya; Okuda, Masaaki; Head, Martin, J.; Sugaya, Manami; Nakazato, Hiroomi; Igarashi, Atsuo; Shikoku, Kizuku; Hongo, Misao; Watanabe, Masami; Satoguchi, Yasufumi; Takeshita, Yoshihiro; Nishida, Naohisa; Izumi, Kentaro; Kawamura, Kenji; Kawamata, Moto; Okuno, Jun'ichi; Yoshida, Takeshi; Ogitsu, Itaru; Yabusaki, Hisashi; Okada, Makoto
2018-07-01
Marine Isotope Stage (MIS) 19 is an important analogue for the present interglacial because of its similar orbital configuration, especially the phasing of the obliquity maximum to precession minimum. However, sedimentary records suitable for capturing both terrestrial and marine environmental changes are limited, and thus the climatic forcing mechanisms for MIS 19 are still largely unknown. The Chiba composite section, east-central Japanese archipelago, is a continuous and expanded marine sedimentary succession well suited to capture terrestrial and marine environmental changes through MIS 19. In this study, a detailed oxygen isotope chronology is established from late MIS 20 to early MIS 18, supported by a U-Pb zircon age and the presence of the Matuyama-Brunhes boundary. New pollen, marine microfossil, and planktonic foraminiferal δ18O and Mg/Ca paleotemperature records reveal the complex interplay of climatic influences. Our pollen data suggest that the duration of full interglacial conditions during MIS 19 extends from 785.0 to 775.1 ka (9.9 kyr), which offers an important natural baseline in predicting the duration of the present interglacial. A Younger Dryas-type cooling event is present during Termination IX, suggesting that such events are linked to this orbital configuration. Millennial- to multi-millennial-scale variations in our δ18O and Mg/Ca records imply that the Subarctic Front fluctuated in the northwestern Pacific Ocean during late MIS 19, probably in response to East Asian winter monsoon variability. The climatic setting at this time appears to be related to less severe summer insolation minima at 65˚N and/or high winter insolation at 50˚N. Our records do not support a recently hypothesized direct coupling between variations in the geomagnetic field intensity and global/regional climate change. Our highly resolved paleoclimatic and paleoceanographic records, coupled with a well-defined Matuyama-Brunhes boundary (772.9 ka; duration 1.9 kyr), establish the Chiba composite section as an exceptional climatic and chronological reference section for the Early-Middle Pleistocene boundary.
Robertson, Ann; Cresswell, Kathrin; Takian, Amirhossein; Petrakaki, Dimitra; Crowe, Sarah; Cornford, Tony; Barber, Nicholas; Avery, Anthony; Fernando, Bernard; Jacklin, Ann; Prescott, Robin; Klecun, Ela; Paton, James; Lichtner, Valentina; Quinn, Casey; Ali, Maryam; Morrison, Zoe; Jani, Yogini; Waring, Justin; Marsden, Kate
2010-01-01
Objectives To describe and evaluate the implementation and adoption of detailed electronic health records in secondary care in England and thereby provide early feedback for the ongoing local and national rollout of the NHS Care Records Service. Design A mixed methods, longitudinal, multisite, socio-technical case study. Setting Five NHS acute hospital and mental health trusts that have been the focus of early implementation efforts and at which interim data collection and analysis are complete. Data sources and analysis Dataset for the evaluation consists of semi-structured interviews, documents and field notes, observations, and quantitative data. Qualitative data were analysed thematically with a socio-technical coding matrix, combined with additional themes that emerged from the data. Main results Hospital electronic health record applications are being developed and implemented far more slowly than was originally envisioned; the top-down, standardised approach has needed to evolve to admit more variation and greater local choice, which hospital trusts want in order to support local activity. Despite considerable delays and frustrations, support for electronic health records remains strong, including from NHS clinicians. Political and financial factors are now perceived to threaten nationwide implementation of electronic health records. Interviewees identified a range of consequences of long term, centrally negotiated contracts to deliver the NHS Care Records Service in secondary care, particularly as NHS trusts themselves are not party to these contracts. These include convoluted communication channels between different stakeholders, unrealistic deployment timelines, delays, and applications that could not quickly respond to changing national and local NHS priorities. Our data suggest support for a “middle-out” approach to implementing hospital electronic health records, combining government direction with increased local autonomy, and for restricting detailed electronic health record sharing to local health communities. Conclusions Experiences from the early implementation sites, which have received considerable attention, financial investment and support, indicate that delivering improved healthcare through nationwide electronic health records will be a long, complex, and iterative process requiring flexibility and local adaptability both with respect to the systems and the implementation strategy. The more tailored, responsive approach that is emerging is becoming better aligned with NHS organisations’ perceived needs and is, if pursued, likely to deliver clinically useful electronic health record systems. PMID:20813822
Overview of NASA communications infrastructure
NASA Technical Reports Server (NTRS)
Arnold, Ray J.; Fuechsel, Charles
1991-01-01
The infrastructure of NASA communications systems for effecting coordination across NASA offices and with the national and international research and technological communities is discussed. The offices and networks of the communication system include the Office of Space Science and Applications (OSSA), which manages all NASA missions, and the Office of Space Operations, which furnishes communication support through the NASCOM, the mission critical communications support network, and the Program Support Communications network. The NASA Science Internet was established by OSSA to centrally manage, develop, and operate an integrated computer network service dedicated to NASA's space science and application research. Planned for the future is the National Research and Education Network, which will provide communications infrastructure to enhance science resources at a national level.
Hydricity-promoted [1,5]-H shifts in acetalic ketenimines and carbodiimides.
Alajarín, Mateo; Bonillo, Baltasar; Ortín, María-Mar; Sánchez-Andrada, Pilar; Vidal, Angel
2006-11-23
2-monosubstituted 1,3-dioxolanes and dithiolanes act as hydride-releasing fragments, transferring intramolecularly their acetalic H atom to the central carbon of ketenimine functions. The presumed products of these migrations, o-quinomethanimines, undergo in situ 6pi-electrocyclization. A computational study supports this mechanism and the hydride-shift character of the first step. Carbodiimides were also suitable substrates, although less reactive. [reaction: see text].
NASA Astrophysics Data System (ADS)
Czymzik, M.; Muscheler, R.; Brauer, A.
2015-10-01
Solar influences on climate variability are one of the most controversially discussed topics in climate research. We analyze solar forcing of flood frequency in Central Europe on inter-annual to millennial time-scales using daily discharge data of River Ammer (southern Germany) back to AD 1926 and revisiting the 5500 year flood layer time-series from varved sediments of the downstream Lake Ammersee. Flood frequency in the discharge record is significantly correlated to changes in solar activity during solar cycles 16-23 (r = -0.47, p < 0.0001, n = 73). Flood layer frequency (n = 1501) in the sediment record depicts distinct multi-decadal variability and significant correlations to 10Be fluxes from a Greenland ice core (r = 0.45, p < 0.0001) and 14C production rates (r =0.36, p < 0.0001), proxy records of solar activity. Flood frequency is higher when solar activity is reduced. These correlations between flood frequency and solar activity might provide empirical support for the solar top-down mechanism expected to modify the mid-latitude storm tracks over Europe by model studies. A lag of flood frequency responses in the Ammer discharge record to changes in solar activity of about one to three years could be explained by a modelled ocean-atmosphere feedback delaying the atmospheric reaction to solar activity variations up to a few years.
Real-Time Transliteration of Speech into Print for Hearing-Impaired Students in Regular Classes.
ERIC Educational Resources Information Center
Stuckless, E. Ross
1983-01-01
A system is described whereby a stenotypist records the classroom instructor's and students' speech which a computer then translates to words on the screen for hearing impaired postsecondary students. Initial results include a high degree of verbatim accuracy, support for real-time operation, and several technical problems including lack of…
ERIC Educational Resources Information Center
Lee, Victor R.; DuMont, Maneksha
2010-01-01
There is a great potential opportunity to use portable physical activity monitoring devices as data collection tools for educational purposes. Using one such device, we designed and implemented a weeklong workshop with high school students to test the utility of such technology. During that intervention, students performed data investigations of…
Central and peripheral components of short latency vestibular responses in the chicken
NASA Technical Reports Server (NTRS)
Nazareth, A. M.; Jones, T. A.
1998-01-01
Far-field recordings of short latency vestibular responses to pulsed cranial translation are composed of a series of positive and negative peaks occurring within 10 ms following stimulus onset. In the bird, these vestibular evoked potentials (VsEPs) can be recorded noninvasively and have been shown in the chicken and quail to depend strictly upon the activation of the vestibular component of the eighth nerve. The utility of the VsEP in the study of vestibular systems is dependent upon a clear understanding of the neural sources of response components. The primary aim of the current research in the chicken was to critically test the hypotheses that 1) responses are generated by both peripheral and central neurons and 2) peaks P1 and N1 originate from first order vestibular neurons, whereas later waves primarily depend on activity in higher order neurons. The principal strategy used here was to surgically isolate the eighth nerve as it enters the brainstem. Interruption of primary afferents of the eighth nerve in the brainstem substantially reduced or eliminated peaks beyond P2, whereas P1 and N1 were generally spared. Surgical sections that spared vestibular pathways had little effect on responses. The degree of change in response components beyond N1 was correlated with the extent of damage to central vestibular relays. These findings support the conclusion that responses are produced by both peripheral and central elements of the vestibular system. Further, response peaks later than N1 appear to be dependent upon central relays, whereas P1 and N1 reflect activity of the peripheral nerve. These findings clarify the roles of peripheral and central neurons in the generation of vestibular evoked potentials and provide the basis for a more useful and detailed interpretation of data from vestibular response testing.
Age-related changes in the anticipatory coarticulation in the speech of young children
NASA Astrophysics Data System (ADS)
Parson, Mathew; Lloyd, Amanda; Stoddard, Kelly; Nissen, Shawn L.
2003-10-01
This paper investigates the possible patterns of anticipatory coarticulation in the speech of young children. Speech samples were elicited from three groups of children between 3 and 6 years of age and one comparison group of adults. The utterances were recorded online in a quiet room environment using high quality microphones and direct analog-to-digital conversion to computer disk. Formant frequency measures (F1, F2, and F3) were extracted from a centralized and unstressed vowel (schwa) spoken prior to two different sets of productions. The first set of productions consisted of the target vowel followed by a series of real words containing an initial CV(C) syllable (voiceless obstruent-monophthongal vowel) in a range of phonetic contexts, while the second set consisted of a series of nonword productions with a relatively constrained phonetic context. An analysis of variance was utilized to determine if the formant frequencies varied systematically as a function of age, gender, and phonetic context. Results will also be discussed in association with spectral moment measures extracted from the obstruent segment immediately following the target vowel. [Work supported by research funding from Brigham Young University.
Pre-Hispanic agricultural decline prior to the Spanish Conquest in southern Central America
NASA Astrophysics Data System (ADS)
Taylor, Zachary P.; Horn, Sally P.; Finkelstein, David B.
2013-08-01
Archeological and paleoenvironmental records from southern Central America attribute population collapse to the Spanish Conquest about 500 years ago. Paleoclimate records from the circum-Caribbean have shown evidence of severe, regional droughts that contributed to the collapse of the Mayan Civilization, but there are few records of these droughts in southern Central America and no records of their effects on prehistoric populations in the region. Here we present a high-resolution lake sediment record of prehistoric agricultural activities using bulk sediment stable carbon isotopes from Laguna Zoncho, Costa Rica. We find isotopic evidence that agriculture was nearly absent from the watershed approximately 220 years prior to the Spanish arrival in Costa Rica and identify two distinct periods of agricultural decline, 1150-970 and 860-640 cal yr BP, which correspond to severe droughts in central Mexico. We attribute decreases in agriculture to a weakened Central American monsoon, which would have shortened the growing season at Laguna Zoncho, reduced crop yields, and negatively affected prehistoric populations.
Contour changes in human alveolar bone following tooth extraction of the maxillary central incisor.
Li, Bei; Wang, Yao
2014-12-01
The purpose of this study was to apply cone-beam computed tomography (CBCT) to observe contour changes in human alveolar bone after tooth extraction of the maxillary central incisor and to provide original morphological evidence for aesthetic implant treatment in the maxillary anterior area. Forty patients were recruited into the study. Each patient had two CBCT scans (CBCT I and CBCT II), one taken before and one taken three months after tooth extraction of maxillary central incisor (test tooth T). A fixed anatomic reference point was used to orient the starting axial slice of the two scans. On three CBCT I axial slices, which represented the deep, middle, and shallow layers of the socket, labial and palatal alveolar bone widths of T were measured. The number of sagittal slices from the start point to the pulp centre of T was recorded. On three CBCT II axial slices, the pulp centres of extracted T were oriented according to the number of moved sagittal slices recorded in CBCT I. Labial and palatal alveolar bone widths at the oriented sites were measured. On the CBCT I axial slice which represented the middle layer of the socket, sagittal slices were reconstructed. Relevant distances of T on the sagittal slice were measured, as were the alveolar bone width and tooth length of the opposite central incisor. On the CBCT II axial slice, which represented the middle layer of the socket, relevant distances recorded in CBCT I were transferred on the sagittal slice. The height reduction of alveolar bone on labial and palatal sides was measured, as were the alveolar bone width and tooth length of the opposite central incisor at the oriented site. Intraobserver reliability assessed by intraclass correlation coefficients (ICCs) was high. Paired sample t-tests were performed. The alveolar bone width and tooth length of the opposite central incisor showed no statistical differences (P<0.05). The labial alveolar bone widths of T at the deep, middle, and shallow layers all showed statistical differences. However, no palatal alveolar bone widths showed any statistical differences. The width reduction of alveolar bone was 1.2, 1.6, and 2.7 mm at the deep, middle, and shallow layers, respectively. The height reduction of alveolar bone on labial and palatal sides of T both showed statistical differences, which was 1.9 and 1.1 mm, respectively.
18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.
Code of Federal Regulations, 2014 CFR
2014-04-01
... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...
18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.
Code of Federal Regulations, 2011 CFR
2011-04-01
... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...
18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.
Code of Federal Regulations, 2012 CFR
2012-04-01
... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...
18 CFR 3b.204 - Safeguarding information in manual and computer-based record systems.
Code of Federal Regulations, 2013 CFR
2013-04-01
... information in manual and computer-based record systems. 3b.204 Section 3b.204 Conservation of Power and Water... Collection of Records § 3b.204 Safeguarding information in manual and computer-based record systems. (a) The administrative and physical controls to protect the information in the manual and computer-based record systems...
Lee, Jae-In; Lee, Yoon; Kim, Yu-Lee; Cho, Hye-Won
2016-02-01
The 4-, 3- or even 2-implant-supported partial fixed dental prosthesis (PFDP) designs have been used to rehabilitate the anterior edentulous maxilla. The purpose of this in vitro study was to compare the stress distribution in the supporting tissues surrounding implants placed in the anterior maxilla with 5 PFDP designs. A photoelastic model of the human maxilla with an anterior edentulous region was made with photoelastic resin (PL-2; Vishay Micro-Measurements), and 6 straight implants (OsseoSpeed; Astra Tech AB) were placed in the 6 anterior tooth positions. The 5 design concepts based on implant location were as follows: model 6I: 6 implants; model 2C2CI: 4 implants (2 canines and 2 central incisors); model 2C2LI: 4 implants (2 canines and 2 lateral incisors); model 2C1CI: 3 implants (2 canines and 1 central incisor); and model 2C: 2 canines. A load of 127.4 N was applied on the cingulum of 3 teeth at a 30-degree angle to the long axis of the implant. Stresses that developed in the supporting structure were recorded photographically. The 6-implant-supported PFDP exhibited the most even and lowest distribution of stresses in all loading conditions. When the canine was loaded, the 2- or 3-implant-supported PFDP showed higher stresses around the implant at the canine position than did the 4- or 6-implant-supported PFDP. When the central incisor or lateral incisor was loaded, the two 4-implant-supported PFDPs exhibited similar levels of stresses around the implants and showed lower stresses than did the 2- or 3-implant-supported PFDP. Implant number and distribution influenced stress distribution around the implants in the anterior maxilla. With a decrease in implant number, the stresses around the implants increased. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
1994-01-01
In the mid-1980s, Kinetic Systems and Langley Research Center determined that high speed CAMAC (Computer Automated Measurement and Control) data acquisition systems could significantly improve Langley's ARTS (Advanced Real Time Simulation) system. The ARTS system supports flight simulation R&D, and the CAMAC equipment allowed 32 high performance simulators to be controlled by centrally located host computers. This technology broadened Kinetic Systems' capabilities and led to several commercial applications. One of them is General Atomics' fusion research program. Kinetic Systems equipment allows tokamak data to be acquired four to 15 times more rapidly. Ford Motor company uses the same technology to control and monitor transmission testing facilities.
Guiffant, Gérard; Durussel, Jean Jacques; Flaud, Patrice; Vigier, Jean Pierre; Merckx, Jacques
2012-01-01
The use of totally implantable venous access devices developed as a medical device allowing mid- and long-term, frequent, repeated, or continuous injection of therapeutic products, by vascular, cavitary, or perineural access. The effective flushing of these devices is a central element to assure long-lasting use. Our experimental work demonstrates that directing the Huber point needle opening in the diametrically opposite direction of the implantable port exit channel increases the flushing efficiency. These results are consolidated by numerical computations, which support recommendations not only for their maintenance, but also for their use.
Guiffant, Gérard; Durussel, Jean Jacques; Flaud, Patrice; Vigier, Jean Pierre; Merckx, Jacques
2012-01-01
The use of totally implantable venous access devices developed as a medical device allowing mid- and long-term, frequent, repeated, or continuous injection of therapeutic products, by vascular, cavitary, or perineural access. The effective flushing of these devices is a central element to assure long-lasting use. Our experimental work demonstrates that directing the Huber point needle opening in the diametrically opposite direction of the implantable port exit channel increases the flushing efficiency. These results are consolidated by numerical computations, which support recommendations not only for their maintenance, but also for their use. PMID:23166455
NASA Astrophysics Data System (ADS)
Barron, J. A.; Metcalfe, S. E.; Davies, S. J.
2014-12-01
We evaluate proxy reconstructions of Holocene records precipitation in the North American Monsoon region (SW US and northern Mexico) and regions to the south (southern Mexico, Central America, and the Caribbean). Seventy-seven precipitation records are tabulated at 2-3 kyr increments for the past 12 kyr, with results displayed mainly on maps. Sites currently dominated by summer precipitation, coupled with proxy records that distinguish summer vs. winter vegetation are used to estimate summer precipitation. Resulting patterns of precipitation variability are evaluated against SST reconstructions from surrounding tropical seas -eastern tropical Pacific, Gulf of California (GoC), Caribbean, and Gulf of Mexico (GoM), which are source areas for summer precipitation. During the Younger Dryas, ca. 12 ka, widespread drying in southern regions contrasted with evidence for wetter conditions in multiple records from the SW US. By 9 ka wetter conditions had spread to the southern regions, likely reflecting an increased Caribbean low-level jet associated with an enhanced Bermuda High. Pacific westerlies contributed significant winter precipitation to the southwestern US and northernmost Mexico at 9 ka. The modern geographical pattern of summer precipitation was established by 6 ka, as the Bermuda High moved northward following the demise of the Laurentide Ice Sheet. SSTs in the GoC and GoM increased, and the NAM strengthened. Increased regional precipitation differences are apparent by 4 ka, likely reflecting enhanced ENSO variability. Most of the southern region experienced increased precipitation during the Medieval Climate Anomaly (MCA), whereas winter drought dominated in the north. In contrast, much of the Little Ice Age (LIA) was characterized by generally drier conditions in Central America and Mexico, with wetter conditions in the SW US. Results are broadly supportive of enhanced La Niña-like conditions during the MCA vs. increased ENSO variability during the LIA.
Kotoda, Atsushi; Akimoto, Tetsu; Kato, Maki; Kanazawa, Hidenori; Nakata, Manabu; Sugase, Taro; Ogura, Manabu; Ito, Chiharu; Sugimoto, Hideharu; Muto, Shigeaki; Kusano, Eiji
2011-01-01
It is widely assumed that central venous stenosis (CVS) is most commonly associated with previous central venous catheterization among the chronic hemodialysis (HD) patients. We evaluated the validity of this assumption in this retrospective study. The clinical records from 2,856 consecutive HD patients with vascular access failure during a 5-year period were reviewed, and a total of 26 patients with symptomatic CVS were identified. Combined with radiological findings, their clinical characteristics were examined. Only seven patients had a history of internal jugular dialysis catheterization. Diagnostic multidetector row computed tomography angiography showed that 7 of the 19 patients with no history of catheterization had left innominate vein stenosis due to extrinsic compression between the sternum and arch vessels. These patients had a shorter period from the time of creation of the vascular access to the initial referral (9.2 ± 7.6 months) than the rest of the patients (35.5 ± 18.6 months, p = 0.0017). Our findings suggest that cases without a history of central venous catheterization may not be rare among the HD patients with symptomatic CVS. However, those still need to be confirm by larger prospective studies of overall chronic HD patients with symptomatic CVS.
Apply creative thinking of decision support in electrical nursing record.
Hao, Angelica Te-Hui; Hsu, Chien-Yeh; Li-Fang, Huang; Jian, Wen-Shan; Wu, Li-Bin; Kao, Ching-Chiu; Lu, Mei-Show; Chang, Her-Kung
2006-01-01
The nursing process consists of five interrelated steps: assessment, diagnosis, planning, intervention, and evaluation. In the nursing process, the nurse collects a great deal of data and information. The amount of data and information may exceed the amount the nurse can process efficiently and correctly. Thus, the nurse needs assistance to become proficient in the planning of nursing care, due to the difficulty of simultaneously processing a large set of information. Computer systems are viewed as tools to expand the capabilities of the nurse's mind. Using computer technology to support clinicians' decision making may provide high-quality, patient-centered, and efficient healthcare. Although some existing nursing information systems aid in the nursing process, they only provide the most fundamental decision support--i.e., standard care plans associated with common nursing diagnoses. Such a computerized decision support system helps the nurse develop a care plan step-by-step. But it does not assist the nurse in the decision-making process. The decision process about how to generate nursing diagnoses from data and how to individualize the care plans still reminds of the nurse. The purpose of this study is to develop a pilot structure in electronic nursing record system integrated with international nursing standard for improving the proficiency and accuracy of plan of care in clinical pathway process. The proposed pilot systems not only assist both student nurses and nurses who are novice in nursing practice, but also experts who need to work in a practice area which they are not familiar with.
Competence with Fractions Predicts Gains in Mathematics Achievement
Bailey, Drew H.; Hoard, Mary K.; Nugent, Lara; Geary, David C.
2012-01-01
Competence with fractions predicts later mathematics achievement, but the co-developmental pattern between fractions knowledge and mathematics achievement is not well understood. We assessed this co-development through examination of the cross-lagged relation between a measure of conceptual knowledge of fractions and mathematics achievement in sixth and seventh grade (n = 212). The cross-lagged effects indicated that performance on the sixth grade fractions concepts measure predicted one year gains in mathematics achievement (β = .14, p<.01), controlling for the central executive component of working memory and intelligence, but sixth grade mathematics achievement did not predict gains on the fractions concepts measure (β = .03, p>.50). In a follow-up assessment, we demonstrated that measures of fluency with computational fractions significantly predicted seventh grade mathematics achievement above and beyond the influence of fluency in computational whole number arithmetic, performance on number fluency and number line tasks, and central executive span and intelligence. Results provide empirical support for the hypothesis that competence with fractions underlies, in part, subsequent gains in mathematics achievement. PMID:22832199
A Hybrid Cloud Computing Service for Earth Sciences
NASA Astrophysics Data System (ADS)
Yang, C. P.
2016-12-01
Cloud Computing is becoming a norm for providing computing capabilities for advancing Earth sciences including big Earth data management, processing, analytics, model simulations, and many other aspects. A hybrid spatiotemporal cloud computing service is bulit at George Mason NSF spatiotemporal innovation center to meet this demands. This paper will report the service including several aspects: 1) the hardware includes 500 computing services and close to 2PB storage as well as connection to XSEDE Jetstream and Caltech experimental cloud computing environment for sharing the resource; 2) the cloud service is geographically distributed at east coast, west coast, and central region; 3) the cloud includes private clouds managed using open stack and eucalyptus, DC2 is used to bridge these and the public AWS cloud for interoperability and sharing computing resources when high demands surfing; 4) the cloud service is used to support NSF EarthCube program through the ECITE project, ESIP through the ESIP cloud computing cluster, semantics testbed cluster, and other clusters; 5) the cloud service is also available for the earth science communities to conduct geoscience. A brief introduction about how to use the cloud service will be included.
Synchronous computer mediated group discussion.
Gallagher, Peter
2005-01-01
Over the past 20 years, focus groups have become increasingly popular with nursing researchers as a data collection method, as has the use of computer-based technologies to support all forms of nursing research. This article describes the conduct of a series of focus groups in which the participants were in the same room as part of a "real-time" discussion during which they also used personal computers as an interface between each other and the moderator. Synchronous Computer Mediated Group Discussion differed from other forms of focus group discussion in that participants used personal computers rather than verbal expressions to respond to specific questions, engage in communication with other participants, and to record their thoughts. This form of focus group maintained many of the features of spoken exchanges, a cornerstone of the focus group, while capturing the advantages of online discussion.
The interdependence of lake ice and climate in central North America
NASA Technical Reports Server (NTRS)
Jelacic, A. J. (Principal Investigator)
1972-01-01
There are no author-identified significant results in this report. This investigation is to identify any correlations between the freeze/ thaw cycles of lakes and regional weather variations. ERTS-1 imagery of central Canada and north central United States is examined on a seasonal basis. The ice conditions of certain major study lakes are noted and recorded on magnetic tape, from which the movement of a freeze/thaw transition zone may be deduced. Weather maps and tables are used to establish any obvious correlations. The process of selecting major study lakes is discussed, and a complete lake directory is presented. Various routines of the software support library are described, accompanied by output samples. Procedures used for ERTS imagery processing are presented along with the data analysis plan. Application of these procedures to selected ERTS imagery has demonstrated their utility. Preliminary results show that the freeze/thaw transition zone can be monitored from ERTS.
Managing the equipment service life in rendering engineering support to NPP operation
NASA Astrophysics Data System (ADS)
Ryasnyy, S. I.
2015-05-01
Apart from subjecting metal to nondestructive testing and determining its actual state, which are the traditional methods used for managing the service life of NPP equipment during its operation, other approaches closely linked with rendering engineering support to NPP operation have emerged in recent decades, which, however, have been covered in publications to a lesser extent. Service life management matters occupy the central place in the structure of engineering support measures. Application of the concept of repairing NPP equipment based on assessing its technical state and the risk of its failure makes it possible to achieve significantly smaller costs for maintenance and repairs and produce a larger amount of electricity due to shorter planned outages. Decreasing the occurrence probability of a process-related abnormality through its prediction is a further development of techniques for monitoring the technical state of equipment and systems. The proposed and implemented procedure for predicting the occurrence of process-related deviations from normal NPP operation opens the possibility to record in the online mode the trends in changes of process parameters that are likely to lead to malfunctions in equipment operation and to reduce the probability of power unit unloading when an abnormal technical state of equipment occurs and develops by recording changes in the state at an early stage and taking timely corrective measures. The article presents the structure of interconnections between the objectives and conditions of adjustment and commissioning tests, in which the management of equipment service life (saving and optimizing the service life) occupies the central place. Special attention is paid to differences in resource saving and optimization measures.
Gareen, Ilana F; Sicks, JoRean D; Jain, Amanda Adams; Moline, Denise; Coffman-Kadish, Nancy
2013-01-01
In clinical trials and epidemiologic studies, information on medical care utilization and health outcomes is often obtained from medical records. For multi-center studies, this information may be gathered by personnel at individual sites or by staff at a central coordinating center. We describe the process used to develop a HIPAA-compliant centralized process to collect medical record information for a large multi-center cancer screening trial. The framework used to select, request, and track medical records incorporated a participant questionnaire with unique identifiers for each medical provider. De-identified information from the questionnaires was sent to the coordinating center indexed by these identifiers. The central coordinating center selected specific medical providers for abstraction and notified sites using these identifiers. The site personnel then linked the identifiers with medical provider information. Staff at the sites collected medical records and provided them for central abstraction. Medical records were successfully obtained and abstracted to ascertain information on outcomes and health care utilization in a study with over 18,000 study participants. Collection of records required for outcomes related to positive screening examinations and lung cancer diagnosis exceeded 90%. Collection of records for all aims was 87.32%. We designed a successful centralized medical record abstraction process that may be generalized to other research settings, including observational studies. The coordinating center received no identifying data. The process satisfied requirements imposed by the Health Insurance Portability and Accountability Act and concerns of site institutional review boards with respect to protected health information. Copyright © 2012 Elsevier Inc. All rights reserved.
Gareen, Ilana F.; Sicks, JoRean; Adams, Amanda; Moline, Denise; Coffman-Kadish, Nancy
2012-01-01
Background In clinical trials and epidemiologic studies, information on medical care utilization and health outcomes is often obtained from medical records. For multi-center studies, this information may be gathered by personnel at individual sites or by staff at a central coordinating center. We describe the process used to develop a HIPAA-compliant centralized process to collect medical record information for a large multi-center cancer screening trial. Methods The framework used to select, request, and track medical records incorporated a participant questionnaire with unique identifiers for each medical provider. De-identified information from the questionnaires was sent to the coordinating center indexed by these identifiers. The central coordinating center selected specific medical providers for abstraction and notified sites using these identifiers. The site personnel then linked the identifiers with medical provider information. Staff at the sites collected medical records and provided them for central abstraction. Results Medical records were successfully obtained and abstracted to ascertain information on outcomes and health care utilization in a study with over 18,000 study participants. Collection of records required for outcomes related to positive screening examinations and lung cancer diagnosis exceeded 90%. Collection of records for all aims was 87.32%. Conclusions We designed a successful centralized medical record abstraction process that may be generalized to other research settings, including observational studies. The coordinating center received no identifying data. The process satisfied requirements imposed by the Health Insurance Portability and Accountability Act and concerns of site institutional review boards with respect to protected health information. PMID:22982342
A millennium of metallurgy recorded by lake sediments from Morococha, Peruvian Andes.
Cooke, Colin A; Abbott, Mark B; Wolfe, Alexander P; Kittleson, John L
2007-05-15
To date, information concerning pre-Colonial metallurgy in South America has largely been limited to the archaeological record of artifacts. Here, we reconstruct a millennium of smelting activity in the Peruvian Andes using the lake-sediment stratigraphy of atmospherically derived metals (Pb, Zn, Cu, Ag, Sb, Bi, and Ti) and lead isotopic ratios (206Pb/ 207Pb) associated with smelting from the Morococha mining region in the central Peruvian Andes. The earliest evidence for metallurgy occurs ca. 1000 A.D., coinciding with the fall of the Wari Empire and decentralization of local populations. Smelting during this interval appears to have been aimed at copper and copper alloys, because of large increases in Zn and Cu relative to Pb. A subsequent switch to silver metallurgy under Inca control (ca. 1450 to conquest, 1533 A.D.) is indicated by increases in Pb, Sb, and Bi, a conclusion supported by further increases of these metals during Colonial mining, which targeted silver extraction. Rapid development of the central Andes during the 20th century raised metal burdens by an order of magnitude above previous levels. Our results represent the first evidence for pre-Colonial smelting in the central Peruvian Andes, and corroborate the sensitivity of lake sediments to pre-Colonial metallurgical activity suggested by earlier findings from Bolivia.
NASA Technical Reports Server (NTRS)
1974-01-01
The specifications and functions of the Central Data Processing (CDPF) Facility which supports the Earth Observatory Satellite (EOS) are discussed. The CDPF will receive the EOS sensor data and spacecraft data through the Spaceflight Tracking and Data Network (STDN) and the Operations Control Center (OCC). The CDPF will process the data and produce high density digital tapes, computer compatible tapes, film and paper print images, and other data products. The specific aspects of data inputs and data processing are identified. A block diagram of the CDPF to show the data flow and interfaces of the subsystems is provided.
Introduction to the LaRC central scientific computing complex
NASA Technical Reports Server (NTRS)
Shoosmith, John N.
1993-01-01
The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.
Use of electronic medical record data for quality improvement in schizophrenia treatment.
Owen, Richard R; Thrush, Carol R; Cannon, Dale; Sloan, Kevin L; Curran, Geoff; Hudson, Teresa; Austen, Mark; Ritchie, Mona
2004-01-01
An understanding of the strengths and limitations of automated data is valuable when using administrative or clinical databases to monitor and improve the quality of health care. This study discusses the feasibility and validity of using data electronically extracted from the Veterans Health Administration (VHA) computer database (VistA) to monitor guideline performance for inpatient and outpatient treatment of schizophrenia. The authors also discuss preliminary results and their experience in applying these methods to monitor antipsychotic prescribing using the South Central VA Healthcare Network (SCVAHCN) Data Warehouse as a tool for quality improvement.
Flame Speeds and Energy Considerations for Explosions in a Spherical Bomb
NASA Technical Reports Server (NTRS)
Fiock, Ernest F; Marvin, Charles F , Jr; Caldwell, Frank R; Roeder, Carl H
1940-01-01
Simultaneous measurements were made of the speed of flame and the rise in pressure during explosions of mixtures of carbon monoxide, normal heptane, iso-octane, and benzene in a 10-inch spherical bomb with central ignition. From these records, fundamental properties of the explosive mixtures, which are independent of the apparatus, were computed. The transformation velocity, or speed at which flame advances into and transforms the explosive mixture, increases with both the temperature and the pressure of the unburned gas. The rise in pressure was correlated with the mass of charge inflamed to show the course of the energy developed.
A multi-channel coronal spectrophotometer.
NASA Technical Reports Server (NTRS)
Landman, D. A.; Orrall, F. Q.; Zane, R.
1973-01-01
We describe a new multi-channel coronal spectrophotometer system, presently being installed at Mees Solar Observatory, Mount Haleakala, Maui. The apparatus is designed to record and interpret intensities from many sections of the visible and near-visible spectral regions simultaneously, with relatively high spatial and temporal resolution. The detector, a thermoelectrically cooled silicon vidicon camera tube, has its central target area divided into a rectangular array of about 100,000 pixels and is read out in a slow-scan (about 2 sec/frame) mode. Instrument functioning is entirely under PDP 11/45 computer control, and interfacing is via the CAMAC system.
Uehara, Takashi; Sartori, Matteo; Tanaka, Toshihisa; Fiori, Simone
2017-06-01
The estimation of covariance matrices is of prime importance to analyze the distribution of multivariate signals. In motor imagery-based brain-computer interfaces (MI-BCI), covariance matrices play a central role in the extraction of features from recorded electroencephalograms (EEGs); therefore, correctly estimating covariance is crucial for EEG classification. This letter discusses algorithms to average sample covariance matrices (SCMs) for the selection of the reference matrix in tangent space mapping (TSM)-based MI-BCI. Tangent space mapping is a powerful method of feature extraction and strongly depends on the selection of a reference covariance matrix. In general, the observed signals may include outliers; therefore, taking the geometric mean of SCMs as the reference matrix may not be the best choice. In order to deal with the effects of outliers, robust estimators have to be used. In particular, we discuss and test the use of geometric medians and trimmed averages (defined on the basis of several metrics) as robust estimators. The main idea behind trimmed averages is to eliminate data that exhibit the largest distance from the average covariance calculated on the basis of all available data. The results of the experiments show that while the geometric medians show little differences from conventional methods in terms of classification accuracy in the classification of electroencephalographic recordings, the trimmed averages show significant improvement for all subjects.
Comparing New Zealand's 'Middle Out' health information technology strategy with other OECD nations.
Bowden, Tom; Coiera, Enrico
2013-05-01
Implementation of efficient, universally applied, computer to computer communications is a high priority for many national health systems. As a consequence, much effort has been channelled into finding ways in which a patient's previous medical history can be made accessible when needed. A number of countries have attempted to share patients' records, with varying degrees of success. While most efforts to create record-sharing architectures have relied upon government-provided strategy and funding, New Zealand has taken a different approach. Like most British Commonwealth nations, New Zealand has a 'hybrid' publicly/privately funded health system. However its information technology infrastructure and automation has largely been developed by the private sector, working closely with regional and central government agencies. Currently the sector is focused on finding ways in which patient records can be shared amongst providers across three different regions. New Zealand's healthcare IT model combines government contributed funding, core infrastructure, facilitation and leadership with private sector investment and skills and is being delivered via a set of controlled experiments. The net result is a 'Middle Out' approach to healthcare automation. 'Middle Out' relies upon having a clear, well-articulated health-reform strategy and a determination by both public and private sector organisations to implement useful healthcare IT solutions by working closely together. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Issues central to a useful image understanding environment
NASA Astrophysics Data System (ADS)
Beveridge, J. Ross; Draper, Bruce A.; Hanson, Allen R.; Riseman, Edward M.
1992-04-01
A recent DARPA initiative has sparked interested in software environments for computer vision. The goal is a single environment to support both basic research and technology transfer. This paper lays out six fundamental attributes such a system must possess: (1) support for both C and Lisp, (2) extensibility, (3) data sharing, (4) data query facilities tailored to vision, (5) graphics, and (6) code sharing. The first three attributes fundamentally constrain the system design. Support for both C and Lisp demands some form of database or data-store for passing data between languages. Extensibility demands that system support facilities, such as spatial retrieval of data, be readily extended to new user-defined datatypes. Finally, data sharing demands that data saved by one user, including data of a user-defined type, must be readable by another user.
Computer-mediated mobile messaging as collaboration support for nurses.
Karpati, Peter; Toussaint, Pieter Jelle; Nytrø, Oystein
2009-01-01
Collaboration in hospitals is coordinated mainly by communication, which currently happens by face-to-face meetings, phone calls, pagers, notes and the electronic patient record. These habits raise problems e.g., delayed notifications and unnecessary interruptions. Dealing with these problems could save time and improve the care. Therefore we designed and prototyped a mobile messaging solution based on two specific scenarios coming from observations at a cardiology department of a Norwegian hospital. The main focus was on supporting the work of nurses. One prototype supported patient management while another one dealt with messages related to medication planning. The evaluation of the prototypes suggested that messaging-based collaboration support is worth to explore and also gave ideas for improvement.
Medhanyie, Araya Abrha; Spigt, Mark; Yebyo, Henock; Little, Alex; Tadesse, Kidane; Dinant, Geert-Jan; Blanco, Roman
2017-05-01
Mobile phone based applications are considered by many as potentially useful for addressing challenges and improving the quality of data collection in developing countries. Yet very little evidence is available supporting or refuting the potential and widely perceived benefits on the use of electronic forms on smartphones for routine patient data collection by health workers at primary health care facilities. A facility based cross sectional study using a structured paper checklist was prepared to assess the completeness and accuracy of 408 electronic records completed and submitted to a central database server using electronic forms on smartphones by 25 health workers. The 408 electronic records were selected randomly out of a total of 1772 maternal health records submitted by the health workers to the central database over a period of six months. Descriptive frequencies and percentages of data completeness and error rates were calculated. When compared to paper records, the use of electronic forms significantly improved data completeness by 209 (8%) entries. Of a total 2622 entries checked for completeness, 2602 (99.2%) electronic record entries were complete, while 2393 (91.3%) paper record entries were complete. A very small percentage of error rates, which was easily identifiable, occurred in both electronic and paper forms although the error rate in the electronic records was more than double that of paper records (2.8% vs. 1.1%). More than half of entry errors in the electronic records related to entering a text value. With minimal training, supervision, and no incentives, health care workers were able to use electronic forms for patient assessment and routine data collection appropriately and accurately with a very small error rate. Minimising the number of questions requiring text responses in electronic forms would be helpful in minimizing data errors. Copyright © 2017 Elsevier B.V. All rights reserved.
Tephrostratigraphy the DEEP site record, Lake Ohrid
NASA Astrophysics Data System (ADS)
Leicher, N.; Zanchetta, G.; Sulpizio, R.; Giaccio, B.; Wagner, B.; Francke, A.
2016-12-01
In the central Mediterranean region, tephrostratigraphy has been proofed to be a suitable and powerful tool for dating and correlating marine and terrestrial records. However, for the period older 200 ka, tephrostratigraphy is incomplete and restricted to some Italian continental basins (e.g. Sulmona, Acerno, Mercure), and continuous records downwind of the Italian volcanoes are rare. Lake Ohrid (Macedonia/Albania) in the eastern Mediterranean region fits this requisite and is assumed to be the oldest continuously existing lake of Europe. A continous record (DEEP) was recovered within the scope of the ICDP deep-drilling campaign SCOPSCO (Scientific Collaboration on Past Speciation Conditions in Lake Ohrid). In the uppermost 450 meters of the record, covering more than 1.2 Myrs of Italian volcanism, 54 tephra layers were identified during core-opening and description. A first tephrostratigraphic record was established for the uppermost 248 m ( 637 ka). Major element analyses (EDS/WDS) were carried out on juvenile glass fragments and 15 out of 35 tephra layers have been identified and correlated with known and dated eruptions of Italian volcanoes. Existing 40Ar/39Ar ages were re-calculated by using the same flux standard and used as first order tie points to develop a robust chronology for the DEEP site succession. Between 248 and 450 m of the DEEP site record, another 19 tephra horizons were identified and are subject of ongoing work. These deposits, once correlated with known and dated tephra, will hopefully enable dating this part of the succession, likely supported by major paleomagnetic events, such as the Brunhes-Matuyama boundary, or the Cobb-Mountain or the Jaramillo excursions. This makes the Lake Ohrid record a unique continuous, distal record of Italian volcanic activity, which is candidate to become the template for the central Mediterranean tephrostratigraphy, especially for the hitherto poorly known and explored lower Middle Pleistocene period.
A square root ensemble Kalman filter application to a motor-imagery brain-computer interface.
Kamrunnahar, M; Schiff, S J
2011-01-01
We here investigated a non-linear ensemble Kalman filter (SPKF) application to a motor imagery brain computer interface (BCI). A square root central difference Kalman filter (SR-CDKF) was used as an approach for brain state estimation in motor imagery task performance, using scalp electroencephalography (EEG) signals. Healthy human subjects imagined left vs. right hand movements and tongue vs. bilateral toe movements while scalp EEG signals were recorded. Offline data analysis was conducted for training the model as well as for decoding the imagery movements. Preliminary results indicate the feasibility of this approach with a decoding accuracy of 78%-90% for the hand movements and 70%-90% for the tongue-toes movements. Ongoing research includes online BCI applications of this approach as well as combined state and parameter estimation using this algorithm with different system dynamic models.
High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers
NASA Astrophysics Data System (ADS)
Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas
2017-04-01
Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise mapping application is composed of four principal modules: (1) pre-processing of raw data, (2) massive cross-correlation, (3) post-processing of correlation data based on computation of logarithmic energy ratio and (4) generation of source maps from post-processed data. Implementation of the solution posed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.
Distributed information system (water fact sheet)
Harbaugh, A.W.
1986-01-01
During 1982-85, the Water Resources Division (WRD) of the U.S. Geological Survey (USGS) installed over 70 large minicomputers in offices across the country to support its mission in the science of hydrology. These computers are connected by a communications network that allows information to be shared among computers in each office. The computers and network together are known as the Distributed Information System (DIS). The computers are accessed through the use of more than 1500 terminals and minicomputers. The WRD has three fundamentally different needs for computing: data management; hydrologic analysis; and administration. Data management accounts for 50% of the computational workload of WRD because hydrologic data are collected in all 50 states, Puerto Rico, and the Pacific trust territories. Hydrologic analysis consists of 40% of the computational workload of WRD. Cost accounting, payroll, personnel records, and planning for WRD programs occupies an estimated 10% of the computer workload. The DIS communications network is shown on a map. (Lantz-PTT)
Poder, Thomas G; Godbout, Sylvie T; Bellemare, Christian
This paper describes a comparative study of clinical coding by Archivists (also known as Clinical Coders in some other countries) using single and dual computer monitors. In the present context, processing a record corresponds to checking the available information; searching for the missing physician information; and finally, performing clinical coding. We collected data for each Archivist during her use of the single monitor for 40 hours and during her use of the dual monitor for 20 hours. During the experimental periods, Archivists did not perform other related duties, so we were able to measure the real-time processing of records. To control for the type of records and their impact on the process time required, we categorised the cases as major or minor, based on whether acute care or day surgery was involved. Overall results show that 1,234 records were processed using a single monitor and 647 records using a dual monitor. The time required to process a record was significantly higher (p= .071) with a single monitor compared to a dual monitor (19.83 vs.18.73 minutes). However, the percentage of major cases was significantly higher (p= .000) in the single monitor group compared to the dual monitor group (78% vs. 69%). As a consequence, we adjusted our results, which reduced the difference in time required to process a record between the two systems from 1.1 to 0.61 minutes. Thus, the net real-time difference was only 37 seconds in favour of the dual monitor system. Extrapolated over a 5-year period, this would represent a time savings of 3.1% and generate a net cost savings of $7,729 CAD (Canadian dollars) for each workstation that devoted 35 hours per week to the processing of records. Finally, satisfaction questionnaire responses indicated a high level of satisfaction and support for the dual-monitor system. The implementation of a dual-monitor system in a hospital archiving department is an efficient option in the context of scarce human resources and has the strong support of Archivists.
NASA Astrophysics Data System (ADS)
Morales-Molino, César; García-Antón, Mercedes; Postigo-Mijarra, José M.; Morla, Carlos
2013-01-01
A new palaeoecological sequence from the western Iberian Central Range significantly contributes to the knowledge on the Holocene vegetation dynamics in central Iberia. This sequence supports the existence of time-transgressive changes in the vegetation cover during the beginning of the Holocene over these central Iberian mountains, specifically the replacement of boreal birch-pine forests with Mediterranean communities. Anthracological analyses also indicate the replacement of boreal pines (Pinus sylvestris) with Mediterranean ones (Pinus pinaster) during the early Holocene. The observed vegetation changes were generally synchronous with climatic phases previously reconstructed for the western Mediterranean region, and they suggest that the climatic trends were most similar to those recorded in the northern Mediterranean region and central Europe. Several cycles of secondary succession after fire ending with the recovery of mature forest have been identified, which demonstrates that the vegetation of western Iberia was highly resilient to fire disturbance. However, when the recurrence of fire crossed a certain threshold, the original forests were not able to completely recover and shrublands and grasslands became dominant; this occurred approximately 5800-5400 cal yr BP. Afterwards, heathlands established as the dominant vegetation, which were maintained by frequent and severe wildfires most likely associated with human activities in a climatic framework that was less suitable for temperate trees. Finally, our palaeoecological record provides guidelines on how to manage protected areas in Mediterranean mountains of southwestern Europe, especially regarding the conservation and restoration of temperate communities that are threatened there such as birch stands.
The Classification and Evaluation of Computer-Aided Software Engineering Tools
1990-09-01
International Business Machines Corporation Customizer is a Registered Trademark of Index Technology Corporation Data Analyst is a Registered Trademark of...years, a rapid series of new approaches have been adopted including: information engineering, entity- relationship modeling, automatic code generation...support true information sharing among tools and automated consistency checking. Moreover, the repository must record and manage the relationships and
Modified timing module for Loran-C receiver
NASA Technical Reports Server (NTRS)
Lilley, R. W.
1983-01-01
Full hardware documentation is provided for the circuit card implementing the Loran-C timing loop, and the receiver event-mark and re-track functions. This documentation is to be combined with overall receiver drawings to form the as-built record for this device. Computer software to support this module is integrated with the remainder of the receiver software, in the LORPROM program.
AlJarullah, Asma; El-Masri, Samir
2013-08-01
The goal of a national electronic health records integration system is to aggregate electronic health records concerning a particular patient at different healthcare providers' systems to provide a complete medical history of the patient. It holds the promise to address the two most crucial challenges to the healthcare systems: improving healthcare quality and controlling costs. Typical approaches for the national integration of electronic health records are a centralized architecture and a distributed architecture. This paper proposes a new approach for the national integration of electronic health records, the semi-centralized approach, an intermediate solution between the centralized architecture and the distributed architecture that has the benefits of both approaches. The semi-centralized approach is provided with a clearly defined architecture. The main data elements needed by the system are defined and the main system modules that are necessary to achieve an effective and efficient functionality of the system are designed. Best practices and essential requirements are central to the evolution of the proposed architecture. The proposed architecture will provide the basis for designing the simplest and the most effective systems to integrate electronic health records on a nation-wide basis that maintain integrity and consistency across locations, time and systems, and that meet the challenges of interoperability, security, privacy, maintainability, mobility, availability, scalability, and load balancing.
Charre-Medellín, Juan Felipe; Monterrubio-Rico, Tiberio Cesar; Guido-Lemus, Daniel; Mendoza, Eduardo
2015-09-01
The Michoacán state is characterized by the existence of important environmental heterogeneity in terms of climate, topography and types of vegetation, which includes the worldwide endangered tropical dry forest. Some reports indicating the presence of the six species of felids occurring in Mexico in this region have been made; however, evidence to support these reports is scant, and filling this lack of information is particularly critical in the case of threatened species or habitats. The aim of this study was to systematize and analyze data distribution patterns of felids in the state of Michoacán, in the Central-Western Mexico. We conducted a review of literature and databases to compile species presence records in the study region. Moreover, we analyzed data obtained from ten years of field work conducted in the region, in which complementary methods (detection of direct and indirect evidence of species occurrence along transects, camera-trapping and interviews to local people) were applied to detect the presence of felid species. We compiled a total of 29 presence records of felids in the region from our review. Additionally, field work, which accumulated 1,107.5 km of walked transects, and 8 699 camera-trap days, produced 672 records of species presence. Lynx rufus was the species with the lowest number of records and the most restricted distribution. In contrast, the species with the greatest number of records was Leoparduspardalis (n = 343). In general, 89% of felids records occurred below 1,000 masl. Overall mean annual temperature of presence records was 24 °C and mean annual precipitation was 1,040 mm. The species whose presence records showed the most distinctive pattern, in terms of temperature and precipitation associated, was L. rufus (15.8 ± 1.3°C and 941 ± 171 mm). Results of a cluster analysis showed that areas supporting different combinations of eco-regions and types of vegetation could be grouped in five clusters having different assemblages of felid species and camera-trapping records. This study results useful to garner a more comprehensive view of the distribution patterns of felids in a region with important environmental contrasts and subjected to an increased human pressure. Moreover, this study provides insights that further our understanding of the relationship between environmental variables and felid distribution patterns which may have an impact for conservation and management strategies at the local and regional levels.
Space Spurred Computer Graphics
NASA Technical Reports Server (NTRS)
1983-01-01
Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.
Analysis of bioelectric records and fabrication of phototype sleep analysis equipment
NASA Technical Reports Server (NTRS)
Kellaway, P.
1972-01-01
A computer-analysis technique was used to evaluate the changes in the waking EEGs of 5 normal subjects which occurred during the oral administration of flurazepam hydrochloride (Dalmane). While the subjects were receiving the drug, there was an increase in the amount of beta (14-38 c/sec) activity in fronto-central EEG leads in all 5 subjects. This increase in beta activity was characterized by a highly consistent increase in the number of waves that occurred during an EEG recording interval of fixed duration and by a less consistent increase in average wave amplitude. There was no detectable change in mean EEG wavelength (frequency) within the beta frequency range. The EEG patterns reverted to their baseline condition during 2-3 weeks after withdrawal of the drug. Analysis of the alpha, theta and delta components of the EEG indicated no changes during or following administration of the drug. This study clearly illustrates the usefulness of specific computer-analysis techniques in the characterization and quantification of sleep-promoting drugs upon the EEG of the normal young adults in the waking state. Two preamplifiers and 150 EEG monitoring caps with electrodes were delivered to MSC.
Geoid modeling in Mexico and the collaboration with Central America and the Caribbean.
NASA Astrophysics Data System (ADS)
Avalos, D.; Gomez, R.
2012-12-01
The model of geoidal heights for Mexico, named GGM10, is presented as a geodetic tool to support vertical positioning in the context of regional height system unification. It is a purely gravimetric solution computed by the Stokes-Helmert technique in resolution of 2.5 arc minutes. This product from the Instituto Nacional de Estadistica y Geografia (INEGI) is released together with a series of 10 gravimetric models which add to the improvements in description of the gravity field. In the recent years, the INEGI joined the initiative of the U.S. National Geodetic Survey and the Canada's Geodetic Survey Division to promote the regional height system unification. In an effort to further improve the compatibility among national geoid models in the region, the INEGI has begun to champion a network of specialists that includes national representatives from Central America and the Caribbean. Through the opening of opportunities for training and more direct access to international agreements and discussions, the tropical region is gaining participation. Now a significantly increased number of countries is pushing for a future North and Central American geoid-based vertical datum as support of height system unification.eoidal height in Mexico, mapped from the model GGM10.
A technique for transferring a patient's smile line to a cone beam computed tomography (CBCT) image.
Bidra, Avinash S
2014-08-01
Fixed implant-supported prosthodontic treatment for patients requiring a gingival prosthesis often demands that bone and implant levels be apical to the patient's maximum smile line. This is to avoid the display of the prosthesis-tissue junction (the junction between the gingival prosthesis and natural soft tissues) and prevent esthetic failures. Recording a patient's lip position during maximum smile is invaluable for the treatment planning process. This article presents a simple technique for clinically recording and transferring the patient's maximum smile line to cone beam computed tomography (CBCT) images for analysis. The technique can help clinicians accurately determine the need for and amount of bone reduction required with respect to the maximum smile line and place implants in optimal positions. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Electronic health records: postadoption physician satisfaction and continued use.
Wright, Edward; Marvel, Jon
2012-01-01
One goal of public-policy makers in general and health care managers in particular is the adoption and efficient utilization of electronic health record (EHR) systems throughout the health care industry. Consequently, this investigation focused on the effects of known antecedents of technology adoption on physician satisfaction with EHR technology and the continued use of such systems. The American Academy of Family Physicians provided support in the survey of 453 physicians regarding their satisfaction with their EHR use experience. A conceptual model merging technology adoption and computer user satisfaction models was tested using structural equation modeling. Results indicate that effort expectancy (ease of use) has the most substantive effect on physician satisfaction and the continued use of EHR systems. As such, health care managers should be especially sensitive to the user and computer interface of prospective EHR systems to avoid costly and disruptive system selection mistakes.
Urbano, A; Babiloni, C; Onorati, P; Babiloni, F
1998-06-01
Between-electrode cross-covariances of delta (0-3 Hz)- and theta (4-7 Hz)-filtered high resolution EEG potentials related to preparation, initiation. and execution of human unilateral internally triggered one-digit movements were computed to investigate statistical dynamic coupling between these potentials. Significant (P < 0.05, Bonferroni-corrected) cross-covariances were calculated between electrodes of lateral and median scalp regions. For both delta- and theta-bandpassed potentials, covariance modeling indicated a shifting functional coupling between contralateral and ipsilateral frontal-central-parietal scalp regions and between these two regions and the median frontal-central scalp region from the preparation to the execution of the movement (P < 0.05). A maximum inward functional coupling of the contralateral with the ipsilateral frontal-central-parietal scalp region was modeled during the preparation and initiation of the movement, and a maximum outward functional coupling during the movement execution. Furthermore, for theta-bandpassed potentials, rapidly oscillating inward and outward relationships were modeled between the contralateral frontal-central-parietal scalp region and the median frontal-central scalp region across the preparation, initiation, and execution of the movement. We speculate that these cross-covariance relationships might reflect an oscillating dynamic functional coupling of primary sensorimotor and supplementary motor areas during the planning, starting, and performance of unilateral movement. The involvement of these cortical areas is supported by the observation that averaged spatially enhanced delta- and theta-bandpassed potentials were computed from the scalp regions where task-related electrical activation of primary sensorimotor areas and supplementary motor area was roughly represented.
Meeker, Daniella; Jiang, Xiaoqian; Matheny, Michael E; Farcas, Claudiu; D'Arcy, Michel; Pearlman, Laura; Nookala, Lavanya; Day, Michele E; Kim, Katherine K; Kim, Hyeoneui; Boxwala, Aziz; El-Kareh, Robert; Kuo, Grace M; Resnic, Frederic S; Kesselman, Carl; Ohno-Machado, Lucila
2015-11-01
Centralized and federated models for sharing data in research networks currently exist. To build multivariate data analysis for centralized networks, transfer of patient-level data to a central computation resource is necessary. The authors implemented distributed multivariate models for federated networks in which patient-level data is kept at each site and data exchange policies are managed in a study-centric manner. The objective was to implement infrastructure that supports the functionality of some existing research networks (e.g., cohort discovery, workflow management, and estimation of multivariate analytic models on centralized data) while adding additional important new features, such as algorithms for distributed iterative multivariate models, a graphical interface for multivariate model specification, synchronous and asynchronous response to network queries, investigator-initiated studies, and study-based control of staff, protocols, and data sharing policies. Based on the requirements gathered from statisticians, administrators, and investigators from multiple institutions, the authors developed infrastructure and tools to support multisite comparative effectiveness studies using web services for multivariate statistical estimation in the SCANNER federated network. The authors implemented massively parallel (map-reduce) computation methods and a new policy management system to enable each study initiated by network participants to define the ways in which data may be processed, managed, queried, and shared. The authors illustrated the use of these systems among institutions with highly different policies and operating under different state laws. Federated research networks need not limit distributed query functionality to count queries, cohort discovery, or independently estimated analytic models. Multivariate analyses can be efficiently and securely conducted without patient-level data transport, allowing institutions with strict local data storage requirements to participate in sophisticated analyses based on federated research networks. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Nonbreeding duck use at Central Flyway National Wildlife Refuges
Andersson, Kent; Davis, Craig A.; Harris, Grant; Haukos, David A.
2018-01-01
Within the U.S. portion of the Central Flyway, the U.S. Fish and Wildlife Service manages waterfowl on numerous individual units (i.e., Refuges) within the National Wildlife Refuge System. Presently, the extent of waterfowl use that Refuges receive and the contribution of Refuges to waterfowl populations (i.e., the proportion of the Central Flyway population registered at each Refuge) remain unassessed. Such an evaluation would help determine to what extent Refuges support waterfowl relative to stated targets, aid in identifying species requiring management attention, inform management targets, and improve fiscal efficiencies. Using historic monitoring data (1954–2008), we performed this assessment for 23 Refuges in Texas, New Mexico, Oklahoma, Kansas, and Nebraska during migration and wintering months (October–March). We examined six dabbling ducks and two diving ducks, plus all dabbling ducks and all diving ducks across two periods (long-term [all data] and short-term [last 10 October–March periods]). Individual Refuge use was represented by the sum of monthly duck count averages for October–March. We used two indices of Refuge contribution: peak contribution and January contribution. Peak contribution was the highest monthly count average for each October–March period divided by the indexed population total for the Central Flyway in the corresponding year; January contribution used the January count average divided by the corresponding population index. Generally, Refuges in Kansas, Nebraska, and New Mexico recorded most use and contribution for mallards Anas platyrhynchos. Refuges along the Texas Gulf Coast recorded most use and contribution for other dabbling ducks, with Laguna Atascosa and Aransas (including Matagorda Island) recording most use for diving ducks. The long-term total January contribution of the assessed Refuges to ducks wintering in the Central Flyway was greatest for green-winged teal Anas creccawith 35%; 12–15% for American wigeon Mareca americana, gadwall Mareca strepera, and northern pintail Anas acuta; and 7–8% for mallard and mottled duck Anas fulvigula. Results indicated that the reliance on the National Wildlife Refuge System decreased for these ducks, with evidence suggesting that, for several species, the assessed Refuges may be operating at carrying capacity. Future analyses could be more detailed and informative were Refuges to implement a single consistent survey methodology that incorporated estimations of detection bias in the survey process, while concomitantly recording habitat metrics on and neighboring each Refuge.
Nonbreeding Duck Use and Management Contribution Trends for Central Flyway Refuges
Andersson, Kent; Davis, Craig A.; Harris, Grant; Haukos, David A.
2018-01-01
Within the U.S. portion of the Central Flyway, the U.S. Fish and Wildlife Service manages waterfowl on numerous individual units (i.e., Refuges) within the National Wildlife Refuge System. Presently, the extent of waterfowl use that Refuges receive and the contribution of Refuges to waterfowl populations (i.e., the proportion of the Central Flyway population registered at each Refuge) remain unassessed. Such an evaluation would help determine to what extent Refuges support waterfowl relative to stated targets, aid in identifying species requiring management attention, inform management targets, and improve fiscal efficiencies. Using historic monitoring data (1954–2008), we performed this assessment for 23 Refuges in Texas, New Mexico, Oklahoma, Kansas, and Nebraska during migration and wintering months (October–March). We examined six dabbling ducks and two diving ducks, plus all dabbling ducks and all diving ducks across two periods (long-term [all data] and short-term [last 10 October–March periods]). Individual Refuge use was represented by the sum of monthly duck count averages for October–March. We used two indices of Refuge contribution: peak contribution and January contribution. Peak contribution was the highest monthly count average for each October–March period divided by the indexed population total for the Central Flyway in the corresponding year; January contribution used the January count average divided by the corresponding population index. Generally, Refuges in Kansas, Nebraska, and New Mexico recorded most use and contribution for mallards Anas platyrhynchos. Refuges along the Texas Gulf Coast recorded most use and contribution for other dabbling ducks, with Laguna Atascosa and Aransas (including Matagorda Island) recording most use for diving ducks. The long-term total January contribution of the assessed Refuges to ducks wintering in the Central Flyway was greatest for green-winged teal Anas creccawith 35%; 12–15% for American wigeon Mareca americana, gadwall Mareca strepera, and northern pintail Anas acuta; and 7–8% for mallard and mottled duck Anas fulvigula. Results indicated that the reliance on the National Wildlife Refuge System decreased for these ducks, with evidence suggesting that, for several species, the assessed Refuges may be operating at carrying capacity. Future analyses could be more detailed and informative were Refuges to implement a single consistent survey methodology that incorporated estimations of detection bias in the survey process, while concomitantly recording habitat metrics on and neighboring each Refuge.
Schroeder, Dixie; Schwei, Kelsey; Chyou, Po-Huang
2017-01-01
This study sought to re-characterize trends and factors affecting electronic dental record (EDR) and technologies adoption by dental practices and the impact of the Health Information Technology for Economic and Clinical Health (HITECH) act on adoption rates through 2012. A 39-question survey was disseminated nationally over 3 months using a novel, statistically-modeled approach informed by early response rates to achieve a predetermined sample. EDR adoption rate for clinical support was 52%. Adoption rates were higher among: (1) younger dentists; (2) dentists ≤ 15 years in practice; (3) females; and (4) group practices. Top barriers to adoption were EDR cost/expense, cost-benefit ratio, electronic format conversion, and poor EDR usability. Awareness of the Federal HITECH incentive program was low. The rate of chairside computer implementation was 72%. Adoption of EDR in dental offices in the United States was higher in 2012 than electronic health record adoption rates in medical offices and was not driven by the HITECH program. Patient portal adoption among dental practices in the United States remained low. PMID:29229631
Martin, Carol Lynn; Ruble, Diane N; Szkrybalo, Joel
2004-09-01
Most of the critique in the A. Bandura and K. Bussey (see record 2004-18097-001) commentary is a misunderstanding or misrepresentation of the points made by C. L. Martin, D. N. Ruble, and J. Szkrybalo in their 2002 Psychological Bulletin article (see record 2002-18663-003). First, Martin et al. never intended to present a comprehensive theory; instead, it was a review of 2 different cognitive approaches to gender development. Second, there is no time line test that has been failed; instead, gender cognitions may occur earlier than initially believed. Third, Bandura and Bussey dismissed central gender cognitions-gender identity and gender stereotype knowledge-despite considerable evidence in their support. Fourth, Bandura and Bussey never addressed the gaps and ambiguities inherent in their theory that Martin et al. questioned in their earlier article. Finally, Bandura and Bussey's misunderstandings of cognitive theorists' views on socialization agents, sociocultural influences, agency, and motivation created theoretical rifts where none exist. ((c) 2004 APA, all rights reserved)
Techniques for animation of CFD results. [computational fluid dynamics
NASA Technical Reports Server (NTRS)
Horowitz, Jay; Hanson, Jeffery C.
1992-01-01
Video animation is becoming increasingly vital to the computational fluid dynamics researcher, not just for presentation, but for recording and comparing dynamic visualizations that are beyond the current capabilities of even the most powerful graphic workstation. To meet these needs, Lewis Research Center has recently established a facility to provide users with easy access to advanced video animation capabilities. However, producing animation that is both visually effective and scientifically accurate involves various technological and aesthetic considerations that must be understood both by the researcher and those supporting the visualization process. These considerations include: scan conversion, color conversion, and spatial ambiguities.
FastMag: Fast micromagnetic simulator for complex magnetic structures (invited)
NASA Astrophysics Data System (ADS)
Chang, R.; Li, S.; Lubarda, M. V.; Livshitz, B.; Lomakin, V.
2011-04-01
A fast micromagnetic simulator (FastMag) for general problems is presented. FastMag solves the Landau-Lifshitz-Gilbert equation and can handle multiscale problems with a high computational efficiency. The simulator derives its high performance from efficient methods for evaluating the effective field and from implementations on massively parallel graphics processing unit (GPU) architectures. FastMag discretizes the computational domain into tetrahedral elements and therefore is highly flexible for general problems. The magnetostatic field is computed via the superposition principle for both volume and surface parts of the computational domain. This is accomplished by implementing efficient quadrature rules and analytical integration for overlapping elements in which the integral kernel is singular. Thus, discretized superposition integrals are computed using a nonuniform grid interpolation method, which evaluates the field from N sources at N collocated observers in O(N) operations. This approach allows handling objects of arbitrary shape, allows easily calculating of the field outside the magnetized domains, does not require solving a linear system of equations, and requires little memory. FastMag is implemented on GPUs with ?> GPU-central processing unit speed-ups of 2 orders of magnitude. Simulations are shown of a large array of magnetic dots and a recording head fully discretized down to the exchange length, with over a hundred million tetrahedral elements on an inexpensive desktop computer.
Computer-Mediated Social Support for Physical Activity: A Content Analysis.
Stragier, Jeroen; Mechant, Peter; De Marez, Lieven; Cardon, Greet
2018-02-01
Online fitness communities are a recent phenomenon experiencing growing user bases. They can be considered as online social networks in which recording, monitoring, and sharing of physical activity (PA) are the most prevalent practices. They have added a new dimension to the social experience of PA in which online peers function as virtual PA partners or supporters. However, research into seeking and receiving computer-mediated social support for PA is scarce. Our aim was to study to what extent using online fitness communities and sharing physical activities with online social networks results in receiving various types of online social support. Two databases, one containing physical activities logged with Strava and one containing physical activities logged with RunKeeper and shared on Twitter, were investigated for occurrence and type of social support, by means of a deductive content analysis. Results indicate that social support delivered through Twitter is not particularly extensive. On Strava, social support is significantly more prevalent. Especially esteem support, expressed as compliments for the accomplishment of an activity, is provided on both Strava and Twitter. The results demonstrate that social media have potential as a platform used for providing social support for PA, but differences among various social network sites can be substantial. Especially esteem support can be expected, in contrast to online health communities, where information support is more common.
An overview of the NASA electronic components information management system
NASA Technical Reports Server (NTRS)
Kramer, G.; Waterbury, S.
1991-01-01
The NASA Parts Project Office (NPPO) comprehensive data system to support all NASA Electric, Electronic, and Electromechanical (EEE) parts management and technical data requirements is described. A phase delivery approach is adopted, comprising four principal phases. Phases 1 and 2 support Space Station Freedom (SSF) and use a centralized architecture with all data and processing kept on a mainframe computer. Phases 3 and 4 support all NASA centers and projects and implement a distributed system architecture, in which data and processing are shared among networked database servers. The Phase 1 system, which became operational in February of 1990, implements a core set of functions. Phase 2, scheduled for release in 1991, adds functions to the Phase 1 system. Phase 3, to be prototyped beginning in 1991 and delivered in 1992, introduces a distributed system, separate from the Phase 1 and 2 system, with a refined semantic data model. Phase 4 extends the data model and functionality of the Phase 3 system to provide support for the NASA design community, including integration with Computer Aided Design (CAD) environments. Phase 4 is scheduled for prototyping in 1992 to 93 and delivery in 1994.
Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis
NASA Astrophysics Data System (ADS)
Patanè, Domenico; Ferrari, Ferruccio
1997-11-01
A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).
Office workers' computer use patterns are associated with workplace stressors.
Eijckelhof, Belinda H W; Huysmans, Maaike A; Blatter, Birgitte M; Leider, Priscilla C; Johnson, Peter W; van Dieën, Jaap H; Dennerlein, Jack T; van der Beek, Allard J
2014-11-01
This field study examined associations between workplace stressors and office workers' computer use patterns. We collected keyboard and mouse activities of 93 office workers (68F, 25M) for approximately two work weeks. Linear regression analyses examined the associations between self-reported effort, reward, overcommitment, and perceived stress and software-recorded computer use duration, number of short and long computer breaks, and pace of input device usage. Daily duration of computer use was, on average, 30 min longer for workers with high compared to low levels of overcommitment and perceived stress. The number of short computer breaks (30 s-5 min long) was approximately 20% lower for those with high compared to low effort and for those with low compared to high reward. These outcomes support the hypothesis that office workers' computer use patterns vary across individuals with different levels of workplace stressors. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Brain-computer interface technology: a review of the Second International Meeting.
Vaughan, Theresa M; Heetderks, William J; Trejo, Leonard J; Rymer, William Z; Weinrich, Michael; Moore, Melody M; Kübler, Andrea; Dobkin, Bruce H; Birbaumer, Niels; Donchin, Emanuel; Wolpaw, Elizabeth Winter; Wolpaw, Jonathan R
2003-06-01
This paper summarizes the Brain-Computer Interfaces for Communication and Control, The Second International Meeting, held in Rensselaerville, NY, in June 2002. Sponsored by the National Institutes of Health and organized by the Wadsworth Center of the New York State Department of Health, the meeting addressed current work and future plans in brain-computer interface (BCI) research. Ninety-two researchers representing 38 different research groups from the United States, Canada, Europe, and China participated. The BCIs discussed at the meeting use electroencephalographic activity recorded from the scalp or single-neuron activity recorded within cortex to control cursor movement, select letters or icons, or operate neuroprostheses. The central element in each BCI is a translation algorithm that converts electrophysiological input from the user into output that controls external devices. BCI operation depends on effective interaction between two adaptive controllers, the user who encodes his or her commands in the electrophysiological input provided to the BCI, and the BCI that recognizes the commands contained in the input and expresses them in device control. Current BCIs have maximum information transfer rates of up to 25 b/min. Achievement of greater speed and accuracy requires improvements in signal acquisition and processing, in translation algorithms, and in user training. These improvements depend on interdisciplinary cooperation among neuroscientists, engineers, computer programmers, psychologists, and rehabilitation specialists, and on adoption and widespread application of objective criteria for evaluating alternative methods. The practical use of BCI technology will be determined by the development of appropriate applications and identification of appropriate user groups, and will require careful attention to the needs and desires of individual users.
Augmenting SCA project management and automation framework
NASA Astrophysics Data System (ADS)
Iyapparaja, M.; Sharma, Bhanupriya
2017-11-01
In our daily life we need to keep the records of the things in order to manage it in more efficient and proper way. Our Company manufactures semiconductor chips and sale it to the buyer. Sometimes it manufactures the entire product and sometimes partially and sometimes it sales the intermediary product obtained during manufacturing, so for the better management of the entire process there is a need to keep the track record of all the entity involved in it. Materials and Methods: Therefore to overcome with the problem the need raised to develop the framework for the maintenance of the project and for the automation testing. Project management framework provides an architecture which supports in managing the project by marinating the records of entire requirements, the test cases that were created for testing each unit of the software, defect raised from the past years. So through this the quality of the project can be maintained. Results: Automation framework provides the architecture which supports the development and implementation of the automation test script for the software testing process. Conclusion: For implementing project management framework the product of HP that is Application Lifecycle management is used which provides central repository to maintain the project.
2006-01-01
unbiased genetic dis- tances and characterized using the unweighted pair group method with arithmetic means ( UPGMA ). Re- Table 1. Gene frequencies for...0.428, and between An. halophylus and the morpho- logical variant was 0.145, supporting the observations seen with allozymes. A UPGMA dendrogram based...and R. C. Wilkerson. 2005. Anopheles tri- annulatus (Neiva and Pinto): a new Anopheles record Fig. 2. UPGMA dendrogram constructed based on RAPD
The Space-Time Scales of Variability in Oceanic Thermal Structure Off the Central California Coast.
1983-12-01
SST and sea- surface salinity (SSS) boundaries extracted from the shipboard (2m) thermalsalinograph (T/S) records (Figs. 23, 24, and 25). For these... extracted for comparison. At 175m the density gradient is sufficient to support vigorous internal wave activity in this region. As a result, the predominant... VB2 (VB squared) profiles were calculated from density profiles taken from each phase at a common location (Fig. 149). The location is approximately
9. NORTHEAST FROM SOUTH ENTRANCE ACROSS RECEIVING AREA OF FACTORY ...
9. NORTHEAST FROM SOUTH ENTRANCE ACROSS RECEIVING AREA OF FACTORY PAST THE GLASS-ENCLOSED OFFICE TOWARD SHOP AREA. BESIDE THE VERTICAL POST ROOF SUPPORT IN THE LEFT FOREGROUND IS A SCALE AND DRAFTING TABLE. BESIDE THE OFFICE WALL ON THE RIGHT IS A SMALL SHOP REPAIR BENCH, WHILE ABOVE THE OFFICE WINDOWS ARE BOXES OF COMPANY MANUSCRIPT BUSINESS RECORDS. THE WELDED METAL PIPE RACK IS A MODERN INTRUSION. - Kregel Windmill Company Factory, 1416 Central Avenue, Nebraska City, Otoe County, NE
Does Cosmological Scale Expansion Explain the Universe?
NASA Astrophysics Data System (ADS)
Masreliez, C. J.
2009-12-01
The idea of the creation of the world has been central in Western civilization since the earliest recorded history some 6000 years ago and it still prevails, supported by religious dogma. If the creation idea is wrong and the universe is eternal we might wonder why science has not yet revealed this fundamental truth. To understand why, we have to review how the Big Bang theory came to be the dominant cosmological paradigm in spite of many clear indications that the theory might be fundamentally flawed.
Wujec, Monika; Kędzierska, Ewa; Kuśmierz, Edyta; Plech, Tomasz; Wróbel, Andrzej; Paneth, Agata; Orzelska, Jolanta; Fidecka, Sylwia; Paneth, Piotr
2014-04-16
This article describes the synthesis of six 4-aryl-(thio)semicarbazides (series a and b) linked with diphenylacetyl moiety along with their pharmacological evaluation on the central nervous system in mice and computational studies, including conformational analysis and electrostatic properties. All thiosemicarbazides (series b) were found to exhibit strong antinociceptive activity in the behavioural model. Among them, compound 1-diphenylacetyl-4-(4-methylphenyl)thiosemicarbazide 1b was found to be the most potent analgesic agent, whose activity is connected with the opioid system. For compounds from series a significant anti-serotonergic effect, especially for compound 1-diphenylacetyl-4-(4-methoxyphenyl)semicarbazide 2b was observed. The computational studies strongly support the obtained results.
Data Understanding Applied to Optimization
NASA Technical Reports Server (NTRS)
Buntine, Wray; Shilman, Michael
1998-01-01
The goal of this research is to explore and develop software for supporting visualization and data analysis of search and optimization. Optimization is an ever-present problem in science. The theory of NP-completeness implies that the problems can only be resolved by increasingly smarter problem specific knowledge, possibly for use in some general purpose algorithms. Visualization and data analysis offers an opportunity to accelerate our understanding of key computational bottlenecks in optimization and to automatically tune aspects of the computation for specific problems. We will prototype systems to demonstrate how data understanding can be successfully applied to problems characteristic of NASA's key science optimization tasks, such as central tasks for parallel processing, spacecraft scheduling, and data transmission from a remote satellite.
Rotich, Joseph K; Hannan, Terry J; Smith, Faye E; Bii, John; Odero, Wilson W; Vu, Nguyen; Mamlin, Burke W; Mamlin, Joseph J; Einterz, Robert M; Tierney, William M
2003-01-01
The authors implemented an electronic medical record system in a rural Kenyan health center. Visit data are recorded on a paper encounter form, eliminating duplicate documentation in multiple clinic logbooks. Data are entered into an MS-Access database supported by redundant power systems. The system was initiated in February 2001, and 10,000 visit records were entered for 6,190 patients in six months. The authors present a summary of the clinics visited, diagnoses made, drugs prescribed, and tests performed. After system implementation, patient visits were 22% shorter. They spent 58% less time with providers (p < 0.001) and 38% less time waiting (p = 0.06). Clinic personnel spent 50% less time interacting with patients, two thirds less time interacting with each other, and more time in personal activities. This simple electronic medical record system has bridged the "digital divide." Financial and technical sustainability by Kenyans will be key to its future use and development.
Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis G; Atkins, David C; Narayanan, Shrikanth S
2015-01-01
The technology for evaluating patient-provider interactions in psychotherapy-observational coding-has not changed in 70 years. It is labor-intensive, error prone, and expensive, limiting its use in evaluating psychotherapy in the real world. Engineering solutions from speech and language processing provide new methods for the automatic evaluation of provider ratings from session recordings. The primary data are 200 Motivational Interviewing (MI) sessions from a study on MI training methods with observer ratings of counselor empathy. Automatic Speech Recognition (ASR) was used to transcribe sessions, and the resulting words were used in a text-based predictive model of empathy. Two supporting datasets trained the speech processing tasks including ASR (1200 transcripts from heterogeneous psychotherapy sessions and 153 transcripts and session recordings from 5 MI clinical trials). The accuracy of computationally-derived empathy ratings were evaluated against human ratings for each provider. Computationally-derived empathy scores and classifications (high vs. low) were highly accurate against human-based codes and classifications, with a correlation of 0.65 and F-score (a weighted average of sensitivity and specificity) of 0.86, respectively. Empathy prediction using human transcription as input (as opposed to ASR) resulted in a slight increase in prediction accuracies, suggesting that the fully automatic system with ASR is relatively robust. Using speech and language processing methods, it is possible to generate accurate predictions of provider performance in psychotherapy from audio recordings alone. This technology can support large-scale evaluation of psychotherapy for dissemination and process studies.