Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-05
...] Dipping and Coating Operations (Dip Tanks) Standard; Extension of the Office of Management and Budget's... Standard on Dipping and Coating Operations (Dip Tanks) (29 CFR 1910.126(g)(4)). DATES: Comments must be... of efforts in obtaining information (29 U.S.C. 657). The Standard on Dipping and Coating Operations...
Through a Regional Applied Research Effort grant to the United States Geological Survey, Region 9 collaborated with ORD on this project to develop a standard operating procedure for collection of water and sediment samples for pyrethroid analysis.
Solar PV O&M Standards and Best Practices – Existing Gaps and Improvement Efforts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Geoffrey Taylor; Balfour, John R.; Keating, T. J.
2014-11-01
As greater numbers of photovoltaic (PV) systems are being installed, operations & maintenance (O&M) activities will need to be performed to ensure the PV system is operating as designed over its useful lifetime. To mitigate risks to PV system availability and performance, standardized procedures for O&M activities are needed to ensure high reliability and long-term system bankability. Efforts are just getting underway to address the need for standard O&M procedures as PV gains a larger share of U.S. generation capacity. Due to the existing landscape of how and where PV is installed, including distributed generation from small and medium PVmore » systems, as well as large, centralized utility-scale PV, O&M activities will require different levels of expertise and reporting, making standards even more important. This report summarizes recent efforts made by solar industry stakeholders to identify the existing standards and best practices applied to solar PV O&M activities, and determine the gaps that have yet to be, or are currently being addressed by industry.« less
Solar PV O&M Standards and Best Practices - Existing Gaps and Improvement Efforts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Geoffrey Taylor; Balfour, John R.; Keating, T. J.
2014-11-01
As greater numbers of photovoltaic (PV) systems are being installed, operations & maintenance (O&M) activities will need to be performed to ensure the PV system is operating as designed over its useful lifetime. To mitigate risks to PV system availability and performance, standardized procedures for O&M activities are needed to ensure high reliability and long-term system bankability. Efforts are just getting underway to address the need for standard O&M procedures as PV gains a larger share of U.S. generation capacity. Due to the existing landscape of how and where PV is installed, including distributed generation from small and medium PVmore » systems, as well as large, centralized utility-scale PV, O&M activities will require different levels of expertise and reporting, making standards even more important. This report summarizes recent efforts made by solar industry stakeholders to identify the existing standards and best practices applied to solar PV O&M activities, and determine the gaps that have yet to be, or are currently being addressed by industry.« less
Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.
1996-01-01
The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,
Improving operating room safety
2009-01-01
Despite the introduction of the Universal Protocol, patient safety in surgery remains a daily challenge in the operating room. This present study describes one community health system's efforts to improve operating room safety through human factors training and ultimately the development of a surgical checklist. Using a combination of formal training, local studies documenting operating room safety issues and peer to peer mentoring we were able to substantially change the culture of our operating room. Our efforts have prepared us for successfully implementing a standardized checklist to improve operating room safety throughout our entire system. Based on these findings we recommend a multimodal approach to improving operating room safety. PMID:19930577
14 CFR 25.683 - Operation tests.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Operation tests. 25.683 Section 25.683... STANDARDS: TRANSPORT CATEGORY AIRPLANES Design and Construction Control Systems § 25.683 Operation tests. It must be shown by operation tests that when portions of the control system subject to pilot effort loads...
14 CFR 25.683 - Operation tests.
Code of Federal Regulations, 2010 CFR
2010-01-01
... STANDARDS: TRANSPORT CATEGORY AIRPLANES Design and Construction Control Systems § 25.683 Operation tests. It must be shown by operation tests that when portions of the control system subject to pilot effort loads... control system are loaded to the maximum load expected in normal operation, the system is free from— (a...
14 CFR 25.683 - Operation tests.
Code of Federal Regulations, 2011 CFR
2011-01-01
... STANDARDS: TRANSPORT CATEGORY AIRPLANES Design and Construction Control Systems § 25.683 Operation tests. It must be shown by operation tests that when portions of the control system subject to pilot effort loads... control system are loaded to the maximum load expected in normal operation, the system is free from— (a...
14 CFR 25.683 - Operation tests.
Code of Federal Regulations, 2014 CFR
2014-01-01
... STANDARDS: TRANSPORT CATEGORY AIRPLANES Design and Construction Control Systems § 25.683 Operation tests. It must be shown by operation tests that when portions of the control system subject to pilot effort loads... control system are loaded to the maximum load expected in normal operation, the system is free from— (a...
14 CFR 25.683 - Operation tests.
Code of Federal Regulations, 2012 CFR
2012-01-01
... STANDARDS: TRANSPORT CATEGORY AIRPLANES Design and Construction Control Systems § 25.683 Operation tests. It must be shown by operation tests that when portions of the control system subject to pilot effort loads... control system are loaded to the maximum load expected in normal operation, the system is free from— (a...
Ground System Harmonization Efforts at NASA's Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Smith, Dan
2011-01-01
This slide presentation reviews the efforts made at Goddard Space Flight Center in harmonizing the ground systems to assist in collaboration in space ventures. The key elements of this effort are: (1) Moving to a Common Framework (2) Use of Consultative Committee for Space Data Systems (CCSDS) Standards (3) Collaboration Across NASA Centers (4) Collaboration Across Industry and other Space Organizations. These efforts are working to bring into harmony the GSFC systems with CCSDS standards to allow for common software, use of Commercial Off the Shelf Software and low risk development and operations and also to work toward harmonization with other NASA centers
Development of consistent hazard controls for DOE transuranic waste operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woody, W.J.
2007-07-01
This paper describes the results of a re-engineering initiative undertaken with the Department of Energy's (DOE) Office of Environmental Management (EM) in order to standardize hazard analysis assumptions and methods and resulting safety controls applied to multiple transuranic (TRU) waste operations located across the United States. A wide range of safety controls are historically applied to transuranic waste operations, in spite of the fact that these operations have similar operational characteristics and hazard/accident potential. The re-engineering effort supported the development of a DOE technical standard with specific safety controls designated for accidents postulated during waste container retrieval, staging/storage, venting, onsitemore » movements, and characterization activities. Controls cover preventive and mitigative measures; include both hardware and specific administrative controls; and provide protection to the facility worker, onsite co-located workers and the general public located outside of facility boundaries. The Standard development involved participation from all major DOE sites conducting TRU waste operations. Both safety analysts and operations personnel contributed to the re-engineering effort. Acknowledgment is given in particular to the following individuals who formed a core working group: Brenda Hawks, (DOE Oak Ridge Office), Patrice McEahern (CWI-Idaho), Jofu Mishima (Consultant), Louis Restrepo (Omicron), Jay Mullis (DOE-ORO), Mike Hitchler (WSMS), John Menna (WSMS), Jackie East (WSMS), Terry Foppe (CTAC), Carla Mewhinney (WIPP-SNL), Stephie Jennings (WIPP-LANL), Michael Mikolanis (DOESRS), Kraig Wendt (BBWI-Idaho), Lee Roberts (Fluor Hanford), and Jim Blankenhorn (WSRC). Additional acknowledgment is given to Dae Chung (EM) and Ines Triay (EM) for leadership and management of the re-engineering effort. (authors)« less
Explosive detection systems data collection final report
2016-10-01
Institute of Standards and Technology (NIST) project to develop standards for bomb squad operators. Under this effort, ARA was tasked with developing and...Collection (EDSDC) ................................................................ 5 DHS/NIST Support ( Bomb Squad Robotic Training Standards...Development ................................................................................................ 5 Figure 3. Layout of the Bomb Squad Test
Current Issues in the Design and Information Content of Instrument Approach Charts
DOT National Transportation Integrated Search
1995-03-01
This report documents an analysis and interview effort conducted to identify common operational errors made using : current Instrument Approach Plates (IAP), Standard Terminal Arrival Route (STAR) charts. Standard Instrument Departure : (SID) charts,...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-05-17
The U.S. Materials Transportation Bureau (MTB) withdraws an advanced notice of proposed rulemaking (ANPR) which requested advice, recommendations, and information relating to the issuance of additional occupational safety and health standards for the protection of employees engaged in the construction, operation, and maintenance of pipeline systems and facilities used in the transportation of hazardous materials. Comments submitted in response to the ANPR indicated that the issuance of additional occupational safety and health standards by the MTB would be a duplication of the U.S. Occupational Safety and Health Administration's efforts and would increase the possibility of jurisdictional disputes. Since the MTB'smore » present standards development efforts are primarily directed at public safety (as opposed to occupational safety) by regulating pipeline design, construction, operation, and maintenance activities, the MTB withdraws the ANPR.« less
Lessons Learned and Technical Standards: A Logical Marriage for Future Space Systems Design
NASA Technical Reports Server (NTRS)
Gill, Paul S.; Garcia, Danny; Vaughan, William W.; Parker, Nelson C. (Technical Monitor)
2002-01-01
A comprehensive database of engineering lessons learned that corresponds with relevant technical standards will be a valuable asset to those engaged in studies on future space vehicle developments, especially for structures, materials, propulsion, control, operations and associated elements. In addition, this will enable the capturing of technology developments applicable to the design, development, and operation of future space vehicles as planned in the Space Launch Initiative. Using the time-honored tradition of passing on lessons learned while utilizing the newest information technology, NASA has launched an intensive effort to link lessons learned acquired through various Internet databases with applicable technical standards. This paper will discuss the importance of lessons learned, the difficulty in finding relevant lessons learned while engaged in a space vehicle development, and the new NASA effort to relate them to technical standards that can help alleviate this difficulty.
14 CFR 23.683 - Operation tests.
Code of Federal Regulations, 2013 CFR
2013-01-01
... STANDARDS: NORMAL, UTILITY, ACROBATIC, AND COMMUTER CATEGORY AIRPLANES Design and Construction Control Systems § 23.683 Operation tests. (a) It must be shown by operation tests that, when the controls are... controls, loads not less than those corresponding to the maximum pilot effort established under § 23.405...
14 CFR 23.683 - Operation tests.
Code of Federal Regulations, 2010 CFR
2010-01-01
... STANDARDS: NORMAL, UTILITY, ACROBATIC, AND COMMUTER CATEGORY AIRPLANES Design and Construction Control Systems § 23.683 Operation tests. (a) It must be shown by operation tests that, when the controls are... controls, loads not less than those corresponding to the maximum pilot effort established under § 23.405...
14 CFR 23.683 - Operation tests.
Code of Federal Regulations, 2014 CFR
2014-01-01
... STANDARDS: NORMAL, UTILITY, ACROBATIC, AND COMMUTER CATEGORY AIRPLANES Design and Construction Control Systems § 23.683 Operation tests. (a) It must be shown by operation tests that, when the controls are... controls, loads not less than those corresponding to the maximum pilot effort established under § 23.405...
14 CFR 23.683 - Operation tests.
Code of Federal Regulations, 2011 CFR
2011-01-01
... STANDARDS: NORMAL, UTILITY, ACROBATIC, AND COMMUTER CATEGORY AIRPLANES Design and Construction Control Systems § 23.683 Operation tests. (a) It must be shown by operation tests that, when the controls are... controls, loads not less than those corresponding to the maximum pilot effort established under § 23.405...
14 CFR 23.683 - Operation tests.
Code of Federal Regulations, 2012 CFR
2012-01-01
... STANDARDS: NORMAL, UTILITY, ACROBATIC, AND COMMUTER CATEGORY AIRPLANES Design and Construction Control Systems § 23.683 Operation tests. (a) It must be shown by operation tests that, when the controls are... controls, loads not less than those corresponding to the maximum pilot effort established under § 23.405...
ERIC Educational Resources Information Center
Claycomb, Gregory D.; Venable, Frances A.
2015-01-01
In an effort to broaden the selection of research opportunities available to a student registered in a one-semester, upper-level independent study course at a primarily undergraduate institution (PUI), a highly motivated student was asked to select, evaluate, and modify a standard operating procedure (SOP). The student gained valuable experience…
Stevenson, Timothy H; Chevalier, Nicole A; Scher, Gregory R; Burke, Ronald L
2016-01-01
Effective multilateral military operations such as those conducted by the North Atlantic Treaty Organization (NATO) require close cooperation and standardization between member nations to ensure interoperability. Failure to standardize policies, procedures, and doctrine prior to the commencement of military operations will result in critical interoperability gaps, which jeopardize the health of NATO forces and mission success. To prevent these gaps from occurring, US forces must be actively involved with NATO standardization efforts such as the Committee of the Chiefs of Medical Services to ensure US interests are properly represented when NATO standards are developed and US doctrine and procedures will meet the established NATO requirements.
A human factors evaluation of the operational demonstration flight inspection aircraft.
DOT National Transportation Integrated Search
1995-05-01
These reports describe the data collection and analysis efforts performed by the Civil Aerospace Medical Institute's Human Factors Research Laboratory to assist the Office of Aviation System Standards (AVN) in the human factors evaluation of the Oper...
ASME Code Efforts Supporting HTGRs
DOE Office of Scientific and Technical Information (OSTI.GOV)
D.K. Morton
2010-09-01
In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This reportmore » discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.« less
ASME Code Efforts Supporting HTGRs
DOE Office of Scientific and Technical Information (OSTI.GOV)
D.K. Morton
2011-09-01
In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This reportmore » discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.« less
ASME Code Efforts Supporting HTGRs
DOE Office of Scientific and Technical Information (OSTI.GOV)
D.K. Morton
2012-09-01
In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This reportmore » discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.« less
NASA Astrophysics Data System (ADS)
Riggs, William R.
1994-05-01
SHARP is a Navy wide logistics technology development effort aimed at reducing the acquisition costs, support costs, and risks of military electronic weapon systems while increasing the performance capability, reliability, maintainability, and readiness of these systems. Lower life cycle costs for electronic hardware are achieved through technology transition, standardization, and reliability enhancement to improve system affordability and availability as well as enhancing fleet modernization. Advanced technology is transferred into the fleet through hardware specifications for weapon system building blocks of standard electronic modules, standard power systems, and standard electronic systems. The product lines are all defined with respect to their size, weight, I/O, environmental performance, and operational performance. This method of defining the standard is very conducive to inserting new technologies into systems using the standard hardware. This is the approach taken thus far in inserting photonic technologies into SHARP hardware. All of the efforts have been related to module packaging; i.e. interconnects, component packaging, and module developments. Fiber optic interconnects are discussed in this paper.
Telemedicine standardization in the NATO environment.
Lam, David M; Poropatich, Ronald K; Gilbert, Gary R
2004-01-01
As the North Atlantic Treaty Organization (NATO) has evolved its doctrine from that of strictly national medical support during operations to that of multinational medical support, the importance of, and the need for, telemedicine standardization has become apparent. This article describes the efforts made by NATO in recent years to begin the process of telemedicine (TMED) standardization within the Alliance.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-02
... Standards and Technology's (NIST) Computer Security Division maintains a Computer Security Resource Center... Regarding Driver History Record Information Security, Continuity of Operation Planning, and Disaster... (SDLAs) to support their efforts at maintaining the security of information contained in the driver...
Effective monitoring and evaluation of military humanitarian medical operations.
Waller, Stephen G; Powell, Clydette; Ward, Jane B; Riley, Kevin
2011-01-01
Non-military government agencies and non-governmental organizations (NGOs) have made great strides in the evaluation of humanitarian medical work, and have learned valuable lessons regarding monitoring and evaluation (M&E) that may be equally as valuable to military medical personnel. We reviewed the recent literature by the worldwide humanitarian community regarding the art and science of M&E, with focus toward military applications. The successes and failures of past humanitarian efforts have resulted in prolific analyses. Alliances of NGOs set the standard for humanitarian quality and M&E standards. Military medical personnel can apply some of these standards to military humanitarian M&E in complex and stability operations. The authors believe that the NGO community?s M&E standards should be applied to improve evaluation of U.S. military medical humanitarian operations. 2011.
Increasing the Automation and Autonomy for Spacecraft Operations with Criteria Action Table
NASA Technical Reports Server (NTRS)
Li, Zhen-Ping; Savki, Cetin
2005-01-01
The Criteria Action Table (CAT) is an automation tool developed for monitoring real time system messages for specific events and processes in order to take user defined actions based on a set of user-defined rules. CAT was developed by Lockheed Martin Space Operations as a part of a larger NASA effort at the Goddard Space Flight Center (GSFC) to create a component-based, middleware-based, and standard-based general purpose ground system architecture referred as GMSEC - the GSFC Mission Services Evolution Center. CAT has been integrated into the upgraded ground systems for Tropical Rainfall Measuring Mission (TRMM) and Small Explorer (SMEX) satellites and it plays the central role in their automation effort to reduce the cost and increase the reliability for spacecraft operations. The GMSEC architecture provides a standard communication interface and protocol for components to publish/describe messages to an information bus. It also provides a standard message definition so components can send and receive messages to the bus interface rather than each other, thus reducing the component-to-component coupling, interface, protocols, and link (socket) management. With the GMSEC architecture, components can publish standard event messages to the bus for all nominal, significant, and surprising events in regard to satellite, celestial, ground system, or any other activity. In addition to sending standard event messages, each GMSEC compliant component is required to accept and process GMSEC directive request messages.
DOT National Transportation Integrated Search
1996-12-02
THE PURPOSE OF THIS DOCUMENT IS TO PROVIDE INFORMATION ON THE PLANNING FOR THE DEVELOPMENT AND DEPLOYMENT OF EDI STANDARDS FOR COMMERCIAL VEHICLE INFORMATION SYSTEMS AND NETWORKS (CVISN). THE STATUS, PRIORITIES, AND SCHEDULES FOR THIS EFFORT ARE CONT...
Operating manual for coaxial injection combustion model. [for the space shuttle main engine
NASA Technical Reports Server (NTRS)
Sutton, R. D.; Schuman, M. D.; Chadwick, W. D.
1974-01-01
An operating manual for the coaxial injection combustion model (CICM) is presented as the final report for an eleven month effort designed to provide improvement, to verify, and to document the comprehensive computer program for analyzing the performance of thrust chamber operation with gas/liquid coaxial jet injection. The effort culminated in delivery of an operation FORTRAN IV computer program and associated documentation pertaining to the combustion conditions in the space shuttle main engine. The computer program is structured for compatibility with the standardized Joint Army-Navy-NASA-Air Force (JANNAF) performance evaluation procedure. Use of the CICM in conjunction with the JANNAF procedure allows the analysis of engine systems using coaxial gas/liquid injection.
Electronic freight management (EFM) standards strategy
DOT National Transportation Integrated Search
2006-04-01
The EFM initiative is a U.S. Department of Transportation (DOT)-sponsored research effort aimed at improving the operating efficiency, safety, and security of freight movement. The initiative involves conducting a deployment test using Web services t...
Operations Brigade S3 Replaced by Operations Battalion
2012-12-06
Prisoner abuse exposed at Abu Ghraib prison in Iraq between October and December 2003 highlighted the need for modifications in detainee protocols and... Abu Ghraib Torture and Prisoner Abuse,” www.martinfrost.ws/htmlfiles/ abughraib2.html (accessed 19 June 2012). 2 Many Arabs and Muslims associated...operations in an effort to further the standardization process initially set in motion after the Abu Ghraib investigations. 7 LTG Miller’s directives
ERIC Educational Resources Information Center
Rhim, Lauren Morando
2005-01-01
Restructuring is a process initiated to substantively change the governance, operation and instruction of public schools or districts identified as failing. There are multiple definitions of restructuring, but the common thread binding all restructuring models is a substantive change of the standard operating procedures of a school or an entire…
Future pension accounting changes: implications for hospitals.
Weld, Tim; Klein, Gina
2011-05-01
Proposed rules in accounting for defined benefit plans may affect hospitals' statement of operations and affect the time, effort, and cost to comply with periodic financial reporting requirements. The new standard would require immediate recognition of the full amount of plan amendments in determining operating income. Hospitals should consider the role of pension plans in their compensation programs.
This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Standards and Trade in the 1990s
1993-01-01
in the 1990s This Technical Committee decided to work efforts made by EWOS (European Work - on ENs and ENVs for, among ...EMC work is that it possible, the results of our global partner, cannot be performed in isolation, by one the International Electrotechnical Commis...evolution of standards ficient operation and sound management and conformity assessment in the United depend on the
M. T. Kiefer; S. Zhong; W. E. Heilman; J. J. Charney; X. Bian
2013-01-01
Efforts to develop a canopy flow modeling system based on the Advanced Regional Prediction System (ARPS) model are discussed. The standard version of ARPS is modified to account for the effect of drag forces on mean and turbulent flow through a vegetation canopy, via production and sink terms in the momentum and subgrid-scale turbulent kinetic energy (TKE) equations....
NASA Astrophysics Data System (ADS)
Yamazaki, Towako
In order to stabilize and improve quality of information retrieval service, the information retrieval team of Daicel Corporation has given some efforts on standard operating procedures, interview sheet for information retrieval, structured format for search report, and search expressions for some technological fields of Daicel. These activities and efforts will also lead to skill sharing and skill tradition between searchers. In addition, skill improvements are needed not only for a searcher individually, but also for the information retrieval team totally when playing searcher's new roles.
44 CFR 300.3 - Financial assistance.
Code of Federal Regulations, 2013 CFR
2013-10-01
... and exercise procedures for State efforts in disaster response, including provision of individual and public assistance; (6) Standard operating procedures for individual State agencies to execute disaster... reduce vulnerability to natural hazards. (11) Plans or procedures for dealing with disasters not...
44 CFR 300.3 - Financial assistance.
Code of Federal Regulations, 2014 CFR
2014-10-01
... and exercise procedures for State efforts in disaster response, including provision of individual and public assistance; (6) Standard operating procedures for individual State agencies to execute disaster... reduce vulnerability to natural hazards. (11) Plans or procedures for dealing with disasters not...
44 CFR 300.3 - Financial assistance.
Code of Federal Regulations, 2011 CFR
2011-10-01
... and exercise procedures for State efforts in disaster response, including provision of individual and public assistance; (6) Standard operating procedures for individual State agencies to execute disaster... reduce vulnerability to natural hazards. (11) Plans or procedures for dealing with disasters not...
44 CFR 300.3 - Financial assistance.
Code of Federal Regulations, 2012 CFR
2012-10-01
... and exercise procedures for State efforts in disaster response, including provision of individual and public assistance; (6) Standard operating procedures for individual State agencies to execute disaster... reduce vulnerability to natural hazards. (11) Plans or procedures for dealing with disasters not...
78 FR 14717 - Energy Conservation Standards for Set-Top Boxes: Availability of Initial Analysis
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-07
.... Despite the participants' best efforts to negotiate a non- regulatory agreement, these talks ultimately... consumption of baseline products in on and sleep modes of operation by system level components (e.g., tuners...
This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Reliability of the individual components of the Canadian Armed Forces Physical Employment Standard.
Stockbrugger, Barry G; Reilly, Tara J; Blacklock, Rachel E; Gagnon, Patrick J
2018-01-29
This investigation recruited 24 participants from both the Canadian Armed Forces (CAF) and civilian populations to complete 4 separate trials at "best effort" of each of the 4 components in the CAF Physical Employment Standard named the FORCE Evaluation: Fitness for Operational Requirements of CAF Employment. Analyses were performed to examine the level of variability and reliability within each component. The results demonstrate that candidates should be provided with at least 1 retest if they have recently completed at least 2 previous best effort attempts as per the protocol. In addition, the minimal detectable difference is given for each of the 4 components in seconds which identifies the threshold for subsequent action, either retest or remedial training, for those unable to meet the minimum standard. These results will educate the delivery of this employment standard, function as a method of accommodation, in addition to providing direction for physical training programs.
Some Impacts of Risk-Centric Certification Requirements for UAS
NASA Technical Reports Server (NTRS)
Neogi, Natasha A. (Inventor); Hayhurst, Kelly J.; Maddalon, Jeffrey M.; Verstynen, Harry A.
2016-01-01
This paper discusses results from a recent study that investigates certification requirements for an unmanned rotorcraft performing agricultural application operations. The process of determining appropriate requirements using a risk-centric approach revealed a number of challenges that could impact larger UAS standardization efforts. Fundamental challenges include selecting the correct level of abstraction for requirements to permit design flexibility, transforming human-centric operational requirements to aircraft airworthiness requirements, and assessing all hazards associated with the operation.
DOT National Transportation Integrated Search
1986-09-01
The analysis work presented in this report is part of an ongoing effort by the Federal Aviation Administration (FAA) to develop improved rotorcraft separation standards. The subject of this report, Analysis and Recommendation of Separation Requiremen...
Savannah River Site Environmental Report for 1999
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arnett, M.
2000-06-30
The purpose of this report is to present summary environmental data that characterize site environmental management performance, confirm compliance with environmental standards and requirements, highlight significant programs and efforts, and assess the impact of SRS operations on the public and the environment.
Wool, Jute and Flax Industry Training Board
ERIC Educational Resources Information Center
Industrial Training International, 1974
1974-01-01
Early achievement in the textile industry training program focused on operative training, followed by emphasis on management development. Recruitment efforts have been increased. As Assessment of Training scheme provides standards, assistance, and recognition for individual companies in maintaining adequate training programs. (MW)
Joint CPT and N resonance in compact atomic time standards
NASA Astrophysics Data System (ADS)
Crescimanno, Michael; Hohensee, Michael; Xiao, Yanhong; Phillips, David; Walsworth, Ron
2008-05-01
Currently development efforts towards small, low power atomic time standards use current-modulated VCSELs to generate phase-coherent optical sidebands that interrogate the hyperfine structure of alkali atoms such as rubidium. We describe and use a modified four-level quantum optics model to study the optimal operating regime of the joint CPT- and N-resonance clock. Resonant and non-resonant light shifts as well as modulation comb detuning effects play a key role in determining the optimal operating point of such clocks. We further show that our model is in good agreement with experimental tests performed using Rb-87 vapor cells.
Network operability of ground-based microwave radiometers: Calibration and standardization efforts
NASA Astrophysics Data System (ADS)
Pospichal, Bernhard; Löhnert, Ulrich; Küchler, Nils; Czekala, Harald
2017-04-01
Ground-based microwave radiometers (MWR) are already widely used by national weather services and research institutions all around the world. Most of the instruments operate continuously and are beginning to be implemented into data assimilation for atmospheric models. Especially their potential for continuously observing boundary-layer temperature profiles as well as integrated water vapor and cloud liquid water path makes them valuable for improving short-term weather forecasts. However until now, most MWR have been operated as stand-alone instruments. In order to benefit from a network of these instruments, standardization of calibration, operation and data format is necessary. In the frame of TOPROF (COST Action ES1303) several efforts have been undertaken, such as uncertainty and bias assessment, or calibration intercomparison campaigns. The goal was to establish protocols for providing quality controlled (QC) MWR data and their uncertainties. To this end, standardized calibration procedures for MWR have been developed and recommendations for radiometer users compiled. Based on the results of the TOPROF campaigns, a new, high-accuracy liquid-nitrogen calibration load has been introduced for MWR manufactured by Radiometer Physics GmbH (RPG). The new load improves the accuracy of the measurements considerably and will lead to even more reliable atmospheric observations. Next to the recommendations for set-up, calibration and operation of ground-based MWR within a future network, we will present homogenized methods to determine the accuracy of a running calibration as well as means for automatic data quality control. This sets the stage for the planned microwave calibration center at JOYCE (Jülich Observatory for Cloud Evolution), which will be shortly introduced.
NASA Technical Reports Server (NTRS)
Briggs, Maxwell H.; Schifer, Nicholas A.
2012-01-01
The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two high-efficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including testing validation hardware, known as the Thermal Standard, to provide a direct comparison to numerical and empirical models used to predict convertor net heat input. This validation hardware provided a comparison for scrutinizing and improving empirical correlations and numerical models of ASC-E2 net heat input. This hardware simulated the characteristics of an ASC-E2 convertor in both an operating and non-operating mode. This paper describes the Thermal Standard testing and the conclusions of the validation effort applied to the empirical correlation methods used by the Radioisotope Power System (RPS) team at NASA Glenn.
NASA Astrophysics Data System (ADS)
Loftus, Pete; Giudice, Seb
2014-08-01
Measurements underpin the engineering decisions that allow products to be designed, manufactured, operated, and maintained. Therefore, the quality of measured data needs to be systematically assured to allow decision makers to proceed with confidence. The use of standards is one way of achieving this. This paper explores the relevance of international documentary standards to the assessment of measurement system capability in High Value Manufacturing (HVM) Industry. An internal measurement standard is presented which supplements these standards and recommendations are made for a cohesive effort to develop the international standards to provide consistency in such industrial applications.
Cost-effectiveness of the stream-gaging program in North Carolina
Mason, R.R.; Jackson, N.M.
1985-01-01
This report documents the results of a study of the cost-effectiveness of the stream-gaging program in North Carolina. Data uses and funding sources are identified for the 146 gaging stations currently operated in North Carolina with a budget of $777,600 (1984). As a result of the study, eleven stations are nominated for discontinuance and five for conversion from recording to partial-record status. Large parts of North Carolina 's Coastal Plain are identified as having sparse streamflow data. This sparsity should be remedied as funds become available. Efforts should also be directed toward defining the efforts of drainage improvements on local hydrology and streamflow characteristics. The average standard error of streamflow records in North Carolina is 18.6 percent. This level of accuracy could be improved without increasing cost by increasing the frequency of field visits and streamflow measurements at stations with high standard errors and reducing the frequency at stations with low standard errors. A minimum budget of $762,000 is required to operate the 146-gage program. A budget less than this does not permit proper service and maintenance of the gages and recorders. At the minimum budget, and with the optimum allocation of field visits, the average standard error is 17.6 percent.
Automated cellular sample preparation using a Centrifuge-on-a-Chip.
Mach, Albert J; Kim, Jae Hyun; Arshi, Armin; Hur, Soojung Claire; Di Carlo, Dino
2011-09-07
The standard centrifuge is a laboratory instrument widely used by biologists and medical technicians for preparing cell samples. Efforts to automate the operations of concentration, cell separation, and solution exchange that a centrifuge performs in a simpler and smaller platform have had limited success. Here, we present a microfluidic chip that replicates the functions of a centrifuge without moving parts or external forces. The device operates using a purely fluid dynamic phenomenon in which cells selectively enter and are maintained in microscale vortices. Continuous and sequential operation allows enrichment of cancer cells from spiked blood samples at the mL min(-1) scale, followed by fluorescent labeling of intra- and extra-cellular antigens on the cells without the need for manual pipetting and washing steps. A versatile centrifuge-analogue may open opportunities in automated, low-cost and high-throughput sample preparation as an alternative to the standard benchtop centrifuge in standardized clinical diagnostics or resource poor settings.
Performance standards for urban search and rescue robots
NASA Astrophysics Data System (ADS)
Messina, Elena; Jacoff, Adam
2006-05-01
In this paper, we describe work in performance standards for urban search and rescue (USAR) robots begun in 2004 by the Department of Homeland Security. This program is being coordinated by the National Institute of Standards and Technology and will result in consensus standards developed through ASTM International, under the Operational Equipment Subcommittee of their Homeland Security Committee. The first phase of the program involved definition of requirements by subject matter experts. Responders participated in a series of workshops to identify deployment categories for robots, performance categories, and ranges of acceptable or target performance in the various categories. Over one hundred individual requirements were identified, within main categories such as Human-System Interaction, Logistics, Operating Environment, and System (which includes Chassis, Communications, Mobility, Payload, Power, and Sensing). To ensure that the robot developers and eventual end users work closely together, "responders meet robots" events at situationally relevant sites are being held to refine and extend the performance requirements and develop standard test methods. The results of these standard performance tests will be captured in a compendium of existing and developmental robots with classifications and descriptors to differentiate particular robotic capabilities. This, along with ongoing efforts to categorize situational USAR constraints such as building collapse types or the presence of hazardous materials, will help responders match particular robotic capabilities to response needs. In general, these efforts will enable responders to effectively use robotic tools to enhance their effectiveness while reducing risk to personnel during disasters.
NASA Technical Reports Server (NTRS)
Runnels, R. L.
1973-01-01
The standards and procedures for the generation of operational display formats to be used in the Mission Control Center (MCC) display control system are presented. The required effort, forms, and fundamentals for the design, specifications, and production of display formats are identified. The principles of display design and system constraints controlling the creation of optimum operational displays for mission control are explained. The basic two types of MCC display systems for presenting information are described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Althouse, P E; Bertoldo, N A; Brown, R A
2005-09-28
The Lawrence Livermore National Laboratory (LLNL) annual Environmental Report, prepared for the Department of Energy (DOE) and made available to the public, presents summary environmental data that characterizes site environmental management performance, summarizes environmental occurrences and responses reported during the calendar year, confirms compliance with environmental standards and requirements, and highlights significant programs and efforts. By explaining the results of effluent and environmental monitoring, mentioning environmental performance indicators and performance measure programs, and assessing the impact of Laboratory operations on the environment and the public, the report also demonstrates LLNL's continuing commitment to minimize any potentially adverse impact of itsmore » operations. The combination of environmental and effluent monitoring, source characterization, and dose assessment showed that radiological doses to the public caused by LLNL operations in 2004 were less than 0.26% of regulatory standards and more than 11,000 times smaller than dose from natural background. Analytical results and evaluations generally showed continuing low levels of most contaminants; remediation efforts further reduced the concentrations of contaminants of concern in groundwater and soil vapor. In addition, LLNL's extensive environmental compliance activities related to water, air, endangered species, waste, wastewater, and waste reduction controlled or reduced LLNL's effects on the environment. LLNL's environmental program clearly demonstrates a commitment to protecting the environment from operational impacts.« less
JSC Design and Procedural Standards, JSC-STD-8080
NASA Technical Reports Server (NTRS)
Punch, Danny T.
2011-01-01
This document provides design and procedural requirements appropriate for inclusion in specifications for any human spaceflight program, project, spacecraft, system, or end item. The term "spacecraft" as used in the standards includes launch vehicles, orbital vehicles, non-terrestrial surface vehicles, and modules. The standards are developed and maintained as directed by Johnson Space Center (JSC) Policy Directive JPD 8080.2, JSC Design and Procedural Standards for Human Space Flight Equipment. The Design and Procedural Standards contained in this manual represent human spacecraft design and operational knowledge applicable to a wide range of spaceflight activities. These standards are imposed on JSC human spaceflight equipment through JPD 8080.2. Designers shall comply with all design standards applicable to their design effort.
Nóbrega, M F; Kinas, P G; Lessa, R; Ferrandis, E
2015-02-01
The sampling of fish from the artisanal fleet operating with surface lines off north-eastern Brazil was carried out between 1998 and 2000. Generalized linear models (GLMs) were used to standardize mean abundance indices using catch and fishing effort data on dolphinfish Coryphaena hippurus and to identify abundance trends in time and space, using 1215 surface line deployments. A standard relative abundance index (catch per unit effort, CPUE) was estimated for the most frequent vessels used in the sets, employing factors and coefficients generated in the GLMs. According to the models, C. hippurus catches are affected by the operating characteristics and power of different fishing vessels. These differences highlight the need for standardization of catch and effort data for artisanal fisheries. The highest mean abundance values for C. hippurus were off the state of Rio Grande do Norte, with an increasing tendency in areas with greater depths and more distant from the coast, reaching maximal values in areas whose depths range from 200 to 500 m. The highest mean abundance values occurred between April and June. The higher estimated abundance of C. hippurus in this period off the state of Rio Grande do Norte and within the 200-500 m depth range may be related to a migration pattern of food sources, as its main prey, the flying fish Hirundichthys affinis, uses floating algae as refuge and to deposit its pelagic eggs. © 2015 The Fisheries Society of the British Isles.
Multilingual Crews: Communication and the Operations of Ships.
ERIC Educational Resources Information Center
Sampson, Helen; Zhao, Minghua
2003-01-01
Focuses on efforts to improve standards of English among seafarers, including a top-down approach to language learning utilized by industry regulators and training establishments. Considers the effectiveness of top-down approaches to language development, drawing on ethnographic research conducted aboard vessels with multilingual crews.…
DOT National Transportation Integrated Search
1997-11-03
In this report to Congress, the General Accounting Office (GAO) examines the efforts by the Office of Motor Carriers and the states to (1) reduce serious accidents by conducting roadside inspections and compliance reviews, (2) better target motor car...
Integrating Mission Type Orders into Operational Level Intelligence Collection
2011-05-27
the planning and direction step of the intelligence process and turning them into collection tasks. The companion effort to CRM is COM. COM is the...the differences between MTOs and standard collection processes observing that “an MTO is asking a chef for their best soup, whereas the standard...collection deck is handing the chef a recipe calling for specific ingredients.”17 Theater collection lacks synergy from the perspective that it is
Transforming Our SMEX Organization by Way of Innovation, Standardization, and Automation
NASA Technical Reports Server (NTRS)
Madden, Maureen; Crouse, Pat; Carry, Everett; Esposito, timothy; Parker, Jeffrey; Bradley, David
2006-01-01
NASA's Small Explorer (SMEX) Flight Operations Team (FOT) is currently tackling the challenge of supporting ground operations for several satellites that have surpassed their designed lifetime and have a dwindling budget. At Goddard Space Flight Center (GSFC), these missions are presently being reengineered into a fleet-oriented ground system. When complete, this ground system will provide command and control of four SMEX missions, and will demonstrate fleet automation and control concepts as a pathfinder for additional mission integrations. A goal of this reengineering effort is to demonstrate new ground-system technologies that show promise of supporting longer mission lifecycles and simplifying component integration. In pursuit of this goal, the SMEX organization has had to examine standardization, innovation, and automation. A core technology being demonstrated in this effort is the GSFC Mission Services Evolution Center (GMSEC) architecture. The GMSEC architecture focuses on providing standard interfaces for ground system applications to promote application interoperability. Building around commercial Message Oriented Middleware and providing a common messaging standard allows GMSEC to provide the capabilities necessary to support integration of new software components into existing missions and increase the level of interaction within the system. For SMS, GMSEC has become the technology platform to transform flight operations with the innovation and automation necessary to reduce operational costs. The automation technologies supported in SMEX are built upon capabilities provided by the GMSEC architecture that allows the FOT to further reduce the involvement of the console, operator. Initially, SMEX is automating only routine operations, such as safety and health monitoring, basic commanding, and system recovery. The operational concepts being developed here will reduce the need for staffed passes and are a necessity for future fleet management. As this project continues to evolve, additional innovations beyond GMSEC and automation have, and will continue to be developed. The team developed techniques for migrating ground systems of existing on-orbit assets. The tools necessary to monitor and control software failures were integrated and tailored for operational environments. All this was done with a focus of extending fleet operations to mission beyond SMU. The result of this work is the foundation for a broader fleet-capable ground system that will include several missions supported by the Space Science Mission Operations Project.
European Patient Summary Guideline: Focus on Greece.
Berler, Alexander; Tagaris, Anastassios; Chronaki, Catherine
2016-01-01
The European Patient Summary (PS) guideline specifies a minimal dataset of essential and important information for unplanned or emergency care initially defined in the epSOS project with aim to improve patients' safety and quality of Care. The eHealth Network of European Union (EU) Member State (MS) representatives established under Article 14 of the EU directive 2011/24 on patient rights to cross-border healthcare adopted PS guideline in November 2013 and since then the guideline has been part of MS strategic eHealth implementation plans, standardization efforts, and concrete regional, national, European and international projects. This paper reviews implementation efforts for the implementation of an operational patient summary service in Greece drawing on challenges and lessons learned for sustainable standards-based large scale eHealth deployment in Europe and abroad, as well as the reuse of best practices from international standards and integration profiles.
Lightweight Radiator for in Space Nuclear Electric Propulsion
NASA Technical Reports Server (NTRS)
Craven, Paul; Tomboulian, Briana; SanSoucie, Michael
2014-01-01
Nuclear electric propulsion (NEP) is a promising option for high-speed in-space travel due to the high energy density of nuclear fission power sources and efficient electric thrusters. Advanced power conversion technologies may require high operating temperatures and would benefit from lightweight radiator materials. Radiator performance dictates power output for nuclear electric propulsion systems. Game-changing propulsion systems are often enabled by novel designs using advanced materials. Pitch-based carbon fiber materials have the potential to offer significant improvements in operating temperature, thermal conductivity, and mass. These properties combine to allow advances in operational efficiency and high temperature feasibility. An effort at the NASA Marshall Space Flight Center to show that woven high thermal conductivity carbon fiber mats can be used to replace standard metal and composite radiator fins to dissipate waste heat from NEP systems is ongoing. The goals of this effort are to demonstrate a proof of concept, to show that a significant improvement of specific power (power/mass) can be achieved, and to develop a thermal model with predictive capabilities making use of constrained input parameter space. A description of this effort is presented.
2013-07-01
applications introduced by third-party developers to connect to the Android operating system through an open software interface. This allows customers...Definition Multimedia Interface have been developed to address the need for standards for high-definition televisions and computer monitors. Perhaps
India's growing participation in global clinical trials.
Gupta, Yogendra K; Padhy, Biswa M
2011-06-01
Lower operational costs, recent regulatory reforms and several logistic advantages make India an attractive destination for conducting clinical trials. Efforts for maintaining stringent ethical standards and the launch of Pharmacovigilance Program of India are expected to maximize the potential of the country for clinical research. Copyright © 2011. Published by Elsevier Ltd.
The Habitat Demonstration Unit System Integration
NASA Technical Reports Server (NTRS)
Gill, Tracy R.; Kennedy, Kriss J.; Tri, Terry O.; Howe, Alan S.
2010-01-01
The Lunar Surface System Habitat Demonstration Unit (HDU) will require a project team to integrate a variety of contributions from National Aeronautics and Space Administration (NASA) centers and potential outside collaborators and poses a challenge in integrating these disparate efforts into a cohesive architecture. To accomplish the development of the first version of the HDU, the Pressurized Excursion Module (PEM), from conception in June 2009 to rollout for operations in July 2010, the HDU project team is using several strategies to mitigate risks and bring the separate efforts together. First, a set of design standards is being developed to define the interfaces between the various systems of PEM and to the payloads, such as the Geology Laboratory, that those systems will support. Scheduled activities such as early fit-checks and the utilization of a habitat avionics test bed prior to equipment installation into HDU PEM are planned to facilitate the integration process. A coordinated effort to establish simplified Computer Aided Design (CAD) standards and the utilization of a modeling and simulation systems will aid in design and integration concept development. Finally, decision processes on the shell development including the assembly sequence and the transportation have been fleshed out early on HDU design to maximize the efficiency of both integration and field operations.
USDI DCS technical support: Mississippi Test Facility
NASA Technical Reports Server (NTRS)
Preble, D. M.
1975-01-01
The objective of the technical support effort is to provide hardware and data processing support to DCS users so that application of the system may be simply and effectively implemented. Technical support at Mississippi Test Facility (MTF) is concerned primarily with on-site hardware. The first objective of the DCP hardware support was to assure that standard measuring apparatus and techniques used by the USGS could be adapted to the DCS. The second objective was to try to standardize the miscellaneous variety of parameters into a standard instrument set. The third objective was to provide the necessary accessories to simplify the use and complement the capabilities of the DCP. The standard USGS sites have been interfaced and are presently operating. These sites are stream gauge, ground water level and line operated quality of water. Evapotranspiration, meteorological and battery operated quality of water sites are planned for near future DCP operation. Three accessories which are under test or development are the Chu antenna, solar power supply and add-on memory. The DCP has proven to be relatively easy to interface with many monitors. The large antenna is awkward to install and transport. The DCS has met the original requirements well; it has and is proving that an operation, satellite-based data collection system is feasible.
Use of a Microprocessor to Implement an ADCCP Protocol (Federal Standard 1003).
1980-07-01
results of other studies, to evaluate the operational and economic impact of incorporating various options in Federal Standard 1003. The effort...the LSI interface and the microprocessor; the LSI chip deposits bytes in its buffer as the producer, and the MPU reads this data as the consumer...on the interface between the MPU and the LSI protocol chip. This requires two main processes to be running at the same time--transmit and receive. The
Lunar Regolith Simulant Materials: Recommendations for Standardization, Production, and Usage
NASA Technical Reports Server (NTRS)
Sibille, L.; Carpenter, P.; Schlagheck, R.; French, R. A.
2006-01-01
Experience gained during the Apollo program demonstrated the need for extensive testing of surface systems in relevant environments, including regolith materials similar to those encountered on the lunar surface. As NASA embarks on a return to the Moon, it is clear that the current lunar sample inventory is not only insufficient to support lunar surface technology and system development, but its scientific value is too great to be consumed by destructive studies. Every effort must be made to utilize standard simulant materials, which will allow developers to reduce the cost, development, and operational risks to surface systems. The Lunar Regolith Simulant Materials Workshop held in Huntsville, AL, on January 24 26, 2005, identified the need for widely accepted standard reference lunar simulant materials to perform research and development of technologies required for lunar operations. The workshop also established a need for a common, traceable, and repeatable process regarding the standardization, characterization, and distribution of lunar simulants. This document presents recommendations for the standardization, production and usage of lunar regolith simulant materials.
Telecommunications administration standard
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gustwiller, K.D.
1996-05-01
The administration of telecommunications is critical to proper maintenance and operation. The intent is to be able to properly support telecommunications for the distribution of all information within a building/campus. This standard will provide a uniform administration scheme that is independent of applications, and will establish guidelines for owners, installers, designers and contractors. This standard will accommodate existing building wiring, new building wiring and outside plant wiring. Existing buildings may not readily adapt to all applications of this standard, but the requirement for telecommunications administration is applicable to all buildings. Administration of the telecommunications infrastructure includes documentation (labels, records, drawings,more » reports, and work orders) of cables, termination hardware, patching and cross-connect facilities, telecommunications rooms, and other telecommunications spaces (conduits, grounding, and cable pathways are documented by Facilities Engineering). The investment in properly documenting telecommunications is a worthwhile effort. It is necessary to adhere to these standards to ensure quality and efficiency for the operation and maintenance of the telecommunications infrastructure for Sandia National Laboratories.« less
MARK-AGE standard operating procedures (SOPs): A successful effort.
Moreno-Villanueva, María; Capri, Miriam; Breusing, Nicolle; Siepelmeyer, Anne; Sevini, Federica; Ghezzo, Alessandro; de Craen, Anton J M; Hervonen, Antti; Hurme, Mikko; Schön, Christiane; Grune, Tilman; Franceschi, Claudio; Bürkle, Alexander
2015-11-01
Within the MARK-AGE project, a population study (3337 subjects) was conducted to identify a set of biomarkers of ageing which, as a combination of parameters with appropriate weighting, would measure biological age better than any single marker. The MARK-AGE project involves 14 European countries and a total of 26 research centres. In such a study, standard operating procedures (SOPs) are an essential task, which are binding for all MARK-AGE Beneficiaries. The SOPs cover all aspects of subject's recruitment, collection, shipment and distribution of biological samples (blood and its components, buccal mucosa cells or BMC and urine) as well as the anthropometric measurements and questionnaires. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Soldier-Warfighter Operationally Responsive Deployer for Space
NASA Technical Reports Server (NTRS)
Davis, Benny; Huebner, Larry; Kuhns, Richard
2015-01-01
The Soldier-Warfighter Operationally Responsive Deployer for Space (SWORDS) project was a joint project between the U.S. Army Space & Missile Defense Command (SMDC) and NASA. The effort, lead by SMDC, was intended to develop a three-stage liquid bipropellant (liquid oxygen/liquid methane), pressure-fed launch vehicle capable of inserting a payload of at least 25 kg to a 750-km circular orbit. The vehicle design was driven by low cost instead of high performance. SWORDS leveraged commercial industry standards to utilize standard hardware and technologies over customized unique aerospace designs. SWORDS identified broadly based global industries that have achieved adequate levels of quality control and reliability in their products and then designed around their expertise and business motivations.
Land use and land cover digital data
Fegeas, Robin G.; Claire, Robert W.; Guptill, Stephen C.; Anderson, K. Eric; Hallam, Cheryl A.
1983-01-01
The discipline of cartography is undergoing a number of profound changesthat center on the emerging influence ofdigital manipulation and analysis ofdata for the preparation of cartographic materials and for use in geographic information systems. Operational requirements have led to the development by the USGS National Mapping Division of several documents that establish in-house digital cartographic standards. In an effort to fulfill lead agency requirements for promulgation of Federal standards in the earth sciences, the documents have been edited and assembled with explanatory text into a USGS Circular. This Circular describes some of the pertinent issues relative to digital cartographic data standards, documents the digital cartographic data standards currently in use within the USGS, and details the efforts of the USGS related to the definition of national digital cartographic data standards. It consists of several chapters; the first is a general overview, and each succeeding chapter is made up from documents that establish in-house standards for one of the various types of digital cartographic data currently produced. This chapter 895-E, describes the Geographic Information Retrieval and Analysis System that is used in conjunction with the USGS land use and land cover classification system to encode, edit, manipuate, and analyze land use and land cover digital data.
Wallert, Mark A; Provost, Joseph J
2014-01-01
To enhance the preparedness of graduates from the Biochemistry and Biotechnology (BCBT) Major at Minnesota State University Moorhead for employment in the bioscience industry we have developed a new Industry certificate program. The BCBT Industry Certificate was developed to address specific skill sets that local, regional, and national industry experts identified as lacking in new B.S. and B.A. biochemistry graduates. The industry certificate addresses concerns related to working in a regulated industry such as Good Laboratory Practices, Good Manufacturing Practices, and working in a Quality System. In this article we specifically describe how we developed a validation course that uses Standard Operating Procedures to describe grading policy and laboratory notebook requirements in an effort to better prepare students to transition into industry careers. © 2013 by The International Union of Biochemistry and Molecular Biology.
NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR HHID AND IRN ASSIGNMENT (UA-F-1.0)
The purpose of this SOP is to outline HHID and IRN assignment during the Arizona NHEXAS project and the "Border" study. Keywords: field; records; HHID and IRN Assignment.
The National Human Exposure Assessment Survey (NHEXAS) is a federal interagency research effort coordinated...
Turning Lightning into Electricity: Organizing Parents for Education Reform
ERIC Educational Resources Information Center
Kelly, Andrew P.
2014-01-01
Families are the primary clients of public schools, but they are one of many constituencies who have a say in how schools actually operate. In all the technocratic fervor around "education reform"--the broad effort to implement standards and accountability, reform teacher tenure and evaluation, and increase parental choice--it is easy to…
Areas with close proximity to oil and natural gas operations in rural Utah have experienced winter ozone levels that exceed EPA’s National Ambient Air Quality Standards (NAAQS). Through a collaborative effort, EPA Region 8 – Air Program, ORD, and OAQPS used the Commun...
The effect of environmental initiatives on NASA specifications and standards activities
NASA Technical Reports Server (NTRS)
Griffin, Dennis; Webb, David; Cook, Beth
1995-01-01
The NASA Operational Environment Team (NOET) has conducted a survey of NASA centers specifications and standards that require the use of Ozone Depleting Substances (ODS's) (Chlorofluorocarbons (CFCs), Halons, and chlorinated solvents). The results of this survey are presented here, along with a pathfinder approach utilized at Marshall Space Flight Center (MSFC) to eliminate the use of ODS's in targeted specifications and standards. Presented here are the lessons learned from a pathfinder effort to replace CFC-113 in a significant MSFC specification for cleaning and cleanliness verification methods for oxygen, fuel and pneumatic service, including Shuttle propulsion elements.
NASA Technical Reports Server (NTRS)
Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.
2017-01-01
As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.
Payload Instrument Design Rules for Safe and Efficient Flight Operations
NASA Astrophysics Data System (ADS)
Montagnon, E.; Ferri, P.
2004-04-01
Payload operations are often being neglected in favour of optimisation of scientific performance of the instrument design. This has major drawbacks in terms of cost, safety, efficiency of operations and finally science return. By taking operational aspects into account in the early phases of the instrument design, with a minimum more cultural than financial or technological additional effort, many problems can be avoided or minimized, with significant benefits to be gained in the mission execution phases. This paper presents possible improvements based on the use of the telemetry and telecommand packet standard, proper sharing of autonomy functions between instrument and platform, and enhanced interface documents.
Lightweight Damage Tolerant Radiators for In-Space Nuclear Electric Power and Propulsion
NASA Technical Reports Server (NTRS)
Craven, Paul; SanSoucie, Michael P.; Tomboulian, Briana; Rogers, Jan; Hyers, Robert
2014-01-01
Nuclear electric propulsion (NEP) is a promising option for high-speed in-space travel due to the high energy density of nuclear power sources and efficient electric thrusters. Advanced power conversion technologies for converting thermal energy from the reactor to electrical energy at high operating temperatures would benefit from lightweight, high temperature radiator materials. Radiator performance dictates power output for nuclear electric propulsion systems. Pitch-based carbon fiber materials have the potential to offer significant improvements in operating temperature and mass. An effort at the NASA Marshall Space Flight Center to show that woven high thermal conductivity carbon fiber mats can be used to replace standard metal and composite radiator fins to dissipate waste heat from NEP systems is ongoing. The goals of this effort are to demonstrate a proof of concept, to show that a significant improvement of specific power (power/mass) can be achieved, and to develop a thermal model with predictive capabilities. A description of this effort is presented.
Willmer, D R; Haas, E J
2016-01-01
As national and international health and safety management system (HSMS) standards are voluntarily accepted or regulated into practice, organizations are making an effort to modify and integrate strategic elements of a connected management system into their daily risk management practices. In high-risk industries such as mining, that effort takes on added importance. The mining industry has long recognized the importance of a more integrated approach to recognizing and responding to site-specific risks, encouraging the adoption of a risk-based management framework. Recently, the U.S. National Mining Association led the development of an industry-specific HSMS built on the strategic frameworks of ANSI: Z10, OHSAS 18001, The American Chemistry Council's Responsible Care, and ILO-OSH 2001. All of these standards provide strategic guidance and focus on how to incorporate a plan-do-check-act cycle into the identification, management and evaluation of worksite risks. This paper details an exploratory study into whether practices associated with executing a risk-based management framework are visible through the actions of an organization's site-level management of health and safety risks. The results of this study show ways that site-level leaders manage day-to-day risk at their operations that can be characterized according to practices associated with a risk-based management framework. Having tangible operational examples of day-to-day risk management can serve as a starting point for evaluating field-level risk assessment efforts and their alignment to overall company efforts at effective risk mitigation through a HSMS or other processes.
Rural areas with close proximity to oil and natural gas operations in Utah have experienced winter ozone levels that exceed EPA’s National Ambient Air Quality Standards (NAAQS). Through a collaborative effort, EPA Region 8 – Air Program, ORD, and OAQPS used the Commun...
Air and Space Power Journal. Volume 25, Number 4, Winter 2011
2011-01-01
to the sustained pace in Operation Enduring Freedom. Fi- nally, the siege of An Loc in 1972 led to a sustained effort from 15 April until 31...An_Approach_to_Netcentric_Ops_Rich_Byrne.pdf; Elizabeth Harding, Leo Obrst, and Arnon Rosenthal, “Creating Standards for Multiway Data Sharing,” Edge: MITRE’s Advanced
NASA Astrophysics Data System (ADS)
Wibawa, Teja A.; Lehodey, Patrick; Senina, Inna
2017-02-01
Geo-referenced catch and fishing effort data of the bigeye tuna fisheries in the Indian Ocean over 1952-2014 were analyzed and standardized to facilitate population dynamics modeling studies. During this 62-year historical period of exploitation, many changes occurred both in the fishing techniques and the monitoring of activity. This study includes a series of processing steps used for standardization of spatial resolution, conversion and standardization of catch and effort units, raising of geo-referenced catch into nominal catch level, screening and correction of outliers, and detection of major catchability changes over long time series of fishing data, i.e., the Japanese longline fleet operating in the tropical Indian Ocean. A total of 30 fisheries were finally determined from longline, purse seine and other-gears data sets, from which 10 longline and 4 purse seine fisheries represented 96 % of the whole historical geo-referenced catch. Nevertheless, one-third of total nominal catch is still not included due to a total lack of geo-referenced information and would need to be processed separately, accordingly to the requirements of the study. The geo-referenced records of catch, fishing effort and associated length frequency samples of all fisheries are available at doi:10.1594/PANGAEA.864154.
Perskvist, Nasrin; Norlin, Loreana; Dillner, Joakim
2015-04-01
This article addresses the important issue of the standardization of the biobank process. It reports on i) the implementation of standard operating procedures for the processing of liquid-based cervical cells, ii) the standardization of storage conditions, and iii) the ultimate establishment of nationwide standardized biorepositories for cervical specimens. Given the differences in the infrastructure and healthcare systems of various county councils in Sweden, these efforts were designed to develop standardized methods of biobanking across the nation. The standardization of cervical sample processing and biobanking is an important and widely acknowledged issue. Efforts to address these concerns will facilitate better patient care and improve research based on retrospective and prospective collections of patient samples and cohorts. The successful nationalization of the Cervical Cytology Biobank in Sweden is based on three vital issues: i) the flexibility of the system to adapt to other regional systems, ii) the development of the system based on national collaboration between the university and the county councils, and iii) stable governmental financing by the provider, the Biobanking and Molecular Resource Infrastructure of Sweden (BBMRI.se). We will share our experiences with biorepository communities to promote understanding of and advances in opportunities to establish a nationalized biobank which covers the healthcare of the entire nation.
Bauer, Daniel R; Otter, Michael; Chafin, David R
2018-01-01
Studying and developing preanalytical tools and technologies for the purpose of obtaining high-quality samples for histological assays is a growing field. Currently, there does not exist a standard practice for collecting, fixing, and monitoring these precious samples. There has been some advancement in standardizing collection for the highest profile tumor types, such as breast, where HER2 testing drives therapeutic decisions. This review examines the area of tissue collection, transport, and monitoring of formalin diffusion and details a prototype system that could be used to help standardize tissue collection efforts. We have surveyed recent primary literature sources and conducted several site visits to understand the most error-prone processes in histology laboratories. This effort identified errors that resulted from sample collection techniques and subsequent transport delays from the operating room (OR) to the histology laboratories. We have therefore devised a prototype sample collection and transport concept. The system consists of a custom data logger and cold transport box and takes advantage of a novel cold + warm (named 2 + 2) fixation method. This review highlights the beneficial aspects of standardizing tissue collection, fixation, and monitoring. In addition, a prototype system is introduced that could help standardize these processes and is compatible with use directly in the OR and from remote sites.
Evaluation and Strategic Planning for the GLOBE Program
NASA Astrophysics Data System (ADS)
Geary, E. E.; Williams, V. L.
2010-12-01
The Global Learning and Observations to Benefit the Environment (GLOBE) Program is an international environmental education program. It unites educators, students and scientists worldwide to collaborate on inquiry based investigations of the environment and Earth system science. Evaluation of the GLOBE program has been challenging because of its broad reach, diffuse models of implementation, and multiple stakeholders. In an effort to guide current evaluation efforts, a logic model was developed that provides a visual display of how the GLOBE program operates. Using standard elements of inputs, activities, outputs, customers and outcomes, this model describes how the program operates to achieve its goals. The template used to develop this particular logic model aligns the GLOBE program operations with its program strategy, thus ensuring that what the program is doing supports the achievement of long-term, intermediate and annual goals. It also provides a foundation for the development of key programmatic metrics that can be used to gauge progress toward the achievement of strategic goals.
2011-01-01
Background The practice and research of medicine generates considerable quantities of data and model resources (DMRs). Although in principle biomedical resources are re-usable, in practice few can currently be shared. In particular, the clinical communities in physiology and pharmacology research, as well as medical education, (i.e. PPME communities) are facing considerable operational and technical obstacles in sharing data and models. Findings We outline the efforts of the PPME communities to achieve automated semantic interoperability for clinical resource documentation in collaboration with the RICORDO project. Current community practices in resource documentation and knowledge management are overviewed. Furthermore, requirements and improvements sought by the PPME communities to current documentation practices are discussed. The RICORDO plan and effort in creating a representational framework and associated open software toolkit for the automated management of PPME metadata resources is also described. Conclusions RICORDO is providing the PPME community with tools to effect, share and reason over clinical resource annotations. This work is contributing to the semantic interoperability of DMRs through ontology-based annotation by (i) supporting more effective navigation and re-use of clinical DMRs, as well as (ii) sustaining interoperability operations based on the criterion of biological similarity. Operations facilitated by RICORDO will range from automated dataset matching to model merging and managing complex simulation workflows. In effect, RICORDO is contributing to community standards for resource sharing and interoperability. PMID:21878109
Final Overview of ACES Simulation for Evaluation SARP Well-Clear Definitions
NASA Technical Reports Server (NTRS)
Santiago, Confesor; Johnson, Marcus A.; Isaacson, Doug; Hershey, David
2014-01-01
The UAS in the NAS project is studying the minimum operational performance standards for unmanned aerial systems (UAS's) detect-and-avoid (DAA) system in order to operate in the National Airspace System. The DoD's Science and research Panel (SARP) Well-Clear Workshop is investigating the time and spatial boundary at which an UAS violates well-clear. NASA is supporting this effort through use of its Airspace Concept Evaluation System (ACES) simulation platform. This briefing presents the final results to the SARP, which will be used to judge the three candidate well-clear definitions, and for the selection of the most operationally suitable option.
This document may be of assistance in applying the Title V air operating permit regulations. This document is part of the Title V Policy and Guidance Database available at www2.epa.gov/title-v-operating-permits/title-v-operating-permit-policy-and-guidance-document-index. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
NASA Technical Reports Server (NTRS)
Houbec, Keith; Tillman, Barry; Connolly, Janis
2010-01-01
For decades, Space Life Sciences and NASA as an Agency have considered NASA-STD-3000, Man-Systems Integration Standards, a significant contribution to human spaceflight programs and to human-systems integration in general. The document has been referenced in numerous design standards both within NASA and by organizations throughout the world. With research program and project results being realized, advances in technology and new information in a variety of topic areas now available, the time arrived to update this extensive suite of requirements and design information. During the past several years, a multi-NASA center effort has been underway to write the update to NASA-STD-3000 with standards and design guidance that would be applicable to all future human spaceflight programs. NASA-STD-3001 - Volumes 1 and 2 - and the Human Integration Design Handbook (HIDH) were created. Volume 1, Crew Health, establishes NASA s spaceflight crew health standards for the pre-flight, in-flight, and post-flight phases of human spaceflight. Volume 2, Human Factors, Habitability and Environmental Health, focuses on the requirements of human-system integration and how the human crew interacts with other systems, and how the human and the system function together to accomplish the tasks for mission success. The HIDH is a compendium of human spaceflight history and knowledge, and provides useful background information and research findings. And as the HIDH is a stand-alone companion to the Standards, the maintenance of the document has been streamlined. This unique and flexible approach ensures that the content is current and addresses the fundamental advances of human performance and human capabilities and constraints research. Current work focuses on the development of new sections of Volume 2 and collecting updates to the HIDH. The new sections in development expand the scope of the standard and address mission operations and support operations. This effort is again collaboration with representatives from the Johnson Space Center Missions Operations and Space Life Sciences Directorates and the Engineering Directorate from Kennedy Space Center as well as discipline experts from across the Agency.
Test Protocols for Advanced Inverter Interoperability Functions – Main Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not currently required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Test Protocols for Advanced Inverter Interoperability Functions - Appendices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Gonzalez, Sigifredo; Ralph, Mark E.
2013-11-01
Distributed energy resources (DER) such as photovoltaic (PV) systems, when deployed in a large scale, are capable of influencing significantly the operation of power systems. Looking to the future, stakeholders are working on standards to make it possible to manage the potentially complex interactions between DER and the power system. In 2009, the Electric Power Research Institute (EPRI), Sandia National Laboratories (SNL) with the U.S. Department of Energy (DOE), and the Solar Electric Power Association (SEPA) initiated a large industry collaborative to identify and standardize definitions for a set of DER grid support functions. While the initial effort concentrated onmore » grid-tied PV inverters and energy storage systems, the concepts have applicability to all DER. A partial product of this on-going effort is a reference definitions document (IEC TR 61850-90-7, Object models for power converters in distributed energy resources (DER) systems) that has become a basis for expansion of related International Electrotechnical Commission (IEC) standards, and is supported by US National Institute of Standards and Technology (NIST) Smart Grid Interoperability Panel (SGIP). Some industry-led organizations advancing communications protocols have also embraced this work. As standards continue to evolve, it is necessary to develop test protocols to independently verify that the inverters are properly executing the advanced functions. Interoperability is assured by establishing common definitions for the functions and a method to test compliance with operational requirements. This document describes test protocols developed by SNL to evaluate the electrical performance and operational capabilities of PV inverters and energy storage, as described in IEC TR 61850-90-7. While many of these functions are not now required by existing grid codes or may not be widely available commercially, the industry is rapidly moving in that direction. Interoperability issues are already apparent as some of these inverter capabilities are being incorporated in large demonstration and commercial projects. The test protocols are intended to be used to verify acceptable performance of inverters within the standard framework described in IEC TR 61850-90-7. These test protocols, as they are refined and validated over time, can become precursors for future certification test procedures for DER advanced grid support functions.« less
Improving Collaboration by Standardization Efforts in Systems Biology
Dräger, Andreas; Palsson, Bernhard Ø.
2014-01-01
Collaborative genome-scale reconstruction endeavors of metabolic networks would not be possible without a common, standardized formal representation of these systems. The ability to precisely define biological building blocks together with their dynamic behavior has even been considered a prerequisite for upcoming synthetic biology approaches. Driven by the requirements of such ambitious research goals, standardization itself has become an active field of research on nearly all levels of granularity in biology. In addition to the originally envisaged exchange of computational models and tool interoperability, new standards have been suggested for an unambiguous graphical display of biological phenomena, to annotate, archive, as well as to rank models, and to describe execution and the outcomes of simulation experiments. The spectrum now even covers the interaction of entire neurons in the brain, three-dimensional motions, and the description of pharmacometric studies. Thereby, the mathematical description of systems and approaches for their (repeated) simulation are clearly separated from each other and also from their graphical representation. Minimum information definitions constitute guidelines and common operation protocols in order to ensure reproducibility of findings and a unified knowledge representation. Central database infrastructures have been established that provide the scientific community with persistent links from model annotations to online resources. A rich variety of open-source software tools thrives for all data formats, often supporting a multitude of programing languages. Regular meetings and workshops of developers and users lead to continuous improvement and ongoing development of these standardization efforts. This article gives a brief overview about the current state of the growing number of operation protocols, mark-up languages, graphical descriptions, and fundamental software support with relevance to systems biology. PMID:25538939
NASA Technical Reports Server (NTRS)
Smith, Jeffrey H.; Drews, Michael
1990-01-01
The results are described of an effort to establish commonality and standardization of generic crew extravehicular (crew-EVA) and telerobotic task analysis primitives used for the study of spaceborne operations. Although direct crew-EVA plans are the most visible output of spaceborne operations, significant ongoing efforts by a wide variety of projects and organizations also require tools for estimation of crew-EVA and telerobotic times. Task analysis tools provide estimates for input to technical and cost tradeoff studies. A workshop was convened to identify the issues and needs to establish a common language and syntax for task analysis primitives. In addition, the importance of such a syntax was shown to have precedence over the level to which such a syntax is applied. The syntax, lists of crew-EVA and telerobotic primitives, and the data base in diskette form are presented.
Interference Analysis for an Aeronautical Mobile Airport Communications System
NASA Technical Reports Server (NTRS)
Wilson, Jeffrey D.; Kercewski, Robert J.
2010-01-01
The next generation of aeronautical communications for airport surface applications has been identified through a NASA research program and an international collaborative future communications study. The result, endorsed by both the United States and European regulatory agencies is called AeroMACS (Aeronautical Mobile Airport Communications System) and is based upon the IEEE 802.16e mobile wireless standard. Coordinated efforts to develop appropriate aviation standards for the AeroMACS system are now underway within RTCA (United States) and Eurocae (Europe). AeroMACS will be implemented in a recently allocated frequency band, 5091- 5150 MHz. As this band is also occupied by fixed satellite service uplinks, AeroMACS must be designed to avoid interference with this incumbent service. The aspects of AeroMACS operation that present potential interference to the fixed satellite service are under analysis in order to enable the definition of standards that assure that such interference will be avoided. The NASA Glenn Research Center has been involved in this analysis, and the first results of modeling and simulation efforts directed at this analysis are the subject of this paper.12
Interference Analysis for an Aeronautical Mobile Airport Communications System
NASA Technical Reports Server (NTRS)
Wilson, Jeffrey D.; Kerczewski, Robert J.
2011-01-01
The next generation of aeronautical communications for airport surface applications has been identified through a NASA research program and an international collaborative future communications study. The result, endorsed by both the United States and European regulatory agencies is called AeroMACS (Aeronautical Mobile Airport Communications System) and is based upon the IEEE 802.16e mobile wireless standard. Coordinated efforts to develop appropriate aviation standards for the AeroMACS system are now underway within RTCA (United States) and Eurocae (Europe). AeroMACS will be implemented in a recently allocated frequency band, 5091-5150 MHz. As this band is also occupied by fixed satellite service uplinks, AeroMACS must be designed to avoid interference with this incumbent service. The aspects of AeroMACS operation that present potential interference to the fixed satellite service are under analysis in order to enable the definition of standards that assure that such interference will be avoided. The NASA Glenn Research Center has been involved in this analysis, and the first results of modeling and simulation efforts directed at this analysis are the subject of this presentation.
Standards-based sensor interoperability and networking SensorWeb: an overview
NASA Astrophysics Data System (ADS)
Bolling, Sam
2012-06-01
The War fighter lacks a unified Intelligence, Surveillance, and Reconnaissance (ISR) environment to conduct mission planning, command and control (C2), tasking, collection, exploitation, processing, and data discovery of disparate sensor data across the ISR Enterprise. Legacy sensors and applications are not standardized or integrated for assured, universal access. Existing tasking and collection capabilities are not unified across the enterprise, inhibiting robust C2 of ISR including near-real time, cross-cueing operations. To address these critical needs, the National Measurement and Signature Intelligence (MASINT) Office (NMO), and partnering Combatant Commands and Intelligence Agencies are developing SensorWeb, an architecture that harmonizes heterogeneous sensor data to a common standard for users to discover, access, observe, subscribe to and task sensors. The SensorWeb initiative long term goal is to establish an open commercial standards-based, service-oriented framework to facilitate plug and play sensors. The current development effort will produce non-proprietary deliverables, intended as a Government off the Shelf (GOTS) solution to address the U.S. and Coalition nations' inability to quickly and reliably detect, identify, map, track, and fully understand security threats and operational activities.
Integrated multi-sensor package (IMSP) for unmanned vehicle operations
NASA Astrophysics Data System (ADS)
Crow, Eddie C.; Reichard, Karl; Rogan, Chris; Callen, Jeff; Seifert, Elwood
2007-10-01
This paper describes recent efforts to develop integrated multi-sensor payloads for small robotic platforms for improved operator situational awareness and ultimately for greater robot autonomy. The focus is on enhancements to perception through integration of electro-optic, acoustic, and other sensors for navigation and inspection. The goals are to provide easier control and operation of the robot through fusion of multiple sensor outputs, to improve interoperability of the sensor payload package across multiple platforms through the use of open standards and architectures, and to reduce integration costs by embedded sensor data processing and fusion within the sensor payload package. The solutions investigated in this project to be discussed include: improved capture, processing and display of sensor data from multiple, non-commensurate sensors; an extensible architecture to support plug and play of integrated sensor packages; built-in health, power and system status monitoring using embedded diagnostics/prognostics; sensor payload integration into standard product forms for optimized size, weight and power; and the use of the open Joint Architecture for Unmanned Systems (JAUS)/ Society of Automotive Engineers (SAE) AS-4 interoperability standard. This project is in its first of three years. This paper will discuss the applicability of each of the solutions in terms of its projected impact to reducing operational time for the robot and teleoperator.
Flying Unmanned Aircraft: A Pilot's Perspective
NASA Technical Reports Server (NTRS)
Pestana, Mark E.
2011-01-01
The National Aeronautics and Space Administration (NASA) is pioneering various Unmanned Aircraft System (UAS) technologies and procedures which may enable routine access to the National Airspace System (NAS), with an aim for Next Gen NAS. These tools will aid in the development of technologies and integrated capabilities that will enable high value missions for science, security, and defense, and open the door to low-cost, extreme-duration, stratospheric flight. A century of aviation evolution has resulted in accepted standards and best practices in the design of human-machine interfaces, the displays and controls of which serve to optimize safe and efficient flight operations and situational awareness. The current proliferation of non-standard, aircraft-specific flight crew interfaces in UAS, coupled with the inherent limitations of operating UAS without in-situ sensory input and feedback (aural, visual, and vestibular cues), has increased the risk of mishaps associated with the design of the "cockpit." The examples of current non- or sub- standard design features range from "annoying" and "inefficient", to those that are difficult to manipulate or interpret in a timely manner, as well as to those that are "burdensome" and "unsafe." A concerted effort is required to establish best practices and standards for the human-machine interfaces, for the pilot as well as the air traffic controller. In addition, roles, responsibilities, knowledge, and skill sets are subject to redefining the terms, "pilot" and "air traffic controller", with respect to operating UAS, especially in the Next-Gen NAS. The knowledge, skill sets, training, and qualification standards for UAS operations must be established, and reflect the aircraft-specific human-machine interfaces and control methods. NASA s recent experiences flying its MQ-9 Ikhana in the NAS for extended duration, has enabled both NASA and the FAA to realize the full potential for UAS, as well as understand the implications of current limitations. Ikhana is a Predator-B/Reaper UAS, built by General Atomics, Aeronautical Systems, Inc., and modified for research. Since 2007, the aircraft has been flown seasonally with a wing-mounted pod containing an infrared scanner, utilized to provide real-time wildfire geo-location data to various fire-fighting agencies in the western U.S. The multi-agency effort included an extensive process to obtain flight clearance from the FAA to operate under special provisions, given that UAS in general do not fully comply with current airspace regulations (e.g. sense-and-avoid requirements).
HFE Process Guidance and Standards for potential application to updating NRC guidance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacques Hugo; J. J. Persensky
2012-07-01
The U.S. Nuclear Regulatory Commission (NRC) reviews and evaluates the human factors engineering (HFE) programs of applicants for nuclear power plant construction permits, operating licenses, standard design certifications, and combined operating licenses. The purpose of these safety reviews is to help ensure that personnel performance and reliability are appropriately supported. Detailed design review procedures and guidance for the evaluations is provided in three key documents: the Standard Review Plan (NUREG-0800), the HFE Program Review Model (NUREG-0711), and the Human-System Interface Design Review Guidelines (NUREG-0700). These documents were last revised in 2007, 2004 and 2002, respectively. The NRC is committed tomore » the periodic update and improvement of these guidance documents to ensure that they remain state-of-the-art design evaluation tools. Thus, the NRC has initiated a project with BNL to update the NRC guidance to remain current with recent research on human performance, advances in HFE methods and tools, and new technology. INL supported Brookhaven National Lab (BNL) to update the detailed HFE review criteria contained in NUREG-0711 and NUREG-0700 based on (1) feedback obtained from end users, (2) the results of NRC research and development efforts supporting the NRC staff’s HFE safety reviews, and (3) other material the project staff identify as applicable to the update effort. INL submitted comments on development plans and sections of NUREGs 0800, 0711, and 0700. The contractor prepared the report attached here as the deliverable for this work.« less
ISO 9000 quality standards: a model for blood banking?
Nevalainen, D E; Lloyd, H L
1995-06-01
The recent American Association of Blood Banks publications Quality Program and Quality Systems in the Blood Bank and Laboratory Environment, the FDA's draft guidelines, and recent changes in the GMP regulations all discuss the benefits of implementing quality systems in blood center and/or manufacturing operations. While the medical device GMPs in the United States have been rewritten to accommodate a quality system approach similar to ISO 9000, the Center for Biologics Evaluation and Research of the FDA is also beginning to make moves toward adopting "quality systems audits" as an inspection process rather than using the historical approach of record reviews. The approach is one of prevention of errors rather than detection after the fact (Tourault MA, oral communication, November 1994). The ISO 9000 series of standards is a quality system that has worldwide scope and can be applied in any industry or service. The use of such international standards in blood banking should raise the level of quality within an organization, among organizations on a regional level, within a country, and among nations on a worldwide basis. Whether an organization wishes to become registered to a voluntary standard or not, the use of such standards to become ISO 9000-compliant would be a move in the right direction and would be a positive sign to the regulatory authorities and the public that blood banking is making a visible effort to implement world-class quality systems in its operations. Implementation of quality system standards such as the ISO 9000 series will provide an organized approach for blood banks and blood bank testing operations. With the continued trend toward consolidation and mergers, resulting in larger operational units with more complexity, quality systems will become even more important as the industry moves into the future.(ABSTRACT TRUNCATED AT 250 WORDS)
Tiltrotor Acoustic Flight Test: Terminal Area Operations
NASA Technical Reports Server (NTRS)
SantaMaria, O. L.; Wellman, J. B.; Conner, D. A.; Rutledge, C. K.
1991-01-01
This paper provides a comprehensive description of an acoustic flight test of the XV- 15 Tiltrotor Aircraft with Advanced Technology Blades (ATB) conducted in August and September 1991 at Crows Landing, California. The purpose of this cooperative research effort of the NASA Langley and Ames Research Centers was to obtain a preliminary, high quality database of far-field acoustics for terminal area operations of the XV-15 at a takeoff gross weight of approximately 14,000 lbs for various glide slopes, airspeeds, rotor tip speeds, and nacelle tilt angles. The test also was used to assess the suitability of the Crows Landing complex for full scale far-field acoustic testing. This was the first acoustic flight test of the XV-15 aircraft equipped with ATB involving approach and level flyover operations. The test involved coordination of numerous personnel, facilities and equipment. Considerable effort was made to minimize potential extraneous noise sources unique to the region during the test. Acoustic data from the level flyovers were analyzed, then compared with data from a previous test of the XV-15 equipped with Standard Metal Blades
Space shuttle low cost/risk avionics study
NASA Technical Reports Server (NTRS)
1971-01-01
All work breakdown structure elements containing any avionics related effort were examined for pricing the life cycle costs. The analytical, testing, and integration efforts are included for the basic onboard avionics and electrical power systems. The design and procurement of special test equipment and maintenance and repair equipment are considered. Program management associated with these efforts is described. Flight test spares and labor and materials associated with the operations and maintenance of the avionics systems throughout the horizontal flight test are examined. It was determined that cost savings can be achieved by using existing hardware, maximizing orbiter-booster commonality, specifying new equipments to MIL quality standards, basing redundancy on cost effective analysis, minimizing software complexity and reducing cross strapping and computer-managed functions, utilizing compilers and floating point computers, and evolving the design as dictated by the horizontal flight test schedules.
AF Cr(VI) Minimize Roadmap: Phase 1 Results
2010-12-01
Environmental Technology Technical Symposium & Workshop, 30 Nov ? 2 Dec 2010, Washington, DC. Sponsored by SERDP and ESTCP. 14. ABSTRACT Hexavalent chromium ...unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Minimizing Hexavalent Chromium Use in DoD...Operations Technical Session No. 2B C-40 CURRENT STATE OF AIR FORCE HEXAVALENT CHROMIUM REDUCTION EFFORTS MR. CARL PERAZZOLA Air Force Corrosion
Not the Drones You’re Looking for: Civil Drones Invading Stability Operations
2016-06-01
APPROVAL The undersigned certify that this thesis meets master’s-level standards of research , argumentation, and expression...Naval War College in 2015. His career as a developmental engineer and acquisition officer has included assignments at the Air Force Research ...culmination of not only an extensive research effort, but of an academic and intellectual journey two years in the making and I would be remiss to end
Civil use of night vision goggles within the National Airspace System
NASA Astrophysics Data System (ADS)
Winkel, James G.; Faber, Lorelei
2001-08-01
When properly employed, Night Vision Goggles (NVGs) improve a pilot's ability to see during periods of darkness. The resultant enhancement in situational awareness achieved when using NVGs, increases light safety during night VFR operations. FAA is constrained with a lack of requisite regulatory and guidance infrastructure to adequately facilitate the civil request for use in NVGs within the National Airspace System (NAS) Appliances and Equipment, is formed and tasked to develop: operational concept and operational requirements for NVG implementation into the NAS, minimum operational performance standards for NVGs, and training guidelines and considerations for NVG operations. This paper provides a historical perspective on use of NVGs within the NAS, the status of SC-196 work in progress, FAA integration of SC-196 committee products and the harmonization effort between EUROCAEs NVG committee and SC- 196.
NASA Astrophysics Data System (ADS)
Johnson, P. D.; Ferrini, V. L.; Jerram, K.
2016-12-01
In 2015 the National Science Foundation funded the University of New Hampshire's Center for Coastal and Ocean Mapping and Lamont-Doherty Earth Observatory, for the second time, to coordinate the effort of standardizing the quality of multibeam echosounder (MBES) data across the U.S. academic fleet. This effort supports 9 different ship operating institutions who manage a total of 12 multibeam-equipped ships carrying 6 different MBES systems, manufactured by two different companies. These MBES are designed to operate over a very wide range of depths and operational modes. The complexity of this endeavor led to the creation of the Multibeam Advisory Committee (MAC), a team of academic and industry experts whose mission is to support the needs of the U.S academic fleet's multibeam echo sounders through all of the phases of the "life" of a MBES system and its data, from initial acceptance of the system, to recommendations on at-sea acquisition of data, to validation of already installed systems, and finally to the post-survey data evaluation. The main activities of the MAC include 1.) standardizing both the Shipboard Acceptance Testing of all new systems and Quality Assurance Testing of already installed systems, 2.) working with the both the ship operators/technicians and the manufacturers of the multibeam systems to guarantee that each MBES is working at its peak performance level, 3.) developing tools that aid in the collection of data, assessment of the MBES hardware, and evaluation of the quality of the MBES data, 4.) creating "best practices" documentation concerning data acquisition and workflow, and 5.) providing a website, http://mac.unols.org, to host technical information, tools, reports, and a "help desk" for operators of the systems to ask questions concerning issues that they see with their systems.
Applying Corporate Climate Principles to Dental School Operations.
Robinson, Michelle A; Reddy, Michael S
2016-12-01
Decades of research have shown that organizational climate has the potential to form the basis of workplace operations and impact an organization's performance. Culture is related to climate but is not the same. "Culture" is the broader term, defining how things are done in an organization, while "climate" is a component of culture that describes how people perceive their environment. Climate can be changed but requires substantial effort over time by management and the workforce. Interest has recently grown in culture and climate in dental education due to the humanistic culture accreditation standard. The aim of this study was to use corporate climate principles to examine how organizational culture and, subsequently, workplace operations can be improved through specific strategic efforts in a U.S. dental school. The school's parent institution initiated a climate survey that the dental school used with qualitative culture data to drive strategic planning and change in the school. Administration of the same survey to faculty and staff members three times over a six-year period showed significant changes to the school's climate occurred as a new strategic plan was implemented that focused on reforming areas of weakness. Concentrated efforts in key areas in the strategic plan resulted in measurable improvements in climate perception. The study discovered that culture was an area previously overlooked but explicitly linked to the success of the organization.
Standard setting: the crucial issues. A case study of accounting & auditing.
Nowakowski, J R
1982-01-01
A study of standard-setting efforts in accounting and auditing is reported. The study reveals four major areas of concern in a professional standard-setting effort: (1) issues related to the rationale for setting standards, (2) issues related to the standard-setting board and its support structure, (3) issues related to the content of standards and rules for generating them, and (4) issues that deal with how standards are put to use. Principles derived from the study of accounting and auditing are provided to illuminate and assess standard-setting efforts in evaluation.
Moore, Lee J; Wilson, Mark R; McGrath, John S; Waine, Elizabeth; Masters, Rich S W; Vine, Samuel J
2015-09-01
Research has demonstrated the benefits of robotic surgery for the patient; however, research examining the benefits of robotic technology for the surgeon is limited. This study aimed to adopt validated measures of workload, mental effort, and gaze control to assess the benefits of robotic surgery for the surgeon. We predicted that the performance of surgical training tasks on a surgical robot would require lower investments of workload and mental effort, and would be accompanied by superior gaze control and better performance, when compared to conventional laparoscopy. Thirty-two surgeons performed two trials on a ball pick-and-drop task and a rope-threading task on both robotic and laparoscopic systems. Measures of workload (the surgery task load index), mental effort (subjective: rating scale for mental effort and objective: standard deviation of beat-to-beat intervals), gaze control (using a mobile eye movement recorder), and task performance (completion time and number of errors) were recorded. As expected, surgeons performed both tasks more quickly and accurately (with fewer errors) on the robotic system. Self-reported measures of workload and mental effort were significantly lower on the robotic system compared to the laparoscopic system. Similarly, an objective cardiovascular measure of mental effort revealed lower investment of mental effort when using the robotic platform relative to the laparoscopic platform. Gaze control distinguished the robotic from the laparoscopic systems, but not in the predicted fashion, with the robotic system associated with poorer (more novice like) gaze control. The findings highlight the benefits of robotic technology for surgical operators. Specifically, they suggest that tasks can be performed more proficiently, at a lower workload, and with the investment of less mental effort, this may allow surgeons greater cognitive resources for dealing with other demands such as communication, decision-making, or periods of increased complexity in the operating room.
Mullett, L.B.; Loach, B.G.; Adams, G.L.
1958-06-24
>Loaded waveguides are described for the propagation of electromagnetic waves with reduced phase velocities. A rectangular waveguide is dimensioned so as to cut-off the simple H/sub 01/ mode at the operating frequency. The waveguide is capacitance loaded, so as to reduce the phase velocity of the transmitted wave, by connecting an electrical conductor between directly opposite points in the major median plane on the narrower pair of waveguide walls. This conductor may take a corrugated shape or be an aperature member, the important factor being that the electrical length of the conductor is greater than one-half wavelength at the operating frequency. Prepared for the Second U.N. International ConferThe importance of nuclear standards is duscussed. A brief review of the international callaboration in this field is given. The proposal is made to let the International Organization for Standardization (ISO) coordinate the efforts from other groups. (W.D.M.)
Lessons Learned from Engineering a Multi-Mission Satellite Operations Center
NASA Technical Reports Server (NTRS)
Madden, Maureen; Cary, Everett, Jr.; Esposito, Timothy; Parker, Jeffrey; Bradley, David
2006-01-01
NASA's Small Explorers (SMEX) satellites have surpassed their designed science-lifetimes and their flight operations teams are now facing the challenge of continuing operations with reduced funding. At present, these missions are being reengineered into a fleet-oriented ground system at Goddard Space Flight Center (GSFC). When completed, this ground system will provide command and control of four SMEX missions and will demonstrate fleet automation and control concepts. As a path-finder for future mission consolidation efforts, this ground system will also demonstrate new ground-based technologies that show promise of supporting longer mission lifecycles and simplifying component integration. One of the core technologies being demonstrated in the SMEiX Mission Operations Center is the GSFC Mission Services Evolution Center (GMSEC) architecture. The GMSEC architecture uses commercial Message Oriented Middleware with a common messaging standard to realize a higher level of component interoperability, allowing for interchangeable components in ground systems. Moreover, automation technologies utilizing the GMSEC architecture are being evaluated and implemented to provide extended lights-out operations. This mode of operation will provide routine monitoring and control of the heterogeneous spacecraft fleet. The operational concepts being developed will reduce the need for staffed contacts and is seen as a necessity for fleet management. This paper will describe the experiences of the integration team throughout the reengineering effort of the SMEX ground system. Additionally, lessons learned will be presented based on the team s experiences with integrating multiple missions into a fleet-based automated ground system.
Lessons Learned from Engineering a Multi-Mission Satellite Operations Center
NASA Technical Reports Server (NTRS)
Madden, Maureen; Cary, Everett, Jr.; Esposito, Timothy; Parker, Jeffrey; Bradley, David
2006-01-01
NASA's Small Explorers (SMEX) satellites have surpassed their designed science-lifetimes and their flight operations teams are now facing the challenge of continuing operations with reduced funding. At present, these missions are being re-engineered into a fleet-oriented ground system at Goddard Space Flight Center (GSFC). When completed, this ground system will provide command and control of four SMEX missions and will demonstrate fleet automation and control concepts. As a path-finder for future mission consolidation efforts, this ground system will also demonstrate new ground-based technologies that show promise of supporting longer mission lifecycles and simplifying component integration. One of the core technologies being demonstrated in the SMEX Mission Operations Center is the GSFC Mission Services Evolution Center (GMSEC) architecture. The GMSEC architecture uses commercial Message Oriented Middleware with a common messaging standard to realize a higher level of component interoperability, allowing for interchangeable components in ground systems. Moreover, automation technologies utilizing the GMSEC architecture are being evaluated and implemented to provide extended lights-out operations. This mode of operation will provide routine monitoring and control of the heterogeneous spacecraft fleet. The operational concepts being developed will reduce the need for staffed contacts and is seen as a necessity for fleet management. This paper will describe the experiences of the integration team throughout the re-enginering effort of the SMEX ground system. Additionally, lessons learned will be presented based on the team's experiences with integrating multiple missions into a fleet-automated ground system.
Assessment of Offshore Wind System Design, Safety, and Operation Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sirnivas, Senu; Musial, Walt; Bailey, Bruce
This report is a deliverable for a project sponsored by the U.S. Department of Energy (DOE) entitled National Offshore Wind Energy Resource and Design Data Campaign -- Analysis and Collaboration (contract number DE-EE0005372; prime contractor -- AWS Truepower). The project objective is to supplement, facilitate, and enhance ongoing multiagency efforts to develop an integrated national offshore wind energy data network. The results of this initiative are intended to 1) produce a comprehensive definition of relevant met-ocean resource assets and needs and design standards, and 2) provide a basis for recommendations for meeting offshore wind energy industry data and design certificationmore » requirements.« less
Clinical pathway for thoracic surgery in the United States
Wei, Benjamin
2016-01-01
The paradigm for postoperative care for thoracic surgical patients in the United States has shifted with efforts to reduce hospital length of stay and improve quality of life. The increasing usage of minimally invasive techniques in thoracic surgery has been an important part of this. In this review we will examine our standard practices as well as the evidence behind both general contemporary postoperative care principles and those specific to certain operations. PMID:26941967
Utilizing Robot Operating System (ROS) in Robot Vision and Control
2015-09-01
actually feel more comfortable with the black screen and white letters now. I would also like to thank James Calusdian for his tireless efforts in...originally designed by Willow Garage and currently maintained by the Open Source Robotics Foundation, is a powerful tool because it utilizes object...Visualization The Rviz package, developed by Willow Garage, comes standard with ROS and is a powerful visualization tool that allows users to visualize
2011-10-01
working memory),(Vanderploeg et al., 2005) Wechsler Adult Intelligence Scale III (WAIS-III) items: Digit Symbol Coding, Digit Span, Letter-Number...EFFORT William C. Walker, MD, Principal Investigator September, 2008 20 (see: VCU sub-award) David X. Cifu, MD Co-Investigator September, 2008...group for our purposes. The neuropsychological battery will consist of the following standardized, validated, tests of proven reliability: Wechsler Test
1978 Army Library Institute, 22-26 May 1978. Fort Bliss, Texas. A report of the Proceedings
1978-10-01
functions of management, librarianship and information science. - To encourage greater self-appraisal and self-development efforts. - To explore some...series of manuals describing the process in detail. This study impacts heavily on Army libraries, which use ALA standards of service as well as... STUDY OF ARMY LIBRARIES: Today Where Do We Stand .MAJ Paul Tracy Girard The Office of the Adjutant General Plans and Operations Directorate
Ho, David M; Huo, Michael H
2007-07-01
Total knee replacement (TKR) operation is one of the most effective procedures, both clinically and in terms of cost. Because of increased volume and cost for this procedure during the past 3 decades, TKRs are often targeted for cost reduction. The purpose of this study was to evaluate the efficacy of two cost reducing methodologies, establishment of critical clinical pathways, and standardization of implant costs. Ninety patients (90 knees) were randomly selected from a population undergoing primary TKR during a 2-year period at a tertiary teaching hospital. Patients were assigned to three groups that corresponded to different strategies implemented during the evolution of the joint-replacement program. Medical records were reviewed for type of anesthesia, operative time, length of stay, and any perioperative complications. Financial information for each patient was compared among the three groups. Data analysis demonstrated that the institution of a critical pathway significantly shortened length of hospital stay and was effective in reducing the hospital costs by 18% (p < 0.05). In addition, standardization of surgical techniques under the care of a single surgeon substantially reduced the operative time. Selection of implants from a single vendor did not have any substantial effect in additionally reducing the costs. Standardized postoperative management protocols and critical clinical pathways can reduce costs and operative time. Future efforts must focus on lowering the costs of the prostheses, particularly with competitive bidding or capitation of prostheses costs. Although a single-vendor approach was not effective in this study, it is possible that a cost reduction could have been realized if more TKRs were performed, because the pricing contract was based on projected volume of TKRs to be done by the hospital.
Clinical utility of cerebrospinal fluid biomarkers in the diagnosis of early Alzheimer’s disease
Blennow, Kaj; Dubois, Bruno; Fagan, Anne M.; Lewczuk, Piotr; de Leon, Mony J.; Hampel, Harald
2015-01-01
Several potential disease-modifying drugs for Alzheimer’s disease (AD) have failed to show any effect on disease progression in clinical trials, conceivably because the AD subjects are already too advanced to derive clinical benefit from treatment and because diagnosis based on clinical criteria alone introduces a high misdiagnosis rate. Thus, well-validated biomarkers for early detection and accurate diagnosis are crucial. Low cerebrospinal fluid (CSF) concentrations of the amyloid-β (Aβ1-42) peptide, in combination with high total tau and phosphorylated tau, are sensitive and specific biomarkers highly predictive of progression to AD dementia in patients with mild cognitive impairment. However, interlaboratory variations in the results seen with currently available immunoassays are of concern. Recent worldwide standardization efforts and quality control programs include standard operating procedures for both preanalytical (e.g., lumbar puncture and sample handling) and analytical (e.g., preparation of calibration curve) procedures. Efforts are also ongoing to develop highly reproducible assays on fully automated instruments. These global standardization and harmonization measures will provide the basis for the generalized international application of CSF bio-markers for both clinical trials and routine clinical diagnosis of AD. PMID:24795085
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-05-01
This technical memorandum presents Preliminary Remediation Goals (PRGs) for use in human health risk assessment efforts under the United States Department of Energy, Oak Ridge Operations Office Environmental Restoration (ER) Division. This document provides the ER Division with standardized PRGs which are integral to the Remedial Investigation/Feasibility Study process. They are used during project scooping (Data Quality Objectives development), in screening level risk assessments to support early action or No Further Investigation decisions, and in the baselines risk assessment where they are employed in the selection of chemicals of potential concern. The primary objective of this document is to standardizemore » these values and eliminate any duplication of effort by providing PRGs to all contractors involved in risk activities. In addition, by managing the assumptions and systems used in PRG derivation, the ER Risk Assessment Program will be able to control the level of quality assurance associated with these risk-based guideline values.« less
Genetics-based control of a mimo boiler-turbine plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dimeo, R.M.; Lee, K.Y.
1994-12-31
A genetic algorithm is used to develop an optimal controller for a non-linear, multi-input/multi-output boiler-turbine plant. The algorithm is used to train a control system for the plant over a wide operating range in an effort to obtain better performance. The results of the genetic algorithm`s controller designed from the linearized plant model at a nominal operating point. Because the genetic algorithm is well-suited to solving traditionally difficult optimization problems it is found that the algorithm is capable of developing the controller based on input/output information only. This controller achieves a performance comparable to the standard linear quadratic regulator.
Preliminary Recommendations for the Collection, Storage, and Analysis of UAS Safety Data
NASA Technical Reports Server (NTRS)
Enomoto, Francis; Bushnell, David; Denney, Ewen; Pai, Ganesh; Schumann, Johann
2013-01-01
Although the use of UASs in military and public service operations is proliferating, civilian use of UASs remains limited in the United States today. With efforts underway to accommodate and integrate UASs into the NAS, a proactive understanding of safety issues, i.e., the unique hazards and the corresponding risks that UASs pose not only through their operations for commercial purposes, but also to existing operations in the NAS, is especially important so as to (a) support the development of a sound regulatory basis, (b) regulate, design and properly equip UASs, and (c) effectively mitigate the risks posed. Data, especially about system and component failures, incidents, and accidents, provides valuable insight into how performance and operational capabilities/limitations contribute to hazards. Since the majority of UAS operations today take place in a context that is significantly different from the norm in civil aviation, i.e., with different operational goals and standards, identifying that which constitutes useful and sufficient data on UASs and their operations is a substantial research challenge.
Human Factors Guidelines for UAS in the National Airspace System
NASA Technical Reports Server (NTRS)
Hobbs, Alan; Shively, R. Jay
2013-01-01
The ground control stations (GCS) of some UAS have been characterized by less-than-adequate human-system interfaces. In some cases this may reflect a failure to apply an existing regulation or human factors standard. In other cases, the problem may indicate a lack of suitable guidance material. NASA is leading a community effort to develop recommendations for human factors guidelines for GCS to support routine beyond-line-of-sight UAS operations in the national airspace system (NAS). In contrast to regulations, guidelines are not mandatory requirements. However, by encapsulating solutions to identified problems or areas of risk, guidelines can provide assistance to system developers, users and regulatory agencies. To be effective, guidelines must be relevant to a wide range of systems, must not be overly prescriptive, and must not impose premature standardization on evolving technologies. By assuming that a pilot will be responsible for each UAS operating in the NAS, and that the aircraft will be required to operate in a manner comparable to conventionally piloted aircraft, it is possible to identify a generic set of pilot tasks and the information, control and communication requirements needed to support these tasks. Areas where guidelines will be useful can then be identified, utilizing information from simulations, operational experience and the human factors literature. In developing guidelines, we recognize that existing regulatory and guidance material will, at times, provide adequate coverage of an area. In other cases suitable guidelines may be found in existing military or industry human factors standards. In cases where appropriate existing standards cannot be identified, original guidelines will be proposed.
MUNICIPAL WASTE COMBUSTION ASSESSMENT ...
The report defines and characterizes types of medical waste, discusses the impacts of burning medical waste on combustor emissions, and outlines important handling and operating considerations. Facility-specific design, handling, and operating practiced are also discussed for municipal waste combustors (MWCs) that reportedly accept medical waste in the U.S., Europe, and Canada. nly very limited data are available on the emission impacts associated with the combustion of medical waste in MWGs. Especially lacking is information needed to fully evaluate the impacts on acid gas, dioxin, and metals emissions, as well as the design and operating requirements for complete destruction of solvents, cytotoxic chemicals, and pathogens. The EPA's Office of Air Quatity Planning and Standards is developing emission standards and guidelines for new and existing MWCs under Sections 111(b) and 111(d) of the Clean Air Act. In support of these regulatory development efforts, the Air and Energy Engineering Research Laboratory in EPA's Office of Research and Development has conducted an assessment to examine the incineration of medical waste in MWGs from an emission standpoint. Potential worker safety and health problems associated with handling of medical wastes and residues were also identified. information
Palazzetti, A; Sanchez-Salas, R; Capogrosso, P; Barret, E; Cathala, N; Mombet, A; Prapotnich, D; Galiano, M; Rozet, F; Cathelineau, X
2017-09-01
Radical cystectomy and regional lymph node dissection is the standard treatment for localized muscle-invasive and for high-risk non-muscle-invasive bladder cancer, and represents one of the main surgical urologic procedures. The open surgical approach is still widely adopted, even if in the last two decades efforts have been made in order to evaluate if minimally invasive procedures, either laparoscopic or robot-assisted, might show a benefit compared to the standard technique. Open radical cystectomy is associated with a high complication rate, but data from the laparoscopic and robotic surgical series failed to demonstrate a clear reduction in post-operative complication rates compared to the open surgical series. Laparoscopic and robotic radical cystectomy show a reduction in blood loss, in-hospital stay and transfusion rates but a longer operative time, while open radical cystectomy is typically associated with a shorter operative time but with a longer in-hospital admission and possibly a higher rate of high grade complications. Copyright © 2016. Publicado por Elsevier España, S.L.U.
NASA Astrophysics Data System (ADS)
Le Bras, R. J.; Arora, N. S.; Kushida, N.; Kebede, F.; Feitio, P.; Tomuta, E.
2017-12-01
The International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has reached out to the broader scientific community through a series of conferences, the later one of which took place in June 2017 in Vienna, Austria. Stemming out of this outreach effort, after the inception of research and development efforts in 2009, the NET-VISA software, following a Bayesian modelling approach, has been elaborated to improve on the key step of automatic association of joint seismic, hydro-acoustic, and infrasound detections. When compared with the current operational system, it has been consistently shown on off-line tests to improve the overlap with the analyst-reviewed Reviewed Event Bulletin (REB) by ten percent for an average of 85% overlap, while the inconsistency rate is essentially the same at about 50%. Testing by analysts in realistic conditions on a few days of data has also demonstrated the software performance in finding additional events which qualify for publication in the REB. Starting in August 2017, the automatic events produced by the software will be reviewed by analysts at the CTBTO, and we report on the initial evaluation of this introduction into operations.
NASA Astrophysics Data System (ADS)
Friedman, Gary; Schwuttke, Ursula M.; Burliegh, Scott; Chow, Sanguan; Parlier, Randy; Lee, Lorrine; Castro, Henry; Gersbach, Jim
1993-03-01
In the early days of JPL's solar system exploration, each spacecraft mission required its own dedicated data system with all software applications written in the mainframe's native assembly language. Although these early telemetry processing systems were a triumph of engineering in their day, since that time the computer industry has advanced to the point where it is now advantageous to replace these systems with more modern technology. The Space Flight Operations Center (SFOC) Prototype group was established in 1985 as a workstation and software laboratory. The charter of the lab was to determine if it was possible to construct a multimission telemetry processing system using commercial, off-the-shelf computers that communicated via networks. The staff of the lab mirrored that of a typical skunk works operation -- a small, multi-disciplinary team with a great deal of autonomy that could get complex tasks done quickly. In an effort to determine which approaches would be useful, the prototype group experimented with all types of operating systems, inter-process communication mechanisms, network protocols, packet size parameters. Out of that pioneering work came the confidence that a multi-mission telemetry processing system could be built using high-level languages running in a heterogeneous, networked workstation environment. Experience revealed that the operating systems on all nodes should be similar (i.e., all VMS or all PC-DOS or all UNIX), and that a unique Data Transport Subsystem tool needed to be built to address the incompatibilities of network standards, byte ordering, and socket buffering. The advantages of building a telemetry processing system based on emerging industry standards were numerous: by employing these standards, we would no longer be locked into a single vendor. When new technology came to market which offered ten times the performance at one eighth the cost, it would be possible to attach the new machine to the network, re-compile the application code, and run. In addition, we would no longer be plagued with lack of manufacturer support when we encountered obscure bugs. And maybe, hopefully, the eternal elusive goal of software portability across different vendors' platforms would finally be available. Some highlights of our prototyping efforts are described.
NASA Technical Reports Server (NTRS)
Friedman, Gary; Schwuttke, Ursula M.; Burliegh, Scott; Chow, Sanguan; Parlier, Randy; Lee, Lorrine; Castro, Henry; Gersbach, Jim
1993-01-01
In the early days of JPL's solar system exploration, each spacecraft mission required its own dedicated data system with all software applications written in the mainframe's native assembly language. Although these early telemetry processing systems were a triumph of engineering in their day, since that time the computer industry has advanced to the point where it is now advantageous to replace these systems with more modern technology. The Space Flight Operations Center (SFOC) Prototype group was established in 1985 as a workstation and software laboratory. The charter of the lab was to determine if it was possible to construct a multimission telemetry processing system using commercial, off-the-shelf computers that communicated via networks. The staff of the lab mirrored that of a typical skunk works operation -- a small, multi-disciplinary team with a great deal of autonomy that could get complex tasks done quickly. In an effort to determine which approaches would be useful, the prototype group experimented with all types of operating systems, inter-process communication mechanisms, network protocols, packet size parameters. Out of that pioneering work came the confidence that a multi-mission telemetry processing system could be built using high-level languages running in a heterogeneous, networked workstation environment. Experience revealed that the operating systems on all nodes should be similar (i.e., all VMS or all PC-DOS or all UNIX), and that a unique Data Transport Subsystem tool needed to be built to address the incompatibilities of network standards, byte ordering, and socket buffering. The advantages of building a telemetry processing system based on emerging industry standards were numerous: by employing these standards, we would no longer be locked into a single vendor. When new technology came to market which offered ten times the performance at one eighth the cost, it would be possible to attach the new machine to the network, re-compile the application code, and run. In addition, we would no longer be plagued with lack of manufacturer support when we encountered obscure bugs. And maybe, hopefully, the eternal elusive goal of software portability across different vendors' platforms would finally be available. Some highlights of our prototyping efforts are described.
1983-03-01
BUREAU OF STANDARDS-1963-A ,,...:-. .-. -.’" :.- --. . 4 Iq " USAAVRADCOM-TR-82-D-37 COMBAT MAINTENANCE CONCEPTS AND REPAIR TECHNIQUES USING SHAPE MEMORY...O APPLIED TECHNOLOGY LABORATORY POSITION STATEMENT The results of this effort determined the feasibility of using the full-ring shape memory alloy...specifications, or other data are used for any purpose other than in connection with a definitely related Government procurement operation, the United
Software Development Projects: Estimation of Cost and Effort (A Manager’s Digest).
1982-12-01
it later when changes need to be made to it. Various levels of difficulty are experiencel .4 due to the skill level of the programmer, poor Orcaram...impurities that if eliminatel 4 reduce the level of complexity of the program. They are as follows: 16 1. Complementary Operations: unreduced expressions 2...greater quality than that to support standard business applications. Remus defines quality as "...the number of program defects normalized by size over
2014-12-01
along the impermeable frozen soil layer. Soil freeze–thaw action disrupts soil structures, displaces soils particles , and creates voids both in...discharging along the bank face). This was caused by excess pore water pressure. Soil piping removes soil particles from their in-situ position, leaving...soil particles ERDC/CRREL SR-14-3 30 c. Maintain original side and bed slopes during the clearing efforts. d. Shape the channels to minimize
Martin, Thomas E.; Riordan, Margaret M.; Repin, Rimi; Mouton, James C.; Blake, William M.
2017-01-01
AimAdult survival is central to theories explaining latitudinal gradients in life history strategies. Life history theory predicts higher adult survival in tropical than north temperate regions given lower fecundity and parental effort. Early studies were consistent with this prediction, but standard-effort netting studies in recent decades suggested that apparent survival rates in temperate and tropical regions strongly overlap. Such results do not fit with life history theory. Targeted marking and resighting of breeding adults yielded higher survival estimates in the tropics, but this approach is thought to overestimate survival because it does not sample social and age classes with lower survival. We compared the effect of field methods on tropical survival estimates and their relationships with life history traits.LocationSabah, Malaysian Borneo.Time period2008–2016.Major taxonPasseriformes.MethodsWe used standard-effort netting and resighted individuals of all social and age classes of 18 tropical songbird species over 8 years. We compared apparent survival estimates between these two field methods with differing analytical approaches.ResultsEstimated detection and apparent survival probabilities from standard-effort netting were similar to those from other tropical studies that used standard-effort netting. Resighting data verified that a high proportion of individuals that were never recaptured in standard-effort netting remained in the study area, and many were observed breeding. Across all analytical approaches, addition of resighting yielded substantially higher survival estimates than did standard-effort netting alone. These apparent survival estimates were higher than for temperate zone species, consistent with latitudinal differences in life histories. Moreover, apparent survival estimates from addition of resighting, but not from standard-effort netting alone, were correlated with parental effort as measured by egg temperature across species.Main conclusionsInclusion of resighting showed that standard-effort netting alone can negatively bias apparent survival estimates and obscure life history relationships across latitudes and among tropical species.
ASTM and ASME-BPE Standards--Complying with the Needs of the Pharmaceutical Industry.
Huitt, William M
2011-01-01
Designing and building a pharmaceutical facility requires the owner, engineer of record, and constructor to be knowledgeable with regard to the industry codes and standards that apply to this effort. Up until 1997 there were no industry standards directed at the needs and requirements of the pharmaceutical industry. Prior to that time it was a patchwork effort at resourcing and adopting nonpharmaceutical-related codes and standards and then modifying them in order to meet the more stringent requirements of the Food and Drug Administration (FDA). In 1997 the American Society of Mechanical Engineers (ASME) published the first Bioprocessing Equipment (BPE) Standard. Through harmonization efforts this relatively new standard has brought together, scrutinized, and refined industry accepted methodologies together with FDA compliance requirements, and has established an American National Standard that provides a comprehensive set of standards that are integral to the pharmaceutical industry. This article describes various American National Standards, including those developed and published by the American Society for Testing and Materials (ASTM), and how they apply to the pharmaceutical industry. It goes on to discuss the harmonization effort that takes place between the various standards developers in an attempt to prevent conflicts and omissions between the many standards. Also included are examples of tables and figures taken from the ASME-BPE Standard. These examples provide the reader with insight to the relevant content of the ASME-BPE Standard. Designing and building a pharmaceutical facility requires the owner, engineer of record, and constructor to be knowledgeable with regard to the industry codes and standards that apply to this effort. Up until 1997 there were no industry standards directed at the needs and requirements of the pharmaceutical industry. Prior to that time it was a patchwork effort at resourcing and adopting nonpharmaceutical-related codes and standards and then modifying them in order to meet the more stringent requirements of the Food and Drug Administration (FDA). In 1997 the American Society of Mechanical Engineers (ASME) published the first Bioprocessing Equipment (BPE) Standard. In its initial development and ongoing maintenance it works with other American National Standards developers to harmonize the many standards associated with the design, engineering, and construction of bioprocessing facilities. This harmonization effort has established a comprehensive set of standards for the betterment of the pharmaceutical industry at large. This effort is, and will remain, very important as technology, along with new and improved product and processes, evolve into the future.
NREL-Led Effort Results in Groundbreaking New ASTM High-Octane Fuel
Standard | News | NREL NREL-Led Effort Results in Groundbreaking New ASTM High-Octane Fuel Standard NREL-Led Effort Results in Groundbreaking New ASTM High-Octane Fuel Standard April 7, 2017 NREL . Photo by Dennis Schroeder, NREL 20114. ASTM International recently announced the release of a new high
ERIC Educational Resources Information Center
Fallan, Lars; Opstad, Leiv
2012-01-01
The purpose of this paper is to explore how gender and personality preferences affect student attitudes towards effort response to higher grading standards. Data collected from 150 economics and business students at a Scandinavian business school reveals that higher grading standards enhance effort and time devoted to learning to a higher degree…
NASA Technical Reports Server (NTRS)
Stensrud, Kjell C.; Hamm, Dustin
2007-01-01
NASA's Johnson Space Center (JSC) / Flight Design and Dynamics Division (DM) has prototyped the use of Open Source middleware technology for building its next generation spacecraft mission support system. This is part of a larger initiative to use open standards and open source software as building blocks for future mission and safety critical systems. JSC is hoping to leverage standardized enterprise architectures, such as Java EE, so that its internal software development efforts can be focused on the core aspects of their problem domain. This presentation will outline the design and implementation of the Trajectory system and the lessons learned during the exercise.
The NASA Mission Operations and Control Architecture Program
NASA Technical Reports Server (NTRS)
Ondrus, Paul J.; Carper, Richard D.; Jeffries, Alan J.
1994-01-01
The conflict between increases in space mission complexity and rapidly declining space mission budgets has created strong pressures to radically reduce the costs of designing and operating spacecraft. A key approach to achieving such reductions is through reducing the development and operations costs of the supporting mission operations systems. One of the efforts which the Communications and Data Systems Division at NASA Headquarters is using to meet this challenge is the Mission Operations Control Architecture (MOCA) project. Technical direction of this effort has been delegated to the Mission Operations Division (MOD) of the Goddard Space Flight Center (GSFC). MOCA is to develop a mission control and data acquisition architecture, and supporting standards, to guide the development of future spacecraft and mission control facilities at GSFC. The architecture will reduce the need for around-the-clock operations staffing, obtain a high level of reuse of flight and ground software elements from mission to mission, and increase overall system flexibility by enabling the migration of appropriate functions from the ground to the spacecraft. The end results are to be an established way of designing the spacecraft-ground system interface for GSFC's in-house developed spacecraft, and a specification of the end to end spacecraft control process, including data structures, interfaces, and protocols, suitable for inclusion in solicitation documents for future flight spacecraft. A flight software kernel may be developed and maintained in a condition that it can be offered as Government Furnished Equipment in solicitations. This paper describes the MOCA project, its current status, and the results to date.
Improved high operating temperature MCT MWIR modules
NASA Astrophysics Data System (ADS)
Lutz, H.; Breiter, R.; Figgemeier, H.; Schallenberg, T.; Schirmacher, W.; Wollrab, R.
2014-06-01
High operating temperature (HOT) IR-detectors are a key factor to size, weight and power (SWaP) reduced IR-systems. Such systems are essential to provide infantrymen with low-weight handheld systems with increased battery lifetimes or most compact clip-on weapon sights in combination with high electro-optical performance offered by cooled IR-technology. AIM's MCT standard n-on-p technology with vacancy doping has been optimized over many years resulting in MWIR-detectors with excellent electro-optical performance up to operating temperatures of ~120K. In the last years the effort has been intensified to improve this standard technology by introducing extrinsic doping with Gold as an acceptor. As a consequence the dark current could considerably be suppressed and allows for operation at ~140K with good e/o performance. More detailed investigations showed that limitation for HOT > 140K is explained by consequences from rising dark current rather than from defective pixel level. Recently, several crucial parameters were identified showing great promise for further optimization of HOT-performance. Among those, p-type concentration could successfully be reduced from the mid 1016 / cm3 to the lower 1015/ cm3 range. Since AIM is one of the leading manufacturers of split linear cryocoolers, an increase in operating temperature will directly lead to IR-modules with improved SWaP characteristics by making use of the miniature members of its SX cooler family with single piston and balancer technology. The paper will present recent progress in the development of HOT MWIR-detector arrays at AIM and show electro-optical performance data in comparison to focal plane arrays produced in the standard technology.
Standards Development Activities at White Sands Test Facility
NASA Technical Reports Server (NTRS)
Baker, D. L.; Beeson, H. D.; Saulsberry, R. L.; Julien, H. L.; Woods, S. S.
2003-01-01
The development of standards and standard activities at the JSC White Sands Test Facility (WSTF) has been expanded to include the transfer of technology and standards to voluntary consensus organizations in five technical areas of importance to NASA. This effort is in direct response to the National Technology Transfer Act designed to accelerate transfer of technology to industry and promote government-industry partnerships. Technology transfer is especially important for WSTF, whose longterm mission has been to develop and provide vital propellant safety and hazards information to aerospace designers, operations personnel, and safety personnel. Meeting this mission is being accomplished through the preparation of consensus guidelines and standards, propellant hazards analysis protocols, and safety courses for the propellant use of hydrogen, oxygen, and hypergols, as well as the design and inspection of spacecraft pressure vessels and the use of pyrovalves in spacecraft propulsion systems. The overall WSTF technology transfer program is described and the current status of technology transfer activities are summarized.
Cost effectiveness of the stream-gaging program in Nevada
Arteaga, F.E.
1990-01-01
The stream-gaging network in Nevada was evaluated as part of a nationwide effort by the U.S. Geological Survey to define and document the most cost-effective means of furnishing streamflow information. Specifically, the study dealt with 79 streamflow gages and 2 canal-flow gages that were under the direct operation of Nevada personnel as of 1983. Cost-effective allocations of resources, including budget and operational criteria, were studied using statistical procedures known as Kalman-filtering techniques. The possibility of developing streamflow data at ungaged sites was evaluated using flow-routing and statistical regression analyses. Neither of these methods provided sufficiently accurate results to warrant their use in place of stream gaging. The 81 gaging stations were being operated in 1983 with a budget of $465,500. As a result of this study, all existing stations were determined to be necessary components of the program for the foreseeable future. At the 1983 funding level, the average standard error of streamflow records was nearly 28%. This same overall level of accuracy could have been maintained with a budget of approximately $445,000 if the funds were redistributed more equitably among the gages. The maximum budget analyzed, $1,164 ,000 would have resulted in an average standard error of 11%. The study indicates that a major source of error is lost data. If perfectly operating equipment were available, the standard error for the 1983 program and budget could have been reduced to 21%. (Thacker-USGS, WRD)
Operator interface for vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bissontz, Jay E
2015-03-10
A control interface for drivetrain braking provided by a regenerative brake and a non-regenerative brake is implemented using a combination of switches and graphic interface elements. The control interface comprises a control system for allocating drivetrain braking effort between the regenerative brake and the non-regenerative brake, a first operator actuated control for enabling operation of the drivetrain braking, and a second operator actuated control for selecting a target braking effort for drivetrain braking. A graphic display displays to an operator the selected target braking effort and can be used to further display actual braking effort achieved by drivetrain braking.
Aircraft LTO emissions regulations and implementations at European airports
NASA Astrophysics Data System (ADS)
Yunos, Siti Nur Mariani Mohd; Ghafir, Mohammad Fahmi Abdul; Wahab, Abas Ab
2017-04-01
Aviation affects the environment via the emission of pollutants from aircraft, impacting human health and ecosystem. Impacts of aircraft operations at lower ground towards local air quality have been recognized. Consequently, various standards and regulations have been introduced to address the related emissions. This paper discussed both environmental regulations by focusing more on the implementations of LTO emissions charges, an incentive-based regulation introduced in Europe as an effort to fill the gap in addressing the environmental issues related to aviation.
EPA/NASA/USAF Depainting Effort Concludes
NASA Technical Reports Server (NTRS)
Finckenor, Miria M.; Clark-Ingram, Marceia
2000-01-01
The final report contains strip rate data from all of the methods, lessons learned during processing, metallurgical evaluations of the panels, and summaries of corrosion and hydrogen embrittlement studies. Any changes in surface roughness, fatigue and tensile properties, and crack detectability are noted in the report. No process was singled out above the others, as companies should consider equipment and operational costs when complying with the Aerospace NESHAP (National Emission Standards for Hazardous Air Pollutants) and new OSHA (Occupational Safety and Health Administration) regulations.
CrossTalk, The Journal of Defense Software Engineering. Volume 26, Number 2. March/April 2013
2013-04-01
standards and best practices. “Software and hardware are at risk of being tam - pered with even before they are linked together in an operational system...because of their role in national and global security and the variety of valuable lessons learned and best practices they can provide because they are...Management. GAO said DoD’s efforts to implement SCRM can be a learning tool for others in the Federal government. DoD is currently imple- CrossTalk
Commercialization and Standardization Progress Towards an Optical Communications Earth Relay
NASA Technical Reports Server (NTRS)
Edwards, Bernard L.; Israel, David J.
2015-01-01
NASA is planning to launch the next generation of a space based Earth relay in 2025 to join the current Space Network, consisting of Tracking and Data Relay Satellites in space and the corresponding infrastructure on Earth. While the requirements and architecture for that relay satellite are unknown at this time, NASA is investing in communications technologies that could be deployed to provide new communications services. One of those new technologies is optical communications. The Laser Communications Relay Demonstration (LCRD) project, scheduled for launch in 2018 as a hosted payload on a commercial communications satellite, is a critical pathfinder towards NASA providing optical communications services on the next generation space based relay. This paper will describe NASA efforts in the on-going commercialization of optical communications and the development of inter-operability standards. Both are seen as critical to making optical communications a reality on future NASA science and exploration missions. Commercialization is important because NASA would like to eventually be able to simply purchase an entire optical communications terminal from a commercial provider. Inter-operability standards are needed to ensure that optical communications terminals developed by one vendor are compatible with the terminals of another. International standards in optical communications would also allow the space missions of one nation to use the infrastructure of another.
NASA Astrophysics Data System (ADS)
Tallapragada, V.
2017-12-01
NOAA's Next Generation Global Prediction System (NGGPS) has provided the unique opportunity to develop and implement a non-hydrostatic global model based on Geophysical Fluid Dynamics Laboratory (GFDL) Finite Volume Cubed Sphere (FV3) Dynamic Core at National Centers for Environmental Prediction (NCEP), making a leap-step advancement in seamless prediction capabilities across all spatial and temporal scales. Model development efforts are centralized with unified model development in the NOAA Environmental Modeling System (NEMS) infrastructure based on Earth System Modeling Framework (ESMF). A more sophisticated coupling among various earth system components is being enabled within NEMS following National Unified Operational Prediction Capability (NUOPC) standards. The eventual goal of unifying global and regional models will enable operational global models operating at convective resolving scales. Apart from the advanced non-hydrostatic dynamic core and coupling to various earth system components, advanced physics and data assimilation techniques are essential for improved forecast skill. NGGPS is spearheading ambitious physics and data assimilation strategies, concentrating on creation of a Common Community Physics Package (CCPP) and Joint Effort for Data Assimilation Integration (JEDI). Both initiatives are expected to be community developed, with emphasis on research transitioning to operations (R2O). The unified modeling system is being built to support the needs of both operations and research. Different layers of community partners are also established with specific roles/responsibilities for researchers, core development partners, trusted super-users, and operations. Stakeholders are engaged at all stages to help drive the direction of development, resources allocations and prioritization. This talk presents the current and future plans of unified model development at NCEP for weather, sub-seasonal, and seasonal climate prediction applications with special emphasis on implementation of NCEP FV3 Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) into operations by 2019.
The Virtual Astronomical Observatory: Re-engineering access to astronomical data
NASA Astrophysics Data System (ADS)
Hanisch, R. J.; Berriman, G. B.; Lazio, T. J. W.; Emery Bunn, S.; Evans, J.; McGlynn, T. A.; Plante, R.
2015-06-01
The US Virtual Astronomical Observatory was a software infrastructure and development project designed both to begin the establishment of an operational Virtual Observatory (VO) and to provide the US coordination with the international VO effort. The concept of the VO is to provide the means by which an astronomer is able to discover, access, and process data seamlessly, regardless of its physical location. This paper describes the origins of the VAO, including the predecessor efforts within the US National Virtual Observatory, and summarizes its main accomplishments. These accomplishments include the development of both scripting toolkits that allow scientists to incorporate VO data directly into their reduction and analysis environments and high-level science applications for data discovery, integration, analysis, and catalog cross-comparison. Working with the international community, and based on the experience from the software development, the VAO was a major contributor to international standards within the International Virtual Observatory Alliance. The VAO also demonstrated how an operational virtual observatory could be deployed, providing a robust operational environment in which VO services worldwide were routinely checked for aliveness and compliance with international standards. Finally, the VAO engaged in community outreach, developing a comprehensive web site with on-line tutorials, announcements, links to both US and internationally developed tools and services, and exhibits and hands-on training at annual meetings of the American Astronomical Society and through summer schools and community days. All digital products of the VAO Project, including software, documentation, and tutorials, are stored in a repository for community access. The enduring legacy of the VAO is an increasing expectation that new telescopes and facilities incorporate VO capabilities during the design of their data management systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kesner, A; Poli, G; Beykan, S
Purpose: As the field of Nuclear Medicine moves forward with efforts to integrate radiation dosimetry into clinical practice we can identify the challenge posed by the lack of standardized dose calculation methods and protocols. All personalized internal dosimetry is derived by projecting biodistribution measurements into dosimetry calculations. In an effort to standardize organization of data and its reporting, we have developed, as a sequel to the EANM recommendation of “Good Dosimetry Reporting”, a freely available biodistribution template, which can be used to create a common point of reference for dosimetry data. It can be disseminated, interpreted, and used for methodmore » development widely across the field. Methods: A generalized biodistribution template was built in a comma delineated format (.csv) to be completed by users performing biodistribution measurements. The template is available for free download. The download site includes instructions and other usage details on the template. Results: This is a new resource developed for the community. It is our hope that users will consider integrating it into their dosimetry operations. Having biodistribution data available and easily accessible for all patients processed is a strategy for organizing large amounts of information. It may enable users to create their own databases that can be analyzed for multiple aspects of dosimetry operations. Furthermore, it enables population data to easily be reprocessed using different dosimetry methodologies. With respect to dosimetry-related research and publications, the biodistribution template can be included as supplementary material, and will allow others in the community to better compare calculations and results achieved. Conclusion: As dosimetry in nuclear medicine become more routinely applied in clinical applications, we, as a field, need to develop the infrastructure for handling large amounts of data. Our organ level biodistribution template can be used as a standard format for data collection, organization, as well as for dosimetry research and software development.« less
Witt, Adam; Magee, Timothy; Stewart, Kevin; ...
2017-08-10
Managing energy, water, and environmental priorities and constraints within a cascade hydropower system is a challenging multiobjective optimization effort that requires advanced modeling and forecasting tools. Within the mid-Columbia River system, there is currently a lack of specific solutions for predicting how coordinated operational decisions can mitigate the impacts of total dissolved gas (TDG) supersaturation while satisfying multiple additional policy and hydropower generation objectives. In this study, a reduced-order TDG uptake equation is developed that predicts tailrace TDG at seven hydropower facilities on the mid-Columbia River. The equation is incorporated into a general multiobjective river, reservoir, and hydropower optimization toolmore » as a prioritized operating goal within a broader set of system-level objectives and constraints. A test case is presented to assess the response of TDG and hydropower generation when TDG supersaturation is optimized to remain under state water-quality standards. Satisfaction of TDG as an operating goal is highly dependent on whether constraints that limit TDG uptake are implemented at a higher priority than generation requests. According to the model, an opportunity exists to reduce TDG supersaturation and meet hydropower generation requirements by shifting spillway flows to different time periods. In conclusion, a coordinated effort between all project owners is required to implement systemwide optimized solutions that satisfy the operating policies of all stakeholders.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witt, Adam; Magee, Timothy; Stewart, Kevin
Managing energy, water, and environmental priorities and constraints within a cascade hydropower system is a challenging multiobjective optimization effort that requires advanced modeling and forecasting tools. Within the mid-Columbia River system, there is currently a lack of specific solutions for predicting how coordinated operational decisions can mitigate the impacts of total dissolved gas (TDG) supersaturation while satisfying multiple additional policy and hydropower generation objectives. In this study, a reduced-order TDG uptake equation is developed that predicts tailrace TDG at seven hydropower facilities on the mid-Columbia River. The equation is incorporated into a general multiobjective river, reservoir, and hydropower optimization toolmore » as a prioritized operating goal within a broader set of system-level objectives and constraints. A test case is presented to assess the response of TDG and hydropower generation when TDG supersaturation is optimized to remain under state water-quality standards. Satisfaction of TDG as an operating goal is highly dependent on whether constraints that limit TDG uptake are implemented at a higher priority than generation requests. According to the model, an opportunity exists to reduce TDG supersaturation and meet hydropower generation requirements by shifting spillway flows to different time periods. In conclusion, a coordinated effort between all project owners is required to implement systemwide optimized solutions that satisfy the operating policies of all stakeholders.« less
Sensor Open System Architecture (SOSA) evolution for collaborative standards development
NASA Astrophysics Data System (ADS)
Collier, Charles Patrick; Lipkin, Ilya; Davidson, Steven A.; Baldwin, Rusty; Orlovsky, Michael C.; Ibrahim, Tim
2017-04-01
The Sensor Open System Architecture (SOSA) is a C4ISR-focused technical and economic collaborative effort between the Air Force, Navy, Army, the Department of Defense (DoD), Industry, and other Governmental agencies to develop (and incorporate) a technical Open Systems Architecture standard in order to maximize C4ISR sub-system, system, and platform affordability, re-configurability, and hardware/software/firmware re-use. The SOSA effort will effectively create an operational and technical framework for the integration of disparate payloads into C4ISR systems; with a focus on the development of a modular decomposition (defining functions and behaviors) and associated key interfaces (physical and logical) for common multi-purpose architecture for radar, EO/IR, SIGINT, EW, and Communications. SOSA addresses hardware, software, and mechanical/electrical interfaces. The modular decomposition will produce a set of re-useable components, interfaces, and sub-systems that engender reusable capabilities. This, in effect, creates a realistic and affordable ecosystem enabling mission effectiveness through systematic re-use of all available re-composed hardware, software, and electrical/mechanical base components and interfaces. To this end, SOSA will leverage existing standards as much as possible and evolve the SOSA architecture through modification, reuse, and enhancements to achieve C4ISR goals. This paper will present accomplishments over the first year of SOSA initiative.
Process' standardization and change management in higher education. The case of TEI of Athens
NASA Astrophysics Data System (ADS)
Chalaris, Ioannis; Chalaris, Manolis; Gritzalis, Stefanos; Belsis, Petros
2015-02-01
The establishment of mature operational procedures and the effort of standardizing and certifying these procedures is a particularly arduous and demanding task which requires strong commitment from management to the existing objectives, administrative stability and continuity, availability of resources, an adequate implementation team with support from all stakeholders and of course great tolerance until tangible results of the investment are shown. Ensuring these conditions, particularly in times of economic crisis, is an extremely difficult task for large organizations such as TEI of Athens where there is heterogeneity in personnel and changes in the administrative hierarchy arise plethora of additional difficulties and require an effective change management. In this work we depict the path of standardization and certification of administrative functions of TEI of Athens, with emphasis on difficulties encountered and how to address them and in particular issues of change management and the culture related to this effort. The requirement for infrastructure needed to be maintained in processes and tools process & strategic management is embodied, in order to evolve mechanisms for continuous improvement processes and storage / recovery of the resulting knowledge. The work concludes with a general design of a road map of internal audit and continuous improvement processes for a large institution of higher education.
A systems approach to accident causation in mining: an application of the HFACS method.
Lenné, Michael G; Salmon, Paul M; Liu, Charles C; Trotter, Margaret
2012-09-01
This project aimed to provide a greater understanding of the systemic factors involved in mining accidents, and to examine those organisational and supervisory failures that are predictive of sub-standard performance at operator level. A sample of 263 significant mining incidents in Australia across 2007-2008 were analysed using the Human Factors Analysis and Classification System (HFACS). Two human factors specialists independently undertook the analysis. Incidents occurred more frequently in operations concerning the use of surface mobile equipment (38%) and working at heights (21%), however injury was more frequently associated with electrical operations and vehicles and machinery. Several HFACS categories appeared frequently: skill-based errors (64%) and violations (57%), issues with the physical environment (56%), and organisational processes (65%). Focussing on the overall system, several factors were found to predict the presence of failures in other parts of the system, including planned inappropriate operations and team resource management; inadequate supervision and team resource management; and organisational climate and inadequate supervision. It is recommended that these associations deserve greater attention in future attempts to develop accident countermeasures, although other significant associations should not be ignored. In accordance with findings from previous HFACS-based analyses of aviation and medical incidents, efforts to reduce the frequency of unsafe acts or operations should be directed to a few critical HFACS categories at the higher levels: organisational climate, planned inadequate operations, and inadequate supervision. While remedial strategies are proposed it is important that future efforts evaluate the utility of the measures proposed in studies of system safety. Copyright © 2011. Published by Elsevier Ltd.
Performance Measurement of Advanced Stirling Convertors (ASC-E3)
NASA Technical Reports Server (NTRS)
Oriti, Salvatore M.
2013-01-01
NASA Glenn Research Center (GRC) has been supporting development of the Advanced Stirling Radioisotope Generator (ASRG) since 2006. A key element of the ASRG project is providing life, reliability, and performance testing data of the Advanced Stirling Convertor (ASC). The latest version of the ASC (ASC-E3, to represent the third cycle of engineering model test hardware) is of a design identical to the forthcoming flight convertors. For this generation of hardware, a joint Sunpower and GRC effort was initiated to improve and standardize the test support hardware. After this effort was completed, the first pair of ASC-E3 units was produced by Sunpower and then delivered to GRC in December 2012. GRC has begun operation of these units. This process included performance verification, which examined the data from various tests to validate the convertor performance to the product specification. Other tests included detailed performance mapping that encompassed the wide range of operating conditions that will exist during a mission. These convertors were then transferred to Lockheed Martin for controller checkout testing. The results of this latest convertor performance verification activity are summarized here.
Development of NASA's Models and Simulations Standard
NASA Technical Reports Server (NTRS)
Bertch, William J.; Zang, Thomas A.; Steele, Martin J.
2008-01-01
From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.
Cataloging Practices in India: Efforts for Standardization.
ERIC Educational Resources Information Center
Tikku, Upinder Kumar
1984-01-01
Surveys current cataloging practices in Indian libraries and discusses standardization in cataloging, types of catalogs, cataloging codes (Anglo-American and Ranganathan), subject headings, descriptive cataloging, and standardization efforts (international, United States, USSR, Great Britain, India). Footnotes are included. (EJS)
NASA Astrophysics Data System (ADS)
Quigley, Stephen
The Space Vehicles Directorate of the Air Force Research Laboratory (AFRL/RVBX) and the Space Environment Branch of the Space and Missile Systems Center (SMC SLG/WMLE) have combined efforts to design, develop, test, implement, and validate numerical and graphical products for Air Force Space Command's (AFSPC) Space Environmental Effects Fusion System (SEEFS). These products are generated to analyze, specify, and forecast the effects of the near-earth space environment on Department of Defense weapons, navigation, communications, and surveillance systems. Jointly developed projects that have been completed as prototypes and are undergoing development for real-time operations include a SEEFS architecture and database, five system-impact products, and a high-level decision aid product. This first round of SEEFS products includes the Solar Radio Burst Effects (SoRBE) on radar and satellite communications, Radar Auroral Clutter (RAC), Scintillation Effects on radar and satellite communications (RadScint and SatScint), and Satellite Surface and Deep Charge/Discharge (Char/D) products. This presentation will provide overviews of the current system impact products, along with plans and potentials for future products expected for the SEEFS program. The overviews will include information on applicable research-to-operations (R2O) issues, to include input data coverage and quality control, output confidence levels, modeling standards, and validation efforts.
Low Resolution Picture Transmission (LRPT) Demonstration System. Phase II; 1.0
NASA Technical Reports Server (NTRS)
Fong, Wai; Yeh, Pen-Shu; Duran, Steve; Sank, Victor; Nyugen, Xuan; Xia, Wei; Day, John H. (Technical Monitor)
2002-01-01
Low-Resolution Picture Transmission (LRPT) is a proposed standard for direct broadcast transmission of satellite weather images. This standard is a joint effort by the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT) and NOAA. As a digital transmission scheme, its purpose is to replace the current analog Automatic Picture Transmission (APT) system for use in the Meteorological Operational (METOP) satellites. GSFC has been tasked to build an LRPT Demonstration System (LDS). Its main objective is to develop or demonstrate the feasibility of a low-cost receiver utilizing a PC as the primary processing component and determine the performance of the protocol in the simulated Radio Frequency (RF) environment. The approach would consist of two phases.
ERIC Educational Resources Information Center
Rusk, Harriet J.
1991-01-01
Considered are standardization efforts of the American National Standards Institute's Accredited Standards Committee X12 concerned with electronic data interchange (EDI). These efforts will effect dentistry first in the transmission of academic records, and then in communications between dental offices and businesses, including major insurance…
The Network Operations Control Center upgrade task: Lessons learned
NASA Technical Reports Server (NTRS)
Sherif, J. S.; Tran, T.-L.; Lee, S.
1994-01-01
This article synthesizes and describes the lessons learned from the Network Operations Control Center (NOCC) upgrade project, from the requirements phase through development and test and transfer. At the outset, the NOCC upgrade was being performed simultaneously with two other interfacing and dependent upgrades at the Signal Processing Center (SPC) and Ground Communications Facility (GCF), thereby adding a significant measure of complexity to the management and overall coordination of the development and transfer-to-operations (DTO) effort. Like other success stories, this project carried with it the traditional elements of top management support and exceptional dedication of cognizant personnel. Additionally, there were several NOCC-specific reasons for success, such as end-to-end system engineering, adoption of open-system architecture, thorough requirements management, and use of appropriate off-the-shelf technologies. On the other hand, there were several difficulties, such as ill-defined external interfaces, transition issues caused by new communications protocols, ambivalent use of two sets of policies and standards, and mistailoring of the new JPL management standard (due to the lack of practical guidelines). This article highlights the key lessons learned, as a means of constructive suggestions for the benefit of future projects.
Occupational carbon monoxide violations in the State of Washington, 1994-1999.
Lofgren, Don J
2002-07-01
Occupational exposure to carbon monoxide continues to cause a number of injuries and deaths. This study reviewed the State of Washington OSHA inspection records for occupational safety or health violations related to carbon monoxide for the time period 1994-1999 to assess the agency's efforts and further identify and characterize causative factors. Inspection data were also compared with carbon monoxide claims data from a companion study to determine if the agency was visiting the most at risk work operations. Inspections were identified by searching computerized violation texts for "carbon monoxide" or "CO." The study found 142 inspections with one or more carbon monoxide violations. Inspections were spread over 84 different 4-digit Standard Industrial Classification codes. Most inspections were initiated as a result of a complaint or other informant. Inspections were predominantly in construction and manufacturing, whereas carbon monoxide claims were mores evenly distributed between the major industries. Inspections also may have failed to find violations for some types of equipment responsible for carbon monoxide claims. Forklifts were the source of carbon monoxide most often associated with a violation, followed by compressors for respirators, auto/truck/bus, and temporary heating devices. Inspections in response to poisonings found common factors associated with lack of recognition and failure to use or maintain equipment and ventilation. Some work sites with one or more poisonings were not being inspected. Only 10 of the 51 incidents with industrial insurance claim reports of carboxyhemoglobin at or above 20 percent were inspected. Further, it was found more preventive efforts should be targeted at cold storage operations and certain warehouse and construction activities. It is proposed that more specific standards, both consensus and regulatory, would provide additional risk reduction. Reliance upon safe work practices as a primary method of control in the use of fuel-powered equipment in cold storage or other enclosed and unventilated environments needs to be prohibited. The study further demonstrates how inspection and industrial insurance records can assist with preventive efforts and better focus an agency's efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sayan Ghosh, Jeff Hammond
OpenSHMEM is a community effort to unifyt and standardize the SHMEM programming model. MPI (Message Passing Interface) is a well-known community standard for parallel programming using distributed memory. The most recen t release of MPI, version 3.0, was designed in part to support programming models like SHMEM.OSHMPI is an implementation of the OpenSHMEM standard using MPI-3 for the Linux operating system. It is the first implementation of SHMEM over MPI one-sided communication and has the potential to be widely adopted due to the portability and widely availability of Linux and MPI-3. OSHMPI has been tested on a variety of systemsmore » and implementations of MPI-3, includingInfiniBand clusters using MVAPICH2 and SGI shared-memory supercomputers using MPICH. Current support is limited to Linux but may be extended to Apple OSX if there is sufficient interest. The code is opensource via https://github.com/jeffhammond/oshmpi« less
Compliance with the Healthy Eating Standards in YMCA afterschool programs
Beets, Michael W.; Weaver, R. Glenn; Turner-McGrievy, Brie; Beighle, Aaron; Moore, Justin B.; Webster, Collin; Khan, Mahmud; Saunders, Ruth
2016-01-01
Objective In 2011, the YMCA of the USA adopted Healthy Eating (HE) Standards for all their afterschool programs (ASPs). The extent to which YMCA-ASPs comply with the standards is unknown. Methods Twenty ASPs from all YMCA-ASPs across SC (N=102) were invited to participate. Direct observation of the foods/beverages served and staff behaviors were collected on four non-consecutive days/ASP. Results One ASP did not serve a snack. Of the remaining, a total of 26% ASPs served a fruit/vegetable and 32% served water every day; 26% served sugar-sweetened beverages, 47% served sugar-added foods, and only 11% served whole grains, when grains were served. Staff were sitting with the children (65%) or verbally promoting healthy eating (15%) on at least one observation day. Staff where drinking non-approved drinks (25%) or foods (45%) on at least one observation day. No ASPs served snacks family-style every day. Conclusions/Implications Additional efforts are required to assist YMCA-operated ASPs in achieving these important nutrition standards. PMID:27372234
Remote Operations of Laser Guide Star Systems: Gemini Observatory.
NASA Astrophysics Data System (ADS)
Oram, Richard J.; Fesquet, Vincent; Wyman, Robert; D'Orgeville, Celine
2011-03-01
The Gemini North telescope, equipped with a 14W laser, has been providing Laser Guide Star Adaptive Optics (LGS AO) regular science queue observations for worldwide astronomers since February 2007. The new 55W laser system for MCAO was installed on the Gemini South telescope in May 2010. In this paper, we comment on how Gemini Observatory developed regular remote operation of the Laser Guide Star Facility and high-power solid-state laser as routine normal operations. Fully remote operation of the LGSF from the Hilo base facility HBF was initially trialed and then optimized and became the standard operating procedure (SOP) for LGS operation in December 2008. From an engineering perspective remote operation demands stable, well characterized and base-lined equipment sets. In the effort to produce consistent, stable and controlled laser parameters (power, wavelength and beam quality) we completed a failure mode effect analysis of the laser system and sub systems that initiated a campaign of hardware upgrades and procedural improvements to the routine maintenance operations. Finally, we provide an overview of normal operation procedures during LGS runs and present a snapshot of data accumulated over several years that describes the overall LGS AO observing efficiency at the Gemini North telescope.
NASA Technical Reports Server (NTRS)
Bryant, Larry W.; Fragoso, Ruth S.
2007-01-01
In 2003 we proposed an effort to develop a core program of standardized training and verification practices and standards against which the implementation of these practices could be measured. The purpose was to provide another means of risk reduction for deep space missions to preclude the likelihood of a repeat of the tragedies of the 1998 Mars missions. We identified six areas where the application of standards and standardization would benefit the overall readiness process for flight projects at JPL. These are Individual Training, Team Training, Interface and Procedure Development, Personnel Certification, Interface and procedure Verification, and Operations Readiness Testing. In this paper we will discuss the progress that has been made in the tasks of developing the proposed infrastructure in each of these areas. Specifically we will address the Position Training and Certification Standards that are now available for each operational position found on our Flight Operations Teams (FOT). We will also discuss the MGSS Baseline Flight Operations Team Training Plan which can be tailored for each new flight project at JPL. As these tasks have been progressing, the climate and emphasis for Training and for V and V at JPL has changed, and we have learned about the expansion, growth, and limitations in the roles of traditional positions at JPL such as the Project's Training Engineer, V and V Engineer, and Operations Engineer. The need to keep a tight rein on budgets has led to a merging and/or reduction in these positions which pose challenges to individual capacities and capabilities. We examine the evolution of these processes and the roles involved while taking a look at the impact or potential impact of our proposed training related infrastructure tasks. As we conclude our examination of the changes taking place for new flight projects, we see that the importance of proceeding with our proposed tasks and adapting them to the changing climate remains an important element in reducing the risk in the challenging business of space exploration.
Private and social costs of surface mine reforestation performance criteria.
Sullivan, Jay; Amacher, Gregory S
2010-02-01
We study the potentially unnecessary costs imposed by strict performance standards for forest restoration of surface coal mines in the Appalachian region under the Surface Mining Control and Reclamation Act of 1977 (SMCRA) that can vary widely across states. Both the unnecessary private costs to the mine operator and costs to society (social costs) are reported for two performance standards, a ground cover requirement, and a seedling survival target. These standards are examined using numerical analyses under a range of site productivity class and market conditions. We show that a strict (90%) ground cover standard may produce an unnecessary private cost of more than $700/ha and a social cost ranging from $428/ha to $710/ha, as compared with a 70% standard. A strict tree survival standard of 1235 trees/ha, as compared with the more typical 1087 trees/ha standard, may produce an unnecessary private cost of approximately $200/ha, and a social cost in the range of $120 to $208/ha. We conclude that strict performance standards may impose substantial unnecessary private costs and social costs, that strict performance standards may be discouraging the choice of forestry as a post-mining land use, and that opportunities exist for reform of reforestation performance standards. Our study provides a basis for evaluating tradeoffs between regulatory efficiency and optimal reforestation effort.
Living systematic reviews: 2. Combining human and machine effort.
Thomas, James; Noel-Storr, Anna; Marshall, Iain; Wallace, Byron; McDonald, Steven; Mavergames, Chris; Glasziou, Paul; Shemilt, Ian; Synnot, Anneliese; Turner, Tari; Elliott, Julian
2017-11-01
New approaches to evidence synthesis, which use human effort and machine automation in mutually reinforcing ways, can enhance the feasibility and sustainability of living systematic reviews. Human effort is a scarce and valuable resource, required when automation is impossible or undesirable, and includes contributions from online communities ("crowds") as well as more conventional contributions from review authors and information specialists. Automation can assist with some systematic review tasks, including searching, eligibility assessment, identification and retrieval of full-text reports, extraction of data, and risk of bias assessment. Workflows can be developed in which human effort and machine automation can each enable the other to operate in more effective and efficient ways, offering substantial enhancement to the productivity of systematic reviews. This paper describes and discusses the potential-and limitations-of new ways of undertaking specific tasks in living systematic reviews, identifying areas where these human/machine "technologies" are already in use, and where further research and development is needed. While the context is living systematic reviews, many of these enabling technologies apply equally to standard approaches to systematic reviewing. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Relationship between fluid bed aerosol generator operation and the aerosol produced
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, R.L.; Yerkes, K.
1980-12-01
The relationships between bed operation in a fluid bed aerosol generator and aerosol output were studied. A two-inch diameter fluid bed aerosol generator (FBG) was constructed using stainless steel powder as a fluidizing medium. Fly ash from coal combustion was aerosolized and the influence of FBG operating parameters on aerosol mass median aerodynamic diameter (MMAD), geometric standard deviation (sigma/sub g/) and concentration was examined. In an effort to extend observations on large fluid beds to small beds using fine bed particles, minimum fluidizing velocities and elutriation constant were computed. Although FBG minimum fluidizing velocity agreed well with calculations, FBG elutriationmore » constant did not. The results of this study show that the properties of aerosols produced by a FBG depend on fluid bed height and air flow through the bed after the minimum fluidizing velocity is exceeded.« less
Surgical management of early pregnancy failure: history, politics, and safe, cost-effective care.
Harris, Lisa H; Dalton, Vanessa K; Johnson, Timothy R B
2007-05-01
Early pregnancy failure and induced abortion are often managed differently, even though safe uterine evacuation is the goal in both. Early pregnancy failure is commonly treated by curettage in operating room settings in anesthetized patients. Induced abortion is most commonly managed by office vacuum aspiration in awake or sedated patients. Medical evidence does not support routine operating room management of early pregnancy failure. This commentary reviews historical origins of these different care standards, explores political factors responsible for their perpetuation, and uses experience at University of Michigan to dramatize the ways in which history, politics, and biomedicine intersect to produce patient care. The University of Michigan initiated office uterine evacuations for early pregnancy failure treatment. Patients previously went to the operating room. These changes required faculty, staff, and resident education. Our efforts blurred the lines between spontaneous and induced abortion management, improved patient care and better utilized hospital resources.
A Note on the Purposes, Development, and Applicability of the Joint Committee Evaluation Standards
ERIC Educational Resources Information Center
Stufflebeam, Daniel L.
2004-01-01
The past few years have seen efforts in several countries and a wide range of disciplines to adopt and apply existing professional standards for guiding and judging evaluation services and/or develop new standards. Some of the efforts have drawn from the work and products of the North American Joint Committee on Standards for Educational…
Optimizing Resources for Trustworthiness and Scientific Impact of Domain Repositories
NASA Astrophysics Data System (ADS)
Lehnert, K.
2017-12-01
Domain repositories, i.e. data archives tied to specific scientific communities, are widely recognized and trusted by their user communities for ensuring a high level of data quality, enhancing data value, access, and reuse through a unique combination of disciplinary and digital curation expertise. Their data services are guided by the practices and values of the specific community they serve and designed to support the advancement of their science. Domain repositories need to meet user expectations for scientific utility in order to be successful, but they also need to fulfill the requirements for trustworthy repository services to be acknowledged by scientists, funders, and publishers as a reliable facility that curates and preserves data following international standards. Domain repositories therefore need to carefully plan and balance investments to optimize the scientific impact of their data services and user satisfaction on the one hand, while maintaining a reliable and robust operation of the repository infrastructure on the other hand. Staying abreast of evolving repository standards to certify as a trustworthy repository and conducting a regular self-assessment and certification alone requires resources that compete with the demands for improving data holdings or usability of systems. The Interdisciplinary Earth Data Alliance (IEDA), a data facility funded by the US National Science Foundation, operates repositories for geochemical, marine Geoscience, and Antarctic research data, while also maintaining data products (global syntheses) and data visualization and analysis tools that are of high value for the science community and have demonstrated considerable scientific impact. Balancing the investments in the growth and utility of the syntheses with resources required for certifcation of IEDA's repository services has been challenging, and a major self-assessment effort has been difficult to accommodate. IEDA is exploring a partnership model to share generic repository functions (e.g. metadata registration, long-term archiving) with other repositories. This could substantially reduce the effort of certification and allow effort to focus on the domain-specific data curation and value-added services.
ERIC Educational Resources Information Center
Ujifusa, Andrew
2013-01-01
Opponents of the Common Core State Standards are ramping up legislative pressure and public relations efforts aimed at getting states to scale back--or even abandon--the high-profile initiative, even as implementation proceeds and tests aligned with the standards loom. Critics of the common core have focused recent lobbying and media efforts on…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-03
...; college- and career- ready standards and assessments; school reform efforts to improve student achievement...- and career- ready standards and assessments; school reforms to improve student achievement and... standards and assessments; school reform efforts to improve student achievement and increase graduation...
2017-05-25
Research Question What lessons can the contemporary Marine Corps learn from its transition from the post - Cold War and Operation Desert Shield and...United States Marine Corps Post -Cold War Evolutionary Efforts: Implications for a Post -Operation Enduring Freedom/Operation Iraqi Freedom...
Linear-drive cryocoolers for the Department of Defense standard advanced dewar assembly (SADA)
NASA Astrophysics Data System (ADS)
Tate, Garin S.
2005-05-01
The Standard Advanced Dewar Assembly (SADA) is the critical module in the Department of Defense (DoD) standardization of scanning second-generation thermal imaging systems. The DoD has established a family of SADAs to fulfill a range of performance requirements for various platforms. The SADA consists of the Infrared Focal Plane Array (IRFPA), Dewar, Command & Control Electronics (C&CE), and the cryogenic cooler, and is used in platforms such as the Apache helicopter, the M1A2 Abrams main battle tank, the M2 Bradley Infantry Fighting Vehicle, and the Javelin Command Launch Unit (CLU). In support of the family of SADAs, the DoD defined a complementary family of tactical linear drive cryocoolers. The Stirling cycle linear drive cryocoolers are utilized to cool the Infrared Focal Plane Arrays (IRFPAs) in the SADAs. These coolers are required to have low input power, a quick cool-down time, low vibration output, low audible noise, and a higher reliability than currently fielded rotary coolers. These coolers must also operate in a military environment with its inherent high vibration level and temperature extremes. This paper will (1) outline the characteristics of each cryocooler, (2) present the status and results of qualification tests, (3) present the status of production efforts, and (4) present the status of efforts to increase linear drive cooler reliability.
NASA Astrophysics Data System (ADS)
Salazar, William
2003-01-01
The Standard Advanced Dewar Assembly (SADA) is the critical module in the Department of Defense (DoD) standardization effort of scanning second-generation thermal imaging systems. DoD has established a family of SADA's to address requirements for high performance (SADA I), mid-to-high performance (SADA II), and compact class (SADA III) systems. SADA's consist of the Infrared Focal Plane Array (IRFPA), Dewar, Command and Control Electronics (C&CE), and the cryogenic cooler. SADA's are used in weapons systems such as Comanche and Apache helicopters, the M1 Abrams Tank, the M2 Bradley Fighting Vehicle, the Line of Sight Antitank (LOSAT) system, the Improved Target Acquisition System (ITAS), and Javelin's Command Launch Unit (CLU). DOD has defined a family of tactical linear drive coolers in support of the family of SADA's. The Stirling linear drive cryo-coolers are utilized to cool the SADA's Infrared Focal Plane Arrays (IRFPAs) to their operating cryogenic temperatures. These linear drive coolers are required to meet strict cool-down time requirements along with lower vibration output, lower audible noise, and higher reliability than currently fielded rotary coolers. This paper will (1) outline the characteristics of each cooler, (2) present the status and results of qualification tests, and (3) present the status and test results of efforts to increase linear drive cooler reliability.
NASA Astrophysics Data System (ADS)
Fergason, R. L.; Laura, J.; Hare, T. M.; Otero, R.; Edgar, L. A.
2017-12-01
A Spatial Data Infrastructure (SDI) is a robust framework for data and data products, metadata, data access mechanisms, standards, policy, and a user community that helps to define and standardize the data necessary to meet some specified goal. The primary objective of an SDI is to improve communication, to enhance data access, and to aid in identifying gaps in knowledge. We are developing an SDI that describes the foundational data sets and accuracy requirements to evaluate landing site safety, facilitate the successful operation of Terrain Relative Navigation (TRN), and assist in the operation of the rover once it has successfully landed on Mars. Thru current development efforts, an implicit SDI exists for the Mars 2020 mission. An explicit SDI will allow us to identify any potential gaps in knowledge, facilitate communication between the different institutions involved in landing site evaluation and TRN development, and help ensure a smooth transition from landing to surface operations. This SDI is currently relevant to the Mars 2020 rover mission, but can also serve as a means to document current requirements for foundational data products and standards for future landed missions to Mars and other planetary bodies. To generate a Mars 2020-specific SDI, we must first document and rationalize data set and accuracy requirements for evaluating landing sites, performing surface operations, and inventorying Mars 2020 mission needs in terms of an SDI framework. This step will allow us to 1) evaluate and define what is needed for the acquisition of data and the generation and validation of data products, 2) articulate the accuracy and co-registration requirements, and 3) identify needs for data access (and eventual archiving). This SDI document will serve as a means to communicate the existing foundational products, standards that were followed in producing these products, and where and how these products can be accessed by the planetary community. This SDI will also facilitate discussions between the landing and surface operations groups to communicate the available data and identify unique needs to surface operations. Our goal is to continually review and update this SDI throughout the Mars 2020 landing site evaluation and operations, so that it remains relevant and effective as data availability and needs evolve.
NASA Astrophysics Data System (ADS)
Gullett, Brian; Touati, Abderrahmane; Oudejans, Lukas
Emissions of aromatic air toxics from aircraft ground equipment (AGE) were measured with a resonance enhanced multiphoton ionization-time of flight mass spectrometry (REMPI-TOFMS) system consisting of a pulsed solid state laser for photoionization and a TOFMS for mass discrimination. This instrument was capable of characterizing turbine emissions and the effect of varying load operations on pollutant production. REMPI-TOFMS is capable of high selectivity and low detection limits (part per trillion to part per billion) in real time (1 s resolution). Hazardous air pollutants and criteria pollutants were measured during startups and idle and full load operations. Measurements of compounds such as benzene, toluene, ethylbenzene, xylenes, styrene, and polycyclic aromatic hydrocarbons compared well with standard methods. Startup emissions from the AGE data showed persistent concentrations of pollutants, unlike those from a diesel generator, where a sharp spike in emissions rapidly declined to steady state levels. The time-resolved responses of air toxics concentrations varied significantly by source, complicating efforts to minimize these emissions with common operating prescriptions. The time-resolved measurements showed that pollutant concentrations decline (up to 5×) in a species-specific manner over the course of multiple hours of operation, complicating determination of accurate and precise emission factors via standard extractive sampling. Correlations of air toxic concentrations with more commonly measured pollutants such as CO or PM were poor due to the relatively greater changes in the measured toxics' concentrations.
In the Face of Cybersecurity: How the Common Information Model Can Be Used
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skare, Paul; Falk, Herbert; Rice, Mark
2016-01-01
Efforts are underway to combine smart grid information, devices, networking, and emergency response information to create messages that are not dependent on specific standards development organizations (SDOs). This supports a future-proof approach of allowing changes in the canonical data models (CDMs) going forward without having to perform forklift replacements of solutions that use the messages. This also allows end users (electric utilities) to upgrade individual components of a larger system while keeping the message payload definitions intact. The goal is to enable public and private information sharing securely in a standards-based approach that can be integrated into existing operations. Wemore » provide an example architecture that could benefit from this multi-SDO, secure message approach. This article also describes how to improve message security« less
Toward the Development of Reporting Standards for Evaluations
ERIC Educational Resources Information Center
Montrosse-Moorhead, Bianca; Griffith, James C.
2017-01-01
This article first makes a case for the need to establish evaluation reporting standards, support for which is rooted in the growing demand for professionalization, in the growing meta evaluation literature, and in growing efforts to develop reporting standards for inquiry efforts. Then, a case is made for a particular set of such standards…
Above reproach: developing a comprehensive ethics and compliance program.
Yuspeh, A; Whalen, K; Cecelic, J; Clifton, S; Cobb, L; Eddy, M; Fainter, J; Packard, J; Postal, S; Steakley, J; Waddey, P
1999-01-01
How can a healthcare organization improve the public's confidence in the conduct of its business operations? What can it do to ensure that it can thrive despite being the subject of public and governmental scrutiny and doubt? Healthcare providers must establish standards of conduct that are above reproach and ensure that those standards are clearly articulated and strictly adhered to. This article describes the merits of a comprehensive ethics and compliance program, suggests five basic elements of such a program--organizational support/structure, setting standards, creating awareness, establishing a mechanism for reporting exceptions, and monitoring and auditing--and then demonstrates how those elements should be applied in several high-risk areas. Fundamentally, an ethics and compliance program has two purposes: to ensure that all individuals in an organization observe pertinent laws and regulations in their work; and to articulate a broader set of aspirational ethical standards that are well-understood within the organization and become a practical guideline for organization members making decisions that raise ethical concerns. Every ethics and compliance program should contain certain fundamental aspects. First, the effort must have the active support of the most senior management in the organization. To instill a commitment to ethics and compliance absent a clear and outspoken commitment to such purposes by organization leaders is simply impossible. Second, an ethics and compliance program is fundamentally about organizational culture--about instilling a commitment to observe the law and, more generally, to do the right thing. Third, ethics and compliance are responsibilities of operating management (sometimes called line management). Although staff such as compliance officers are obligated to provide the necessary resources for a successful program and to design the program, such staff officers cannot achieve implementation and execution. Only operating managers can do that. Fourth, an ethics and compliance effort should be about the conduct of individuals, not about "checking the boxes" in a model plan or generating attractive written or educational materials. Such an effort is about individuals on a day-to-day basis knowing what is expected of them and doing it and about never compromising integrity, regardless of pressures faced. A great deal of progress has been made in healthcare organizations in the development of increasingly sophisticated ethics and compliance programs. A particularly energetic focus has been placed on these programs since formal government guidance regarding compliance programs was first issued in the laboratory area about two years ago and as more sophisticated automated monitoring tools have been developed. As ethics and compliance programs have become more sophisticated, certain best practices have been established. This discussion will set forth approaches to ethics and compliance in the context of what are believed to be illustrative best practices. Much of what is described here is descriptive of the efforts of Columbia/HCA Healthcare Corporation from October 1997 to the present; however, this article has been presented not as a mere descriptive piece but rather as a set of normative guidelines. We hope that other healthcare providers will find this to be of practical use. Provider settings pose certain unique challenges that are specifically addressed in this discussion; however, many of the issues raised can be adapted to other healthcare organizations. For simplicity's sake, because the authors of this article all work on a daily basis primarily with hospitals, the article is written from a hospital perspective.
Completeness of breast cancer operative reports in a community care setting.
Eng, Jordan Lang; Baliski, Christopher Ronald; McGahan, Colleen; Cai, Eric
2017-10-01
The narrative operative report represents the traditional means by which breast cancer surgery has been documented. Previous work has established that omissions occur in narrative operative reports produced in an academic setting. The goal of this study was to determine the completeness of breast cancer narrative operative reports produced in a community care setting and to explore the effect of a surgeon's case volume and years in practice on the completeness of these reports. A standardized retrospective review of operative reports produced over a consecutive 2 year period was performed using a set of procedure-specific elements identified through a review of the relevant literature and work done locally. 772 operative reports were reviewed. 45% of all elements were completely documented. A small positive trend was observed between case volume and completeness while a small negative trend was observed between years in practice and completeness. The dictated narrative report inadequately documents breast cancer surgery irrespective of the recording surgeon's volume or experience. An intervention, such as the implementation of synoptic reporting, should be considered in an effort to maximize the utility of the breast cancer operative report. Copyright © 2017. Published by Elsevier Ltd.
Using standardized fishery data to inform rehabilitation efforts
Spurgeon, Jonathan J.; Stewart, Nathaniel T.; Pegg, Mark A.; Pope, Kevin L.; Porath, Mark T.
2016-01-01
Lakes and reservoirs progress through an aging process often accelerated by human activities, resulting in degradation or loss of ecosystem services. Resource managers thus attempt to slow or reverse the negative effects of aging using a myriad of rehabilitation strategies. Sustained monitoring programs to assess the efficacy of rehabilitation strategies are often limited; however, long-term standardized fishery surveys may be a valuable data source from which to begin evaluation. We present 3 case studies using standardized fishery survey data to assess rehabilitation efforts stemming from the Nebraska Aquatic Habitat Plan, a large-scale program with the mission to rehabilitate waterbodies within the state. The case studies highlight that biotic responses to rehabilitation efforts can be assessed, to an extent, using standardized fishery data; however, there were specific areas where minor increases in effort would clarify the effectiveness of rehabilitation techniques. Management of lakes and reservoirs can be streamlined by maximizing the utility of such datasets to work smarter, not harder. To facilitate such efforts, we stress collecting both biotic (e.g., fish lengths and weight) and abiotic (e.g., dissolved oxygen, pH, and turbidity) data during standardized fishery surveys and designing rehabilitation actions with an appropriate experimental design.
Advanced telemetry systems for payloads. Technology needs, objectives and issues
NASA Technical Reports Server (NTRS)
1990-01-01
The current trends in advanced payload telemetry are the new developments in advanced modulation/coding, the applications of intelligent techniques, data distribution processing, and advanced signal processing methodologies. Concerted efforts will be required to design ultra-reliable man-rated software to cope with these applications. The intelligence embedded and distributed throughout various segments of the telemetry system will need to be overridden by an operator in case of life-threatening situations, making it a real-time integration issue. Suitable MIL standards on physical interfaces and protocols will be adopted to suit the payload telemetry system. New technologies and techniques will be developed for fast retrieval of mass data. Currently, these technology issues are being addressed to provide more efficient, reliable, and reconfigurable systems. There is a need, however, to change the operation culture. The current role of NASA as a leader in developing all the new innovative hardware should be altered to save both time and money. We should use all the available hardware/software developed by the industry and use the existing standards rather than inventing our own.
The Rubidium-Crystal Oscillator Hybrid Development Program
NASA Technical Reports Server (NTRS)
Vig, J. R.; Rosati, V. J.
1984-01-01
The rubidium-crystal oscillator hybrid (RbXO) will make precise time available to systems that lack the power required by atomic frequency standards. The RbXO consists of two subassemblies in separate enclosures. One contains a small rubidium frequency standard (RFS) without its internal oven-controlled crystal oscillator (OCXO), plus interface circuits. The second contains a low-power OCXO, and additional interface circuits. The OCXO is on continuously. Periodically, e.g., once a week, the user system applies power to the RFS. After the few necessary for the warmup of the RFS, the interface circuits adjust the frequency of the OCXO to the RFS reference, then shut off the RFS. The OCXO enclosure is separable from the RFS enclosure so that manpacks will be able to operate with minimum size, weight, and power consumption, while having the accuracy of the RFS for the duration of a mission. A prototype RbXO's RFS has operated successfully for 4200 on-off cycles. Parallel efforts on a Phase 2 RbXO development are in progress. Two sources for the RbXO are scheduled to be available during 1986.
Nickel cadmium battery operations and performance
NASA Technical Reports Server (NTRS)
Rao, Gopalakrishna; Prettyman-Lukoschek, Jill; Calvin, Richard; Berry, Thomas; Bote, Robert; Toft, Mark
1994-01-01
The Earth Radiation Budget Satellite (ERBS), Compton Gamma Ray Observatory (CGRO), Upper Atmosphere Research Satellite (UARS), and Extreme Ultraviolet Explorer (EUVE) spacecraft are operated from NASA's Goddard Space Flight Center (GSFC) in Greenbelt, Maryland. On-board power subsystems for each satellite employ NASA Standard 50 Ampere-hour (Ah) nickel-cadmium batteries in a parallel configuration. To date, these batteries have exhibited degradation over periods from several months (anomalous behavior, UARS and CGRO (MPS-1); to little if any, EUVE) to several years (old age, normal behavior, ERBS). Since the onset of degraded performance, each mission's Flight Operations Team (FOT), under the direction of their cognizant GSFC Project Personnel and Space Power Application Branch's Engineers has closely monitored the battery performance and implemented several charge control schemes in an effort to extend battery life. Various software and hardware solutions have been developed to minimize battery overcharge. Each of the four sections of this paper covers a brief overview of each mission's operational battery management and its associated spacecraft battery performance. Also included are new operational procedures developed on-orbit that may be of special interest to future mission definition and development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sjoberg, Carl Magnus Goran; Vuilleumier, David
Ever tighter fuel economy standards and concerns about energy security motivate efforts to improve engine efficiency and to develop alternative fuels. This project contributes to the science base needed by industry to develop highly efficient direct injection spark ignition (DISI) engines that also beneficially exploit the different properties of alternative fuels. Here, the emphasis is on lean operation, which can provide higher efficiencies than traditional non-dilute stoichiometric operation. Since lean operation can lead to issues with ignition stability, slow flame propagation and low combustion efficiency, the focus is on techniques that can overcome these challenges. Specifically, fuel stratification is usedmore » to ensure ignition and completeness of combustion but this technique has soot and NOx emissions challenges. For ultra-lean well-mixed operation, turbulent deflagration can be combined with controlled end-gas autoignition to render mixed-mode combustion for sufficiently fast heat release. However, such mixed-mode combustion requires very stable inflammation, motivating studies on the effects of near-spark flow and turbulence, and the use of small amounts of fuel stratification near the spark plug.« less
Challenges for Transitioning Science Knowledge to an Operational Environment for Space Weather
NASA Technical Reports Server (NTRS)
Spann, James
2012-01-01
Effectively transitioning science knowledge to an operational environment relevant to space weather is critical to meet the civilian and defense needs, especially considering how technologies are advancing and present evolving susceptibilities to space weather impacts. The effort to transition scientific knowledge to a useful application is not a research task nor is an operational activity, but an effort that bridges the two. Successful transitioning must be an intentional effort that has a clear goal for all parties and measureable outcome and deliverable. This talk will present proven methodologies that have been demonstrated to be effective for terrestrial weather and disaster relief efforts, and how those methodologies can be applied to space weather transition efforts.
Spirometry Testing Standards in Spinal Cord Injury
Kelley, Alyson; Garshick, Eric; Gross, Erica R.; Lieberman, Steven L.; Tun, Carlos G.; Brown, Robert
2007-01-01
Study objectives Because muscle paralysis makes it uncertain whether subjects with spinal cord injury (SCI) can perform spirometry in accordance with American Thoracic Society (ATS) standards, determinants of test failure were examined. Design Cross-sectional study. Setting Veterans Affairs (VA) medical center. Participants Veterans with SCI at VA Boston Healthcare System and nonveterans recruited by mail and advertisement. Measurements and results Two hundred thirty of 278 subjects (83%) were able to produce three expiratory efforts lasting ≥ 6 s and without excessive back-extrapolated volume (EBEV). In 217 of 230 subjects (94%), FVC and FEV1 were each reproducible in accordance with 1994 ATS standards. In the remaining 48 subjects, efforts with smooth and continuous volume-time tracings and acceptable flow-volume loops were identified. These subjects had a lower percentage of predicted FVC, FEV1, and maximum expiratory and inspiratory pressures compared to the others, and a greater proportion had neurologically complete cervical injury (42% compared to 16%). In 19 subjects (40%), some expiratory efforts were not sustained maximally for ≥ 6 s but had at least a 0.5-s plateau at residual volume (short efforts). In eight subjects (17%), some efforts were not short but had EBEV. In the remaining 21 subjects (44%), some efforts were short, some had EBEV, and some had both. If these efforts were not rejected, 262 of 278 subjects (94%) would have produced three acceptable efforts, and in 257 subjects (92%), the efforts were reproducible. Conclusions Subjects with SCI with the most impaired respiratory muscles and abnormal pulmonary function are able to perform spirometry reproducibly despite not meeting usual ATS acceptability standards. Exclusion of these subjects would lead to bias in studies of respiratory function in SCI. The modification of spirometry testing standards to include efforts with EBEV and with a 0.5-s plateau if < 6 s would reduce the potential for bias. PMID:12628869
Analysis of UAS DAA Alerting in Fast-Time Simulations without DAA Mitigation
NASA Technical Reports Server (NTRS)
Thipphavong, David P.; Santiago, Confesor; Isaacson, Douglas R.; Lee, Seung Man; Park, Chunki; Refai, Mohamad Said; Snow, James
2015-01-01
Realization of the expected proliferation of Unmanned Aircraft System (UAS) operations in the National Airspace System (NAS) depends on the development and validation of performance standards for UAS Detect and Avoid (DAA) Systems. The RTCA Special Committee 228 is charged with leading the development of draft Minimum Operational Performance Standards (MOPS) for UAS DAA Systems. NASA, as a participating member of RTCA SC-228 is committed to supporting the development and validation of draft requirements for DAA alerting system performance. A recent study conducted using NASA's ACES (Airspace Concept Evaluation System) simulation capability begins to address questions surrounding the development of draft MOPS for DAA alerting systems. ACES simulations were conducted to study the performance of alerting systems proposed by the SC-228 DAA Alerting sub-group. Analysis included but was not limited to: 1) correct alert (and timeliness), 2) false alert (and severity and duration), 3) missed alert, and 4) probability of an alert type at the time of loss of well clear. The performance of DAA alerting systems when using intent vs. dead-reckoning for UAS ownship trajectories was also compared. The results will be used by SC-228 to inform decisions about the surveillance standards of UAS DAA systems and future requirements development and validation efforts.
Analysis of UAS DAA Surveillance in Fast-Time Simulations without DAA Mitigation
NASA Technical Reports Server (NTRS)
Thipphavong, David P.; Santiago, Confesor; Isaacson, David R.; Lee, Seung Man; Refai, Mohamad Said; Snow, James William
2015-01-01
Realization of the expected proliferation of Unmanned Aircraft System (UAS) operations in the National Airspace System (NAS) depends on the development and validation of performance standards for UAS Detect and Avoid (DAA) Systems. The RTCA Special Committee 228 is charged with leading the development of draft Minimum Operational Performance Standards (MOPS) for UAS DAA Systems. NASA, as a participating member of RTCA SC-228 is committed to supporting the development and validation of draft requirements for DAA surveillance system performance. A recent study conducted using NASA's ACES (Airspace Concept Evaluation System) simulation capability begins to address questions surrounding the development of draft MOPS for DAA surveillance systems. ACES simulations were conducted to study the performance of sensor systems proposed by the SC-228 DAA Surveillance sub-group. Analysis included but was not limited to: 1) number of intruders (both IFR and VFR) detected by all sensors as a function of UAS flight time, 2) number of intruders (both IFR and VFR) detected by radar alone as a function of UAS flight time, and 3) number of VFR intruders detected by all sensors as a function of UAS flight time. The results will be used by SC-228 to inform decisions about the surveillance standards of UAS DAA systems and future requirements development and validation efforts.
OʼHara, Susan
2014-01-01
Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.
Michel-Sendis, F.; Gauld, I.; Martinez, J. S.; ...
2017-08-02
SFCOMPO-2.0 is the new release of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) database of experimental assay measurements. These measurements are isotopic concentrations from destructive radiochemical analyses of spent nuclear fuel (SNF) samples. We supplement the measurements with design information for the fuel assembly and fuel rod from which each sample was taken, as well as with relevant information on operating conditions and characteristics of the host reactors. These data are necessary for modeling and simulation of the isotopic evolution of the fuel during irradiation. SFCOMPO-2.0 has been developed and is maintained by the OECDmore » NEA under the guidance of the Expert Group on Assay Data of Spent Nuclear Fuel (EGADSNF), which is part of the NEA Working Party on Nuclear Criticality Safety (WPNCS). Significant efforts aimed at establishing a thorough, reliable, publicly available resource for code validation and safety applications have led to the capture and standardization of experimental data from 750 SNF samples from more than 40 reactors. These efforts have resulted in the creation of the SFCOMPO-2.0 database, which is publicly available from the NEA Data Bank. Our paper describes the new database, and applications of SFCOMPO-2.0 for computer code validation, integral nuclear data benchmarking, and uncertainty analysis in nuclear waste package analysis are briefly illustrated.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michel-Sendis, F.; Gauld, I.; Martinez, J. S.
SFCOMPO-2.0 is the new release of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) database of experimental assay measurements. These measurements are isotopic concentrations from destructive radiochemical analyses of spent nuclear fuel (SNF) samples. We supplement the measurements with design information for the fuel assembly and fuel rod from which each sample was taken, as well as with relevant information on operating conditions and characteristics of the host reactors. These data are necessary for modeling and simulation of the isotopic evolution of the fuel during irradiation. SFCOMPO-2.0 has been developed and is maintained by the OECDmore » NEA under the guidance of the Expert Group on Assay Data of Spent Nuclear Fuel (EGADSNF), which is part of the NEA Working Party on Nuclear Criticality Safety (WPNCS). Significant efforts aimed at establishing a thorough, reliable, publicly available resource for code validation and safety applications have led to the capture and standardization of experimental data from 750 SNF samples from more than 40 reactors. These efforts have resulted in the creation of the SFCOMPO-2.0 database, which is publicly available from the NEA Data Bank. Our paper describes the new database, and applications of SFCOMPO-2.0 for computer code validation, integral nuclear data benchmarking, and uncertainty analysis in nuclear waste package analysis are briefly illustrated.« less
NASA Technical Reports Server (NTRS)
Pease, R. Adam
1995-01-01
MIDAS is a set of tools which allow a designer to specify the physical and functional characteristics of a complex system such as an aircraft cockpit, and analyze the system with regard to human performance. MIDAS allows for a number of static analyses such as military standard reach and fit analysis, display legibility analysis, and vision polars. It also supports dynamic simulation of mission segments with 3d visualization. MIDAS development has incorporated several models of human planning behavior. The CaseMIDAS effort has been to provide a simplified and unified approach to modeling task selection behavior. Except for highly practiced, routine procedures, a human operator exhibits a cognitive effort while determining what step to take next in the accomplishment of mission tasks. Current versions of MIDAS do not model this effort in a consistent and inclusive manner. CaseMIDAS also attempts to address this issue. The CaseMIDAS project has yielded an easy to use software module for case creation and execution which is integrated with existing MIDAS simulation components.
Design and Implementation of a Metadata-rich File System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ames, S; Gokhale, M B; Maltzahn, C
2010-01-19
Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address thesemore » problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.« less
Announcing a Community Effort to Create an Information Model for Research Software Archives
NASA Astrophysics Data System (ADS)
Million, C.; Brazier, A.; King, T.; Hayes, A.
2018-04-01
An effort has started to create recommendations and standards for the archiving of planetary science research software. The primary goal is to define an information model that is consistent with OAIS standards.
Feasibility of voluntary menu labeling among locally owned restaurants.
Britt, John W; Frandsen, Kirsten; Leng, Kirsten; Evans, Diane; Pulos, Elizabeth
2011-01-01
In 2007, Tacoma-Pierce County Health Department launched a restaurant menu labeling project called SmartMenu. The objective was to recruit locally owned restaurants to voluntarily post basic nutrition information on their menus or menu boards. Participating restaurants submitted recipes to an independent contractor for nutritional analysis and agreed to post calorie, fat, carbohydrate, and sodium values on new menus within 90 days of receiving results. Vigorous recruitment efforts by the Health Department between June 2007 and September 2008 included free advertising, consultation with a Registered Dietitian, and free nutritional analysis. By the end of 2008, a total of 24 restaurants participated in the program. Significant barriers to participation included infrequent use of standardized recipes, perceived business risk of labeling, and low perceived customer demand for nutrition information. Key program elements, recruitment strategies, and costs are discussed. Results have important implications for future efforts to increase the adoption of menu labeling by locally owned and operated restaurants.
ERIC Educational Resources Information Center
Anderson, Kimberly; Mire, Mary Elizabeth
2016-01-01
This report presents a multi-year study of how states are implementing their state college- and career-readiness standards. In this report, the Southern Regional Education Board's (SREB's) Benchmarking State Implementation of College- and Career-Readiness Standards project studied state efforts in 2014-15 and 2015-16 to foster effective…
Contrast in low-cost operational concepts for orbiting satellites
NASA Astrophysics Data System (ADS)
Walyus, Keith D.; Reis, James; Bradley, Arthur J.
2002-12-01
Older spacecraft missions, especially those in low Earth orbit with telemetry intensive requirements, required round-the-clock control center staffing. The state of technology relied on control center personnel to continually examine data, make decisions, resolve anomalies, and file reports. Hubble Space Telescope (HST) is a prime example of this description. Technological advancements in hardware and software over the last decade have yielded increases in productivity and operational efficiency, which result in lower cost. The re-engineering effort of HST, which has recently concluded, utilized emerging technology to reduce cost and increase productivity. New missions, of which NASA's Transition Region and Coronal Explorer Satellite (TRACE) is an example, have benefited from recent technological advancements and are more cost-effective than when HST was first launched. During its launch (1998) and early orbit phase, the TRACE Flight Operations Team (FOT) employed continually staffed operations. Yet once the mission entered its nominal phase, the FOT reduced their staffing to standard weekday business hours. Operations were still conducted at night and during the weekends, but these operations occurred autonomously without compromising their high standards for data collections. For the HST, which launched in 1990, reduced cost operations will employ a different operational concept, when the spacecraft enters its low-cost phase after its final servicing mission in 2004. Primarily due to the spacecraft"s design, the HST Project has determined that single-shift operations will introduce unacceptable risks for the amount of dollars saved. More importantly, significant cost-savings can still be achieved by changing the operational concept for the FOT, while still maintaining round-the-clock staffing. It"s important to note that the low-cost solutions obtained for one satellite may not be applicable for other satellites. This paper will contrast the differences between low-cost operational concepts for a satellite launched in 1998 versus a satellite launched in 1990.
NASA Astrophysics Data System (ADS)
Collier, Charles Patrick
2017-04-01
The Next Generation Space Interconnect Standard (NGSIS) effort is a Government-Industry collaboration effort to define a set of standards for interconnects between space system components with the goal of cost effectively removing bandwidth as a constraint for future space systems. The NGSIS team has selected the ANSI/VITA 65 OpenVPXTM standard family for the physical baseline. The RapidIO protocol has been selected as the basis for the digital data transport. The NGSIS standards are developed to provide sufficient flexibility to enable users to implement a variety of system configurations, while meeting goals for interoperability and robustness for space. The NGSIS approach and effort represents a radical departure from past approaches to achieve a Modular Open System Architecture (MOSA) for space systems and serves as an exemplar for the civil, commercial, and military Space communities as well as a broader high reliability terrestrial market.
Ethernet for Space Flight Applications
NASA Technical Reports Server (NTRS)
Webb, Evan; Day, John H. (Technical Monitor)
2002-01-01
NASA's Goddard Space Flight Center (GSFC) is adapting current data networking technologies to fly on future spaceflight missions. The benefits of using commercially based networking standards and protocols have been widely discussed and are expected to include reduction in overall mission cost, shortened integration and test (I&T) schedules, increased operations flexibility, and hardware and software upgradeability/scalability with developments ongoing in the commercial world. The networking effort is a comprehensive one encompassing missions ranging from small University Explorer (UNEX) class spacecraft to large observatories such as the Next Generation Space Telescope (NGST). Mission aspects such as flight hardware and software, ground station hardware and software, operations, RF communications, and security (physical and electronic) are all being addressed to ensure a complete end-to-end system solution. One of the current networking development efforts at GSFC is the SpaceLAN (Spacecraft Local Area Network) project, development of a space-qualifiable Ethernet network. To this end we have purchased an IEEE 802.3-compatible 10/100/1000 Media Access Control (MAC) layer Intellectual Property (IP) core and are designing a network node interface (NNI) and associated network components such as a switch. These systems will ultimately allow the replacement of the typical MIL-STD-1553/1773 and custom interfaces that inhabit most spacecraft. In this paper we will describe our current Ethernet NNI development along with a novel new space qualified physical layer that will be used in place of the standard interfaces. We will outline our plans for development of space qualified network components that will allow future spacecraft to operate in significant radiation environments while using a single onboard network for reliable commanding and data transfer. There will be a brief discussion of some issues surrounding system implications of a flight Ethernet. Finally, we will show an onboard network architecture for a proposed new mission using Ethernet for science data transport.
DOT National Transportation Integrated Search
2003-10-29
The Beta Test and Baseline Data Collection efforts ensured that the test technologies would successfully operate during the field operational test (FOT) in the designed scenario configurations. These efforts also ensured that FOT systems would succes...
Standardized Low-Power Wireless Communication Technologies for Distributed Sensing Applications
Vilajosana, Xavier; Tuset-Peiro, Pere; Vazquez-Gallego, Francisco; Alonso-Zarate, Jesus; Alonso, Luis
2014-01-01
Recent standardization efforts on low-power wireless communication technologies, including time-slotted channel hopping (TSCH) and DASH7 Alliance Mode (D7AM), are starting to change industrial sensing applications, enabling networks to scale up to thousands of nodes whilst achieving high reliability. Past technologies, such as ZigBee, rooted in IEEE 802.15.4, and ISO 18000-7, rooted in frame-slotted ALOHA (FSA), are based on contention medium access control (MAC) layers and have very poor performance in dense networks, thus preventing the Internet of Things (IoT) paradigm from really taking off. Industrial sensing applications, such as those being deployed in oil refineries, have stringent requirements on data reliability and are being built using new standards. Despite the benefits of these new technologies, industrial shifts are not happening due to the enormous technology development and adoption costs and the fact that new standards are not well-known and completely understood. In this article, we provide a deep analysis of TSCH and D7AM, outlining operational and implementation details with the aim of facilitating the adoption of these technologies to sensor application developers. PMID:24518893
The RAI DBS experiment with Olympus
NASA Astrophysics Data System (ADS)
Castelli, Enzo
The Italian broadcasting network (RAI) has studied the development of a national DBS service in an effort to outline a proposal for a space segment configuration compatible with development of new services, including HDTV. Proposals so far considered feature the integration of RAI's channel on Olympus in a future operational system and after extensive experimental use. Contents of the experimental program are discussed, and need for a broadcasting standard which considers projected introduction of HDTV is noted. The debate between RAI and consumer electronic industries on the use of broadcasting standards is outlined. The position of RAI in the context of HDTV and DBS is defined and the issue of determining the most effective transmission standard during the experimental stage is raised. It is pointed out that, in the absence of new production facilities for HDTV, the maximum quality which MAC will yield will be that of PAL since programs must be produced in PAL and then converted into MAC. Two alternatives for strategy on the use of broadcasting standards for DBS are offered. Finally, technical experiments and a market survey are discussed.
Standardized low-power wireless communication technologies for distributed sensing applications.
Vilajosana, Xavier; Tuset-Peiro, Pere; Vazquez-Gallego, Francisco; Alonso-Zarate, Jesus; Alonso, Luis
2014-02-10
Recent standardization efforts on low-power wireless communication technologies, including time-slotted channel hopping (TSCH) and DASH7 Alliance Mode (D7AM), are starting to change industrial sensing applications, enabling networks to scale up to thousands of nodes whilst achieving high reliability. Past technologies, such as ZigBee, rooted in IEEE 802.15.4, and ISO 18000-7, rooted in frame-slotted ALOHA (FSA), are based on contention medium access control (MAC) layers and have very poor performance in dense networks, thus preventing the Internet of Things (IoT) paradigm from really taking off. Industrial sensing applications, such as those being deployed in oil refineries, have stringent requirements on data reliability and are being built using new standards. Despite the benefits of these new technologies, industrial shifts are not happening due to the enormous technology development and adoption costs and the fact that new standards are not well-known and completely understood. In this article, we provide a deep analysis of TSCH and D7AM, outlining operational and implementation details with the aim of facilitating the adoption of these technologies to sensor application developers.
Compliance With the Healthy Eating Standards in YMCA After-School Programs.
Beets, Michael W; Weaver, R Glenn; Turner-McGrievy, Gabrielle; Beighle, Aaron; Moore, Justin B; Webster, Collin; Khan, Mahmud; Saunders, Ruth
2016-09-01
In 2011, the YMCA of the US adopted Healthy Eating standards for all of their after-school programs (ASPs). The extent to which YMCA ASPs comply with the standards is unknown. Twenty ASPs from all YMCA ASPs across South Carolina (N = 102) were invited to participate. Direct observation of the food and beverages served and staff behaviors were collected on 4 nonconsecutive days per ASP. One ASP did not serve a snack. Of the remaining ASPs, a total of 26% served a fruit or vegetable and 32% served water every day; 26% served sugar-sweetened beverages, 47% served sugar-added foods, and only 11% served whole grains when grains were served. Staff members sat with the children (65%) or verbally promoted healthy eating (15%) on at least 1 observation day. Staff drank non-approved drinks (25%) or foods (45%) on at least 1 observation day. No ASPs served snacks family-style every day. Additional efforts are required to assist YMCA-operated ASPs in achieving these important nutrition standards. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
1991-09-01
Secretary of Defense as a method to achieve savings in the $9 billion spent annually on information technology in DoD and to promote interoperability and...by the Office of the Secretary of Defense as a method to achieve savings in the $9 billion spent annually on information technology in DoD and to...commander. Without standard conventions for terminolo- gy, tactics or operations, the different units would have to expend much more time and effort to
NSSDC activities with 12-inch optical disk drives
NASA Technical Reports Server (NTRS)
Lowrey, Barbara E.; Lopez-Swafford, Brian
1986-01-01
The development status of optical-disk data transfer and storage technology at the National Space Science Data Center (NSSDC) is surveyed. The aim of the R&D program is to facilitate the exchange of large volumes of data. Current efforts focus on a 12-inch 1-Gbyte write-once/read-many disk and a disk drive which interfaces with VAX/VMS computer systems. The history of disk development at NSSDC is traced; the results of integration and performance tests are summarized; the operating principles of the 12-inch system are explained and illustrated with diagrams; and the need for greater standardization is indicated.
NASA Astrophysics Data System (ADS)
Mirvis, E.; Iredell, M.
2015-12-01
The operational (OPS) NOAA National Centers for Environmental Prediction (NCEP) suite, traditionally, consist of a large set of multi- scale HPC models, workflows, scripts, tools and utilities, which are very much depending on the variety of the additional components. Namely, this suite utilizes a unique collection of the in-house developed 20+ shared libraries (NCEPLIBS), certain versions of the 3-rd party libraries (like netcdf, HDF, ESMF, jasper, xml etc.), HPC workflow tool within dedicated (sometimes even vendors' customized) HPC system homogeneous environment. This domain and site specific, accompanied with NCEP's product- driven large scale real-time data operations complicates NCEP collaborative development tremendously by reducing chances to replicate this OPS environment anywhere else. The NOAA/NCEP's Environmental Modeling Center (EMC) missions to develop and improve numerical weather, climate, hydrological and ocean prediction through the partnership with the research community. Realizing said difficulties, lately, EMC has been taken an innovative approach to improve flexibility of the HPC environment by building the elements and a foundation for NCEP OPS functionally equivalent environment (FEE), which can be used to ease the external interface constructs as well. Aiming to reduce turnaround time of the community code enhancements via Research-to-Operations (R2O) cycle, EMC developed and deployed several project sub-set standards that already paved the road to NCEP OPS implementation standards. In this topic we will discuss the EMC FEE for O2R requirements and approaches in collaborative standardization, including NCEPLIBS FEE and models code version control paired with the models' derived customized HPC modules and FEE footprints. We will share NCEP/EMC experience and potential in the refactoring of EMC development processes, legacy codes and in securing model source code quality standards by using combination of the Eclipse IDE, integrated with the reverse engineering tools/APIs. We will also inform on collaborative efforts in the restructuring of the NOAA Environmental Modeling System (NEMS) - the multi- model and coupling framework, and transitioning FEE verification methodology.
Future Standardization of Space Telecommunications Radio System with Core Flight System
NASA Technical Reports Server (NTRS)
Briones, Janette C.; Hickey, Joseph P.; Roche, Rigoberto; Handler, Louis M.; Hall, Charles S.
2016-01-01
NASA Glenn Research Center (GRC) is integrating the NASA Space Telecommunications Radio System (STRS) Standard with the Core Flight System (cFS), an avionics software operating environment. The STRS standard provides a common, consistent framework to develop, qualify, operate and maintain complex, reconfigurable and reprogrammable radio systems. The cFS is a flexible, open architecture that features a plugand- play software executive called the Core Flight Executive (cFE), a reusable library of software components for flight and space missions and an integrated tool suite. Together, STRS and cFS create a development environment that allows for STRS compliant applications to reference the STRS application programmer interfaces (APIs) that use the cFS infrastructure. These APIs are used to standardize the communication protocols on NASAs space SDRs. The cFS-STRS Operating Environment (OE) is a portable cFS library, which adds the ability to run STRS applications on existing cFS platforms. The purpose of this paper is to discuss the cFS-STRS OE prototype, preliminary experimental results performed using the Advanced Space Radio Platform (ASRP), the GRC S- band Ground Station and the SCaN (Space Communication and Navigation) Testbed currently flying onboard the International Space Station (ISS). Additionally, this paper presents a demonstration of the Consultative Committee for Space Data Systems (CCSDS) Spacecraft Onboard Interface Services (SOIS) using electronic data sheets (EDS) inside cFE. This configuration allows for the data sheets to specify binary formats for data exchange between STRS applications. The integration of STRS with cFS leverages mission-proven platform functions and mitigates barriers to integration with future missions. This reduces flight software development time and the costs of software-defined radio (SDR) platforms. Furthermore, the combined benefits of STRS standardization with the flexibility of cFS provide an effective, reliable and modular framework to minimize software development efforts for spaceflight missions.
NASA Technical Reports Server (NTRS)
Hammock, William R., Jr.; Cota, Phillip E., Jr.; Rosenbaum, Bernard J.; Barrett, Michael J.
1991-01-01
Standard leak detection methods at ambient temperature have been developed in order to prevent excessive leakage from the Space Shuttle liquid oxygen and liquid hydrogen Main Propulsion System. Unacceptable hydrogen leakage was encountered on the Columbia and Atlantis flight vehicles in the summer of 1990 after the standard leak check requirements had been satisfied. The leakage was only detectable when the fuel system was exposed to subcooled liquid hydrogen during External Tank loading operations. Special instrumentation and analytical tools were utilized during a series of propellant tanking tests in order to identify the sources of the hydrogen leakage. After the leaks were located and corrected, the physical characteristics of the leak sources were analyzed in an effort to understand how the discrepancies were introduced and why the leakage had evaded the standard leak detection methods. As a result of the post-leak analysis, corrective actions and leak detection improvements have been implemented in order to preclude a similar incident.
Standardizing electrofishing power for boat electrofishing: chapter 14
Miranda, L.E. (Steve); Bonar, Scott A.; Hubert, Wayne A.; Willis, David W.
2009-01-01
Standardizing boat electrofishing entails achieving an accepted level of collection consistency by managing various brand factors, including (1) the temporal and spatial distribution of sampling effort, (2) boat operation, (3) equipment configuration, (4) characteristics of the waveform and energized field, and (5) power transferred to fish. This chapter focuses exclusively on factor 5:L factors 1-4 have been addressed in earlier chapters. Additionally, while the concepts covered in this chapter address boat electrofishing in general, the power settings discussed were developed from tests with primarily warmwater fish communities. Others (see Chapter 9) recommend lower power settings for communities consisting of primarily coldwater fishes. For reviews of basic concepts of electricity, electrofishing theory and systems, fish behavior relative to diverse waveforms, and injury matter, the reader is referred to Novotny (1990), Reynold (1996), and Snyder (2003).
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.
1996-01-01
As part of a continuing effort to re-engineer the wind tunnel testing process, a comprehensive data quality assurance program is being established at NASA Langley Research Center (LaRC). The ultimate goal of the program is routing provision of tunnel-to-tunnel reproducibility with total uncertainty levels acceptable for test and evaluation of civilian transports. The operational elements for reaching such levels of reproducibility are: (1) statistical control, which provides long term measurement uncertainty predictability and a base for continuous improvement, (2) measurement uncertainty prediction, which provides test designs that can meet data quality expectations with the system's predictable variation, and (3) national standards, which provide a means for resolving tunnel-to-tunnel differences. The paper presents the LaRC design for the program and discusses the process of implementation.
The philosophy of benchmark testing a standards-based picture archiving and communications system.
Richardson, N E; Thomas, J A; Lyche, D K; Romlein, J; Norton, G S; Dolecek, Q E
1999-05-01
The Department of Defense issued its requirements for a Digital Imaging Network-Picture Archiving and Communications System (DIN-PACS) in a Request for Proposals (RFP) to industry in January 1997, with subsequent contracts being awarded in November 1997 to the Agfa Division of Bayer and IBM Global Government Industry. The Government's technical evaluation process consisted of evaluating a written technical proposal as well as conducting a benchmark test of each proposed system at the vendor's test facility. The purpose of benchmark testing was to evaluate the performance of the fully integrated system in a simulated operational environment. The benchmark test procedures and test equipment were developed through a joint effort between the Government, academic institutions, and private consultants. Herein the authors discuss the resources required and the methods used to benchmark test a standards-based PACS.
Cultural intelligence support for military operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guthormsen, Amy M.; MacKerrow, Edward P; Merritt, Terence M
It has long been recognized that military success relies on knowledge of the enemy. In the context of standard warfare, adequate knowledge of the enemy may be gained by analyzing observable, measurable data. In the context of modern counterinsurgency operations and the global war on terror, the task of predicting the behavior of the enemy is vastly more complex and difficult. Without an understanding of the ways individuals in the host nation interpret and react to events, no amount of objective information can provide the insight required to accurately predict behavior. US military doctrine has begun to recognize the importancemore » of the many ways that local culture can affect operation success. Increasingly military decision makers use cultural information in the service of operation planning, and troops are provided with pre-deployment cultural training. However, no amount of training can cover the breadth and depth of potentially useful cultural information, and no amount of careful planning can avoid the need to adapt as situations develop. Therefore, a critical challenge is to provide useful tools to US personnel in their efforts to collect, analyze, and utilize cultural information. Essential functions for cultural support tools include the following: (1) to narrow down a broad range of available data and focus the user's attention on context-relevant information, (2) to present cultural information in an easily understood form, (3) to prompt the user to seek relevant information in the environment, (4) to synthesize information, and (5) to predict outcomes based on possible courses of operation. In this paper, we begin by reviewing the ways in which military operations can benefit from cultural intelligence. We then discuss frameworks for analyzing cultural information in the context of a military operation. We conclude with a demonstration of our current efforts to develop a tool that meets the aforementioned functional challenges.« less
1996 DOE technical standards program workshop: Proceedings
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-07-01
The workshop theme is `The Strategic Standardization Initiative - A Technology Exchange and Global Competitiveness Challenge for DOE.` The workshop goal is to inform the DOE technical standards community of strategic standardization activities taking place in the Department, other Government agencies, standards developing organizations, and industry. Individuals working on technical standards will be challenged to improve cooperation and communications with the involved organizations in response to the initiative. Workshop sessions include presentations by representatives from various Government agencies that focus on coordination among and participation of Government personnel in the voluntary standards process; reports by standards organizations, industry, and DOEmore » representatives on current technology exchange programs; and how the road ahead appears for `information superhighway` standardization. Another session highlights successful standardization case studies selected from several sites across the DOE complex. The workshop concludes with a panel discussion on the goals and objectives of the DOE Technical Standards Program as envisioned by senior DOE management. The annual workshop on technical standards has proven to be an effective medium for communicating information related to standards throughout the DOE community. Technical standards are used to transfer technology and standardize work processes to produce consistent, acceptable results. They provide a practical solution to the Department`s challenge to protect the environment and the health and safety of the public and workers during all facility operations. Through standards, the technologies of industries and governments worldwide are available to DOE. The DOE Technical Standards Program, a Department-wide effort that crosscuts all organizations and disciplines, links the Department to those technologies.« less
Rolling Deck to Repository I: Designing a Database Infrastructure
NASA Astrophysics Data System (ADS)
Arko, R. A.; Miller, S. P.; Chandler, C. L.; Ferrini, V. L.; O'Hara, S. H.
2008-12-01
The NSF-supported academic research fleet collectively produces a large and diverse volume of scientific data, which are increasingly being shared across disciplines and contributed to regional and global syntheses. As both Internet connectivity and storage technology improve, it becomes practical for ships to routinely deliver data and documentation for a standard suite of underway instruments to a central shoreside repository. Routine delivery will facilitate data discovery and integration, quality assessment, cruise planning, compliance with funding agency and clearance requirements, and long-term data preservation. We are working collaboratively with ship operators and data managers to develop a prototype "data discovery system" for NSF-supported research vessels. Our goal is to establish infrastructure for a central shoreside repository, and to develop and test procedures for the routine delivery of standard data products and documentation to the repository. Related efforts are underway to identify tools and criteria for quality control of standard data products, and to develop standard interfaces and procedures for maintaining an underway event log. Development of a shoreside repository infrastructure will include: 1. Deployment and testing of a central catalog that holds cruise summaries and vessel profiles. A cruise summary will capture the essential details of a research expedition (operating institution, ports/dates, personnel, data inventory, etc.), as well as related documentation such as event logs and technical reports. A vessel profile will capture the essential details of a ship's installed instruments (manufacturer, model, serial number, reference location, etc.), with version control as the profile changes through time. The catalog's relational database schema will be based on the UNOLS Data Best Practices Committee's recommendations, and published as a formal XML specification. 2. Deployment and testing of a central repository that holds navigation and routine underway data. Based on discussion with ship operators and data managers at a workgroup meeting in September 2008, we anticipate that a subset of underway data could be delivered from ships to the central repository in near- realtime - enabling the integrated display of ship tracks at a public Web portal, for example - and a full data package could be delivered post-cruise by network transfer or disk shipment. Once ashore, data sets could be distributed to assembly centers such as the Shipboard Automated Meteorological and Oceanographic System (SAMOS) for routine processing, quality assessment, and synthesis efforts - as well as transmitted to national data centers such as NODC and NGDC for permanent archival. 3. Deployment and testing of a basic suite of Web services to make cruise summaries, vessel profiles, event logs, and navigation data easily available. A standard set of catalog records, maps, and navigation features will be published via the Open Archives Initiative (OAI) and Open Geospatial Consortium (OGC) protocols, which can then be harvested by partner data centers and/or embedded in client applications.
A comparison of measured wind park load histories with the WISPER and WISPERX load spectra
NASA Astrophysics Data System (ADS)
Kelley, N. D.
1995-01-01
The blade-loading histories from two adjacent Micon 65/13 wind turbines are compared with the variable-amplitude test-loading histories known as the WISPER and WISPERX spectra. These standardized loading sequences were developed from blade flapwise load histories taken from nine different horizontal-axis wind turbines operating under a wide range of conditions in Europe. The subject turbines covered a broad spectrum of rotor diameters, materials, and operating environments. The final loading sequences were developed as a joint effort of thirteen different European organizations. The goal was to develop a meaningful loading standard for horizontal-axis wind turbine blades that represents common interaction effects seen in service. In 1990, NREL made extensive load measurements on two adjacent Micon 65/13 wind turbines in simultaneous operation in the very turbulent environment of a large wind park. Further, before and during the collection of the loads data, comprehensive measurements of the statistics of the turbulent environment were obtained at both the turbines under test and at two other locations within the park. The trend to larger but lighter wind turbine structures has made an understanding of the expected lifetime loading history of paramount importance. Experience in the US has shown that the turbulence-induced loads associated with multi-row wind parks in general are much more severe than for turbines operating individually or within widely spaced environments. Multi-row wind parks are much more common in the US than in Europe. In this paper we report on our results in applying the methodology utilized to develop the WISPER and WISPERX standardized loading sequences using the available data from the Micon turbines. While the intended purpose of the WISPER sequences were not to represent a specific operating environment, we believe the exercise is useful, especially when a turbine design is likely to be installed in a multi-row wind park.
2011-05-19
Planning Staffs .......................................................................................... 26 Operation Octopus ...University Press, 2003), 96. 28 Operation Octopus Having established credibility, the EOU assigned its analysts to branches of the US air and...synchronize efforts. The EOU members coined this effort Operation Octopus . This operation was critical for the EOU as it allowed EOU members to
Compacted graphite iron: Cast iron makes a comeback
NASA Astrophysics Data System (ADS)
Dawson, S.
1994-08-01
Although compacted graphite iron has been known for more than four decades, the absence of a reliable mass-production technique has resulted in relatively little effort to exploit its operational benefits. However, a proven on-line process control technology developed by SinterCast allows for series production of complex components in high-quality CGI. The improved mechanical properties of compacted graphite iron relative to conventional gray iron allow for substantial weight reduction in gasoline and diesel engines or substantial increases in horsepower, or an optimal combination of both. Concurrent with these primary benefits, CGI also provides significant emissions and fuel efficiency benefits allowing automakers to meet legislated performance standards. The operational and environmental benefits of compacted graphite iron together with its low cost and recyclability reinforce cast iron as a prime engineering material for the future.
Comparison of icing cloud instruments for 1982-1983 icing season flight program
NASA Technical Reports Server (NTRS)
Ide, R. F.; Richter, G. P.
1984-01-01
A number of modern and old style liquid water content (LWC) and droplet sizing instruments were mounted on a DeHavilland DHC-6 Twin Otter and operated in natural icing clouds in order to determine their comparative operating characteristics and their limitations over a broad range of conditions. The evaluation period occurred during the 1982-1983 icing season from January to March 1983. Time histories of all instrument outputs were plotted and analyzed to assess instrument repeatability and reliability. Scatter plots were also generated for comparison of instruments. The measured LWC from four instruments differed by as much as 20 percent. The measured droplet size from two instruments differed by an average of three microns. The overall effort demonstrated the need for additional data, and for some means of calibrating these instruments to known standards.
Perspectives on three issues facing the transportation manager in the nineties. Research report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, C.A.; Marzette, D.; McCoy, B.
1999-03-01
The nineties have been a period of tremendous change for the transportation industry. The Intermodal Surface Transportation Efficiency Act, Clean Air Act Amendments, Americans with Disabilities Act, and increasing gender and ethnic diversity have caused agencies to reassess their standard operating procedures. Greater knowledge has been sought by senior level transportation officials in an effort to prepare agencies for the changing policy, including, seminars and workshops, revisions to policy manuals, and strengthened procedures regarding how issues will be resolved. This research examines the level and nature of direct impacts on the transportation organization. Major legislative changes and mandates have imposedmore » the need for changes in how transportation systems operate. Transportation professionals continue to be challenged to develop plans and implement services that respond to mandates within the framework of the legislation.« less
Cassini's Test Methodology for Flight Software Verification and Operations
NASA Technical Reports Server (NTRS)
Wang, Eric; Brown, Jay
2007-01-01
The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).
20 °C—A Short History of the Standard Reference Temperature for Industrial Dimensional Measurements
Doiron, Ted
2007-01-01
One of the basic principles of dimensional metrology is that a part dimension changes with temperature because of thermal expansion. Since 1931 industrial lengths have been defined as the size at 20 °C. This paper discusses the variety of standard temperatures that were in use before that date, the efforts of C.E. Johansson to meet these variations, and the effort by the National Bureau of Standards to bring the United States to the eventual world standard. PMID:27110451
Quality Assurance through ISO 9000.
ERIC Educational Resources Information Center
Zuckerman, Amy
2000-01-01
Created in 1987 by the International Organization for Standardization, in Geneva, Switzerland, ISO 9000 is attempting to develop a world standard to help companies and other institutions measure and monitor their quality-control efforts. This article describes four school districts' successful efforts to secure ISO 9000 certification. (MLH)
Morgan, Lauren; New, Steve; Robertson, Eleanor; Collins, Gary; Rivero-Arias, Oliver; Catchpole, Ken; Pickering, Sharon P; Hadi, Mohammed; Griffin, Damian; McCulloch, Peter
2015-02-01
Standard operating procedures (SOPs) should improve safety in the operating theatre, but controlled studies evaluating the effect of staff-led implementation are needed. In a controlled interrupted time series, we evaluated three team process measures (compliance with WHO surgical safety checklist, non-technical skills and technical performance) and three clinical outcome measures (length of hospital stay, complications and readmissions) before and after a 3-month staff-led development of SOPs. Process measures were evaluated by direct observation, using Oxford Non-Technical Skills II for non-technical skills and the 'glitch count' for technical performance. All staff in two orthopaedic operating theatres were trained in the principles of SOPs and then assisted to develop standardised procedures. Staff in a control operating theatre underwent the same observations but received no training. The change in difference between active and control groups was compared before and after the intervention using repeated measures analysis of variance. We observed 50 operations before and 55 after the intervention and analysed clinical data on 1022 and 861 operations, respectively. The staff chose to structure their efforts around revising the 'whiteboard' which documented and prompted tasks, rather than directly addressing specific task problems. Although staff preferred and sustained the new system, we found no significant differences in process or outcome measures before/after intervention in the active versus the control group. There was a secular trend towards worse outcomes in the postintervention period, seen in both active and control theatres. SOPs when developed and introduced by frontline staff do not necessarily improve operative processes or outcomes. The inherent tension in improvement work between giving staff ownership of improvement and maintaining control of direction needs to be managed, to ensure staff are engaged but invest energy in appropriate change. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Meter shop equipment, techniques, and operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, R.F.
1995-12-01
The development of new equipment inevitably results in new techniques and operational procedures to be implemented. Innovative techniques can result in operational changes and the development of new equipment. An operational modification opens the door for the development of new ideas, new equipment, and new techniques. This constant cycle of change promotes a continuous series of economic evaluations and decisions. Equipment and technological changes in measuring natural gas have resulted in modifications from past practices. The changes underway today will influence the way we plan and budget for the future, and, most notably, will change the way we repair gasmore » meters. The meter shop of today will need to look at the way that these changes will become available and how they will help efforts in reducing costs and increasing productivity. For instance, the standard three- and four-chambered diaphragm gas meter design has not been significantly changed in the past few decades aside from having heavy, cast iron bodies changed to lightweight aluminum bodies and utilizing a few plastic parts. But changes have been made, and more are on the threshold.« less
Canik, John M.; Briesemeister, Alexis R.; McLean, Adam G.; ...
2017-05-10
Recent experiments in DIII-D helium plasmas are examined to resolve the role of atomic and molecular physics in major discrepancies between experiment and modeling of dissipative divertor operation. Helium operation removes the complicated molecular processes of deuterium plasmas that are a prime candidate for the inability of standard fluid models to reproduce dissipative divertor operation, primarily the consistent under-prediction of radiated power. Modeling of these experiments shows that the full divertor radiation can be accounted for, but only if measures are taken to ensure that the model reproduces the measured divertor density. Relying on upstream measurements instead results in amore » lower divertor density and radiation than is measured, indicating a need for improved modeling of the connection between the diverter and the upstream scrape-off layer. Furthermore, these results show that fluid models are able to quantitatively describe the divertor-region plasma, including radiative losses, and indicate that efforts to improve the fidelity of the molecular deuterium models are likely to help resolve the discrepancy in radiation for deuterium plasmas.« less
Operational Support for Instrument Stability through ODI-PPA Metadata Visualization and Analysis
NASA Astrophysics Data System (ADS)
Young, M. D.; Hayashi, S.; Gopu, A.; Kotulla, R.; Harbeck, D.; Liu, W.
2015-09-01
Over long time scales, quality assurance metrics taken from calibration and calibrated data products can aid observatory operations in quantifying the performance and stability of the instrument, and identify potential areas of concern or guide troubleshooting and engineering efforts. Such methods traditionally require manual SQL entries, assuming the requisite metadata has even been ingested into a database. With the ODI-PPA system, QA metadata has been harvested and indexed for all data products produced over the life of the instrument. In this paper we will describe how, utilizing the industry standard Highcharts Javascript charting package with a customized AngularJS-driven user interface, we have made the process of visualizing the long-term behavior of these QA metadata simple and easily replicated. Operators can easily craft a custom query using the powerful and flexible ODI-PPA search interface and visualize the associated metadata in a variety of ways. These customized visualizations can be bookmarked, shared, or embedded externally, and will be dynamically updated as new data products enter the system, enabling operators to monitor the long-term health of their instrument with ease.
Addressable configurations of DNA nanostructures for rewritable memory
Levchenko, Oksana; Patel, Dhruv S.; MacIsaac, Molly
2017-01-01
Abstract DNA serves as nature's information storage molecule, and has been the primary focus of engineered systems for biological computing and data storage. Here we combine recent efforts in DNA self-assembly and toehold-mediated strand displacement to develop a rewritable multi-bit DNA memory system. The system operates by encoding information in distinct and reversible conformations of a DNA nanoswitch and decoding by gel electrophoresis. We demonstrate a 5-bit system capable of writing, erasing, and rewriting binary representations of alphanumeric symbols, as well as compatibility with ‘OR’ and ‘AND’ logic operations. Our strategy is simple to implement, requiring only a single mixing step at room temperature for each operation and standard gel electrophoresis to read the data. We envision such systems could find use in covert product labeling and barcoding, as well as secure messaging and authentication when combined with previously developed encryption strategies. Ultimately, this type of memory has exciting potential in biomedical sciences as data storage can be coupled to sensing of biological molecules. PMID:28977499
Pant, Saumya; Weiner, Russell; Marton, Matthew J.
2014-01-01
Over the past decade, next-generation sequencing (NGS) technology has experienced meteoric growth in the aspects of platform, technology, and supporting bioinformatics development allowing its widespread and rapid uptake in research settings. More recently, NGS-based genomic data have been exploited to better understand disease development and patient characteristics that influence response to a given therapeutic intervention. Cancer, as a disease characterized by and driven by the tumor genetic landscape, is particularly amenable to NGS-based diagnostic (Dx) approaches. NGS-based technologies are particularly well suited to studying cancer disease development, progression and emergence of resistance, all key factors in the development of next-generation cancer Dxs. Yet, to achieve the promise of NGS-based patient treatment, drug developers will need to overcome a number of operational, technical, regulatory, and strategic challenges. Here, we provide a succinct overview of the state of the clinical NGS field in terms of the available clinically targeted platforms and sequencing technologies. We discuss the various operational and practical aspects of clinical NGS testing that will facilitate or limit the uptake of such assays in routine clinical care. We examine the current strategies for analytical validation and Food and Drug Administration (FDA)-approval of NGS-based assays and ongoing efforts to standardize clinical NGS and build quality control standards for the same. The rapidly evolving companion diagnostic (CDx) landscape for NGS-based assays will be reviewed, highlighting the key areas of concern and suggesting strategies to mitigate risk. The review will conclude with a series of strategic questions that face drug developers and a discussion of the likely future course of NGS-based CDx development efforts. PMID:24860780
Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project
NASA Technical Reports Server (NTRS)
Shively, Jay
2017-01-01
Over the past 5 years, the UAS integration into the NAS project has worked to reduce technical barriers to integration. A major focus of this work has been in support of RTCA SC-228. This committee has recently published the first UAS integration minimum performance standards (MOPS). This work has spanned detect and avoid (DAA) as well as command and control comm datalinks. I will discuss DAA efforts with focus on the human systems work. I will discuss how automation was discussed and addressed within this context. ICAO stood up a remotely piloted aircraft systems (RPAS) panel in 2014. They have developed an RPAS manual and are now working to revise existing annexes and standards and recommended practices. The Human In The System (HITS) has worked to infuse human factors guidelines into those documents. I will discuss that effort as well as how ICAO has defined and address autonomy. There is a great deal of interest in the control of multiple vehicles by a single operator. The UAS EXCOM Science and Research Panel (SARP) is holding a workshop on this topic in late June. I will discuss research performed on this topic when I worked for the Army and on-going work within the division and a NATO working group on Human-Autonomy Teaming.
Biological standards for the Knowledge-Based BioEconomy: What is at stake.
de Lorenzo, Víctor; Schmidt, Markus
2018-01-25
The contribution of life sciences to the Knowledge-Based Bioeconomy (KBBE) asks for the transition of contemporary, gene-based biotechnology from being a trial-and-error endeavour to becoming an authentic branch of engineering. One requisite to this end is the need for standards to measure and represent accurately biological functions, along with languages for data description and exchange. However, the inherent complexity of biological systems and the lack of quantitative tradition in the field have largely curbed this enterprise. Fortunately, the onset of systems and synthetic biology has emphasized the need for standards not only to manage omics data, but also to increase reproducibility and provide the means of engineering living systems in earnest. Some domains of biotechnology can be easily standardized (e.g. physical composition of DNA sequences, tools for genome editing, languages to encode workflows), while others might be standardized with some dedicated research (e.g. biological metrology, operative systems for bio-programming cells) and finally others will require a considerable effort, e.g. defining the rules that allow functional composition of biological activities. Despite difficulties, these are worthy attempts, as the history of technology shows that those who set/adopt standards gain a competitive advantage over those who do not. Copyright © 2017 Elsevier B.V. All rights reserved.
2015-07-01
steps to identify and mitigate potential challenges; (2) extent the services’ efforts to validate gender -neutral occupational standards are...to address statutory and Joint Staff requirements for validating gender -neutral occupational standards. GAO identified five elements required for...SOCOM Have Studies Underway to Validate Gender -Neutral Occupational Standards 21 DOD Is Providing Oversight of Integration Efforts, but Has Not
Effectiveness of the Department of Defense Information Assurance Accreditation Process
2013-03-01
meeting the requirements of ISO 27001, Information Security Management System. ISO 27002 provides “security techniques” or best practices that can be...efforts to the next level and implement a recognized standard such as the International Organization for Standards ( ISO ) 27000 Series of standards...implemented by an organization as part of their certification effort.15 Most likely, the main motivation a company would have for achieving an ISO
A Critical Path for Data Integration in the U.S. Earth Sciences
NASA Astrophysics Data System (ADS)
Gallagher, K. T.; Allison, M. L.
2011-12-01
Development efforts for the U.S. Geoscience Information Network (US GIN) have crystallized around the Community for Data Integration (CDI) at the USGS, and the 50-state AASG State Geothermal Data project. The next step in developing a USGS-AASG community is to bring these two efforts into closer alignment through greater participation in CDI activities by geoinformatics practitioners from state geological surveys, and implementation of test bed activities by the USGIN partners. Test bed activities in the geological survey community will define a scope and provide a foundation to promote the use of specifications developed by the larger geoinformatics community. Adoption of some of these specifications as 'standards' by USGS and AASG for use by those organizations will lend authority and motivate wider adoption. The arc from use case to test bed to production deployments to agreement on 'standard' specifications for data discovery and access must be propelled by active interest from the user communities who have a stake in the outcome. The specifications developed will benefit the organizations involved in development, testing and deployment, which motivates participation -- a model that has worked successfully for standards organizations such as OGC, ISO and OASIS. The governance structure to support such a community process should promote grass root nucleation of interest groups that are the core of development efforts. Some mechanism for community agreement on priorities is desirable because geological survey agencies will need to allocate resources to support development. Loosely knit organizations such as ESIP and the current CDI provide models for this kind of structure. Because many geological surveys have data archive and dissemination functions as part of their portfolio, some support for the system can be built into the operating expenses and overhead. Sharing of resources and reuse of components can reduce the cost. Wide adoption of similar software, protocols and practices increases the number of stake holders with an interest in supporting the system.
NASA Astrophysics Data System (ADS)
Northup, E. A.; Beach, A. L., III; Early, A. B.; Kusterer, J.; Quam, B.; Wang, D.; Chen, G.
2015-12-01
The current data management practices for NASA airborne field projects have successfully served science team data needs over the past 30 years to achieve project science objectives, however, users have discovered a number of issues in terms of data reporting and format. The ICARTT format, a NASA standard since 2010, is currently the most popular among the airborne measurement community. Although easy for humans to use, the format standard is not sufficiently rigorous to be machine-readable, and there lacks a standard variable naming convention among the many airborne measurement variables. This makes data use and management tedious and resource intensive, and also create problems in Distributed Active Archive Center (DAAC) data ingest procedures and distribution. Further, most DAACs use metadata models that concentrate on satellite data observations, making them less prepared to deal with airborne data. There also exists a substantial amount of airborne data distributed by websites designed for science team use that are less friendly to users unfamiliar with operations of airborne field studies. A number of efforts are underway to help overcome the issues with airborne data discovery and distribution. The ICARTT Refresh Earth Science Data Systems Working Group (ESDSWG) was established to enable a platform for atmospheric science data providers, users, and data managers to collaborate on developing new criteria for the file format in an effort to enhance airborne data usability. In addition, the NASA Langley Research Center Atmospheric Science Data Center (ASDC) has developed the Toolsets for Airborne Data (TAD) to provide web-based tools and centralized access to airborne in situ measurements of atmospheric composition. This presentation will discuss the aforementioned challenges and attempted solutions in an effort to demonstrate how airborne data management can be improved to streamline data ingest and discoverability to a broader user community.
Non-Deterministic, Non-Traditional Methods (NDNTM)
NASA Technical Reports Server (NTRS)
Cruse, Thomas A.; Chamis, Christos C. (Technical Monitor)
2001-01-01
The review effort identified research opportunities related to the use of nondeterministic, nontraditional methods to support aerospace design. The scope of the study was restricted to structural design rather than other areas such as control system design. Thus, the observations and conclusions are limited by that scope. The review identified a number of key results. The results include the potential for NASA/AF collaboration in the area of a design environment for advanced space access vehicles. The following key points set the context and delineate the key results. The Principal Investigator's (PI's) context for this study derived from participation as a Panel Member in the Air Force Scientific Advisory Board (AF/SAB) Summer Study Panel on 'Whither Hypersonics?' A key message from the Summer Study effort was a perceived need for a national program for a space access vehicle whose operating characteristics of cost, availability, deployability, and reliability most closely match the NASA 3rd Generation Reusable Launch Vehicle (RLV). The Panel urged the AF to make a significant joint commitment to such a program just as soon as the AF defined specific requirements for space access consistent with the AF Aerospace Vision 2020. The review brought home a concurrent need for a national vehicle design environment. Engineering design system technology is at a time point from which a revolution as significant as that brought about by the finite element method is possible, this one focusing on information integration on a scale that far surpasses current design environments. The study therefore fully supported the concept, if not some of the details of the Intelligent Synthesis Environment (ISE). It became abundantly clear during this study that the government (AF, NASA) and industry are not moving in the same direction in this regard, in fact each is moving in its own direction. NASA/ISE is not yet in an effective leadership position in this regard. However, NASA does have complementary software interoperability efforts that should be a part of any major ISE program. Software standards that assure interoperability of data systems and modeling representations are enabling for the proposed research advocated herein and should be a major element in the ISE initiative. The international standard for data interchange is known by the acronym 'STEP.' The NASA participation and lead for that effort is at the Goddard Space Flight Center. NASA/GRC is leading an effort to define CAD geometry standards through the Object Management Group (OMG). To enable the design environment so necessary to the above national vision for a unique space vehicle will require an integrating software environment with interoperability standards that allow the development and widespread deployment of tools and toolsets, rather than traditional "shrink-wrapped" software used by engineers today.
DANTi: Detect and Avoid iN The Cockpit
NASA Technical Reports Server (NTRS)
Chamberlain, James; Consiglio, Maria; Munoz, Cesar
2017-01-01
Mid-air collision risk continues to be a concern for manned aircraft operations, especially near busy non-towered airports. The use of Detect and Avoid (DAA) technologies and draft standards developed for unmanned aircraft systems (UAS), either alone or in combination with other collision avoidance technologies, may be useful in mitigating this collision risk for manned aircraft. This paper describes a NASA research effort known as DANTi (DAA iN The Cockpit), including the initial development of the concept of use, a software prototype, and results from initial flight tests conducted with this prototype. The prototype used a single Automatic Dependent Surveillance - Broadcast (ADS-B) traffic sensor and the own aircraft's position, track, heading and air data information, along with NASA-developed DAA software to display traffic alerts and maneuver guidance to manned aircraft pilots on a portable tablet device. Initial flight tests with the prototype showed a successful DANTi proof-of-concept, but also demonstrated that the traffic separation parameter set specified in the RTCA SC-228 Phase I DAA MOPS may generate excessive false alerts during traffic pattern operations. Several parameter sets with smaller separation values were also tested in flight, one of which yielded more timely alerts for the maneuvers tested. Results from this study may further inform future DANTi efforts as well as Phase II DAA MOPS development.
Growing controversy over "wise international water governance".
Trondalen, J M
2004-01-01
This article takes the perspective that when political relationships are strained, there seem to be few examples of wise international water resources governance. The Middle East is a striking example. Much effort has been put into policy development and the design of international principles, but very little into the translation of those into concrete and lasting governance. One of the theses of the article is that politics--whether domestic or international--in most cases overrides these principles and standards. Moreover readymade regional co-operation models of water managements are not directly applicable to every geographical, political, economic and social setting. Certain factors are often under-estimated in international water negotiations, such as: the complexity of any hydro-political negotiations, and need to develop commonly accepted standards; the difficulty of translating policy--either politically or legally--into an operational and realistic negotiations strategy; the format of the procedures and meetings; recognition that third parties should have a long-term perspective on any conflict they get involved in. With reservations, the lessons learned indicate that the following factors have an impact on grid locked situations, such as: new substantive information; new trade-offs between the parties; and changed political climate or relationship with external power-brokers.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-23
... specifications and operating instructions, if available, or standard operating procedures must be developed by... manufacturer's specifications and operating instructions, if available, or standard operating procedures must... operating specifications or standard operating procedures developed by the prepared feeds manufacturer be...
Evaluation of the advanced operating system of the Ann Arbor Transit Authority
DOT National Transportation Integrated Search
1999-10-01
These reports constitute an evaluation of the intelligent transportation system deployment efforts of the Ann Arbor Transportation Authority. These efforts, collectively termed "Advanced Operating System" (AOS), represent a vision of an integrated ad...
Total Force Fitness in units part 1: military demand-resource model.
Bates, Mark J; Fallesen, Jon J; Huey, Wesley S; Packard, Gary A; Ryan, Diane M; Burke, C Shawn; Smith, David G; Watola, Daniel J; Pinder, Evette D; Yosick, Todd M; Estrada, Armando X; Crepeau, Loring; Bowles, Stephen V
2013-11-01
The military unit is a critical center of gravity in the military's efforts to enhance resilience and the health of the force. The purpose of this article is to augment the military's Total Force Fitness (TFF) guidance with a framework of TFF in units. The framework is based on a Military Demand-Resource model that highlights the dynamic interactions across demands, resources, and outcomes. A joint team of subject-matter experts identified key variables representing unit fitness demands, resources, and outcomes. The resulting framework informs and supports leaders, support agencies, and enterprise efforts to strengthen TFF in units by (1) identifying TFF unit variables aligned with current evidence and operational practices, (2) standardizing communication about TFF in units across the Department of Defense enterprise in a variety of military organizational contexts, (3) improving current resources including evidence-based actions for leaders, (4) identifying and addressing of gaps, and (5) directing future research for enhancing TFF in units. These goals are intended to inform and enhance Service efforts to develop Service-specific TFF models, as well as provide the conceptual foundation for a follow-on article about TFF metrics for units. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Low-Cost SIRTF Flight Operations
NASA Astrophysics Data System (ADS)
Deutsch, M.-J.; Ebersole, M.; Nichols, J.
1997-12-01
The Space Infrared Telescope Facility (SIRTF) , the fourth of the Great Observatories, will be placed in a unique solar orbit trailing the Earth, in 2001. SIRTF will acquire both imaging and spectral data using large infrared detector arrays from 3.5mm to 160mm. The primary science objectives are (1) search for and study of brown dwarfs and super planets, (2) discovery and study of protoplanetary debris disks, (3) study of ultraluminous galaxies and active galactic nuclei, and (4) study of the early Universe. Driven by the limited cryogenic lifetime of 2.5 years, with a goal of 5 years, and the severely cost-capped development, a Mission Planning and Operations system is being designed that will result in high on-board efficiency (>90%) and low-cost operation, yet will accommodate rapid response science requirements . SIRTF is designing an architecture for an operations system that will be shared between science and flight operations. Crucial to this effort is the philosophy of an integrated science and engineering plan, co-location, cross-training of teams and common planning tools. The common tool set will enable the automatic generation of an integrated and conflict free planned schedule accommodating 20 000 observations and engineering activities a year. The shared tool set will help generate standard observations , (sometimes non-standard) engineering activities and manage the ground and flight resources and constraints appropriately. The ground software will allow the development from the ground of robust event driven sequences. Flexibility will be provided to incorporate newly discovered science opportunities or health issues late in the process and via quick links. This shared science and flight operations process if used from observation selection through sequence and command generation, will provide a low-cost operations system. Though SIRTF is a 'Great Observatory', its annual mission operations costs will more closely resemble those of an Explorer class mission.
Future Standardization of Space Telecommunications Radio System with Core Flight System
NASA Technical Reports Server (NTRS)
Hickey, Joseph P.; Briones, Janette C.; Roche, Rigoberto; Handler, Louis M.; Hall, Steven
2016-01-01
NASA Glenn Research Center (GRC) is integrating the NASA Space Telecommunications Radio System (STRS) Standard with the Core Flight System (cFS). The STRS standard provides a common, consistent framework to develop, qualify, operate and maintain complex, reconfigurable and reprogrammable radio systems. The cFS is a flexible, open architecture that features a plug-and-play software executive called the Core Flight Executive (cFE), a reusable library of software components for flight and space missions and an integrated tool suite. Together, STRS and cFS create a development environment that allows for STRS compliant applications to reference the STRS APIs through the cFS infrastructure. These APis are used to standardize the communication protocols on NASAs space SDRs. The cFE-STRS Operating Environment (OE) is a portable cFS library, which adds the ability to run STRS applications on existing cFS platforms. The purpose of this paper is to discuss the cFE-STRS OE prototype, preliminary experimental results performed using the Advanced Space Radio Platform (ASRP), the GRC Sband Ground Station and the SCaN (Space Communication and Navigation) Testbed currently flying onboard the International Space Station. Additionally, this paper presents a demonstration of the Consultative Committee for Space Data Systems (CCSDS) Spacecraft Onboard Interface Services (SOIS) using electronic data sheets inside cFE. This configuration allows for the data sheets to specify binary formats for data exchange between STRS applications. The integration of STRS with cFS leverages mission-proven platform functions and mitigates barriers to integration with future missions. This reduces flight software development time and the costs of software-defined radio (SDR) platforms. Furthermore, the combined benefits of STRS standardization with the flexibility of cFS provide an effective, reliable and modular framework to minimize software development efforts for spaceflight missions.
Kent County Health Department: Using an Agency Strategic Plan to Drive Improvement.
Saari, Chelsey K
The Kent County Health Department (KCHD) was accredited by the Public Health Accreditation Board (PHAB) in September 2014. Although Michigan has had a state-level accreditation process for local health departments since the late 1990s, the PHAB accreditation process presented a unique opportunity for KCHD to build on successes achieved through state accreditation and enhance performance in all areas of KCHD programs, services, and operations. PHAB's standards, measures, and peer-review process provided a standardized and structured way to identify meaningful opportunities for improvement and to plan and implement strategies for enhanced performance and established a platform for being recognized nationally as a high-performing local health department. The current case report highlights the way in which KCHD has developed and implemented its strategic plan to guide efforts aimed at addressing gaps identified through the accreditation process and to drive overall improvement within our agency.
NASA Astrophysics Data System (ADS)
Hong, Seok Hoon; Kwon, Yong-Chan; Jewett, Michael
2014-06-01
Incorporating non-standard amino acids (NSAAs) into proteins enables new chemical properties, new structures, and new functions. In recent years, improvements in cell-free protein synthesis (CFPS) systems have opened the way to accurate and efficient incorporation of NSAAs into proteins. The driving force behind this development has been three-fold. First, a technical renaissance has enabled high-yielding (>1 g/L) and long-lasting (>10 h in batch operation) CFPS in systems derived from Escherichia coli. Second, the efficiency of orthogonal translation systems has improved. Third, the open nature of the CFPS platform has brought about an unprecedented level of control and freedom of design. Here, we review recent developments in CFPS platforms designed to precisely incorporate NSAAs. In the coming years, we anticipate that CFPS systems will impact efforts to elucidate structure/function relationships of proteins and to make biomaterials and sequence-defined biopolymers for medical and industrial applications.
NASA Technical Reports Server (NTRS)
Dorsey, John T.; Jones, Thomas C.; Doggett, W. R.; Brady, Jeffrey S.; Berry, Felecia C.; Ganoe, George G.; Anderson, Eric; King, Bruce D.; Mercer, David C.
2011-01-01
The first generation of a versatile high performance device for performing payload handling and assembly operations on planetary surfaces, the Lightweight Surface Manipulation System (LSMS), has been designed and built. Over the course of its development, conventional crane type payload handling configurations and operations have been successfully demonstrated and the range of motion, types of operations and the versatility greatly expanded. This enhanced set of 1st generation LSMS hardware is now serving as a laboratory test-bed allowing the continuing development of end effectors, operational techniques and remotely controlled and automated operations. This paper describes the most recent LSMS and test-bed development activities, that have focused on two major efforts. The first effort was to complete a preliminary design of the 2nd generation LSMS that has the capability for limited mobility and can reposition itself between lander decks, mobility chassis, and fixed base locations. A major portion of this effort involved conducting a study to establish the feasibility of, and define, the specifications for a lightweight cable-drive waist joint. The second effort was to continue expanding the versatility and autonomy of large planetary surface manipulators using the 1st generation LSMS as a test-bed. This has been accomplished by increasing manipulator capabilities and efficiencies through both design changes and tool and end effector development. A software development effort has expanded the operational capabilities of the LSMS test-bed to include; autonomous operations based on stored paths, use of a vision system for target acquisition and tracking, and remote command and control over a communications bridge.
DICOM version 3.0 demonstration at InfoRAD 1992
NASA Astrophysics Data System (ADS)
Jost, R. Gilbert
1993-09-01
Over the past 10 years, a large number of devoted individuals have worked on a project to develop a standard for the storage and exchange of medical images. Known as the ACR/NEMA standardization effort, this project is jointly sponsored by the American College of Radiology (ACR) and the National Electrical Manufacturers Association (NEMA). Over the years, radiologists and industry representatives alike have supported the standardization effort, but there has been little evidence of actual exchanges of medical images among medical equipment vendors who use the ACR/NEMA standard.
NASA Astrophysics Data System (ADS)
Heller, Johann; Flisgen, Thomas; van Rienen, Ursula
The computation of electromagnetic fields and parameters derived thereof for lossless radio frequency (RF) structures filled with isotropic media is an important task for the design and operation of particle accelerators. Unfortunately, these computations are often highly demanding with regard to computational effort. The entire computational demand of the problem can be reduced using decomposition schemes in order to solve the field problems on standard workstations. This paper presents one of the first detailed comparisons between the recently proposed state-space concatenation approach (SSC) and a direct computation for an accelerator cavity with coupler-elements that break the rotational symmetry.
NASA Technical Reports Server (NTRS)
Werner, C. R.; Humphreys, B. T.; Mulugeta, L.
2014-01-01
The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.
Dohrenbusch, R
2009-06-01
Chronic pain accompanied by disability and handicap is a frequent symptom necessitating medical assessment. Current guidelines for the assessment of malingering suggest discrimination between explanatory demonstration, aggravation and simulation. However, this distinction has not clearly been put into operation and validated. The necessity of assessment strategies based on general principles of psychological assessment and testing is emphasized. Standardized and normalized psychological assessment methods and symptom validation techniques should be used in the assessment of subjects with chronic pain problems. An adaptive procedure for assessing the validity of complaints is suggested to minimize effort and costs.
40 CFR 160.81 - Standard operating procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Standard operating procedures. 160.81... GOOD LABORATORY PRACTICE STANDARDS Testing Facilities Operation § 160.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting forth study...
40 CFR 160.81 - Standard operating procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Standard operating procedures. 160.81... GOOD LABORATORY PRACTICE STANDARDS Testing Facilities Operation § 160.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting forth study...
40 CFR 160.81 - Standard operating procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Standard operating procedures. 160.81... GOOD LABORATORY PRACTICE STANDARDS Testing Facilities Operation § 160.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting forth study...
An Analysis for an Internet Grid to Support Space Based Operations
NASA Technical Reports Server (NTRS)
Bradford, Robert; McNair, Ann R. (Technical Monitor)
2002-01-01
Currently, and in the past, dedicated communication circuits and "network services" with very stringent performance requirements have been used to support manned and unmanned mission critical ground operations at GSFC, JSC, MSFC, KSC and other NASA facilities. Because of the evolution of network technology, it is time to investigate other approaches to providing mission services for space ground and flight operations. In various scientific disciplines, effort is under way to develop network/komputing grids. These grids consisting of networks and computing equipment are enabling lower cost science. Specifically, earthquake research is headed in this direction. With a standard for network and computing interfaces using a grid, a researcher would not be required to develop and engineer NASA/DoD specific interfaces with the attendant increased cost. Use of the Internet Protocol (IP), CCSDS packet spec, and reed-solomon for satellite error correction etc. can be adopted/standardized to provide these interfaces. Generally most interfaces are developed at least to some degree end to end. This study would investigate the feasibility of using existing standards and protocols necessary to implement a SpaceOps Grid. New interface definitions or adoption/modification of existing ones for the various space operational services is required for voice both space based and ground, video, telemetry, commanding and planning may play a role to some undefined level. Security will be a separate focus in the study since security is such a large issue in using public networks. This SpaceOps Grid would be transparent to users. It would be anagulous to the Ethernet protocol's ease of use in that a researcher would plug in their experiment or instrument at one end and would be connected to the appropriate host or server without further intervention. Free flyers would be in this category as well. They would be launched and would transmit without any further intervention with the researcher or ground ops personnel. The payback in developing these new approaches in support of manned and unmanned operations is lower cost and will enable direct participation by more people in organizations and educational institutions in space based science. By lowering the high cost of space based operations and networking, more resource will be available to the science community for science. With a specific grid in place, experiment development and operations would be much less costly by using standardized network interfaces. Because of the extensive connectivity on a global basis, significant numbers of people would participate in science who otherwise would not be able to participate.
How HRP Research Results Contribute to Human Space Exploration Risk Mitigation
NASA Technical Reports Server (NTRS)
Lumpkins, S. B.; Mindock, J. A.
2014-01-01
In addition to the scientific value of publications derived from research, results from Human Research Program (HRP) research also support HRP’s goals of mitigating crew health and performance risks in space flight. Research results are used to build the evidence base characterizing crew health and performance risks, to support risk research plan development, to inform crew health and performance standards, and to provide technologies to programs for meeting those standards and optimizing crew health and performance in space. This talk will describe examples of how research results support these efforts. For example, HRP research results are used to revise or even create new standards for human space flight, which have been established to protect crew health and performance during flight, and prevent negative long-term health consequences due to space flight. These standards are based on the best available clinical and scientific evidence, as well as operational experience from previous space flight missions, and are reviewed as new evidence emerges. Research results are also used to update the HRP evidence base, which is comprised of a set of reports that provide a current record of the state of knowledge from research and operations for each of the defined human health and performance risks for future NASA exploration missions. A discussion of the role of evidence within the HRP architecture will also be presented. The scope of HRP research results extends well beyond publications, as they are used in several capacities to support HRP deliverables and, ultimately, the advancement of human space exploration beyond low-Earth orbit.
How HRP Research Results Contribute to Human Space Exploration Risk Mitigation
NASA Technical Reports Server (NTRS)
Lumpkins, Sarah; Mindock, Jennifer
2014-01-01
In addition to the scientific value of publications derived from research, results from Human Research Program (HRP) research also support HRP's goals of mitigating crew health and performance risks in space flight. Research results are used to build the evidence base characterizing crew health and performance risks, to support risk research plan development, to inform crew health and performance standards, and to provide technologies to programs for meeting those standards and optimizing crew health and performance in space. This talk will describe examples of how research results support these efforts. For example, HRP research results are used to revise or even create new standards for human space flight, which have been established to protect crew health and performance during flight, and prevent negative long-term health consequences due to space flight. These standards are based on the best available clinical and scientific evidence, as well as operational experience from previous space flight missions, and are reviewed as new evidence emerges. Research results are also used to update the HRP evidence base, which is comprised of a set of reports that provide a current record of the state of knowledge from research and operations for each of the defined human health and performance risks for future NASA exploration missions. A discussion of the role of evidence within the HRP architecture will also be presented. The scope of HRP research results extends well beyond publications, as they are used in several capacities to support HRP deliverables and, ultimately, the advancement of human space exploration beyond low-Earth orbit.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-22
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2013-N-0001] Synergizing Efforts in Standards Development for Cellular Therapies and Regenerative Medicine Products; Public Workshop AGENCY: Food and Drug Administration, HHS. ACTION: Notice of public workshop. The Food and Drug Administration (FDA), Center for...
NASA Earth Observation Systems and Applications for Health and Air Quality
NASA Technical Reports Server (NTRS)
Omar, Ali H.
2015-01-01
There is a growing body of evidence that the environment can affect human health in ways that are both complex and global in scope. To address some of these complexities, NASA maintains a diverse constellation of Earth observing research satellites, and sponsors research in developing satellite data applications across a wide spectrum of areas. These include environmental health; infectious disease; air quality standards, policies, and regulations; and the impact of climate change on health and air quality in a number of interrelated efforts. The Health and Air Quality Applications fosters the use of observations, modeling systems, forecast development, application integration, and the research to operations transition process to address environmental health effects. NASA has been a primary partner with Federal operational agencies over the past nine years in these areas. This talk presents the background of the Health and Air Quality Applications program, recent accomplishments, and a plan for the future.
Evaluating safety and operations of high-speed signalized intersections.
DOT National Transportation Integrated Search
2010-03-01
This Final Report reviews a research effort to evaluate the safety and operations of high-speed intersections in the State of : Oregon. In particular, this research effort focuses on four-leg, signalized intersections with speed limits of 45 mph or :...
Evaluating safety and operation of high-speed intersections.
DOT National Transportation Integrated Search
2010-03-01
This Final Report reviews a research effort to evaluate the safety and operations of high-speed intersections in the State of : Oregon. In particular, this research effort focuses on four-leg, signalized intersections with speed limits of 45 mph or :...
2010-10-27
central to the enemy‟s operational design, defense and industry efforts to counter the IED with technology have been aggressive. The US has spent...and industry efforts to counter the IED with technology have been aggressive. The US has spent billions of dollars to mitigate the effects of IEDs...2001 has claimed the lives of over 1,700 United States and Coalition Force (CF) service men and women ; over six times that many have been wounded.1
Early detection of sporadic pancreatic cancer: strategic map for innovation--a white paper.
Kenner, Barbara J; Chari, Suresh T; Cleeter, Deborah F; Go, Vay Liang W
2015-07-01
Innovation leading to significant advances in research and subsequent translation to clinical practice is urgently necessary in early detection of sporadic pancreatic cancer. Addressing this need, the Early Detection of Sporadic Pancreatic Cancer Summit Conference was conducted by Kenner Family Research Fund in conjunction with the 2014 American Pancreatic Association and Japan Pancreas Society Meeting. International interdisciplinary scientific representatives engaged in strategic facilitated conversations based on distinct areas of inquiry: Case for Early Detection: Definitions, Detection, Survival, and Challenges; Biomarkers for Early Detection; Imaging; and Collaborative Studies. Ideas generated from the summit have led to the development of a Strategic Map for Innovation built upon 3 components: formation of an international collaborative effort, design of an actionable strategic plan, and implementation of operational standards, research priorities, and first-phase initiatives. Through invested and committed efforts of leading researchers and institutions, philanthropic partners, government agencies, and supportive business entities, this endeavor will change the future of the field and consequently the survival rate of those diagnosed with pancreatic cancer.
Early Detection of Sporadic Pancreatic Cancer
Kenner, Barbara J.; Chari, Suresh T.; Cleeter, Deborah F.; Go, Vay Liang W.
2015-01-01
Abstract Innovation leading to significant advances in research and subsequent translation to clinical practice is urgently necessary in early detection of sporadic pancreatic cancer. Addressing this need, the Early Detection of Sporadic Pancreatic Cancer Summit Conference was conducted by Kenner Family Research Fund in conjunction with the 2014 American Pancreatic Association and Japan Pancreas Society Meeting. International interdisciplinary scientific representatives engaged in strategic facilitated conversations based on distinct areas of inquiry: Case for Early Detection: Definitions, Detection, Survival, and Challenges; Biomarkers for Early Detection; Imaging; and Collaborative Studies. Ideas generated from the summit have led to the development of a Strategic Map for Innovation built upon 3 components: formation of an international collaborative effort, design of an actionable strategic plan, and implementation of operational standards, research priorities, and first-phase initiatives. Through invested and committed efforts of leading researchers and institutions, philanthropic partners, government agencies, and supportive business entities, this endeavor will change the future of the field and consequently the survival rate of those diagnosed with pancreatic cancer. PMID:25938853
Efforts to improve international migration statistics: a historical perspective.
Kraly, E P; Gnanasekaran, K S
1987-01-01
During the past decade, the international statistical community has made several efforts to develop standards for the definition, collection and publication of statistics on international migration. This article surveys the history of official initiatives to standardize international migration statistics by reviewing the recommendations of the International Statistical Institute, International Labor Organization, and the UN, and reports a recently proposed agenda for moving toward comparability among national statistical systems. Heightening awareness of the benefits of exchange and creating motivation to implement international standards requires a 3-pronged effort from the international statistical community. 1st, it is essential to continue discussion about the significance of improvement, specifically standardization, of international migration statistics. The move from theory to practice in this area requires ongoing focus by migration statisticians so that conformity to international standards itself becomes a criterion by which national statistical practices are examined and assessed. 2nd, the countries should be provided with technical documentation to support and facilitate the implementation of the recommended statistical systems. Documentation should be developed with an understanding that conformity to international standards for migration and travel statistics must be achieved within existing national statistical programs. 3rd, the call for statistical research in this area requires more efforts by the community of migration statisticians, beginning with the mobilization of bilateral and multilateral resources to undertake the preceding list of activities.
Space Synthetic Biology Project
NASA Technical Reports Server (NTRS)
Howard, David; Roman, Monsi; Mansell, James (Matt)
2015-01-01
Synthetic biology is an effort to make genetic engineering more useful by standardizing sections of genetic code. By standardizing genetic components, biological engineering will become much more similar to traditional fields of engineering, in which well-defined components and subsystems are readily available in markets. Specifications of the behavior of those components and subsystems can be used to model a system which incorporates them. Then, the behavior of the novel system can be simulated and optimized. Finally, the components and subsystems can be purchased and assembled to create the optimized system, which most often will exhibit behavior similar to that indicated by the model. The Space Synthetic Biology project began in 2012 as a multi-Center effort. The purpose of this project was to harness Synthetic Biology principals to enable NASA's missions. A central target for application was to Environmental Control & Life Support (ECLS). Engineers from NASA Marshall Space Flight Center's (MSFC's) ECLS Systems Development Branch (ES62) were brought into the project to contribute expertise in operational ECLS systems. Project lead scientists chose to pursue the development of bioelectrochemical technologies to spacecraft life support. Therefore, the ECLS element of the project became essentially an effort to develop a bioelectrochemical ECLS subsystem. Bioelectrochemical systems exploit the ability of many microorganisms to drive their metabolisms by direct or indirect utilization of electrical potential gradients. Whereas many microorganisms are capable of deriving the energy required for the processes of interest (such as carbon dioxide (CO2) fixation) from sunlight, it is believed that subsystems utilizing electrotrophs will exhibit smaller mass, volume, and power requirements than those that derive their energy from sunlight. In the first 2 years of the project, MSFC personnel conducted modeling, simulation, and conceptual design efforts to assist the project in selecting the best approaches to the application of bioelectrochemical technologies to ECLS. Figure 1 shows results of simulation of charge transport in an experimental system. Figure 2 shows one of five conceptual designs for ECLS subsystems based on bioelectrochemical reactors. Also during the first 2 years, some work was undertaken to gather fundamental data (conductivities, overpotentials) relevant to the modeling efforts.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-23
... operating instructions, if available, or standard operating procedures must be developed by the facility... operating instructions, if available, or standard operating procedures must be developed by the facility... standard operating procedures developed by the prepared feeds manufacturer be required as part of the...
Ravid, Rivka
2008-09-01
The use of human biological specimens in scientific research is the focus of current international public and professional concern and a major issue in bioethics in general. Brain/Tissue/Bio banks (BTB-banks) are a rapid developing sector; each of these banks acts locally as a steering unit for the establishment of the local Standard Operating Procedures (SOPs) and the legal regulations and ethical guidelines to be followed in the procurement and dissemination of research specimens. An appropriat Code of Conduct is crucial to a successful operation of the banks and the research application they handle. What are we still missing ? (1) Adequate funding for research BTB-banks. (2) Standard evaluation protocls for audit of BTB-bank performance. (3) Internationally accepted SOP's which will facilitate exchange and sharing of specimens and data with the scientific community. (4) Internationally accepted Code of Conduct. In the present paper we review the most pressing organizational, methodological, medico-legal and ethical issues involved in BTB-banking; funding, auditing, procurement, management/handling, dissemination and sharing of specimens, confidentiality and data protection, genetic testing, "financial gain" and safety measures. Taking into consideration the huge variety of the specimens stored in different repositories and the enormous differences in medico-legal systems and ethics regulations in different countries it is strongly recommend that the health-care systems and institutions who host BTB-Banks will put more efforts in getting adequate funding for the infrastructure and daily activities. The BTB-banks should define evaluation protocols, SOPs and their Code of Conduct. This in turn will enable the banks to share the collected specimens and data with the largest possible number of researchers and aim at a maximal scientific spin-off and advance in public health research.
Nolte, Kurt B; Hanzlick, Randy L; Payne, Daniel C; Kroger, Andrew T; Oliver, William R; Baker, Andrew M; McGowan, Dennis E; DeJong, Joyce L; Bell, Micahel R; Guarner, Jeannette; Shieh, Wun-Ju; Zaki, Sherif R
2004-06-11
Medical examiners and coroners (ME/Cs) are essential public health partners for terrorism preparedness and response. These medicolegal investigators support both public health and public safety functions and investigate deaths that are sudden, suspicious, violent, unattended, and unexplained. Medicolegal autopsies are essential for making organism-specific diagnoses in deaths caused by biologic terrorism. This report has been created to 1) help public health officials understand the role of ME/Cs in biologic terrorism surveillance and response efforts and 2) provide ME/Cs with the detailed information required to build capacity for biologic terrorism preparedness in a public health context. This report provides background information regarding biologic terrorism, possible biologic agents, and the consequent clinicopathologic diseases, autopsy procedures, and diagnostic tests as well as a description of biosafety risks and standards for autopsy precautions. ME/Cs' vital role in terrorism surveillance requires consistent standards for collecting, analyzing, and disseminating data. Familiarity with the operational, jurisdictional, and evidentiary concerns involving biologic terrorism-related death investigation is critical to both ME/Cs and public health authorities. Managing terrorism-associated fatalities can be expensive and can overwhelm the existing capacity of ME/Cs. This report describes federal resources for funding and reimbursement for ME/C preparedness and response activities and the limited support capacity of the federal Disaster Mortuary Operational Response Team. Standards for communication are critical in responding to any emergency situation. This report, which is a joint collaboration between CDC and the National Association of Medical Examiners (NAME), describes the relationship between ME/Cs and public health departments, emergency management agencies, emergency operations centers, and the Incident Command System.
Towards a World Catalogue of Standards
ERIC Educational Resources Information Center
Kuiper, Barteld E.
1973-01-01
The International Organization for Standardization (ISO) efforts to develop a uniform catalog of integrated standard indexes from around the world are described. The purpose is to facilitate the search for standards. (SM)
NASA Astrophysics Data System (ADS)
Tyczka, Dale R.; Wright, Robert; Janiszewski, Brian; Chatten, Martha Jane; Bowen, Thomas A.; Skibba, Brian
2012-06-01
Nearly all explosive ordnance disposal robots in use today employ monoscopic standard-definition video cameras to relay live imagery from the robot to the operator. With this approach, operators must rely on shadows and other monoscopic depth cues in order to judge distances and object depths. Alternatively, they can contact an object with the robot's manipulator to determine its position, but that approach carries with it the risk of detonation from unintentionally disturbing the target or nearby objects. We recently completed a study in which high-definition (HD) and stereoscopic video cameras were used in addition to conventional standard-definition (SD) cameras in order to determine if higher resolutions and/or stereoscopic depth cues improve operators' overall performance of various unmanned ground vehicle (UGV) tasks. We also studied the effect that the different vision modes had on operator comfort. A total of six different head-aimed vision modes were used including normal-separation HD stereo, SD stereo, "micro" (reduced separation) SD stereo, HD mono, and SD mono (two types). In general, the study results support the expectation that higher resolution and stereoscopic vision aid UGV teleoperation, but the degree of improvement was found to depend on the specific task being performed; certain tasks derived notably more benefit from improved depth perception than others. This effort was sponsored by the Joint Ground Robotics Enterprise under Robotics Technology Consortium Agreement #69-200902 T01. Technical management was provided by the U.S. Air Force Research Laboratory's Robotics Research and Development Group at Tyndall AFB, Florida.
ATOS-1: Designing the infrastructure for an advanced spacecraft operations system
NASA Technical Reports Server (NTRS)
Poulter, K. J.; Smith, H. N.
1993-01-01
The space industry has identified the need to use artificial intelligence and knowledge based system techniques as integrated, central, symbolic processing components of future mission design, support and operations systems. Various practical and commercial constraints require that off-the-shelf applications, and their knowledge bases, are reused where appropriate and that different mission contractors, potentially using different KBS technologies, can provide application and knowledge sub-modules of an overall integrated system. In order to achieve this integration, which we call knowledge sharing and distributed reasoning, there needs to be agreement on knowledge representations, knowledge interchange-formats, knowledge level communications protocols, and ontology. Research indicates that the latter is most important, providing the applications with a common conceptualization of the domain, in our case spacecraft operations, mission design, and planning. Agreement on ontology permits applications that employ different knowledge representations to interwork through mediators which we refer to as knowledge agents. This creates the illusion of a shared model without the constraints, both technical and commercial, that occur in centralized or uniform architectures. This paper explains how these matters are being addressed within the ATOS program at ESOC, using techniques which draw upon ideas and standards emerging from the DARPA Knowledge Sharing Effort. In particular, we explain how the project is developing an electronic Ontology of Spacecraft Operations and how this can be used as an enabling component within space support systems that employ advanced software engineering. We indicate our hope and expectation that the core ontology developed in ATOS, will permit the full development of standards for such systems throughout the space industry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodriguez, J.R.; Ahrens, J.S.; Lowe, D.L.
Throughout the years, Sandia National Laboratories (SNL) has performed various laboratory evaluations of entry control devices, including biometric identity verifiers. The reports which resulted from this testing have been very well received by the physical security community. This same community now requires equally informative field study data. To meet this need we have conducted a field study in an effort to develop the tools and methods which our customers can use to translate laboratory data into operational field performance. The field testing described in this report was based on the Recognition Systems Inc.`s (RSI) model ID3D HandKey biometric verifier. Thismore » device was selected because it is referenced in DOE documents such as the Guide for Implementation of the DOE Standard Badge and is the de facto biometric standard for the DOE. The ID3D HandKey is currently being used at several DOE sites such as Hanford, Rocky Flats, Pantex, Savannah River, and Idaho Nuclear Engineering Laboratory. The ID3D HandKey was laboratory tested at SNL. It performed very well during this test, exhibiting an equal error point of 0.2 percent. The goals of the field test were to identify operational characteristics and design guidelines to help system engineers translate laboratory data into field performance. A secondary goal was to develop tools which could be used by others to evaluate system effectiveness or improve the performance of their systems. Operational characteristics were determined by installing a working system and studying its operation over a five month period. Throughout this test we developed tools which could be used by others to similarly gauge system effectiveness.« less
NASA Technical Reports Server (NTRS)
Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Vincent, Michael J.; Sturdy, James L.; Munoz, Cesar A.; Hoffler, Keith D.; Dutle, Aaron M.; Myer, Robert R.; Dehaven, Anna M.;
2017-01-01
As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The current NAS relies on pilot's vigilance and judgement to remain Well Clear (CFR 14 91.113) of other aircraft. RTCA SC-228 has defined DAA Well Clear (DAAWC) to provide a quantified Well Clear volume to allow systems to be designed and measured against. Extended research efforts have been conducted to understand and quantify system requirements needed to support a UAS pilot's ability to remain well clear of other aircraft. The efforts have included developing and testing sensor, algorithm, alerting, and display requirements. More recently, sensor uncertainty and uncertainty mitigation strategies have been evaluated. This paper discusses results and lessons learned from an End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS). NASA Langley Research Center (LaRC) was called upon to develop a system that evaluates a specific set of encounters, in a variety of geometries, with end-to-end DAA functionality including the use of sensor and tracker models, a sensor uncertainty mitigation model, DAA algorithmic guidance in both vertical and horizontal maneuvering, and a pilot model which maneuvers the ownship aircraft to remain well clear from intruder aircraft, having received collective input from the previous modules of the system. LaRC developed a functioning batch simulation and added a sensor/tracker model from the Federal Aviation Administration (FAA) William J. Hughes Technical Center, an in-house developed sensor uncertainty mitigation strategy, and implemented a pilot model similar to one from the Massachusetts Institute of Technology's Lincoln Laboratory (MIT/LL). The resulting simulation provides the following key parameters, among others, to evaluate the effectiveness of the MOPS DAA system: severity of loss of well clear (SLoWC), alert scoring, and number of increasing alerts (alert jitter). The technique, results, and lessons learned from a detailed examination of DAA system performance over specific test vectors and encounter cases during the simulation experiment will be presented in this paper.
Kim, Nancy; Boone, Kyle B; Victor, Tara; Lu, Po; Keatinge, Carolyn; Mitchell, Cary
2010-08-01
Recently published practice standards recommend that multiple effort indicators be interspersed throughout neuropsychological evaluations to assess for response bias, which is most efficiently accomplished through use of effort indicators from standard cognitive tests already included in test batteries. The present study examined the utility of a timed recognition trial added to standard administration of the WAIS-III Digit Symbol subtest in a large sample of "real world" noncredible patients (n=82) as compared with credible neuropsychology clinic patients (n=89). Scores from the recognition trial were more sensitive in identifying poor effort than were standard Digit Symbol scores, and use of an equation incorporating Digit Symbol Age-Corrected Scaled Scores plus accuracy and time scores from the recognition trial was associated with nearly 80% sensitivity at 88.7% specificity. Thus, inclusion of a brief recognition trial to Digit Symbol administration has the potential to provide accurate assessment of response bias.
Quality control and assurance for validation of DOS/I measurements
NASA Astrophysics Data System (ADS)
Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.
2010-02-01
Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.
NASA's SDR Standard: Space Telecommunications Radio System
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Johnson, Sandra K.
2007-01-01
A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG s SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA s current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.
Comparison of national space debris mitigation standards
NASA Astrophysics Data System (ADS)
Kato, A.
2001-01-01
Several national organizations of the space faring nations have established Space Debris Mitigation Standards or Handbooks to promote efforts to deal with the space debris issue. This paper introduces the characteristics of each document and compares the structure, items and level of requirements. The contents of these standards may be slightly different from each other but the fundamental principles are almost the same; they are (1) prevention of on-orbit breakups, (2) removal of mission terminated spacecraft from the useful orbit regions, and (3) limiting the objects released during normal operations. The Inter-Agency Space Debris Coordination Committee has contributed considerably to this trend. The Committee also found out by its recent survey that some commercial companies have begun to adopt the debris mitigation measures for their projects. However, the number of organizations that have initiated this kind of self-control is still limited, so the next challenge of the Committee is to promote the Space Debris Mitigation Guidelines world-wide. IADC initiated this project in October 1999 and a draft is being circulated among the member agencies.
Program Standards and Expectations: Providing Clarity, Consistency, and Focus
ERIC Educational Resources Information Center
Diem, Keith G.
2016-01-01
The effort described in this article resulted from requests for clarity and consistency from new and existing Extension/4-H educators as well as from recommendations by university auditors. The primary purpose of the effort was to clarify standards for effective county-based 4-H youth development programs and to help focus the roles of 4-H…
ERIC Educational Resources Information Center
Garet, Michael S.; Birman, Beatrice F.; Porter, Andrew C.; Desimone, Laura; Herman, Rebecca
The professional development of teachers is a crucial element of the nation's efforts to improve education. In recent years, these efforts have sought to foster high standards for teaching and learning for all of the nation's children, and almost all states have met federal requirements for developing challenging statewide content standards. Such…
Pollution in Higher Education. Efforts of the U.S. Office of Education in Relation to Degree Mills.
ERIC Educational Resources Information Center
Bureau of Postsecondary Education (DHEW/OE), Washington, DC. Accreditation and Institutional Eligibility Staff.
These papers concern the efforts of the U.S. Office of Education to eradicate "degree mills", that is, organizations that award degrees without requiring their students to meet educational standards for such degrees, standards that have been established and traditionally followed by reputable educational institutions. The rapid growth in…
On Logic and Standards for Structuring Documents
NASA Astrophysics Data System (ADS)
Eyers, David M.; Jones, Andrew J. I.; Kimbrough, Steven O.
The advent of XML has been widely seized upon as an opportunity to develop document representation standards that lend themselves to automated processing. This is a welcome development and much good has come of it. That said, present standardization efforts may be criticized on a number of counts. We explore two issues associated with document XML standardization efforts. We label them (i) the dynamic point and (ii) the logical point. Our dynamic point is that in many cases experience has shown that the search for a final, or even reasonably permanent, document representation standard is futile. The case is especially strong for electronic data interchange (EDI). Our logical point is that formalization into symbolic logic is materially helpful for understanding and designing dynamic document standards.
What public officials need to know about connected vehicles.
DOT National Transportation Integrated Search
1996-06-01
The Standards Development Plan identifies potential standards areas, reviews existing standards efforts, describes a general process to assist standards development, and suggests beneficial actions to support and encourage ITS deployment. This docume...
Cabaleiro, Joe
2007-01-01
A key component of qualifying for accreditation with the Pharmacy Compounding Accreditation Board is having a set of comprehensive standard operating procedures that are being used by the pharmacy staff. The three criteria in standard operating procedures for which the Pharmacy Compounding Accreditation Board looks are: (1)written standard operating procedures; (2)standard operating procedures that reflect what the organization actualy does; and (3) whether the written standard operating procedures are implemented. Following specified steps in the preparation of standard operating procedures will result in procedures that meet Pharmacy Compounding Accreditation Board Requirements, thereby placing pharmacies one step closer to qualifying for accreditation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archer, Daniel E.; Hornback, Donald Eric; Johnson, Jeffrey O.
This report summarizes the findings of a two year effort to systematically assess neutron and gamma backgrounds relevant to operational modeling and detection technology implementation. The first year effort focused on reviewing the origins of background sources and their impact on measured rates in operational scenarios of interest. The second year has focused on the assessment of detector and algorithm performance as they pertain to operational requirements against the various background sources and background levels.
The Human Variome Project (HVP) 2009 Forum "Towards Establishing Standards".
Howard, Heather J; Horaitis, Ourania; Cotton, Richard G H; Vihinen, Mauno; Dalgleish, Raymond; Robinson, Peter; Brookes, Anthony J; Axton, Myles; Hoffmann, Robert; Tuffery-Giraud, Sylvie
2010-03-01
The May 2009 Human Variome Project (HVP) Forum "Towards Establishing Standards" was a round table discussion attended by delegates from groups representing international efforts aimed at standardizing several aspects of the HVP: mutation nomenclature, description and annotation, clinical ontology, means to better characterize unclassified variants (UVs), and methods to capture mutations from diagnostic laboratories for broader distribution to the medical genetics research community. Methods for researchers to receive credit for their effort at mutation detection were also discussed. (c) 2010 Wiley-Liss, Inc.
Evaluating safety and operation of high-speed signalized intersections : final report, March 2010.
DOT National Transportation Integrated Search
2010-03-01
This Final Report reviews a research effort to evaluate the safety and operations of high-speed intersections in the State of : Oregon. In particular, this research effort focuses on four-leg, signalized intersections with speed limits of 45 mph or :...
(abstract) Mission Operations and Control Assurance: Flight Operations Quality Improvements
NASA Technical Reports Server (NTRS)
Welz, Linda L.; Bruno, Kristin J.; Kazz, Sheri L.; Witkowski, Mona M.
1993-01-01
Mission Operations and Command Assurance (MO&CA), a recent addition to flight operations teams at JPL. provides a system level function to instill quality in mission operations. MO&CA's primary goal at JPL is to help improve the operational reliability for projects during flight. MO&CA tasks include early detection and correction of process design and procedural deficiencies within projects. Early detection and correction are essential during development of operational procedures and training of operational teams. MO&CA's effort focuses directly on reducing the probability of radiating incorrect commands to a spacecraft. Over the last seven years at JPL, MO&CA has become a valuable asset to JPL flight projects. JPL flight projects have benefited significantly from MO&CA's efforts to contain risk and prevent rather than rework errors. MO&CA's ability to provide direct transfer of knowledge allows new projects to benefit directly from previous and ongoing experience. Since MO&CA, like Total Quality Management (TQM), focuses on continuous improvement of processes and elimination of rework, we recommend that this effort be continued on NASA flight projects.
1997 DOE technical standards program workshop: Proceedings
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-10-01
The Department of Energy held its annual Technical Standards Program Workshop on July 8--10, 1997, at the Loews L`Enfant Plaza Hotel in Washington, DC. The workshop focused on aspects of implementation of the National Technology Transfer and Advancement Act of 1995 [Public Law (PL) 104-113] and the related revision (still pending) to OMB Circular A119 (OMB A119), Federal Participation in the Development and Use of Voluntary Standards. It also addressed DOE`s efforts in transitioning to a standards-based operating culture, and, through this transition, to change from a developer of internal technical standards to a customer of external technical standards. Themore » workshop was designed to provide a forum to better understand how the new law is affecting Department activities. Panel topics such as ``Public Law 104-113 and Its Influence on Federal Agency Standards Activities`` and ``Update on Global Standards Issues`` provided insight on both the internal and external effects of the new law. Keynote speaker Richard Meier of Meadowbrook International (and formerly the Deputy Assistant US Trade Representative) addressed the subject of international trade balance statistics. He pointed out that increases in US export figures do not necessarily indicate increases in employment. Rather, increased employment results from product growth. Mr Meier also discussed issues such as the US migration to the sue of the metric system, the impact of budget limitations on Government participation in voluntary standards organizations, international standards ISO 9000 and ISO 14000, and DOE`s role in the worldwide transition from weapons production to cleanup.« less
Increasing the Operational Value of Event Messages
NASA Technical Reports Server (NTRS)
Li, Zhenping; Savkli, Cetin; Smith, Dan
2003-01-01
Assessing the health of a space mission has traditionally been performed using telemetry analysis tools. Parameter values are compared to known operational limits and are plotted over various time periods. This presentation begins with the notion that there is an incredible amount of untapped information contained within the mission s event message logs. Through creative advancements in message handling tools, the event message logs can be used to better assess spacecraft and ground system status and to highlight and report on conditions not readily apparent when messages are evaluated one-at-a-time during a real-time pass. Work in this area is being funded as part of a larger NASA effort at the Goddard Space Flight Center to create component-based, middleware-based, standards-based general purpose ground system architecture referred to as GMSEC - the GSFC Mission Services Evolution Center. The new capabilities and operational concepts for event display, event data analyses and data mining are being developed by Lockheed Martin and the new subsystem has been named GREAT - the GMSEC Reusable Event Analysis Toolkit. Planned for use on existing and future missions, GREAT has the potential to increase operational efficiency in areas of problem detection and analysis, general status reporting, and real-time situational awareness.
Overview of the land analysis system (LAS)
Quirk, Bruce K.; Olseson, Lyndon R.
1987-01-01
The Land Analysis System (LAS) is a fully integrated digital analysis system designed to support remote sensing, image processing, and geographic information systems research. LAS is being developed through a cooperative effort between the National Aeronautics and Space Administration Goddard Space Flight Center and the U. S. Geological Survey Earth Resources Observation Systems (EROS) Data Center. LAS has over 275 analysis modules capable to performing input and output, radiometric correction, geometric registration, signal processing, logical operations, data transformation, classification, spatial analysis, nominal filtering, conversion between raster and vector data types, and display manipulation of image and ancillary data. LAS is currently implant using the Transportable Applications Executive (TAE). While TAE was designed primarily to be transportable, it still provides the necessary components for a standard user interface, terminal handling, input and output services, display management, and intersystem communications. With TAE the analyst uses the same interface to the processing modules regardless of the host computer or operating system. LAS was originally implemented at EROS on a Digital Equipment Corporation computer system under the Virtual Memorial System operating system with DeAnza displays and is presently being converted to run on a Gould Power Node and Sun workstation under the Berkeley System Distribution UNIX operating system.
Downs, John W; Flood, Daniel T; Orr, Nicholas H; Constantineau, Jason A; Caviness, James W
2017-01-01
Sandfly fever, sometimes known as pappataci fever or Phlebotomus fever, is a vector transmitted viral illness with a history of affecting naïve military formations that travel through or fight in areas in which the infection is endemic. We present a series of 4 hospitalized cases of sandfly fever (2 presumptive, 2 laboratory confirmed) that were admitted to a Role 3 hospital in Afghanistan for evaluation and treatment following medical evacuation from a forward area for marked fevers and malaise. Laboratory evaluation of these cases was significant for leukopenia and thrombocytopenia, consistent with historical descriptions of sandfly fever. In the correct geographic and clinical setting, the finding of mild leukopenia among a cluster of febrile patients should prompt the clinician to at least consider a diagnosis of sandfly fever. A cluster investigation conducted by preventive medicine personnel identified numerous other presumed cases of sandfly fever in this forward special operations camp. Response efforts emphasized enforcement of standard vector-borne disease control measures by operational leadership in order to limit effect on tactical operations. We review historical instances of sandfly fever affecting military operations, and present a review of clinical presentation, transmission, management, and prevention.
Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.
Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie
2010-07-01
Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Stegen, J.; Scheibe, T. D.; Chen, X.; Huang, M.; Arntzen, E.; Garayburu-Caruso, V. A.; Graham, E.; Johnson, T. C.; Strickland, C. E.
2017-12-01
The installation and operation of dams have myriad influences on ecosystems, from direct effects on hydrographs to indirect effects on marine biogeochemistry and terrestrial food webs. With > 50000 existing and > 3700 planned large dams world-wide there is a pressing need for holistic understanding of dam impacts. Such understanding is likely to reveal unrecognized opportunities to modify dam operations towards beneficial outcomes. One of the most dramatic influences of daily dam operations is the creation of `artificial intertidal zones' that emerge from short-term increases and decreases in discharge due to hydroelectric power demands; known as hydropeaking. There is a long history of studying the influences of hydropeaking on macrofauna such as fish and invertebrates, but only recently has significant attention been paid to the hydrobiogeochemical effects of hydropeaking. Our aim here is to develop an integrated conceptual model of the hydrobiogeochemical influences of hydropeaking. To do so we reviewed available literature focusing on hydrologic and/or biogeochemical influences of hydropeaking. Results from these studies were collated into a single conceptual model that integrates key physical (e.g., sediment transport, hydromorphology) and biological (e.g., timescale of microbiome response) processes. This conceptual model highlights non-intuitive impacts of hydropeaking, the presence of critical thresholds, and strong interactions among processes. When examined individually these features suggest context dependency, but when viewed through an integrated conceptual model, common themes emerge. We will further discuss a critical next step, which is the local to regional to global evaluation of this conceptual model, to enable multiscale understanding. We specifically propose a global `hydropeaking network' of researchers using common methods, data standards, and analysis techniques to quantify the hydrobiogeochemical effects of hydropeaking across biomes. We will conclude with a prospective discussion of key science questions that emerge from the conceptual model and that can only be answered through a global, synchronized effort. Such an effort has the potential to strongly influence dam operations towards improved health of river corridor ecosystems from local to global scales.
NASA's UAS [Unmanned Aircraft Systems] Related Activities
NASA Technical Reports Server (NTRS)
Bauer, Jeffrey
2012-01-01
NASA continues to operate all sizes of UAS in all classes of airspace both domestically and internationally. Missions range from highly complex operations in coordination with piloted aircraft, ground, and space systems in support of science objectives to single aircraft operations in support of aeronautics research. One such example is a scaled commercial transport aircraft being used to study recovery techniques due to large upsets. NASA's efforts to support routine UAS operations continued on several fronts last year. At the national level in the United States (U.S.), NASA continued its support of the UAS Executive Committee (ExCom) comprised of the Federal Aviation Administration (FAA), Department of Defense (DoD), Department of Homeland Security (DHS), and NASA. The committee was formed in recognition of the need of UAS operated by these agencies to access to the National Airspace System (NAS) to support operational, training, development and research requirements. Recommendations were received on how to operate both manned and unmanned aircraft in class D airspace and plans are being developed to validate and implement those recommendations. In addition the UAS ExCom has begun developing recommendations for how to achieve routine operations in remote areas as well as for small UAS operations in class G airspace. As well as supporting the UAS ExCom, NASA is a participant in the recently formed Aviation Rule Making Committee for UAS. This committee, established by the FAA, is intended to propose regulatory guidance which would enable routine civil UAS operations. As that effort matures NASA stands ready to supply the necessary technical expertise to help that committee achieve its objectives. By supporting both the UAS ExCom and UAS ARC, NASA is positioned to provide its technical expertise across the full spectrum of UAS airspace access related topic areas. The UAS NAS Access Project got underway this past year under the leadership of NASA s Aeronautics Research Mission Directorate. This project is focused on advancing the state of the art and providing research and analysis results in the areas of Separation Assurance, Communications (non-governmental spectrum allocation for UAS), Certification, and Human System Integration (ground control station design/pilot interfaces). The project is working in close coordination with the FAA and industry standards organizations (e.g. RTCA SC 203). More details on this project are provided in a separate article in this year's yearbook
Operation Occupation: A College and Career Readiness Intervention for Elementary Students
ERIC Educational Resources Information Center
Mariani, Melissa; Berger, Carolyn; Koerner, Kathleen; Sandlin, Cassie
2017-01-01
This article describes efforts undertaken to design, deliver, and evaluate a college and career readiness (CCR) unit for fifth-grade students. Preliminary findings from the school counselor-developed and -delivered intervention, Operation Occupation, supported interdisciplinary efforts between counselors and classroom teachers. Pre- and…
Fox Valley Technical Institute Economic Development Plan.
ERIC Educational Resources Information Center
Fox Valley Technical Inst., Appleton, WI.
Designed as an operating blueprint for Fox Valley Technical Institute's (FVTI's) economic development efforts, this guide incorporates the necessary operation procedures, descriptions, and resources for those involved in FVTI's effort to assist existing businesses to expand and to attract new businesses to the area. Introductory material describes…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mckie, Jim
2012-01-09
This report documents the results of work done over a 6 year period under the FAST-OS programs. The first effort was called Right-Weight Kernels, (RWK) and was concerned with improving measurements of OS noise so it could be treated quantitatively; and evaluating the use of two operating systems, Linux and Plan 9, on HPC systems and determining how these operating systems needed to be extended or changed for HPC, while still retaining their general-purpose nature. The second program, HARE, explored the creation of alternative runtime models, building on RWK. All of the HARE work was done on Plan 9. Themore » HARE researchers were mindful of the very good Linux and LWK work being done at other labs and saw no need to recreate it. Even given this limited funding, the two efforts had outsized impact: _ Helped Cray decide to use Linux, instead of a custom kernel, and provided the tools needed to make Linux perform well _ Created a successor operating system to Plan 9, NIX, which has been taken in by Bell Labs for further development _ Created a standard system measurement tool, Fixed Time Quantum or FTQ, which is widely used for measuring operating systems impact on applications _ Spurred the use of the 9p protocol in several organizations, including IBM _ Built software in use at many companies, including IBM, Cray, and Google _ Spurred the creation of alternative runtimes for use on HPC systems _ Demonstrated that, with proper modifications, a general purpose operating systems can provide communications up to 3 times as effective as user-level libraries Open source was a key part of this work. The code developed for this project is in wide use and available at many places. The core Blue Gene code is available at https://bitbucket.org/ericvh/hare. We describe details of these impacts in the following sections. The rest of this report is organized as follows: First, we describe commercial impact; next, we describe the FTQ benchmark and its impact in more detail; operating systems and runtime research follows; we discuss infrastructure software; and close with a description of the new NIX operating system, future work, and conclusions.« less
40 CFR 792.81 - Standard operating procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 33 2013-07-01 2013-07-01 false Standard operating procedures. 792.81... operating procedures. (a) A testing facility shall have standard operating procedures in writing, setting... data generated in the course of a study. All deviations in a study from standard operating procedures...
40 CFR 792.81 - Standard operating procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 33 2012-07-01 2012-07-01 false Standard operating procedures. 792.81... operating procedures. (a) A testing facility shall have standard operating procedures in writing, setting... data generated in the course of a study. All deviations in a study from standard operating procedures...
40 CFR 792.81 - Standard operating procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 32 2014-07-01 2014-07-01 false Standard operating procedures. 792.81... operating procedures. (a) A testing facility shall have standard operating procedures in writing, setting... data generated in the course of a study. All deviations in a study from standard operating procedures...
ERIC Educational Resources Information Center
Anagnostopoulos, Dorothea; Sykes, Gary; McCrory, Raven; Cannata, Marisa; Frank, Kenneth
2010-01-01
The National Board for Professional Teaching Standards (NBPTS) is the most prominent contemporary effort to professionalize teaching. Along with identifying exceptional teachers, the NBPTS seeks to alter teachers' work by establishing a cadre of expert teachers capable of and obligated to leading school improvement efforts. This article reports…
ERIC Educational Resources Information Center
Askov, Eunice N.
This document describes two activities of the Literacy Leader Fellowship research project, which addressed the needs of adult educators for knowledge of job skills and of business and unions for information about adult literacy efforts. The first section describes the following efforts related to skill standards and other policy initiatives: (1)…
Health hazard evaluation report HETA 92-064-2222, Ohio Valley Litho Color, Inc. , Florence, Kentucky
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moss, C.E.; Burr, G.
1992-05-01
In response to a confidential request from a worker at Ohio Valley Litho Color, Inc. (SIC-2652) in Florence, Kentucky, an evaluation was made of possible exposure to electrical currents during the operation of an industrial papercutter. The company used industrial papercutters to prepare various types of advertising material for their clients. An operator of the papercutters had been diagnosed with a brain tumor. Measurements indicated that operators of the equipment had electrical currents of 9.7 to 160 microamperes passing through their bodies. There do not appear to be any standards related to exposure to currents of this magnitude passing throughmore » the body. The authors conclude that no electrocution or electrical burn hazard existed at the time of the survey. The lack of data to indicate the outcome of chronic exposure to low levels of currents passing through the body over a period of several years makes it impossible to rule out a connection between this exposure and chronic diseases such as brain tumors or cancers. The authors recommend that efforts be made to eliminate this exposure.« less
Addressable configurations of DNA nanostructures for rewritable memory.
Chandrasekaran, Arun Richard; Levchenko, Oksana; Patel, Dhruv S; MacIsaac, Molly; Halvorsen, Ken
2017-11-02
DNA serves as nature's information storage molecule, and has been the primary focus of engineered systems for biological computing and data storage. Here we combine recent efforts in DNA self-assembly and toehold-mediated strand displacement to develop a rewritable multi-bit DNA memory system. The system operates by encoding information in distinct and reversible conformations of a DNA nanoswitch and decoding by gel electrophoresis. We demonstrate a 5-bit system capable of writing, erasing, and rewriting binary representations of alphanumeric symbols, as well as compatibility with 'OR' and 'AND' logic operations. Our strategy is simple to implement, requiring only a single mixing step at room temperature for each operation and standard gel electrophoresis to read the data. We envision such systems could find use in covert product labeling and barcoding, as well as secure messaging and authentication when combined with previously developed encryption strategies. Ultimately, this type of memory has exciting potential in biomedical sciences as data storage can be coupled to sensing of biological molecules. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Robotic single port cholecystectomy: current data and future perspectives.
Angelou, Anastasios; Skarmoutsos, Athanasios; Margonis, Georgios A; Moris, Demetrios; Tsigris, Christos; Pikoulis, Emmanouil
2017-04-01
Minimally invasive techniques are used more and more frequently. Since conventional laparoscopic approach has been the gold standard, surgeons in their effort to further reduce the invasiveness of conventional laparoscopic cholecystectomy have adopted Single Incision approach. The widespread adoption of robotics has led to the inevitable hybridization of robotic technology with laparoendoscopic single-site surgery (LESS). As a result, employment of the da Vinci surgical system may allow greater surgical maneuverability, improving ergonomics. A review of the English literature was conducted to evaluate all robotic single port cholecystectomy performed till today. Demographic data, operative parameters, postoperative outcomes and materials used for the operation were collected and assessed. A total of 12 studies, including 501 patients were analyzed. Demographics and clinical characteristics of the patients was heterogeneous, but in most studies a mean BMI <30 was recorded. Intraoperative metrics like operative time, estimated blood loss and conversion rate were comparable with those in multiport conventional laparoscopy. Robotic single port cholecystectomy is a safe and feasible alternative to conventional multiport laparoscopic or manual robotic approach. However, current data do not suggest a superiority of robotic SILC over other established methods.
Operating Ferret on a patrol boat
NASA Astrophysics Data System (ADS)
Bédard, Jacques
2006-05-01
Ferret is an acoustic system that detects, recognizes and localizes the source and direction of small arms fire. The system comprises a small array of microphones and pressure sensors connected to a standard PC-104 computer that analyzes, displays, reports and logs the parameters of a recognized shot. The system operates by detecting and recognizing the ballistic shock waves created by the supersonic bullet, combined with the muzzle blast wave propagating from the weapon. The system was recently installed and tested on a patrol boat operated by the Royal Canadian Mounted Police (RCMP). An electronic compass with tilt compensation and a GPS was incorporated into the system. This allows the system to correct for the motion of the boat and provide the full coordinates of the shooter. The system also updates the azimuth to the shooter in real time as the boat turns. This paper presents the results of our test and evaluation based on a live firing experiment. Ferret is the result of a collaborative effort by Defence R&D Canada and MacDonald Dettwiler and Associates.
A Tale of Two Archives: PDS3/PDS4 Archiving and Distribution of Juno Mission Data
NASA Astrophysics Data System (ADS)
Stevenson, Zena; Neakrase, Lynn; Huber, Lyle; Chanover, Nancy J.; Beebe, Reta F.; Sweebe, Kathrine; Johnson, Joni J.
2017-10-01
The Juno mission to Jupiter, which was launched on 5 August 2011 and arrived at the Jovian system in July 2016, represents the last mission to be officially archived under the PDS3 archive standards. Modernization and availability of the newer PDS4 archive standard has prompted the PDS Atmospheres Node (ATM) to provide on-the-fly migration of Juno data from PDS3 to PDS4. Data distribution under both standards presents challenges in terms of how to present data to the end user in both standards, without sacrificing accessibility to the data or impacting the active PDS3 mission pipelines tasked with delivering the data on predetermined schedules. The PDS Atmospheres Node has leveraged its experience with prior active PDS4 missions (e.g., LADEE and MAVEN) and ongoing PDS3-to-PDS4 data migration efforts providing a seamless distribution of Juno data in both PDS3 and PDS4. When ATM receives a data delivery from the Juno Science Operations Center, the PDS3 labels are validated and then fed through PDS4 migration software built at ATM. Specifically, a collection of Python methods and scripts has been developed to make the migration process as automatic as possible, even when working with the more complex labels used by several of the Juno instruments. This is used to create all of the PDS4 data labels at once and build PDS4 archive bundles with minimal human effort. Resultant bundles are then validated against the PDS4 standard and released alongside the certified PDS3 versions of the same data. The newer design of the distribution pages provides access to both versions of the data, utilizing some of the enhanced capabilities of PDS4 to improve search and retrieval of Juno data. Webpages are designed with the intent of offering easy access to all documentation for Juno data as well as the data themselves in both standards for users of all experience levels. We discuss the structure and organization of the Juno archive and associated webpages as examples of joint PDS3/PDS4 data access for end users.
Innovative Approach Enabled the Retirement of TDRS-1 Compliant with NASA Orbital Debris Requirements
NASA Technical Reports Server (NTRS)
Zaleski, Ronald; Mirczak, Walter; Staich, Stephen; Caverly, Richard; Smith, Eric; Teti, Nicholas; Vaught, W. Lynn; Olney, Dave
2011-01-01
The first Tracking and Data Relay Satellite (TDRS-1) was deactivated on June 27th 2010 following more than 26 years of operation. The end-of-mission (EOM) operations were developed to address the stringent requirements of NPR 8715.6: NASA Procedural Requirements for Limiting Orbital Debris, which consists of three key items: 1) removal from the geosynchronous arc; 2) depletion of the remaining propellant; and 3) passivation of all sources of energy storage or generation [1]. The EOM approach minimized risks while accomplishing these goals. Raising TDRS-1 over 350 km above geosynchronous was accomplished via proven station change operations. Depleting propellant was the most challenging task, requiring over 20 hours of thruster on-time accumulated within schedule, orbit, and spacecraft subsystem constraints. The attitude configuration and operational procedures, including the unique final passivation method, were thoroughly analyzed and simulated prior to the start of operations. The complete EOM campaign lasted 21 days. The TDRS-1 EOM campaign demonstrated that pre-NPR 8715.6 satellite designs can be made to comply and that lessons learned could be applied to other satellite designs. The significant TDRS-1 effort demonstrates a commitment by NASA to responsible orbital debris management in compliance with international standards.
Intelligence Level Performance Standards Research for Autonomous Vehicles
Bostelman, Roger B.; Hong, Tsai H.; Messina, Elena
2017-01-01
United States and European safety standards have evolved to protect workers near Automatic Guided Vehicles (AGV’s). However, performance standards for AGV’s and mobile robots have only recently begun development. Lessons can be learned from research and standards efforts for mobile robots applied to emergency response and military applications. Research challenges, tests and evaluations, and programs to develop higher intelligence levels for vehicles can also used to guide industrial AGV developments towards more adaptable and intelligent systems. These other efforts also provide useful standards development criteria for AGV performance test methods. Current standards areas being considered for AGVs are for docking, navigation, obstacle avoidance, and the ground truth systems that measure performance. This paper provides a look to the future with standards developments in both the performance of vehicles and the dynamic perception systems that measure intelligent vehicle performance. PMID:28649189
Intelligence Level Performance Standards Research for Autonomous Vehicles.
Bostelman, Roger B; Hong, Tsai H; Messina, Elena
2015-01-01
United States and European safety standards have evolved to protect workers near Automatic Guided Vehicles (AGV's). However, performance standards for AGV's and mobile robots have only recently begun development. Lessons can be learned from research and standards efforts for mobile robots applied to emergency response and military applications. Research challenges, tests and evaluations, and programs to develop higher intelligence levels for vehicles can also used to guide industrial AGV developments towards more adaptable and intelligent systems. These other efforts also provide useful standards development criteria for AGV performance test methods. Current standards areas being considered for AGVs are for docking, navigation, obstacle avoidance, and the ground truth systems that measure performance. This paper provides a look to the future with standards developments in both the performance of vehicles and the dynamic perception systems that measure intelligent vehicle performance.
40 CFR 201.15 - Standard for car coupling operations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Standard for car coupling operations... Interstate Rail Carrier Operations Standards § 201.15 Standard for car coupling operations. Effective January 15, 1984, no carrier subject to this regulation shall conduct car coupling operations that exceed an...
40 CFR 201.15 - Standard for car coupling operations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Standard for car coupling operations... Interstate Rail Carrier Operations Standards § 201.15 Standard for car coupling operations. Effective January 15, 1984, no carrier subject to this regulation shall conduct car coupling operations that exceed an...
40 CFR 201.15 - Standard for car coupling operations.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Standard for car coupling operations... Interstate Rail Carrier Operations Standards § 201.15 Standard for car coupling operations. Effective January 15, 1984, no carrier subject to this regulation shall conduct car coupling operations that exceed an...
40 CFR 201.15 - Standard for car coupling operations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Standard for car coupling operations... Interstate Rail Carrier Operations Standards § 201.15 Standard for car coupling operations. Effective January 15, 1984, no carrier subject to this regulation shall conduct car coupling operations that exceed an...
40 CFR 201.15 - Standard for car coupling operations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Standard for car coupling operations... Interstate Rail Carrier Operations Standards § 201.15 Standard for car coupling operations. Effective January 15, 1984, no carrier subject to this regulation shall conduct car coupling operations that exceed an...
One tool - one team: the marriage of test and operations in a low-budget spacecraft development
NASA Astrophysics Data System (ADS)
Finley, Charles J.
2006-05-01
The Air Force Research Laboratory's Space Vehicles Directorate (AFRL/VS) and the Department of Defense Space Test Program (STP) are two organizations that have partnered on more than 85 missions since 1968 to develop, launch, and operate Research and Development, Test and Evaluation space missions. As valuable as these missions have been to the follow-on generation of Operational systems, they are consistently under-funded and forced to execute on excessively ambitious development schedules. Due to these constraints, space mission development teams that serve the RDT&E community are faced with a number of unique technical and programmatic challenges. AFRL and STP have taken various approaches throughout the mission lifecycle to accelerate their development schedules, without sacrificing cost or system reliability. In the areas of test and operations, they currently employ one of two strategies. Historically, they have sought to avoid the added cost and complexity associated with coupled development schedules and segregated the spacecraft development and test effort from the ground operations system development and test effort. However, because these efforts have far more in common than they have differences, they have more recently attempted to pursue parallel I&T and Operations development and readiness efforts. This paper seeks to compare and contrast the "decoupled test and operations" approach, used by such missions as C/NOFS and Coriolis, with the "coupled test and operations" approach, adopted by the XSS-11 and TacSat-2 missions.
Space Telecommunications Radio Architecture (STRS)
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.
2006-01-01
A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG's SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA s current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.
Space Telecommunications Radio Architecture (STRS): Technical Overview
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.
2006-01-01
A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG s SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA's current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.
Framework for Human-Automation Collaboration: Conclusions from Four Studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna; Le Blanc, Katya L.; O'Hara, John
The Human Automation Collaboration (HAC) research project is investigating how advanced technologies that are planned for Advanced Small Modular Reactors (AdvSMR) will affect the performance and the reliability of the plant from a human factors and human performance perspective. The HAC research effort investigates the consequences of allocating functions between the operators and automated systems. More specifically, the research team is addressing how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. Oxstrand et al. (2013 - March) describes the efforts conductedmore » by the researchers to identify the research needs for HAC. The research team reviewed the literature on HAC, developed a model of HAC, and identified gaps in the existing knowledge of human-automation collaboration. As described in Oxstrand et al. (2013 – June), the team then prioritized the research topics identified based on the specific needs in the context of AdvSMR. The prioritization was based on two sources of input: 1) The preliminary functions and tasks, and 2) The model of HAC. As a result, three analytical studies were planned and conduced; 1) Models of Teamwork, 2) Standardized HAC Performance Measurement Battery, and 3) Initiators and Triggering Conditions for Adaptive Automation. Additionally, one field study was also conducted at Idaho Falls Power.« less
Space Suit Joint Torque Measurement Method Validation
NASA Technical Reports Server (NTRS)
Valish, Dana; Eversley, Karina
2012-01-01
In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.
The development of health care data warehouses to support data mining.
Lyman, Jason A; Scully, Kenneth; Harrison, James H
2008-03-01
Clinical data warehouses offer tremendous benefits as a foundation for data mining. By serving as a source for comprehensive clinical and demographic information on large patient populations, they streamline knowledge discovery efforts by providing standard and efficient mechanisms to replace time-consuming and expensive original data collection, organization, and processing. Building effective data warehouses requires knowledge of and attention to key issues in database design, data acquisition and processing, and data access and security. In this article, the authors provide an operational and technical definition of data warehouses, present examples of data mining projects enabled by existing data warehouses, and describe key issues and challenges related to warehouse development and implementation.
Dual Oculometer System for Aviation Crew Assessment
NASA Technical Reports Server (NTRS)
Latorella, Kara; Ellis, Kyle K.; Lynn, William A.; Frasca, Dennis; Burdette, Daniel W.; Feigh, Charles T.; Douglas, Alan L.
2010-01-01
Oculometers, eye trackers, are a useful tool for ascertaining the manner in which pilots deploy visual attentional resources, and for assessing the degree to which stimuli capture attention exogenously. The aim of this effort was to obtain oculometer data comfortably, unobtrusively, reliably and with good spatial resolution over a standard B757-like flight deck for both individuals in a crew. We chose to implement two remote, 5-camera Smarteye systems which were crafted for this purpose to operate harmoniously. We present here the results of validation exercises, lessons learned for improving data quality, and initial thoughts on the use of paired oculometer data to reflect crew workload, coordination, and situation awareness, in the aggregate.
NASA/SPoRt: GOES-R Activities in Support of Product Development, Management, and Training
NASA Technical Reports Server (NTRS)
Fuell, Kevin; Jedlovec, Gary; Molthan, Andrew; Stano, Geoffrey
2012-01-01
SPoRT is using current capabilities of MODIS and VIIRS, combined with current GOES (i.e. Hybrid Imagery) to demonstrate mesoscale capabilities of future ABI instrument. SPoRT is transitioning RGBs from EUMETSAT standard "recipes" to demonstrate a method to more efficiently handle the increase channels/frequency of ABI. Challenges for RGB production exist. Internal vs. external production, Bit depth needed, Adding quantitative information, etc. SPoRT forming group to address these issues. SPoRT is leading efforts on the application of total lightning in operations and to educate users of this new capability. Training in many forms is used to support testbed activities and is a key part to the transition process.
EuroPhenome and EMPReSS: online mouse phenotyping resource
Mallon, Ann-Marie; Hancock, John M.
2008-01-01
EuroPhenome (http://www.europhenome.org) and EMPReSS (http://empress.har.mrc.ac.uk/) form an integrated resource to provide access to data and procedures for mouse phenotyping. EMPReSS describes 96 Standard Operating Procedures for mouse phenotyping. EuroPhenome contains data resulting from carrying out EMPReSS protocols on four inbred laboratory mouse strains. As well as web interfaces, both resources support web services to enable integration with other mouse phenotyping and functional genetics resources, and are committed to initiatives to improve integration of mouse phenotype databases. EuroPhenome will be the repository for a recently initiated effort to carry out large-scale phenotyping on a large number of knockout mouse lines (EUMODIC). PMID:17905814
EuroPhenome and EMPReSS: online mouse phenotyping resource.
Mallon, Ann-Marie; Blake, Andrew; Hancock, John M
2008-01-01
EuroPhenome (http://www.europhenome.org) and EMPReSS (http://empress.har.mrc.ac.uk/) form an integrated resource to provide access to data and procedures for mouse phenotyping. EMPReSS describes 96 Standard Operating Procedures for mouse phenotyping. EuroPhenome contains data resulting from carrying out EMPReSS protocols on four inbred laboratory mouse strains. As well as web interfaces, both resources support web services to enable integration with other mouse phenotyping and functional genetics resources, and are committed to initiatives to improve integration of mouse phenotype databases. EuroPhenome will be the repository for a recently initiated effort to carry out large-scale phenotyping on a large number of knockout mouse lines (EUMODIC).
Autonomous Spacecraft Navigation Using Above-the-Constellation GPS Signals
NASA Technical Reports Server (NTRS)
Winternitz, Luke
2017-01-01
GPS-based spacecraft navigation offers many performance and cost benefits, and GPS receivers are now standard GNC components for LEO missions. Recently, more and more high-altitude missions are taking advantage of the benefits of GPS navigation as well. High-altitude applications pose challenges, however, because receivers operating above the GPS constellations are subject to reduced signal strength and availability, and uncertain signal quality. This presentation will present the history and state-of-the-art in high-altitude GPS spacecraft navigation, including early experiments, current missions and receivers, and efforts to characterize and protect signals available to high-altitude users. Recent results from the very-high altitude MMS mission are also provided.
The Use of Technology to Advance HIV Prevention for Couples.
Mitchell, Jason W
2015-12-01
The majority of HIV prevention studies and programs have targeted individuals or operated at the community level. This has also been the standard approach when incorporating technology (e.g., web-based, smartphones) to help improve HIV prevention efforts. The tides have turned for both approaches: greater attention is now focusing on couple-based HIV prevention and using technology to help improve these efforts for maximizing reach and potential impact. To assess the extent that technology has been used to help advance HIV prevention with couples, a literature review was conducted using four databases and included studies that collected data from 2000 to early 2015. Results from this review suggest that technology has primarily been used to help advance HIV prevention with couples as a tool for (1) recruitment and data collection and (2) intervention development. Challenges and limitations of conducting research (e.g., validity of dyadic data) along with future directions for how technology (e.g., mHealth, wearable sensors) can be used to advance HIV prevention with couples are then discussed. Given the growing and near ubiquitous use of the Internet and smartphones, further efforts in the realm of mHealth (e.g., applications or "apps") and eHealth are needed to develop novel couple-focused HIV-preventive interventions.
40 CFR 201.13 - Standard for rail car operations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Standard for rail car operations. 201... Interstate Rail Carrier Operations Standards § 201.13 Standard for rail car operations. Effective December 31, 1976, no carrier subject to this regulation shall operate any rail car or combination of rail cars...
40 CFR 201.13 - Standard for rail car operations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Standard for rail car operations. 201... Interstate Rail Carrier Operations Standards § 201.13 Standard for rail car operations. Effective December 31, 1976, no carrier subject to this regulation shall operate any rail car or combination of rail cars...
40 CFR 201.13 - Standard for rail car operations.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Standard for rail car operations. 201... Interstate Rail Carrier Operations Standards § 201.13 Standard for rail car operations. Effective December 31, 1976, no carrier subject to this regulation shall operate any rail car or combination of rail cars...
40 CFR 201.13 - Standard for rail car operations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Standard for rail car operations. 201... Interstate Rail Carrier Operations Standards § 201.13 Standard for rail car operations. Effective December 31, 1976, no carrier subject to this regulation shall operate any rail car or combination of rail cars...
40 CFR 201.13 - Standard for rail car operations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Standard for rail car operations. 201... Interstate Rail Carrier Operations Standards § 201.13 Standard for rail car operations. Effective December 31, 1976, no carrier subject to this regulation shall operate any rail car or combination of rail cars...
A review of medical terminology standards and structured reporting.
Awaysheh, Abdullah; Wilcke, Jeffrey; Elvinger, François; Rees, Loren; Fan, Weiguo; Zimmerman, Kurt
2018-01-01
Much effort has been invested in standardizing medical terminology for representation of medical knowledge, storage in electronic medical records, retrieval, reuse for evidence-based decision making, and for efficient messaging between users. We only focus on those efforts related to the representation of clinical medical knowledge required for capturing diagnoses and findings from a wide range of general to specialty clinical perspectives (e.g., internists to pathologists). Standardized medical terminology and the usage of structured reporting have been shown to improve the usage of medical information in secondary activities, such as research, public health, and case studies. The impact of standardization and structured reporting is not limited to secondary activities; standardization has been shown to have a direct impact on patient healthcare.
Advanced Command Destruct System (ACDS) Enhanced Flight Termination System (EFTS)
NASA Technical Reports Server (NTRS)
Tow, David K.
2011-01-01
This presentation provides information on the development, integration, and operational usage of the Enhanced Flight Termination System (EFTS) at NASA Dryden Flight Research Center and Air Force Flight Test Center. The presentation will describe the efforts completed to certify the system and acquire approval for operational usage, the efforts to integrate the system into the NASA Dryden existing flight termination infrastructure, and the operational support of aircraft with EFTS at Edwards AFB.
Dynamics and control of detumbling a disabled spacecraft during rescue operations
NASA Technical Reports Server (NTRS)
Kaplan, M. H.
1973-01-01
Results of a two-year research effort on dynamics and control of detumbling a disabled spacecraft during rescue operations are summarized. Answers to several basic questions about associated techniques and hardware requirements were obtained. Specifically, efforts have included development of operational procedures, conceptual design of remotely controlled modules, feasibility of internal moving mass for stabilization, and optimal techniques for minimum-time detumbling. Results have been documented in several reports and publications.
2007-05-10
planners will also benefit from experiencing the regimented military decision - making process and working with experienced operational planners. This...picture of the disaster area for the senior decision -makers, duplication of efforts, gaps in addressing requests for assistance, and the inefficient...Guard Atlantic Area. Interview by author, 25 March 2007. Mr. Doane stated that the JTF operated “in a vacuum” and “outside the inter-agency decision
Operational development of small plant growth systems
NASA Technical Reports Server (NTRS)
Scheld, H. W.; Magnuson, J. W.; Sauer, R. L.
1986-01-01
The results of a study undertaken on the first phase of an empricial effort in the development of small plant growth chambers for production of salad type vegetables on space shuttle or space station are discussed. The overall effort is visualized as providing the underpinning of practical experience in handling of plant systems in space which will provide major support for future efforts in planning, design, and construction of plant-based (phytomechanical) systems for support of human habitation in space. The assumptions underlying the effort hold that large scale phytomechanical habitability support systems for future space stations must evolve from the simple to the complex. The highly complex final systems will be developed from the accumulated experience and data gathered from repetitive tests and trials of fragments or subsystems of the whole in an operational mode. These developing system components will, meanwhile, serve a useful operational function in providing psychological support and diversion for the crews.
ERIC Educational Resources Information Center
Hursh, David
2008-01-01
This book examines the changes in educational policy in the U.S. and Britain over the last twenty-five years. The author contends that education in the States and Britain has been significantly transformed, through efforts to create curricular standards, increased emphasis on accountability measured by standardized tests, and efforts to introduce…
An Integrated Approach to the Teaching of Operations Management in a Business School
ERIC Educational Resources Information Center
Misra, Ram B.; Ravinder, Handanhal; Peterson, Richard L.
2016-01-01
The authors discuss a curriculum integration effort that a school of business piloted recently. This effort was aimed at integrating the core functions (finance, marketing, management, and operations) so that undergraduate students would better appreciate the full impact of functional decisions on each other and in achieving the corporation's…
40 CFR 265.1201 - Design and operating standards.
Code of Federal Regulations, 2012 CFR
2012-07-01
... be stored in accordance with a Standard Operating Procedure specifying procedures to ensure safety... 40 Protection of Environment 27 2012-07-01 2012-07-01 false Design and operating standards. 265... operating standards. (a) Hazardous waste munitions and explosives storage units must be designed and...
40 CFR 265.1201 - Design and operating standards.
Code of Federal Regulations, 2014 CFR
2014-07-01
... be stored in accordance with a Standard Operating Procedure specifying procedures to ensure safety... 40 Protection of Environment 26 2014-07-01 2014-07-01 false Design and operating standards. 265... operating standards. (a) Hazardous waste munitions and explosives storage units must be designed and...
40 CFR 265.1201 - Design and operating standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
... be stored in accordance with a Standard Operating Procedure specifying procedures to ensure safety... 40 Protection of Environment 27 2013-07-01 2013-07-01 false Design and operating standards. 265... operating standards. (a) Hazardous waste munitions and explosives storage units must be designed and...
40 CFR 265.1201 - Design and operating standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... be stored in accordance with a Standard Operating Procedure specifying procedures to ensure safety... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Design and operating standards. 265... operating standards. (a) Hazardous waste munitions and explosives storage units must be designed and...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-20
... Committee 147: Minimum Operational Performance Standards for Traffic Alert and Collision Avoidance Systems... Committee 147: Minimum Operational Performance Standards for Traffic Alert and Collision Avoidance Systems... RTCA Special Committee 147: Minimum Operational Performance Standards for Traffic Alert and Collision...
Developing policies and procedures for a picture archiving and communication system.
Gaytos, B
2001-06-01
Policies and procedures (P&P) constitute the mechanism for planning, standardizing, and documenting the provision of clinical services. Upon approval by hospital management, the P&P is an official statement of hospital rules and regulations. Each P&P establishes organizational responsibility for providing services. P&P are a mechanism for communicating standard operating procedures to hospital and medical staff. P&P serve as a reference document for unusual events, as well as routine procedures. P&P are often reviewed by inspection teams from the Joint Commission on Accreditation of Hospital Organizations (JCAHO) to determine whether the hospital has documented systematic practices. A picture archival and communications system (PACS) provides a new vehicle for providing radiology services. P&P that were designed for conventional film-based imaging are often not appropriate for electronic imaging. Because PACS is new and not yet widespread, good examples of PACS P&P are not yet available. JCAHO has no official requirements for PACS: PACS is viewed only as a means for the hospital to accomplish its work. Successful P&P development is a team effort, drafted by personnel responsible for executing the procedure, assisted by staff proficient in PACS technology, and tested in the field. The P&P should be reviewed and approved by management personnel knowledgeable about hospital and imaging operations. P&P should be written in clear and concise language. Successful P&P development is an ongoing effort. P&P must be periodically reviewed and updated to reflect changes in PACS technology and changes in clinical operations. New P&P must be developed when a deficit is noted. PACS security is a good example of a topic worthy of P&P development, especially in the face of the Health Insurance Portability and Accountability Act (HIPAA) legislation of 1996. What are the provisions for access control? Does the system include a feature for automatic shut-off of the software? Are there "generic" passwords and log-ins shared by a community of users? How are passwords assigned and how frequently are they changed? What security measures are in place to assure passwords are given to the appropriate user? Who grants and denies access? Service calls are another topic for P&P. Who initiates a service call? What is the process for escalating a service call from the operator level to the vendor? What immediate actions are expected by the operator in order to restore PACS services? How are service events documented? Who is responsible for determining when "downtime" procedures should be initiated or suspended? When our hospital's total electrical system had to be shut down for an extended period, we found that a P&P was lacking for a task as mundane as shutting down and restarting our PACS components. What is the sequence for the shutdown? Who is responsible for shutting down and restarting? How long can the devices operate on uninteruptible power supplies (UPS)? What components are on emergency power? Should we expect the components to survive the switchover to generator power? Developing this P&P was worth the effort: it made the PACS more fault-tolerant and served as a reference document 3 years later when expansion of our physical plant required two more power outages.
Verster, Joris C; Volkerts, Edmund R; Verbaten, Marinus N
2002-08-01
Alprazolam is prescribed for the treatment of anxiety and panic disorder. Most users are presumably involved in daily activities such as driving. However, the effects of alprazolam on driving ability have never been investigated. This study was conducted to determine the effects of alprazolam (1 mg) on driving ability, memory and psychomotor performance. Twenty healthy volunteers participated in a randomized, double-blind, placebo-controlled crossover study. One hour after oral administration, subjects performed a standardized driving test on a primary highway during normal traffic. They were instructed to drive with a constant speed (90 km/h) while maintaining a steady lateral position within the right traffic lane. Primary performance measures were the Standard Deviation of Lateral Position (SDLP) and the Standard Deviation of Speed (SDS). After the driving test, subjective driving quality, mental effort, and mental activation during driving were assessed. A laboratory test battery was performed 2.5 h after treatment administration, comprising the Sternberg Memory Scanning Test, a Continuous Tracking Test, and a Divided Attention Test. Relative to placebo, alprazolam caused serious driving impairment, as expressed by a significantly increased SDLP (F(1,19) = 97.3, p <.0001) and SDS (F(1,19) = 30.4, p <.0001). This was confirmed by subjective assessments showing significantly impaired driving quality (F(1,19) = 16.4, p <.001), decreased alertness (F(1,19) = 43.4, p <.0001), decreased mental activation (F(1,19) = 5.7, p <.03) and increased mental effort during driving (F(1,19) = 26.4, p <.0001). Furthermore, alprazolam significantly impaired performance on the laboratory tests. In conclusion, alprazolam users must be warned not to drive an automobile or operate potentially dangerous machinery.
Hydrostatic Hyperbaric Chamber Ventilation System
NASA Technical Reports Server (NTRS)
Sarguisingh, Miriam J.
2012-01-01
The hydrostatic hyperbaric chamber (HHC) represents the merger of several technologies in development for NASA aerospace applications, harnessed to directly benefit global health. NASA has significant experience developing composite hyperbaric chambers for a variety of applications. NASA also has researched the application of water-filled vessels to increase tolerance of acceleration forces. The combination of these two applications has resulted in the hydrostatic chamber, which has been conceived as a safe, affordable means of making hyperbaric oxygen therapy (HBOT) available in the developing world for the treatment of a variety of medical conditions. Specifically, HBOT is highly-desired as a possibly curative treatment for Buruli Ulcer, an infectious condition that afflicts children in sub-Saharan Africa. HBOT is simply too expensive and too dangerous to implement in the developing world using standard equipment. The HHC technology changes the paradigm. The HHC differs from standard hyperbaric chambers in that the majority of its volume is filled with water which is pressurized by oxygen being supplied in the portion of the chamber containing the patient s head. This greatly reduces the amount of oxygen required to sustain a hyperbaric atmosphere, thereby making the system more safe and economical to operate. An effort was taken to develop an HHC system to apply HBOT to children that is simple and robust enough to support transport, assembly, maintenance and operation in developing countries. This paper details the concept for an HHC ventilation and pressurization system to provide controlled pressurization and adequate washout of carbon dioxide while the subject is enclosed in the confined space during the administration of the medical treatment. The concept took into consideration operational complexity, safety to the patient and operating personnel, and physiological considerations. The simple schematic, comprised of easily acquired commercial hardware, supports sustainability.
NASA Technical Reports Server (NTRS)
Salem, Jonathan A.; Jenkins, Michael G.
2003-01-01
Advanced aerospace systems occasionally require the use of very brittle materials such as sapphire and ultra-high temperature ceramics. Although great progress has been made in the development of methods and standards for machining, testing and design of component from these materials, additional development and dissemination of standard practices is needed. ASTM Committee C28 on Advanced Ceramics and ISO TC 206 have taken a lead role in the standardization of testing for ceramics, and recent efforts and needs in standards development by Committee C28 on Advanced Ceramics will be summarized. In some cases, the engineers, etc. involved are unaware of the latest developments, and traditional approaches applicable to other material systems are applied. Two examples of flight hardware failures that might have been prevented via education and standardization will be presented.
The proposed coding standard at GSFC
NASA Technical Reports Server (NTRS)
Morakis, J. C.; Helgert, H. J.
1977-01-01
As part of the continuing effort to introduce standardization of spacecraft and ground equipment in satellite systems, NASA's Goddard Space Flight Center and other NASA facilities have supported the development of a set of standards for the use of error control coding in telemetry subsystems. These standards are intended to ensure compatibility between spacecraft and ground encoding equipment, while allowing sufficient flexibility to meet all anticipated mission requirements. The standards which have been developed to date cover the application of block codes in error detection and error correction modes, as well as short and long constraint length convolutional codes decoded via the Viterbi and sequential decoding algorithms, respectively. Included are detailed specifications of the codes, and their implementation. Current effort is directed toward the development of standards covering channels with burst noise characteristics, channels with feedback, and code concatenation.
Creative Analytics of Mission Ops Event Messages
NASA Technical Reports Server (NTRS)
Smith, Dan
2017-01-01
Historically, tremendous effort has been put into processing and displaying mission health and safety telemetry data; and relatively little attention has been paid to extracting information from missions time-tagged event log messages. Todays missions may log tens of thousands of messages per day and the numbers are expected to dramatically increase as satellite fleets and constellations are launched, as security monitoring continues to evolve, and as the overall complexity of ground system operations increases. The logs may contain information about orbital events, scheduled and actual observations, device status and anomalies, when operators were logged on, when commands were resent, when there were data drop outs or system failures, and much much more. When dealing with distributed space missions or operational fleets, it becomes even more important to systematically analyze this data. Several advanced information systems technologies make it appropriate to now develop analytic capabilities which can increase mission situational awareness, reduce mission risk, enable better event-driven automation and cross-mission collaborations, and lead to improved operations strategies: Industry Standard for Log Messages. The Object Management Group (OMG) Space Domain Task Force (SDTF) standards organization is in the process of creating a formal standard for industry for event log messages. The format is based on work at NASA GSFC. Open System Architectures. The DoD, NASA, and others are moving towards common open system architectures for mission ground data systems based on work at NASA GSFC with the full support of the commercial product industry and major integration contractors. Text Analytics. A specific area of data analytics which applies statistical, linguistic, and structural techniques to extract and classify information from textual sources. This presentation describes work now underway at NASA to increase situational awareness through the collection of non-telemetry mission operations information into a common log format and then providing display and analytics tools to provide in-depth assessment of the log contents. The work includes: Common interface formats for acquiring time-tagged text messages Conversion of common files for schedules, orbital events, and stored commands to the common log format Innovative displays to depict thousands of messages on a single display Structured English text queries against the log message data store, extensible to a more mature natural language query capability Goal of speech-to-text and text-to-speech additions to create a personal mission operations assistant to aid on-console operations. A wide variety of planned uses identified by the mission operations teams will be discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-03
... Committee 147: Minimum Operational Performance Standards for Traffic Alert and Collision Avoidance Systems... Committee 147 meeting: Minimum Operational Performance Standards for Traffic Alert and Collision Avoidance... RTCA Special Committee 147: Minimum Operational Performance Standards for Traffic Alert and Collision...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-26
... Committee 147: Minimum Operational Performance Standards for Traffic Alert and Collision Avoidance Systems... Committee 147: Minimum Operational Performance Standards for Traffic Alert and Collision Avoidance Systems... RTCA Special Committee 147: Minimum Operational Performance Standards for Traffic Alert and Collision...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-19
... Committee 147: Minimum Operational Performance Standards for Traffic Alert and Collision Avoidance Systems... Committee 147 meeting: Minimum Operational Performance Standards for Traffic Alert and Collision Avoidance... RTCA Special Committee 147: Minimum Operational Performance Standards for Traffic Alert and Collision...
NASA Astrophysics Data System (ADS)
Goring, S. J.; Richard, S. M.; Williams, J. W.; Dawson, A.
2017-12-01
A broad array of data resources, across disciplines, are needed to study Earth system processes operating at multiple spatial or temporal scales. Data friction frequently delays this integrative and interdisciplinary research, while sustainable solutions may be hampered as a result of academic incentives that penalize technical "tool building" at the expense of research publication. The paleogeosciences, in particular, often integrate data drawn from multiple sub-disciplines and from a range of long-tail and big data sources. Data friction can be lowered and the pace of scientific discovery accelerated through the development and adoption of data standards, both within the paleogeosciences and with allied disciplines. Using the PalEON Project (https://sites.nd.edu/paleonproject/) and the Neotoma Paleoecological Database (https://neotomadb.org) as focal case studies, we first illustrate the advances possible through data standardization. We then focus on new efforts in data standardization and building linkages among paleodata resources underway through the EarthCube-funded Throughput project. A first step underway is to analyze existing standards across paleo-data repositories and identify ways in which the adoption of common standards can promote connectivity, reducing barriers to interdisciplinary research, especially for early career researchers. Experience indicates that standards tend to emerge by necessity and from a mixture of bottom-up and top-down processes. A common pathway is when conventions developed to solve specific problems within a community are extended to address challenges that are more general. The Throughput project will identify, document, and promote such solutions to foster wider adoption of standards for data interchange and reduce data friction in the paleogeosciences.
Software Defined Radio Standard Architecture and its Application to NASA Space Missions
NASA Technical Reports Server (NTRS)
Andro, Monty; Reinhart, Richard C.
2006-01-01
A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG's SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA's current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.
Acceleration of Linear Finite-Difference Poisson-Boltzmann Methods on Graphics Processing Units.
Qi, Ruxi; Botello-Smith, Wesley M; Luo, Ray
2017-07-11
Electrostatic interactions play crucial roles in biophysical processes such as protein folding and molecular recognition. Poisson-Boltzmann equation (PBE)-based models have emerged as widely used in modeling these important processes. Though great efforts have been put into developing efficient PBE numerical models, challenges still remain due to the high dimensionality of typical biomolecular systems. In this study, we implemented and analyzed commonly used linear PBE solvers for the ever-improving graphics processing units (GPU) for biomolecular simulations, including both standard and preconditioned conjugate gradient (CG) solvers with several alternative preconditioners. Our implementation utilizes the standard Nvidia CUDA libraries cuSPARSE, cuBLAS, and CUSP. Extensive tests show that good numerical accuracy can be achieved given that the single precision is often used for numerical applications on GPU platforms. The optimal GPU performance was observed with the Jacobi-preconditioned CG solver, with a significant speedup over standard CG solver on CPU in our diversified test cases. Our analysis further shows that different matrix storage formats also considerably affect the efficiency of different linear PBE solvers on GPU, with the diagonal format best suited for our standard finite-difference linear systems. Further efficiency may be possible with matrix-free operations and integrated grid stencil setup specifically tailored for the banded matrices in PBE-specific linear systems.
Increasing injuries as trampoline parks expand within Australia: a call for mandatory standards.
Sharwood, Lisa N; Adams, Susan; Blaszkow, Tracy; Eager, David
2018-04-01
To quantify an apparent increase in indoor trampoline park related injuries in children and young people across Australia, and to understand the implications for current regulatory standards. Retrospective analyses of three state-based Injury Surveillance databases, identifying children and adolescents presenting to emergency departments between the years 2005 and 2017, who had sustained injuries during trampolining activity at an indoor trampoline park. Across the three datasets, 487 cases were identified. No cases were recorded prior to 2012, the year the first indoor trampoline park opened. At least half occurred among those aged 10-14 years. In Victoria, 58% were male, with 52% in Queensland and 60% in Western Australia being male, respectively. Hospital admission rates in these states were 15%, 11.7% and 14.5%, respectively. The most frequent injury types were dislocations, sprains and strains, followed by fractures, with some head and spinal injuries. Across several states in Australia, the incidence of indoor trampoline park related injuries is concerning, as these venues are increasing in number. Some injuries can be serious and result in lifelong disability for children or adolescents. Implications for public health: National safety standards that apply to indoor trampoline park operators are not currently mandatory; injury prevention efforts would be assisted if such standards were mandatory. © 2018 The Authors.
Human Factors Engineering as a System in the Vision for Exploration
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Smith, Danielle; Holden, Kritina
2006-01-01
In order to accomplish NASA's Vision for Exploration, while assuring crew safety and productivity, human performance issues must be well integrated into system design from mission conception. To that end, a two-year Technology Development Project (TDP) was funded by NASA Headquarters to develop a systematic method for including the human as a system in NASA's Vision for Exploration. The specific goals of this project are to review current Human Systems Integration (HSI) standards (i.e., industry, military, NASA) and tailor them to selected NASA Exploration activities. Once the methods are proven in the selected domains, a plan will be developed to expand the effort to a wider scope of Exploration activities. The methods will be documented for inclusion in NASA-specific documents (such as the Human Systems Integration Standards, NASA-STD-3000) to be used in future space systems. The current project builds on a previous TDP dealing with Human Factors Engineering processes. That project identified the key phases of the current NASA design lifecycle, and outlined the recommended HFE activities that should be incorporated at each phase. The project also resulted in a prototype of a webbased HFE process tool that could be used to support an ideal HFE development process at NASA. This will help to augment the limited human factors resources available by providing a web-based tool that explains the importance of human factors, teaches a recommended process, and then provides the instructions, templates and examples to carry out the process steps. The HFE activities identified by the previous TDP are being tested in situ for the current effort through support to a specific NASA Exploration activity. Currently, HFE personnel are working with systems engineering personnel to identify HSI impacts for lunar exploration by facilitating the generation of systemlevel Concepts of Operations (ConOps). For example, medical operations scenarios have been generated for lunar habitation in order to identify HSI requirements for the lunar communications architecture. Throughout these ConOps exercises, HFE personnel are testing various tools and methodologies that have been identified in the literature. A key part of the effort is the identification of optimal processes, methods, and tools for these early development phase activities, such as ConOps, requirements development, and early conceptual design. An overview of the activities completed thus far, as well as the tools and methods investigated will be presented.
Considerations for Improving the Capacity and Performance of AeroMACS
NASA Technical Reports Server (NTRS)
Kerczewski, Robert J.; Kamali, Behnam; Apaza, Rafael D.; Wilson, Jeffrey D.; Dimond, Robert P.
2014-01-01
The Aeronautical Mobile Airport Communications System (AeroMACS) has progressed from concept through prototype development, testing, and standards development and is now poised for the first operational deployments at nine US airports by the Federal Aviation Administration. These initial deployments will support fixed applications. Mobile applications providing connectivity to and from aircraft and ground-based vehicles on the airport surface will occur at some point in the future. Given that many fixed applications are possible for AeroMACS, it is necessary to now consider whether the existing capacity of AeroMACS will be reached even before the mobile applications are ready to be added, since AeroMACS is constrained by both available bandwidth and transmit power limitations. This paper describes some concepts that may be applied to improve the future capacity of AeroMACS, with a particular emphasis on gains that can be derived from the addition of IEEE 802.16j multihop relays to the AeroMACS standard, where a significant analysis effort has been undertaken.
Non-intubated uniportal left-lower lobe upper segmentectomy (S6)
Navarro-Martinez, Jose; Bolufer, Sergio; Sesma, Julio; Lirio, Francisco; Galiana, Maria; Rivera, Maria Jesus
2017-01-01
Worldwide accepted indications of anatomical segmentectomies are mainly early stage primary adenocarcinomas, pulmonary metastasis and benign conditions. Their performance through uniportal VATS has become more and more popular due to the less invasiveness of the whole procedure under this approach. Recently, many efforts have focused on non-intubated spontaneously breathing management of lobectomies and anatomical segmentectomies, although specific selection criteria and main advantages are not completely standardized. In a 62-year-old thin man with two pulmonary residual metastasis from sigma adenocarcinoma, after chemotherapy plus antiangiogenic treatment, we indicated a single-incision video-assisted left-lower lobe (LLL) upper segmentectomy (S6) under spontaneous breathing and intercostal blockade. Total operation time was 240 minutes. Chest tube was removed at 24 hours and the patient was discharge on postoperative day 2 without any complication. Non-intubated uniportal VATS is a safe and reasonable approach for lung-sparing resections in selected patients, although more evidence is required for selecting which patients can benefit more over standard intubated procedures. PMID:29078611
Non-intubated uniportal left-lower lobe upper segmentectomy (S6).
Galvez, Carlos; Navarro-Martinez, Jose; Bolufer, Sergio; Sesma, Julio; Lirio, Francisco; Galiana, Maria; Rivera, Maria Jesus
2017-01-01
Worldwide accepted indications of anatomical segmentectomies are mainly early stage primary adenocarcinomas, pulmonary metastasis and benign conditions. Their performance through uniportal VATS has become more and more popular due to the less invasiveness of the whole procedure under this approach. Recently, many efforts have focused on non-intubated spontaneously breathing management of lobectomies and anatomical segmentectomies, although specific selection criteria and main advantages are not completely standardized. In a 62-year-old thin man with two pulmonary residual metastasis from sigma adenocarcinoma, after chemotherapy plus antiangiogenic treatment, we indicated a single-incision video-assisted left-lower lobe (LLL) upper segmentectomy (S6) under spontaneous breathing and intercostal blockade. Total operation time was 240 minutes. Chest tube was removed at 24 hours and the patient was discharge on postoperative day 2 without any complication. Non-intubated uniportal VATS is a safe and reasonable approach for lung-sparing resections in selected patients, although more evidence is required for selecting which patients can benefit more over standard intubated procedures.
Quality Assurance Program Plan for SFR Metallic Fuel Data Qualification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benoit, Timothy; Hlotke, John Daniel; Yacout, Abdellatif
2017-07-05
This document contains an evaluation of the applicability of the current Quality Assurance Standards from the American Society of Mechanical Engineers Standard NQA-1 (NQA-1) criteria and identifies and describes the quality assurance process(es) by which attributes of historical, analytical, and other data associated with sodium-cooled fast reactor [SFR] metallic fuel and/or related reactor fuel designs and constituency will be evaluated. This process is being instituted to facilitate validation of data to the extent that such data may be used to support future licensing efforts associated with advanced reactor designs. The initial data to be evaluated under this program were generatedmore » during the US Integral Fast Reactor program between 1984-1994, where the data includes, but is not limited to, research and development data and associated documents, test plans and associated protocols, operations and test data, technical reports, and information associated with past United States Nuclear Regulatory Commission reviews of SFR designs.« less
Operational experience with DICOM for the clinical specialties in the healthcare enterprise
NASA Astrophysics Data System (ADS)
Kuzmak, Peter M.; Dayhoff, Ruth E.
2004-04-01
A number of clinical specialties routinely use images in treating patients, for example ophthalmology, dentistry, cardiology, endoscopy, and surgery. These images are captured by a variety of commercial digital image acquisition systems. The US Department of Veterans Affairs has been working for several years on advancing the use of the Digital Imaging and Communications in Medicine (DICOM) Standard in these clinical specialties. This is an effort that has involved several facets: (1) working with the vendors to ensure that they satisfy existing DICOM requirements, (2) developing interface software to the VistA hospital information system (HIS), (3) field testing DICOM systems, (4) deploying these DICOM interfaces nation-wide to all VA medical centers, (5) working with the healthcare providers using the system, and (6) participating in the DICOM working groups to improve the standard. The VA is now beginning to develop clinical applications that make use of the DICOM interfaces in the clinical specialties. The first of these will be in ophthalmology to remotely screen patients for diabetic retinopathy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-12-01
Accurate measurements of radioactivity in soils contaminated with Strontium-90 (Sr-90) or Uranium-238 (U-238) are essential for many DOE site remediation programs. These crucial measurements determine if excavation and soil removal is necessary, where remediation efforts should be focused, and/or if a site has reached closure. Measuring soil contamination by standard EPA laboratory methods typically takes a week (accelerated analytical test turnaround) or a month (standard analytical test turnaround). The time delay extends to operations involving heavy excavation equipment and associated personnel which are the main costs of remediation. This report describes an application of the BetaScint{trademark} fiber-optic sensor that measuresmore » Sr-90 or U-238 contamination in soil samples on site in about 20 minutes, at a much lower cost than time-consuming laboratory methods, to greatly facilitate remediation. This report describes the technology, its performance, its uses, cost, regulatory and policy issues, and lessons learned.« less
Image manipulation software portable on different hardware platforms: what is the cost?
NASA Astrophysics Data System (ADS)
Ligier, Yves; Ratib, Osman M.; Funk, Matthieu; Perrier, Rene; Girard, Christian; Logean, Marianne
1992-07-01
A hospital wide PACS project is currently under development at the University Hospital of Geneva. The visualization and manipulation of images provided by different imaging modalities constitutes one of the most challenging components of a PACS. Because there are different requirements depending on the clinical usage, it was necessary for such a visualization software to be provided on different types of workstations in different sectors of the PACS. The user interface has to be the same independently of the underlying workstation. Beside, in addition to a standard set of image manipulation and processing tools there is a need for more specific clinical tools that should be easily adapted to specific medical requirements. To achieve operating and windowing systems: the standard Unix/X-11/OSF-Motif based workstations and the Macintosh family and should be easily ported on other systems. This paper describes the design of such a system and discusses the extra cost and efforts involved in the development of a portable and easily expandable software.
NASA Astrophysics Data System (ADS)
Matahari, Rho Natta; Putra, Nandy; Ariantara, Bambang; Amin, Muhammad; Prawiro, Erwin
2017-02-01
High number of preterm births is one of the issues in improving health standard. The effort to help premature babies is hampered by high cost of NICU care in hospital. In addition, uneven distribution of electricity to remote area made it hard to operate the incubator. Utilization of phase change material beeswax to non-electricity incubator as heating element becomes alternative option to save premature babies. The objective of this experiment is to investigate the most efficient mass of beeswax according to Indonesian National Standard to earn over time and ideal temperature of incubator. Experiment was performed using prototype incubator, which utilizes natural convection phenomenon in the heating process of incubator. Utilization of fin is to accelerate heat distribution in the incubator. Result of experiment showed that the most efficient mass of PCM is 3 kg, which has 2.45 hours of running time for maintaining temperature of incubator in range of 32-36 °C.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-10
... to accounting standards in these rules to the Financial Accounting Standards Board Accounting... Financial Accounting Standards Board Accounting Standards Codification. In accordance with 12 U.S.C. 2252..., Loan Policies and Operations, and Funding Operations; Accounting and Reporting Requirements; Federal...
40 CFR 267.151 - Wording of the instruments.
Code of Federal Regulations, 2013 CFR
2013-07-01
... owner or operator of a facility with a standardized permit who uses a financial test to demonstrate... financial officer of an owner or operator of a facility with a standardized permit who use a financial test... (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE FACILITIES OPERATING UNDER A STANDARDIZED...
40 CFR 267.151 - Wording of the instruments.
Code of Federal Regulations, 2011 CFR
2011-07-01
... owner or operator of a facility with a standardized permit who uses a financial test to demonstrate... financial officer of an owner or operator of a facility with a standardized permit who use a financial test... (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE FACILITIES OPERATING UNDER A STANDARDIZED...
40 CFR 267.151 - Wording of the instruments.
Code of Federal Regulations, 2012 CFR
2012-07-01
... owner or operator of a facility with a standardized permit who uses a financial test to demonstrate... financial officer of an owner or operator of a facility with a standardized permit who use a financial test... (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE FACILITIES OPERATING UNDER A STANDARDIZED...
40 CFR 267.151 - Wording of the instruments.
Code of Federal Regulations, 2014 CFR
2014-07-01
... owner or operator of a facility with a standardized permit who uses a financial test to demonstrate... financial officer of an owner or operator of a facility with a standardized permit who use a financial test... (CONTINUED) STANDARDS FOR OWNERS AND OPERATORS OF HAZARDOUS WASTE FACILITIES OPERATING UNDER A STANDARDIZED...
NASA Technical Reports Server (NTRS)
Iannicca, Dennis C.; McKim, James H.; Stewart, David H.; Thadhani, Suresh K.; Young, Daniel P.
2015-01-01
NASA Glenn Research Center, in cooperation with Rockwell Collins, is working to develop a prototype Control and Non-Payload Communications (CNPC) radio platform as part of NASA Integrated Systems Research Program's (ISRP) Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) project. A primary focus of the project is to work with the FAA and industry standards bodies to build and demonstrate a safe, secure, and efficient CNPC architecture that can be used by industry to evaluate the feasibility of deploying a system using these technologies in an operational capacity. GRC has been working in conjunction with these groups to assess threats, identify security requirements, and to develop a system of standards-based security controls that can be applied to the current GRC prototype CNPC architecture as a demonstration platform. The security controls were integrated into a lab test bed mock-up of the Mobile IPv6 architecture currently being used for NASA flight testing, and a series of network tests were conducted to evaluate the security overhead of the controls compared to the baseline CNPC link without any security. The aim of testing was to evaluate the performance impact of the additional security control overhead when added to the Mobile IPv6 architecture in various modes of operation. The statistics collected included packet captures at points along the path to gauge packet size as the sample data traversed the CNPC network, round trip latency, jitter, and throughput. The effort involved a series of tests of the baseline link, a link with Robust Header Compression (ROHC) and without security controls, a link with security controls and without ROHC, and finally a link with both ROHC and security controls enabled. The effort demonstrated that ROHC is both desirable and necessary to offset the additional expected overhead of applying security controls to the CNPC link.
Quality Indicators in Radiation Oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albert, Jeffrey M.; Das, Prajnan, E-mail: prajdas@mdanderson.org
Oncologic specialty societies and multidisciplinary collaborative groups have dedicated considerable effort to developing evidence-based quality indicators (QIs) to facilitate quality improvement, accreditation, benchmarking, reimbursement, maintenance of certification, and regulatory reporting. In particular, the field of radiation oncology has a long history of organized quality assessment efforts and continues to work toward developing consensus quality standards in the face of continually evolving technologies and standards of care. This report provides a comprehensive review of the current state of quality assessment in radiation oncology. Specifically, this report highlights implications of the healthcare quality movement for radiation oncology and reviews existing efforts tomore » define and measure quality in the field, with focus on dimensions of quality specific to radiation oncology within the “big picture” of oncologic quality assessment efforts.« less
NASA Technical Reports Server (NTRS)
Wickens, Christopher; Sebok, Angelia; Keller, John; Peters, Steve; Small, Ronald; Hutchins, Shaun; Algarin, Liana; Gore, Brian Francis; Hooey, Becky Lee; Foyle, David C.
2013-01-01
NextGen operations are associated with a variety of changes to the national airspace system (NAS) including changes to the allocation of roles and responsibilities among operators and automation, the use of new technologies and automation, additional information presented on the flight deck, and the entire concept of operations (ConOps). In the transition to NextGen airspace, aviation and air operations designers need to consider the implications of design or system changes on human performance and the potential for error. To ensure continued safety of the NAS, it will be necessary for researchers to evaluate design concepts and potential NextGen scenarios well before implementation. One approach for such evaluations is through human performance modeling. Human performance models (HPMs) provide effective tools for predicting and evaluating operator performance in systems. HPMs offer significant advantages over empirical, human-in-the-loop testing in that (1) they allow detailed analyses of systems that have not yet been built, (2) they offer great flexibility for extensive data collection, (3) they do not require experimental participants, and thus can offer cost and time savings. HPMs differ in their ability to predict performance and safety with NextGen procedures, equipment and ConOps. Models also vary in terms of how they approach human performance (e.g., some focus on cognitive processing, others focus on discrete tasks performed by a human, while others consider perceptual processes), and in terms of their associated validation efforts. The objectives of this research effort were to support the Federal Aviation Administration (FAA) in identifying HPMs that are appropriate for predicting pilot performance in NextGen operations, to provide guidance on how to evaluate the quality of different models, and to identify gaps in pilot performance modeling research, that could guide future research opportunities. This research effort is intended to help the FAA evaluate pilot modeling efforts and select the appropriate tools for future modeling efforts to predict pilot performance in NextGen operations.
International Safety Regulation and Standards for Space Travel and Commerce
NASA Astrophysics Data System (ADS)
Pelton, J. N.; Jakhu, R.
The evolution of air travel has led to the adoption of the 1944 Chicago Convention that created the International Civil Aviation Organization (ICAO), headquartered in Montreal, Canada, and the propagation of aviation safety standards. Today, ICAO standardizes and harmonizes commercial air safety worldwide. Space travel and space safety are still at an early stage of development, and the adoption of international space safety standards and regulation still remains largely at the national level. This paper explores the international treaties and conventions that govern space travel, applications and exploration today and analyzes current efforts to create space safety standards and regulations at the national, regional and global level. Recent efforts to create a commercial space travel industry and to license commercial space ports are foreseen as means to hasten a space safety regulatory process.
The development of STS payload environmental engineering standards
NASA Technical Reports Server (NTRS)
Bangs, W. F.
1982-01-01
The presently reported effort to provide a single set of standards for the design, analysis and testing of Space Transportation System (STS) payloads throughout the NASA organization must be viewed as essentially experimental, since the concept of incorporating the diverse opinions and experiences of several separate field research centers may in retrospect be judged too ambitious or perhaps even naive. While each STS payload may have unique characteristics, and the project should formulate its own criteria for environmental design, testing and evaluation, a reference source document providing coordinated standards is expected to minimize the duplication of effort and limit random divergence of practices among the various NASA payload programs. These standards would provide useful information to all potential STS users, and offer a degree of standardization to STS users outside the NASA organization.
Digital optical tape: Technology and standardization issues
NASA Technical Reports Server (NTRS)
Podio, Fernando L.
1996-01-01
During the coming years, digital data storage technologies will continue an aggressive growth to satisfy the user's need for higher storage capacities, higher data transfer rates and long-term archival media properties. Digital optical tape is a promising technology to satisfy these user's needs. As any emerging data storage technology, the industry faces many technological and standardization challenges. The technological challenges are great, but feasible to overcome. Although it is too early to consider formal industry standards, the optical tape industry has decided to work together by initiating prestandardization efforts that may lead in the future to formal voluntary industry standards. This paper will discuss current industry optical tape drive developments and the types of standards that will be required for the technology. The status of current industry prestandardization efforts will also be discussed.
Jack, Clifford R; Barkhof, Frederik; Bernstein, Matt A; Cantillon, Marc; Cole, Patricia E; DeCarli, Charles; Dubois, Bruno; Duchesne, Simon; Fox, Nick C; Frisoni, Giovanni B; Hampel, Harald; Hill, Derek LG; Johnson, Keith; Mangin, Jean-François; Scheltens, Philip; Schwarz, Adam J; Sperling, Reisa; Suhy, Joyce; Thompson, Paul M; Weiner, Michael; Foster, Norman L
2012-01-01
Background The promise of Alzheimer’s disease (AD) biomarkers has led to their incorporation in new diagnostic criteria and in therapeutic trials; however, significant barriers exist to widespread use. Chief among these is the lack of internationally accepted standards for quantitative metrics. Hippocampal volumetry is the most widely studied quantitative magnetic resonance imaging (MRI) measure in AD and thus represents the most rational target for an initial effort at standardization. Methods and Results The authors of this position paper propose a path toward this goal. The steps include: 1) Establish and empower an oversight board to manage and assess the effort, 2) Adopt the standardized definition of anatomic hippocampal boundaries on MRI arising from the EADC-ADNI hippocampal harmonization effort as a Reference Standard, 3) Establish a scientifically appropriate, publicly available Reference Standard Dataset based on manual delineation of the hippocampus in an appropriate sample of subjects (ADNI), and 4) Define minimum technical and prognostic performance metrics for validation of new measurement techniques using the Reference Standard Dataset as a benchmark. Conclusions Although manual delineation of the hippocampus is the best available reference standard, practical application of hippocampal volumetry will require automated methods. Our intent is to establish a mechanism for credentialing automated software applications to achieve internationally recognized accuracy and prognostic performance standards that lead to the systematic evaluation and then widespread acceptance and use of hippocampal volumetry. The standardization and assay validation process outlined for hippocampal volumetry is envisioned as a template that could be applied to other imaging biomarkers. PMID:21784356
NASA Technical Reports Server (NTRS)
Beisert, Susan; Rodriggs, Michael; Moreno, Francisco; Korth, David; Gibson, Stephen; Lee, Young H.; Eagles, Donald E.
2013-01-01
Now that major assembly of the International Space Station (ISS) is complete, NASA's focus has turned to using this high fidelity in-space research testbed to not only advance fundamental science research, but also demonstrate and mature technologies and develop operational concepts that will enable future human exploration missions beyond low Earth orbit. The ISS as a Testbed for Analog Research (ISTAR) project was established to reduce risks for manned missions to exploration destinations by utilizing ISS as a high fidelity micro-g laboratory to demonstrate technologies, operations concepts, and techniques associated with crew autonomous operations. One of these focus areas is the development and execution of ISS Testbed for Analog Research (ISTAR) autonomous flight crew procedures intended to increase crew autonomy that will be required for long duration human exploration missions. Due to increasing communications delays and reduced logistics resupply, autonomous procedures are expected to help reduce crew reliance on the ground flight control team, increase crew performance, and enable the crew to become more subject-matter experts on both the exploration space vehicle systems and the scientific investigation operations that will be conducted on a long duration human space exploration mission. These tests make use of previous or ongoing projects tested in ground analogs such as Research and Technology Studies (RATS) and NASA Extreme Environment Mission Operations (NEEMO). Since the latter half of 2012, selected non-critical ISS systems crew procedures have been used to develop techniques for building ISTAR autonomous procedures, and ISS flight crews have successfully executed them without flight controller involvement. Although the main focus has been preparing for exploration, the ISS has been a beneficiary of this synergistic effort and is considering modifying additional standard ISS procedures that may increase crew efficiency, reduce operational costs, and raise the amount of crew time available for scientific research. The next phase of autonomous procedure development is expected to include payload science and human research investigations. Additionally, ISS International Partners have expressed interest in participating in this effort. The recently approved one-year crew expedition starting in 2015, consisting of one Russian and one U.S. Operating Segment (USOS) crewmember, will be used not only for long duration human research investigations but also for the testing of exploration operations concepts, including crew autonomy.
Fire service and first responder thermal imaging camera (TIC) advances and standards
NASA Astrophysics Data System (ADS)
Konsin, Lawrence S.; Nixdorff, Stuart
2007-04-01
Fire Service and First Responder Thermal Imaging Camera (TIC) applications are growing, saving lives and preventing injury and property damage. Firefighters face a wide range of serious hazards. TICs help mitigate the risks by protecting Firefighters and preventing injury, while reducing time spent fighting the fire and resources needed to do so. Most fire safety equipment is covered by performance standards. Fire TICs, however, are not covered by such standards and are also subject to inadequate operational performance and insufficient user training. Meanwhile, advancements in Fire TICs and lower costs are driving product demand. The need for a Fire TIC Standard was spurred in late 2004 through a Government sponsored Workshop where experts from the First Responder community, component manufacturers, firefighter training, and those doing research on TICs discussed strategies, technologies, procedures, best practices and R&D that could improve Fire TICs. The workshop identified pressing image quality, performance metrics, and standards issues. Durability and ruggedness metrics and standard testing methods were also seen as important, as was TIC training and certification of end-users. A progress report on several efforts in these areas and their impact on the IR sensor industry will be given. This paper is a follow up to the SPIE Orlando 2004 paper on Fire TIC usage (entitled Emergency Responders' Critical Infrared) which explored the technological development of this IR industry segment from the viewpoint of the end user, in light of the studies and reports that had established TICs as a mission critical tool for firefighters.
Scenario Development Process at the Vertical Motion Simulator
NASA Technical Reports Server (NTRS)
Reardon, Scott E.; Beard, Steven D.; Lewis, Emily
2017-01-01
There has been a significant effort within the simulation community to standardize many aspects of flight simulation. More recently, an effort has begun to develop a formal scenario definition language for aviation. A working group within the AIAA Modeling and Simulation Technical Committee has been created to develop a standard aviation scenario definition language, though much of the initial effort has been tailored to training simulators. Research and development (R&D) simulators, like the Vertical Motion Simulator (VMS), and training simulators have different missions and thus have different scenario requirements. The purpose of this paper is to highlight some of the unique tasks and scenario elements used at the VMS so they may be captured by scenario standardization efforts. The VMS most often performs handling qualities studies and transfer of training studies. Three representative handling qualities simulation studies and two transfer of training simulation studies are described in this paper. Unique scenario elements discussed in this paper included special out-the-window (OTW) targets and environmental conditions, motion system parameters, active inceptor parameters, and configurable vehicle math model parameters.
Best, Rebecca R; Harris, Benjamin H L; Walsh, Jason L; Manfield, Timothy
2017-05-08
Drowning is one of the leading causes of death in children. Resuscitating a child following submersion is a high-pressure situation, and standard operating procedures can reduce error. Currently, the Resuscitation Council UK guidance does not include a standard operating procedure on pediatric drowning. The objective of this project was to design a standard operating procedure to improve outcomes of drowned children. A literature review on the management of pediatric drowning was conducted. Relevant publications were used to develop a standard operating procedure for management of pediatric drowning. A concise standard operating procedure was developed for resuscitation following pediatric submersion. Specific recommendations include the following: the Heimlich maneuver should not be used in this context; however, prolonged resuscitation and therapeutic hypothermia are recommended. This standard operating procedure is a potentially useful adjunct to the Resuscitation Council UK guidance and should be considered for incorporation into its next iteration.
Global strategies for cervical cancer prevention.
Pimple, Sharmila; Mishra, Gauravi; Shastri, Surendra
2016-02-01
Cervical cancer still remains the fourth most common cancer, affecting women worldwide with large geographic variations in cervical cancer incidence and mortality rates. There exist vast disparities in cervix cancer control and prevention efforts globally. The present review addresses the current developments in cervical cancer prevention and control across both high-income countries and low-middle income countries and attempts to identify new strategies that might help address the gaps in cervical cancer care disparities globally. Paradigms for cervix cancer screening are changing in high-resource settings from cytology-based screening to adoption of molecular screening and cotesting to achieve program effectiveness. Low-middle income countries with larger burden of cervical cancer continue to face financial and logistic limitations to make both cervix cancer screening and human papillomavirus vaccine available to their populations. Alternative low-cost screening technologies, operationally feasible implementation strategies, reduction of cost of procurement and delivery approaches for human papillomavirus vaccine need assessment to decrease cancer care disparities. Efforts directed toward cervix cancer prevention and early detection for improvements in cervical cancer outcomes of incidence and mortality have to be proportionately matched by access to acceptable standards of cancer care.
The Human Phenotype Ontology in 2017
Köhler, Sebastian; Vasilevsky, Nicole A.; Engelstad, Mark; Foster, Erin; McMurry, Julie; Aymé, Ségolène; Baynam, Gareth; Bello, Susan M.; Boerkoel, Cornelius F.; Boycott, Kym M.; Brudno, Michael; Buske, Orion J.; Chinnery, Patrick F.; Cipriani, Valentina; Connell, Laureen E.; Dawkins, Hugh J.S.; DeMare, Laura E.; Devereau, Andrew D.; de Vries, Bert B.A.; Firth, Helen V.; Freson, Kathleen; Greene, Daniel; Hamosh, Ada; Helbig, Ingo; Hum, Courtney; Jähn, Johanna A.; James, Roger; Krause, Roland; F. Laulederkind, Stanley J.; Lochmüller, Hanns; Lyon, Gholson J.; Ogishima, Soichi; Olry, Annie; Ouwehand, Willem H.; Pontikos, Nikolas; Rath, Ana; Schaefer, Franz; Scott, Richard H.; Segal, Michael; Sergouniotis, Panagiotis I.; Sever, Richard; Smith, Cynthia L.; Straub, Volker; Thompson, Rachel; Turner, Catherine; Turro, Ernest; Veltman, Marijcke W.M.; Vulliamy, Tom; Yu, Jing; von Ziegenweidt, Julie; Zankl, Andreas; Züchner, Stephan; Zemojtel, Tomasz; Jacobsen, Julius O.B.; Groza, Tudor; Smedley, Damian; Mungall, Christopher J.; Haendel, Melissa; Robinson, Peter N.
2017-01-01
Deep phenotyping has been defined as the precise and comprehensive analysis of phenotypic abnormalities in which the individual components of the phenotype are observed and described. The three components of the Human Phenotype Ontology (HPO; www.human-phenotype-ontology.org) project are the phenotype vocabulary, disease-phenotype annotations and the algorithms that operate on these. These components are being used for computational deep phenotyping and precision medicine as well as integration of clinical data into translational research. The HPO is being increasingly adopted as a standard for phenotypic abnormalities by diverse groups such as international rare disease organizations, registries, clinical labs, biomedical resources, and clinical software tools and will thereby contribute toward nascent efforts at global data exchange for identifying disease etiologies. This update article reviews the progress of the HPO project since the debut Nucleic Acids Research database article in 2014, including specific areas of expansion such as common (complex) disease, new algorithms for phenotype driven genomic discovery and diagnostics, integration of cross-species mapping efforts with the Mammalian Phenotype Ontology, an improved quality control pipeline, and the addition of patient-friendly terminology. PMID:27899602
The Human Phenotype Ontology in 2017
Köhler, Sebastian; Vasilevsky, Nicole A.; Engelstad, Mark; ...
2016-11-24
Deep phenotyping has been defined as the precise and comprehensive analysis of phenotypic abnormalities in which the individual components of the phenotype are observed and described. The three components of the Human PhenotypeOntology (HPO; www.human-phenotype-ontology.org) project are the phenotype vocabulary, disease-phenotype annotations and the algorithms that operate on these. These components are being used for computational deep phenotyping and precision medicine as well as integration of clinical data into translational research. The HPO is being increasingly adopted as a standard for phenotypic abnormalities by diverse groups such as international rare disease organizations, registries, clinical labs, biomedical resources, and clinical softwaremore » tools and will thereby contribute toward nascent efforts at global data exchange for identifying disease etiologies. This update article reviews the progress of the HPO project since the debut Nucleic Acids Research database article in 2014, including specific areas of expansion such as common (complex) disease, new algorithms for phenotype driven genomic discovery and diagnostics, integration of cross-species mapping efforts with the Mammalian Phenotype Ontology, an improved quality control pipeline, and the addition of patient-friendly terminology.« less
Minimum impulse thruster valve design and development
NASA Technical Reports Server (NTRS)
Huftalen, Richard L.; Platt, Andrea L.; Parker, Morgan J.; Yankura, George A.
2003-01-01
The design and development of a minimum impulse thruster valve was conducted, by Moog, under contract by NASA's Jet Propulsion Laboratory, California Institute of Technology, for deep space propulsion systems. The effort was focused on applying known solenoid design techniques scaled to provide a 1 -millisecond response capability for monopropellant, hydrazine ACS thruster applications. The valve has an extended operating temperature range of 20(deg)F to +350(deg)F with a total mass of less than 25 grams and nominal power draw of 7 watts. The design solution resulted in providing a solenoid valve that is one-tenth the scale of the standard product line. The valve has the capability of providing a mass flow rate of 0.0009 pounds per second hydrazine. The design life of 1,000,000 cycles was demonstrated both dry and wet. Not all design factors scaled as expected and proved to be the focus of the final development effort. These included the surface interactions, hydrodynamics and driver electronics. The resulting solution applied matured design approaches to minimize the program risk with innovative methods to address the impacts of scale.
NASA Battery Working Group - 2007-2008: Battery Task Summary Report
NASA Technical Reports Server (NTRS)
Manzo, Michelle
2008-01-01
This presentation provides a summary of the 2007-2008 NASA Battery Working Group efforts completed in support of the NASA Engineering Safety Center (NESC). The effort covered a series of pro-active tasks that address the following: Binding Procurements -- guidelines related to requirements for the battery system that should be considered at the time of contract award Wet Life of Ni-H2 Batteries -- issues/strategies for effective storage and impact of long-term storage on performance and life Generic Guidelines for Lithium-ion Safety, Handling and Qualification -- Standardized approaches developed and risk assessments (1) Lithium-ion Performance Assessment -- survey of manufacturers and capabilities to meet mission needs. Guidelines document generated (2) Conditions Required for using Pouch Cells in Aerospace Missions -- focus on corrosion, thermal excursions and long-term performance issues. Document defining requirements to maintain performance and life (3) High Voltage Risk Assessment -- focus on safety and abuse tolerance of battery module assemblies. Recommendations of features required for safe implementation (4) Procedure for Determination of Safe Charge Rates -- evaluation of various cell chemistries and recommendation of safe operating regimes for specific cell designs
Community for Data Integration 2014 annual report
Langseth, Madison L.; Chang, Michelle Y.; Carlino, Jennifer; Birch, Daniella D.; Bradley, Joshua; Bristol, R. Sky; Conzelmann, Craig; Diehl, Robert H.; Earle, Paul S.; Ellison, Laura E.; Everette, Anthony L.; Fuller, Pamela L.; Gordon, Janice M.; Govoni, David L.; Guy, Michelle R.; Henkel, Heather S.; Hutchison, Vivian B.; Kern, Tim; Lightsom, Frances L.; Long, Joseph W.; Longhenry, Ryan; Preston, Todd M.; Smith, Stan W.; Viger, Roland J.; Wesenberg, Katherine; Wood, Eric C.
2015-10-02
To achieve these goals, the CDI operates within four applied areas: monthly forums, annual workshop/webinar series, working groups, and projects. The monthly forums, also known as the Opportunity/Challenge of the Month, provide an open dialogue to share and learn about data integration efforts or to present problems that invite the community to offer solutions, advice, and support. Since 2010, the CDI has also sponsored annual workshops/webinar series to encourage the exchange of ideas, sharing of activities, presentations of current projects, and networking among members. Stemming from common interests, the working groups are focused on efforts to address data management and technical challenges including the development of standards and tools, improving interoperability and information infrastructure, and data preservation within USGS and its partners. The growing support for the activities of the working groups led to the CDI’s first formal request for proposals (RFP) process in 2013 to fund projects that produced tangible products. As of 2014, the CDI continues to hold an annual RFP that creates data management tools and practices, collaboration tools, and training in support of data integration and delivery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Köhler, Sebastian; Vasilevsky, Nicole A.; Engelstad, Mark
Deep phenotyping has been defined as the precise and comprehensive analysis of phenotypic abnormalities in which the individual components of the phenotype are observed and described. The three components of the Human PhenotypeOntology (HPO; www.human-phenotype-ontology.org) project are the phenotype vocabulary, disease-phenotype annotations and the algorithms that operate on these. These components are being used for computational deep phenotyping and precision medicine as well as integration of clinical data into translational research. The HPO is being increasingly adopted as a standard for phenotypic abnormalities by diverse groups such as international rare disease organizations, registries, clinical labs, biomedical resources, and clinical softwaremore » tools and will thereby contribute toward nascent efforts at global data exchange for identifying disease etiologies. This update article reviews the progress of the HPO project since the debut Nucleic Acids Research database article in 2014, including specific areas of expansion such as common (complex) disease, new algorithms for phenotype driven genomic discovery and diagnostics, integration of cross-species mapping efforts with the Mammalian Phenotype Ontology, an improved quality control pipeline, and the addition of patient-friendly terminology.« less
DOT National Transportation Integrated Search
1999-08-01
This study examines certain airport design standards in an effort to understand the rationale behind their development. Researchers studied the standards to identify potential standards for relaxing. The focus is on smaller, less active airports wher...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-01
...] Standard on Commercial Diving Operations; Extension of the Office of Management and Budget's (OMB) Approval... Commercial Diving Operations Standard (29 CFR part 1910, subpart T). DATES: Comments must be submitted... existing Standard on Commercial Diving Operations (29 CFR part 1910, Subpart [[Page 67481
Terry Turbopump Analytical Modeling Efforts in Fiscal Year 2016 ? Progress Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osborn, Douglas; Ross, Kyle; Cardoni, Jeffrey N
This document details the Fiscal Year 2016 modeling efforts to define the true operating limitations (margins) of the Terry turbopump systems used in the nuclear industry for Milestone 3 (full-scale component experiments) and Milestone 4 (Terry turbopump basic science experiments) experiments. The overall multinational-sponsored program creates the technical basis to: (1) reduce and defer additional utility costs, (2) simplify plant operations, and (3) provide a better understanding of the true margin which could reduce overall risk of operations.
Human factors in waste management - potential and reality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, J.S.
There is enormous potential for human factors contributions in the realm of waste management. The reality, however, is very different from the potential. This is particularly true for low-level and low-level mixed-waste management. The hazards are less severe; therefore, health and safety requirements (including human factors) are not as rigorous as for high-level waste. High-level waste management presents its own unique challenges and opportunities. Waste management is strongly driven by regulatory compliance. When regulations are flexible and open to interpretation and the environment is driven so strongly by regulatory compliance, standard practice is to drop {open_quotes}nice to have{close_quotes} features, likemore » a human factors program, to save money for complying with other requirements. The challenge is to convince decision makers that human factors can help make operations efficient and cost-effective, as well as improving safety and complying with regulations. A human factors program should not be viewed as competing with compliance efforts; in fact, it should complement them and provide additional cost-effective means of achieving compliance with other regulations. Achieving this synergy of human factors with ongoing waste management operations requires educating program and facility managers and other technical specialists about human factors and demonstrating its value {open_quotes}through the back door{close_quotes} on existing efforts. This paper describes ongoing projects at Los Alamos National Laboratory (LANL) in support of their waste management groups. It includes lessons learned from hazard and risk analyses, safety analysis reports, job and task analyses, operating procedure development, personnel qualification/certification program development, and facility- and job-specific training program and course development.« less
Advanced Steels for Accident Tolerant Fuel Cladding in Current Light Water Reactors
NASA Astrophysics Data System (ADS)
Rebak, Raul B.
After the March 2011 Fukushima events, the U.S. Congress directed the Department of Energy (DOE) to focus efforts on the development of fuel cladding materials with enhanced accident tolerance. In comparison with the stand-ard UO2-Zirconium based system, the new fuels need to tolerate loss of active cooling in the core for a considerably longer time period while maintaining or improving the fuel performance during normal operation conditions. Advanced steels such as iron-chromium-aluminum (FeCrAl) alloys are being investigated for degradation behavior both under normal operation conditions in high temperature water (e.g. 288°C) and under accident conditions for reaction with steam up to 1400°C. Commercial and experimental alloys were tested for several periods of time in 100% superheated steam from 800°C to 1475°C. Results show that FeCrAl alloys significantly outperform the resistance in steam of the current zirconium alloys.
Science guides search and rescue after the 2006 Philippine landslide.
Lagmay, Alfredo Mahar A; Tengonciang, Arlene Mae P; Rodolfo, Raymond S; Soria, Janneli Lea A; Baliatan, Eden G; Paguican, Engielle R; Ong, John Burtkenley T; Lapus, Mark R; Fernandez, Dan Ferdinand D; Quimba, Zareth P; Uichanco, Christopher L
2008-09-01
A rockslide-debris avalanche destroyed the remote village of Guinsaugon in Southern Leyte, Philippines, on 17 February 2006. Although search and rescue procedures were implemented immediately, the scale of the landslide and a lack of information about its nature resulted in unfocused and imprecise efforts in the early days of the operation. Technical support was only introduced five days after the event, provided by a team of volunteer geologists, geophysicists, and meteorologists. By the time search and rescue operations were transferred to specific target sites, however, the chances of finding survivors trapped under the rubble had diminished. In such critical situations, speed, accuracy, and the maximum appropriation of resources are crucial. We emphasise here the need for a systematic and technically informed approach to search and rescue missions in large-scale landslide disaster contexts, and the formulation of better disaster management policies in general. Standard procedures must be developed and enforced to improve how civil authorities respond to natural calamities.
PaR-PaR Laboratory Automation Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linshiz, G; Stawski, N; Poust, S
2013-05-01
Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaRmore » allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.« less
NASA Technical Reports Server (NTRS)
Perry, J. L.
2016-01-01
As the Space Station Freedom program transitioned to become the International Space Station (ISS), uncertainty existed concerning the performance capabilities for U.S.- and Russian-provided trace contaminant control (TCC) equipment. In preparation for the first dialogue between NASA and Russian Space Agency personnel in Moscow, Russia, in late April 1994, an engineering analysis was conducted to serve as a basis for discussing TCC equipment engineering assumptions as well as relevant assumptions on equipment offgassing and cabin air quality standards. The analysis presented was conducted as part of the efforts to integrate Russia into the ISS program via the early ISS Multilateral Medical Operations Panel's Air Quality Subgroup deliberations. This analysis, served as a basis for technical deliberations that established a framework for TCC system design and operations among the ISS program's international partners that has been instrumental in successfully managing the ISS common cabin environment.
Preliminary remediation goals for use at the U.S. Department of Energy Oak Ridge Operations Office
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-06-01
This report presents Preliminary Remediation Goals (PRGs) for use in human health risk assessment efforts under the United States Department of Energy, Oak Ridge Operations Office Environmental Restoration (ER) Division. Chemical-specific PRGs are concentration goals for individual chemicals for specific medium and land use combinations. The PRGs are referred to as risk-based because they have been calculated using risk assessment procedures. Risk-based calculations set concentration limits using both carcinogenic or noncarcinogenic toxicity values under specific exposure pathways. The PRG is a concentration that is derived from a specified excess cancer risk level or hazard quotient. This report provides the ERmore » Division with standardized PRGs which are integral to the Remedial Investigation/Feasibility Study process. By managing the assumptions and systems used in PRG derivation, the Environmental Restoration Risk Assessment Program will be able to control the level of quality assurance associated with these risk-based guideline values.« less
PaR-PaR laboratory automation platform.
Linshiz, Gregory; Stawski, Nina; Poust, Sean; Bi, Changhao; Keasling, Jay D; Hillson, Nathan J
2013-05-17
Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaR allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.
Artificial intelligent decision support for low-cost launch vehicle integrated mission operations
NASA Astrophysics Data System (ADS)
Szatkowski, Gerard P.; Schultz, Roger
1988-11-01
The feasibility, benefits, and risks associated with Artificial Intelligence (AI) Expert Systems applied to low cost space expendable launch vehicle systems are reviewed. This study is in support of the joint USAF/NASA effort to define the next generation of a heavy-lift Advanced Launch System (ALS) which will provide economical and routine access to space. The significant technical goals of the ALS program include: a 10 fold reduction in cost per pound to orbit, launch processing in under 3 weeks, and higher reliability and safety standards than current expendables. Knowledge-based system techniques are being explored for the purpose of automating decision support processes in onboard and ground systems for pre-launch checkout and in-flight operations. Issues such as: satisfying real-time requirements, providing safety validation, hardware and Data Base Management System (DBMS) interfacing, system synergistic effects, human interfaces, and ease of maintainability, have an effect on the viability of expert systems as a useful tool.
Artificial intelligent decision support for low-cost launch vehicle integrated mission operations
NASA Technical Reports Server (NTRS)
Szatkowski, Gerard P.; Schultz, Roger
1988-01-01
The feasibility, benefits, and risks associated with Artificial Intelligence (AI) Expert Systems applied to low cost space expendable launch vehicle systems are reviewed. This study is in support of the joint USAF/NASA effort to define the next generation of a heavy-lift Advanced Launch System (ALS) which will provide economical and routine access to space. The significant technical goals of the ALS program include: a 10 fold reduction in cost per pound to orbit, launch processing in under 3 weeks, and higher reliability and safety standards than current expendables. Knowledge-based system techniques are being explored for the purpose of automating decision support processes in onboard and ground systems for pre-launch checkout and in-flight operations. Issues such as: satisfying real-time requirements, providing safety validation, hardware and Data Base Management System (DBMS) interfacing, system synergistic effects, human interfaces, and ease of maintainability, have an effect on the viability of expert systems as a useful tool.
40 CFR 792.63 - Maintenance and calibration of equipment.
Code of Federal Regulations, 2012 CFR
2012-07-01
... standardized. (b) The written standard operating procedures required under § 792.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...
40 CFR 792.63 - Maintenance and calibration of equipment.
Code of Federal Regulations, 2011 CFR
2011-07-01
... standardized. (b) The written standard operating procedures required under § 792.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...
21 CFR 58.63 - Maintenance and calibration of equipment.
Code of Federal Regulations, 2013 CFR
2013-04-01
... standardized. (b) The written standard operating procedures required under § 58.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...
40 CFR 792.63 - Maintenance and calibration of equipment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... standardized. (b) The written standard operating procedures required under § 792.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...
21 CFR 58.63 - Maintenance and calibration of equipment.
Code of Federal Regulations, 2011 CFR
2011-04-01
... standardized. (b) The written standard operating procedures required under § 58.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...
21 CFR 58.63 - Maintenance and calibration of equipment.
Code of Federal Regulations, 2012 CFR
2012-04-01
... standardized. (b) The written standard operating procedures required under § 58.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...
21 CFR 58.63 - Maintenance and calibration of equipment.
Code of Federal Regulations, 2014 CFR
2014-04-01
... standardized. (b) The written standard operating procedures required under § 58.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...
40 CFR 792.63 - Maintenance and calibration of equipment.
Code of Federal Regulations, 2013 CFR
2013-07-01
... standardized. (b) The written standard operating procedures required under § 792.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...
40 CFR 792.63 - Maintenance and calibration of equipment.
Code of Federal Regulations, 2014 CFR
2014-07-01
... standardized. (b) The written standard operating procedures required under § 792.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...
21 CFR 58.63 - Maintenance and calibration of equipment.
Code of Federal Regulations, 2010 CFR
2010-04-01
... standardized. (b) The written standard operating procedures required under § 58.81(b)(11) shall set forth in... maintenance operations were routine and followed the written standard operating procedures. Written records... operating procedures shall designate the person responsible for the performance of each operation. (c...
DOE`s nation-wide system for access control can solve problems for the federal government
DOE Office of Scientific and Technical Information (OSTI.GOV)
Callahan, S.; Tomes, D.; Davis, G.
1996-07-01
The U.S. Department of Energy`s (DOE`s) ongoing efforts to improve its physical and personnel security systems while reducing its costs, provide a model for federal government visitor processing. Through the careful use of standardized badges, computer databases, and networks of automated access control systems, the DOE is increasing the security associated with travel throughout the DOE complex, and at the same time, eliminating paperwork, special badging, and visitor delays. The DOE is also improving badge accountability, personnel identification assurance, and access authorization timeliness and accuracy. Like the federal government, the DOE has dozens of geographically dispersed locations run by manymore » different contractors operating a wide range of security systems. The DOE has overcome these obstacles by providing data format standards, a complex-wide virtual network for security, the adoption of a standard high security system, and an open-systems-compatible link for any automated access control system. If the location`s level of security requires it, positive visitor identification is accomplished by personal identification number (PIN) and/or by biometrics. At sites with automated access control systems, this positive identification is integrated into the portals.« less
Avruscio, Giampiero; Tocco-Tussardi, Ilaria; Bordignon, Greta; Vindigni, Vincenzo
2017-01-01
Chronic vascular wounds have a significant economic and social impact on our society calling for allocation of a great deal of attention and resources. Efforts should be oriented toward the achievement of the most effective and efficient clinical management. The Angiology Unit at the University Hospital of Padova, Italy, developed a performance improvement project to enhance the quality of practice for vascular ulcers. The project consisted in a multistep process comprising a critical revision of the previous clinical process management, staff education, tightening connections between operators and services, and creation of a position for a wound care nurse. The previous standard of practice was modified according to the results of revision and the current evidence-based practice. The new standard of practice reached its full application in September 2015. The number of patients treated and the number of visits in 2015 remained almost unvaried from 2014. However, the total annual expenditure for treating vascular ulcers was reduced by ~60% from the previous year. Standardization of guidelines and practice is effective in creating an efficient clinical management and in reducing the economic burden of vascular ulcers.
Robles, Brenda; Wood, Michelle; Kimmons, Joel; Kuo, Tony
2013-03-01
National, state, and local institutions that procure, distribute, sell, and/or serve food to employees, students, and the public are increasingly capitalizing on existing operational infrastructures to create healthier food environments. Integration of healthy nutrition standards and other recommended practices [e.g., energy (kilocalories) postings at point-of-purchase, portion size restrictions, product placement guidelines, and signage] into new or renewing food service and vending contracts codifies an institution's commitment to increasing the availability of healthful food options in their food service venues and vending machines. These procurement requirements, in turn, have the potential to positively influence consumers' food-purchasing behaviors. Although these strategies are becoming increasingly popular, much remains unknown about their context, the processes required to implement them effectively, and the factors that facilitate their sustainability, especially in such broad and diverse settings as schools, county government facilities, and cities. To contribute to this gap in information, we reviewed and compared nutrition standards and other best practices implemented recently in a large school district, in a large county government, and across 10 municipalities in Los Angeles County. We report lessons learned from these efforts.
Mission Systems Open Architecture Science and Technology (MOAST) program
NASA Astrophysics Data System (ADS)
Littlejohn, Kenneth; Rajabian-Schwart, Vahid; Kovach, Nicholas; Satterthwaite, Charles P.
2017-04-01
The Mission Systems Open Architecture Science and Technology (MOAST) program is an AFRL effort that is developing and demonstrating Open System Architecture (OSA) component prototypes, along with methods and tools, to strategically evolve current OSA standards and technical approaches, promote affordable capability evolution, reduce integration risk, and address emerging challenges [1]. Within the context of open architectures, the program is conducting advanced research and concept development in the following areas: (1) Evolution of standards; (2) Cyber-Resiliency; (3) Emerging Concepts and Technologies; (4) Risk Reduction Studies and Experimentation; and (5) Advanced Technology Demonstrations. Current research includes the development of methods, tools, and techniques to characterize the performance of OMS data interconnection methods for representative mission system applications. Of particular interest are the OMS Critical Abstraction Layer (CAL), the Avionics Service Bus (ASB), and the Bulk Data Transfer interconnects, as well as to develop and demonstrate cybersecurity countermeasures techniques to detect and mitigate cyberattacks against open architecture based mission systems and ensure continued mission operations. Focus is on cybersecurity techniques that augment traditional cybersecurity controls and those currently defined within the Open Mission System and UCI standards. AFRL is also developing code generation tools and simulation tools to support evaluation and experimentation of OSA-compliant implementations.
Robles, Brenda; Wood, Michelle; Kimmons, Joel; Kuo, Tony
2013-01-01
National, state, and local institutions that procure, distribute, sell, and/or serve food to employees, students, and the public are increasingly capitalizing on existing operational infrastructures to create healthier food environments. Integration of healthy nutrition standards and other recommended practices [e.g., energy (kilocalories) postings at point-of-purchase, portion size restrictions, product placement guidelines, and signage] into new or renewing food service and vending contracts codifies an institution’s commitment to increasing the availability of healthful food options in their food service venues and vending machines. These procurement requirements, in turn, have the potential to positively influence consumers’ food-purchasing behaviors. Although these strategies are becoming increasingly popular, much remains unknown about their context, the processes required to implement them effectively, and the factors that facilitate their sustainability, especially in such broad and diverse settings as schools, county government facilities, and cities. To contribute to this gap in information, we reviewed and compared nutrition standards and other best practices implemented recently in a large school district, in a large county government, and across 10 municipalities in Los Angeles County. We report lessons learned from these efforts. PMID:23493535
System 80+{trademark} standard design: CESSAR design certification. Volume 5: Amendment I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report has been prepared in support of the industry effort to standardize nuclear plant designs. The documents in this series describe the Combustion Engineering, Inc. System 80+{sup TM} Standard Design.
NASA Astrophysics Data System (ADS)
Fortunato, Michael W. P.
2017-03-01
This essay is a response to a paper by Avery and Hains that raises questions about the often unintended effects of knowledge standardization in an educational setting. While many K-12 schools are implementing common core standards, and many institutions of higher education are implementing their own standardized educational practices, the question is raised about what is lost in this effort to ensure regularity and consistency in educational outcomes. One such casualty may be local knowledge, which in a rural context includes ancestral knowledge about land, society, and cultural meaning. This essay explores whether or not efforts to standardize crowd out such knowledge, and decrease the diversity of knowledge within our society's complex ecosystem—thus making the ecosystem weaker. Using antifragility as a useful idea for examining system complexity, the essay considers the impact of standardization on innovation, democracy, and the valuation of some forms of knowledge (and its bearers) above others.
MO-D-218-01: Overview of Methodology and Standards (QIBA, IEC, AIUM and AAPM).
Carson, P
2012-06-01
Ultrasound system standards and professional guidelines can facilitate efficient provision of medical physics services and growth of ultrasound imaging if the documents are well designed and are utilized. We too often develop our own phantoms and procedures and never converge to obtain a critical mass of data on system performance and value of such services. Standards can also produce unnecessary costs and limit innovation if not carefully developed, reviewed, and changed as needed. There are quite a few new initiatives that, if followed vigorously, could improve medical ultrasound and medical physicists' contributions thereto. This talk is to explain many of the existing standards and recommendations for ultrasound system quality control, performance evaluation, and safety, as well as current and suggested efforts in these areas. The primary standards body for medical ultrasound systems is now the International Electrotechnical Commission (IEC). Uniformity across the world is helpful to all if the documents are reasonably current. There still is a role for traditional bodies such as the AAPM with its valuable report series and the American Inst. of Ultras. in Med. (AIUM) with its own standards and reports and its joint work with the Medical Imaging Technology Alliance (MITA). All three, with strong involvement of FDA scientists and with some efforts from the Acoustical Society of America have historically provided the main standards affecting medical physicists. Now that the lengthy IEC process is moving more smoothly, our national bodies still can provide new developments and drafts that can be offered as needed for international standardization. The ACR in particular can provide meaningful incentives through ultrasound service accreditation. Without any regulatory or strong consumer push, reports and standards on ultrasound system performance have received only modest use in the USA. A consistent consumer or accreditation push might be justified now. A series of three standards on performance evaluation is well on its way to covering pulse echo ultrasound well, with IEC 61319-1 on spatial measurements, IEC 61319-2 on depth of penetration and local dynamic range and one draft and one Technical Specification 62558 on small void imaging. A new effort has just been initiated to help drive more and better use of quantitative ultrasound imaging in human and surrogate studies and in clinical use. A shear wave speed ultrasound technical committee will carry out this effort in the Quantitative Imaging Biomarkers Alliance (QIBA) that is managed by the RSNA. 1. Understand the coverage of the two current and third planned IEC medical ultrasound performance evaluation standards that could form a basis for stable performance evaluation tests. 2. Understand the coverage of the Current AIUM and ACR QC documents and the drafting and support efforts in the IEC. 3. Understand the need for and partial availability of simplified software and instructions to improve and facilitate performance of these tests? 4. Understand how standards development can lead to improved understanding and performance of medical ultrasound imaging as is anticipated for the new QIBA effort. © 2012 American Association of Physicists in Medicine.
Tropospheric ozone observations - How well can we assess tropospheric ozone changes?
NASA Astrophysics Data System (ADS)
Tarasick, D. W.; Galbally, I. E.; Ancellet, G.; Leblanc, T.; Wallington, T. J.; Ziemke, J. R.; Steinbacher, M.; Stähelin, J.; Vigouroux, C.; Hannigan, J. W.; García, O. E.; Foret, G.; Zanis, P.; Liu, X.; Weatherhead, E. C.; Petropavlovskikh, I. V.; Worden, H. M.; Osman, M.; Liu, J.; Lin, M.; Cooper, O. R.; Schultz, M. G.; Granados-Muñoz, M. J.; Thompson, A. M.; Cuesta, J.; Dufour, G.; Thouret, V.; Hassler, B.; Trickl, T.
2017-12-01
Since the early 20th century, measurements of ozone in the free troposphere have evolved and changed. Data records have different uncertainties and biases, and differ with respect to coverage, information content, and representativeness. Almost all validation studies employ ECC ozonesondes. These have been compared to UV-absorption measurements in a number of intercomparison studies, and show a modest ( 1-5%) high bias in the troposphere, with an uncertainty of 5%, but no evidence of a change over time. Umkehr, lidar, FTIR, and commercial aircraft all show modest low biases relative to the ECCs, and so -- if the ECC biases are transferable -- all agree within 1σ with the modern UV standard. Relative to the UV standard, Brewer-Mast sondes show a 20% increase in sensitivity from 1970-1995, while Japanese KC sondes show an increase of 5-10%. Combined with the shift of the global ozonesonde network to ECCs, this can induce a false positive trend, in analyses based on sonde data. Passive sounding methods -- Umkehr, FTIR and satellites -- have much lower vertical resolution than active methods, and this can limit the attribution of trends. Satellite biases are larger than those of other measurement systems, ranging between -10% and +20%, and standard deviations are large: about 10-30%, versus 5-10% for sondes, aircraft, lidar and ground-based FTIR. There is currently little information on measurement drift for satellite measurements of tropospheric ozone. This is an evident area of concern if satellite retrievals are used for trend studies. The importance of ECC sondes as a transfer standard for satellite validation means that efforts to homogenize existing records, by correcting for known changes and by adopting strict standard operating procedures, should continue, and additional research effort should be put into understanding and reducing sonde uncertainties. Representativeness is also a potential source of large errors, which are difficult to quantify. The global observation network is unevenly distributed, and so additional sites (or airports), would be of benefit. Objective methods of quantifying spatial representativeness can optimize future network design. International cooperation and data sharing will be of paramount importance, as the TOAR project has demonstrated.
A Free and Open Source Web-based Data Catalog Evaluation Tool
NASA Astrophysics Data System (ADS)
O'Brien, K.; Schweitzer, R.; Burger, E. F.
2015-12-01
For many years, the Unified Access Framework (UAF) project has worked to provide improved access to scientific data by leveraging widely used data standards and conventions. These standards include the Climate and Forecast (CF) metadata conventions, the Data Access Protocol (DAP) and various Open Geospatial Consortium (OGC) standards such as WMS and WCS. The UAF has also worked to create a unified access point for scientific data access through THREDDS and ERDDAP catalogs. A significant effort was made by the UAF project to build a catalog-crawling tool that was designed to crawl remote catalogs, analyze their content and then build a clean catalog that 1) represented only CF compliant data; 2) provided a uniform set of access services and 3) where possible, aggregated data in time. That catalog is available at http://ferret.pmel.noaa.gov/geoide/geoIDECleanCatalog.html.Although this tool has proved immensely valuable in allowing the UAF project to create a high quality data catalog, the need for a catalog evaluation service or tool to operate on a more local level also exists. Many programs that generate data of interest to the public are recognizing the utility and power of using the THREDDS data server (TDS) to serve that data. However, for some groups that lack the resources to maintain dedicated IT personnel, it can be difficult to set up a properly configured TDS. The TDS catalog evaluating service that is under development and will be discussed in this presentation is an effort, through the UAF project, to bridge that gap. Based upon the power of the original UAF catalog cleaner, the web evaluator will have the ability to scan and crawl a local TDS catalog, evaluate the contents for compliance with CF standards, analyze the services offered, and identify datasets where possible temporal aggregation would benefit data access. The results of the catalog evaluator will guide the configuration of the dataset in TDS to ensure that it meets the standards as promoted by the UAF framework.
Christian, Michael D; Joynt, Gavin M; Hick, John L; Colvin, John; Danis, Marion; Sprung, Charles L
2010-04-01
To provide recommendations and standard operating procedures for intensive care unit (ICU) and hospital preparations for an influenza pandemic or mass disaster with a specific focus on critical care triage. Based on a literature review and expert opinion, a Delphi process was used to define the essential topics including critical care triage. Key recommendations include: (1) establish an Incident Management System with Emergency Executive Control Groups at facility, local, regional/state or national levels to exercise authority and direction over resources; (2) developing fair and equitable policies may require restricting ICU services to patients most likely to benefit; (3) usual treatments and standards of practice may be impossible to deliver; (4) ICU care and treatments may have to be withheld from patients likely to die even with ICU care and withdrawn after a trial in patients who do not improve or deteriorate; (5) triage criteria should be objective, ethical, transparent, applied equitably and be publically disclosed; (6) trigger triage protocols for pandemic influenza only when critical care resources across a broad geographic area are or will be overwhelmed despite all reasonable efforts to extend resources or obtain additional resources; (7) triage of patients for ICU should be based on those who are likely to benefit most or a 'first come, first served' basis; (8) a triage officer should apply inclusion and exclusion criteria to determine patient qualification for ICU admission. Judicious planning and adoption of protocols for critical care triage are necessary to optimize outcomes during a pandemic.
Limitations and challenges towards an effective business continuity management in Nuklear Malaysia
NASA Astrophysics Data System (ADS)
Hamid, A. H. A.
2018-01-01
One of Nuklear Malaysia’s top concerns is radiological and nuclear safety as well as security preparedness of its operational facility management, which was bonded by Act 304, Directive 20 and International Atomic Energy Agency (IAEA) guidelines. In 2012, the Malaysian government initialised the Business Continuity Management System under the supervision of Malaysian Administrative Modernization and Management Planning Unit (MAMPU), referring to MAMPU.BPICT.700-4/2/11 (3), ISO 22301:2012 and Business Continuity Good Practice Guidelines 2013 documentation. These standards are integral to the implementation of a resilient management program that indicates an organisation’s capability to prevent any accident from occurring and spreading its impact, which includes sufficient recovery action to post-accident situation towards a normal operational and managerial state. Unfortunately, there is a lack of certified Business Continuity Management standard among the public sector agencies compared to local private sectors. Subsequently, Nuklear Malaysia has been selected by MAMPU and CyberSecurity Malaysia as one of the pioneering agencies to be certified accordingly. This paper significantly recognized Nuklear Malaysia’s effort to plan, analyse, design, implement, review and validate the establishment of this standard currently. The project was implemented using a case study approach to complete the required certification activities. As a result, this paper proposed benchmarking the selected literature reviews against the Nuklear Malaysia experience to determine best practices in implementing and managing Business Continuity effectively. It concluded that a resilient Business Continuity Management program needs to be incorporated into Nuclear Malaysia’s capabilities in ensuring its mitigation capacities to survive any unexpected event and subsequently overcome future challenges.
Dagostino, Concetta; De Gregori, Manuela; Gieger, Christian; Manz, Judith; Gudelj, Ivan; Lauc, Gordan; Divizia, Laura; Wang, Wei; Sim, Moira; Pemberton, Iain K; MacDougall, Jane; Williams, Frances; Van Zundert, Jan; Primorac, Dragan; Aulchenko, Yurii; Kapural, Leonardo; Allegri, Massimo
2017-01-01
Chronic low back pain (CLBP) is one of the most common medical conditions, ranking as the greatest contributor to global disability and accounting for huge societal costs based on the Global Burden of Disease 2010 study. Large genetic and -omics studies provide a promising avenue for the screening, development and validation of biomarkers useful for personalized diagnosis and treatment (precision medicine). Multicentre studies are needed for such an effort, and a standardized and homogeneous approach is vital for recruitment of large numbers of participants among different centres (clinical and laboratories) to obtain robust and reproducible results. To date, no validated standard operating procedures (SOPs) for genetic/-omics studies in chronic pain have been developed. In this study, we validated an SOP model that will be used in the multicentre (5 centres) retrospective "PainOmics" study, funded by the European Community in the 7th Framework Programme, which aims to develop new biomarkers for CLBP through three different -omics approaches: genomics, glycomics and activomics. The SOPs describe the specific procedures for (1) blood collection, (2) sample processing and storage, (3) shipping details and (4) cross-check testing and validation before assays that all the centres involved in the study have to follow. Multivariate analysis revealed the absolute specificity and homogeneity of the samples collected by the five centres for all genetics, glycomics and activomics analyses. The SOPs used in our multicenter study have been validated. Hence, they could represent an innovative tool for the correct management and collection of reliable samples in other large-omics-based multicenter studies.
Reduction and standardization of surgical instruments in pediatric inguinal hernia repair.
Koyle, Martin A; AlQarni, Naif; Odeh, Rakan; Butt, Hissan; Alkahtani, Mohammed M; Konstant, Louis; Pendergast, Lisa; Koyle, Leah C C; Baker, G Ross
2018-02-01
To standardize and reduce surgical instrumentation by >25% within a 9-month period for pediatric inguinal hernia repair (PIHR), using "improvement science" methodology. We prospectively evaluated instruments used for PIHR in 56 consecutive cases by individual surgeons across two separate subspecialties, pediatric surgery (S) and pediatric urology (U), to measure actual number of instruments used compared with existing practice based on preference cards. Based on this evaluation, a single preference card was developed using only instruments that had been used in >50% of all cases. A subsequent series of 52 cases was analyzed to assess whether the new tray contained the ideal instrumentation. Cycle time (CT), to sterilize and package the instruments, and weights of the trays were measured before and after the intervention. A survey of operating room (OR) nurses and U and S surgeons was conducted before and after the introduction of the standardized tray to assess the impact and perception of standardization. Prior to creating the standardized tray, a U PIHR tray contained 96 instruments with a weight of 13.5 lbs, while the S set contained 51, weighing 11.2 lbs. The final standardized set comprised 28 instruments and weighed 7.8 lbs. Of 52 PIHRs performed after standardization, in three (6%) instances additional instruments were requested. CT was reduced from 11 to 8 min (U and S respectively) to <5 min for the single tray. Nurses and surgeons reported that quality, safety, and efficiency were improved, and that efforts should continue to standardize instrumentation for other common surgeries. Standardization of surgical equipment can be employed across disciplines with the potential to reduce costs and positively impact quality, safety, and efficiencies. Copyright © 2017 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.
Water Quality Projects Summary for the Mid-Columbia and Cumberland River Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Kevin M.; Witt, Adam M.; Hadjerioua, Boualem
Scheduling and operational control of hydropower systems is accompanied with a keen awareness of the management of water use, environmental effects, and policy, especially within the context of strict water rights policy and generation maximization. This is a multi-objective problem for many hydropower systems, including the Cumberland and Mid-Columbia river systems. Though each of these two systems have distinct operational philosophies, hydrologic characteristics, and system dynamics, they both share a responsibility to effectively manage hydropower and the environment, which requires state-of-the art improvements in the approaches and applications for water quality modeling. The Department of Energy and Oak Ridge Nationalmore » Laboratory have developed tools for total dissolved gas (TDG) prediction on the Mid-Columbia River and a decision-support system used for hydropower generation and environmental optimization on the Cumberland River. In conjunction with IIHR - Hydroscience & Engineering, The University of Iowa and University of Colorado s Center for Advanced Decision Support for Water and Environmental Systems (CADSWES), ORNL has managed the development of a TDG predictive methodology at seven dams along the Mid-Columbia River and has enabled the ability to utilize this methodology for optimization of operations at these projects with the commercially available software package Riverware. ORNL has also managed the collaboration with Vanderbilt University and Lipscomb University to develop a state-of-the art method for reducing high-fidelity water quality modeling results into surrogate models which can be used effectively within the context of optimization efforts to maximize generation for a reservoir system based on environmental and policy constraints. The novel contribution of these efforts is the ability to predict water quality conditions with simplified methodologies at the same level of accuracy as more complex and resource intensive computing methods. These efforts were designed to incorporate well into existing hydropower and reservoir system scheduling models, with runtimes that are comparable to existing software tools. In addition, the transferability of these tools to assess other systems is enhanced due the use of simplistic and easily attainable values for inputs, straight-forward calibration of predictive equation coefficients, and standardized comparison of traditionally familiar outputs.« less
American lifelines alliance efforts to improve electric power transmission reliability
Nishenko, S.P.; Savage, W.U.; Honegger, D.G.; McLane, T.R.; ,
2002-01-01
A study was performed on American Lifelines Alliance (ALA) efforts to improve electric power transmission reliability. ALA is a public-private partnership project, with the goal of reducing risks to lifelines from natural hazards and human threat events. The mechanism used by ALA for developing national guidelines for lifeline systems is dependent upon using existing Standards Developing Organizations (SDO) accredited by the American National Standards Institute (ANSI) as means to achieve national consensus.
43 CFR 3930.11 - Performance standards for exploration and in situ operations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... in situ operations. 3930.11 Section 3930.11 Public Lands: Interior Regulations Relating to Public....11 Performance standards for exploration and in situ operations. The operator/lessee must adhere to the following standards for all exploration and in situ drilling operations: (a) At the end of...
43 CFR 3930.11 - Performance standards for exploration and in situ operations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... in situ operations. 3930.11 Section 3930.11 Public Lands: Interior Regulations Relating to Public....11 Performance standards for exploration and in situ operations. The operator/lessee must adhere to the following standards for all exploration and in situ drilling operations: (a) At the end of...
43 CFR 3930.11 - Performance standards for exploration and in situ operations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... in situ operations. 3930.11 Section 3930.11 Public Lands: Interior Regulations Relating to Public....11 Performance standards for exploration and in situ operations. The operator/lessee must adhere to the following standards for all exploration and in situ drilling operations: (a) At the end of...
43 CFR 3930.11 - Performance standards for exploration and in situ operations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... in situ operations. 3930.11 Section 3930.11 Public Lands: Interior Regulations Relating to Public....11 Performance standards for exploration and in situ operations. The operator/lessee must adhere to the following standards for all exploration and in situ drilling operations: (a) At the end of...
The IEEE Software Engineering Standards Process
Buckley, Fletcher J.
1984-01-01
Software Engineering has emerged as a field in recent years, and those involved increasingly recognize the need for standards. As a result, members of the Institute of Electrical and Electronics Engineers (IEEE) formed a subcommittee to develop these standards. This paper discusses the ongoing standards development, and associated efforts.
MouthLab: A Tricorder Concept Optimized for Rapid Medical Assessment.
Fridman, Gene Y; Tang, Hai; Feller-Kopman, David; Hong, Yang
2015-09-01
The goal of rapid medical assessment (RMA) is to estimate the general health of a patient during an emergency room or a doctor's office visit, or even while the patient is at home. Currently the devices used during RMA are typically "all-in-one" vital signs monitors. They require time, effort and expertise to attach various sensors to the body. A device optimized for RMA should instead require little effort or expertise to operate and be able to rapidly obtain and consolidate as much information as possible. MouthLab is a battery powered hand-held device intended to acquire and evaluate many measurements such as non-invasive blood sugar, saliva and respiratory biochemistry. Our initial prototype acquires standard vital signs: pulse rate (PR), breathing rate (BR), temperature (T), blood oxygen saturation (SpO2), blood pressure (BP), and a three-lead electrocardiogram. In our clinical study we tested the device performance against the measurements obtained with a standard patient monitor. 52 people participated in the study. The measurement errors were as follows: PR: -1.7 ± 3.5 BPM, BR: 0.4 ± 2.4 BPM, T: -0.4 ± 1.24 °F, SpO2: -0.6 ± 1.7%. BP systolic: -1.8 ± 12 mmHg, BP diastolic: 0.6 ± 8 mmHg. We have shown that RMA can be easily performed non-invasively by patients with no prior training.
Moeller, Dade W
2009-11-01
The Yucca Mountain high-level radioactive waste repository is designed to contain spent nuclear fuel and vitrified fission products. Due to the fact that it will be the first such facility constructed anywhere in the world, it has proved to be one in which multiple organizations, most prominently the U.S. Congress, are exercising a role. In addition to selecting a site for the facility, Congress specified that the U.S. Environmental Protection Agency (U.S. EPA) promulgate the associated Standards, the U.S. Nuclear Regulatory Commission establish applicable Regulations to implement the Standards, and the U.S. Department of Energy (U.S. DOE) design, construct, and operate the repository. Congress also specified that U.S. EPA request that the National Academy of Sciences (NAS) provide them guidance on the form and nature of the Standards. In so doing, Congress also stipulated that the Standards be expressed in terms of an "equivalent dose rate." As will be noted, this subsequently introduced serious complications. Due to the inputs of so many groups, and the fact that the NAS recommendations conflicted with the Congressional stipulation that the limits be expressed in terms of a dose rate, the outcome is a set of Standards that not only does not comply with the NAS recommendations, but also is neither integrated, nor consistent. The initial goals of this paper are to provide an independent risk/dose analysis for each of the eight radionuclides that are to be regulated, and to evaluate them in terms of the Standards. These efforts reveal that the Standards are neither workable nor capable of being implemented. The concluding portions of the paper provide guidance that, if successfully implemented, would enable U.S. DOE to complete the construction of the repository and operate it in accordance with the recommendations of NAS while, at the same time, provide a better, more accurate, understanding of its potential risks to the public. This facility is too important to the U.S. nuclear energy program to be impeded by inappropriate Standards and unnecessary regulatory restrictions. As will be noted, the sources of essentially all of the recommendations suggested in this paper were derived through applications of the principles of good science, and the benefits of "thinking outside the box."
21 CFR 58.81 - Standard operating procedures.
Code of Federal Regulations, 2010 CFR
2010-04-01
... LABORATORY PRACTICE FOR NONCLINICAL LABORATORY STUDIES Testing Facilities Operation § 58.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting... following: (1) Animal room preparation. (2) Animal care. (3) Receipt, identification, storage, handling...
21 CFR 58.81 - Standard operating procedures.
Code of Federal Regulations, 2013 CFR
2013-04-01
... LABORATORY PRACTICE FOR NONCLINICAL LABORATORY STUDIES Testing Facilities Operation § 58.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting... following: (1) Animal room preparation. (2) Animal care. (3) Receipt, identification, storage, handling...
21 CFR 58.81 - Standard operating procedures.
Code of Federal Regulations, 2012 CFR
2012-04-01
... LABORATORY PRACTICE FOR NONCLINICAL LABORATORY STUDIES Testing Facilities Operation § 58.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting... following: (1) Animal room preparation. (2) Animal care. (3) Receipt, identification, storage, handling...
21 CFR 58.81 - Standard operating procedures.
Code of Federal Regulations, 2014 CFR
2014-04-01
... LABORATORY PRACTICE FOR NONCLINICAL LABORATORY STUDIES Testing Facilities Operation § 58.81 Standard operating procedures. (a) A testing facility shall have standard operating procedures in writing setting... following: (1) Animal room preparation. (2) Animal care. (3) Receipt, identification, storage, handling...
Fatigue Management in Spaceflight Operations
NASA Technical Reports Server (NTRS)
Whitmire, Alexandra
2011-01-01
Sleep loss and fatigue remain an issue for crewmembers working on the International Space Station, and the ground crews who support them. Schedule shifts on the ISS are required for conducting mission operations. These shifts lead to tasks being performed during the biological night, and sleep scheduled during the biological day, for flight crews and the ground teams who support them. Other stressors have been recognized as hindering sleep in space; these include workload, thinking about upcoming tasks, environmental factors, and inadequate day/night cues. It is unknown if and how other factors such as microgravity, carbon dioxide levels, or increased radiation, may also play a part. Efforts are underway to standardize and provide care for crewmembers, ground controllers and other support personnel. Through collaborations between research and operations, evidenced-based clinical practice guidelines are being developed to equip flight surgeons with the tools and processes needed for treating circadian desynchrony (and subsequent sleep loss) caused by jet lag and shift work. The proper implementation of countermeasures such as schedules, lighting protocols, and cognitive behavioral education can hasten phase shifting, enhance sleep and optimize performance. This panel will focus on Fatigue Management in Spaceflight Operations. Speakers will present on research-based recommendations and technologies aimed at mitigating sleep loss, circadian desynchronization and fatigue on-orbit. Gaps in current mitigations and future recommendations will also be discussed.
Design-for-reliability (DfR) of aerospace electronics: Attributes and challenges
NASA Astrophysics Data System (ADS)
Bensoussan, A.; Suhir, E.
The next generation of multi-beam satellite systems that would be able to provide effective interactive communication services will have to operate within a highly flexible architecture. One option to develop such flexibility is to employ microwaves and/or optoelectronic components and to make them reliable. The use of optoelectronic devices, equipments and systems will result indeed in significant improvement in the state-of-the-art only provided that the new designs will suggest a novel and effective architecture that will combine the merits of good functional performance, satisfactory mechanical (structural) reliability and high cost effectiveness. The obvious challenge is the ability to design and fabricate equipment based on EEE components that would be able to successfully withstand harsh space environments for the entire duration of the mission. It is imperative that the major players in the space industry, such as manufacturers, industrial users, and space agencies, understand the importance and the limits of the achievable quality and reliability of optoelectronic devices operated in harsh environments. It is equally imperative that the physics of possible failures is well understood and, if necessary, minimized, and that adequate Quality Standards are developed and employed. The space community has to identify and to develop the strategic approach for validating optoelectronic products. This should be done with consideration of numerous intrinsic and extrinsic requirements for the systems' performance. When considering a particular next generation optoelectronic space system, the space community needs to address the following major issues: proof of concept for this system, proof of reliability and proof of performance. This should be done with taking into account the specifics of the anticipated application. High operational reliability cannot be left to the prognostics and health monitoring/management (PHM) effort and stage, no matter how important and - ffective such an effort might be. Reliability should be pursued at all the stages of the equipment lifetime: design, product development, manufacturing, burn-in testing and, of course, subsequent PHM after the space apparatus is launched and operated.
NEON's eddy-covariance: interoperable flux data products, software and services for you, now
NASA Astrophysics Data System (ADS)
Metzger, S.; Desai, A. R.; Durden, D.; Hartmann, J.; Li, J.; Luo, H.; Durden, N. P.; Sachs, T.; Serafimovich, A.; Sturtevant, C.; Xu, K.
2017-12-01
Networks of eddy-covariance (EC) towers such as AmeriFlux, ICOS and NEON are vital for providing the necessary distributed observations to address interactions at the soil-vegetation-atmosphere interface. NEON, close to full operation with 47 tower sites, will represent the largest single-provider EC network globally. Its standardized observation and data processing suite is designed specifically for inter-site comparability and analysis of feedbacks across multiple spatial and temporal scales. Furthermore, NEON coordinates EC with rich contextual observations such as airborne remote sensing and in-situ sampling bouts. In January 2018 NEON enters its operational phase, and EC data products, software and services become fully available to the science community at large. These resources strive to incorporate lessons-learned through collaborations with AmeriFlux, ICOS, LTER and others, to suggest novel systemic solutions, and to synergize ongoing research efforts across science communities. Here, we present an overview of the ongoing product release, alongside efforts to integrate and collaborate with existing infrastructures, networks and communities. Near-real-time heat, water and carbon cycle observations in "basic" and "expanded", self-describing HDF5 formats become accessible from the NEON Data Portal, including an Application Program Interface. Subsequently, they are ingested into the AmeriFlux processing pipeline, together with inclusion in FLUXNET globally harmonized data releases. Software for reproducible, extensible and portable data analysis and science operations management also becomes available. This includes the eddy4R family of R-packages underlying the data product generation, together with the ability to directly participate in open development via GitHub version control and DockerHub image hosting. In addition, templates for science operations management include a web-based field maintenance application and a graphical user interface to simplify problem tracking and resolution along the entire data chain. We hope that this presentation can initiate further collaboration and synergies in challenge areas, and would appreciate input and discussion on continued development.
Hubble Space Telescope: cost reduction by re-engineering telemetry processing and archiving
NASA Astrophysics Data System (ADS)
Miebach, Manfred P.
1998-05-01
The Hubble Space Telescope (HST), the first of NASA's Great Observatories, was launched on April 24, 1990. The HST was designed for a minimum fifteen-year mission with on-orbit servicing by the Space Shuttle System planned at approximately three-year intervals. Major changes to the HST ground system are planned to be in place for the third servicing mission in December 1999. The primary objectives of the ground system reengineering effort, a project called 'vision December 1999. The primary objectives of the ground system re-engineering effort, a project called 'vision 2000 control center systems (CCS)', are to reduce both development and operating costs significantly for the remaining years of HST's lifetime. Development costs will be reduced by providing a modern hardware and software architecture and utilizing commercial of f the shelf (COTS) products wherever possible. Operating costs will be reduced by eliminating redundant legacy systems and processes and by providing an integrated ground system geared toward autonomous operation. Part of CCS is a Space Telescope Engineering Data Store, the design of which is based on current Data Warehouse technology. The purpose of this data store is to provide a common data source of telemetry data for all HST subsystems. This data store will become the engineering data archive and will include a queryable database for the user to analyze HST telemetry. The access to the engineering data in the Data Warehouse is platform- independent from an office environment using commercial standards. Latest internet technology is used to reach the HST engineering community. A WEB-based user interface allows easy access to the data archives. This paper will provide a high level overview of the CCS system and will illustrate some of the CCS telemetry capabilities. Samples of CCS user interface pages will be given. Vision 2000 is an ambitious project, but one that is well under way. It will allow the HST program to realize reduced operations costs for the Third Servicing Mission and beyond.
National High Frequency Radar Network (hfrnet) and Pacific Research Efforts
NASA Astrophysics Data System (ADS)
Hazard, L.; Terrill, E. J.; Cook, T.; de Paolo, T.; Otero, M. P.; Rogowski, P.; Schramek, T. A.
2016-12-01
The U.S. High Frequency Radar Network (HFRNet) has been in operation for over ten years with representation from 31 organizations spanning academic institutions, state and local government agencies, and private organizations. HFRNet currently holds a collection from over 130 radar installations totaling over 10 million records of surface ocean velocity measurements. HFRNet is a primary example of inter-agency and inter-institutional partnerships for improving oceanographic research and operations. HF radar derived surface currents have been used in several societal applications including coastal search and rescue, oil spill response, water quality monitoring and marine navigation. Central to the operational success of the large scale network is an efficient data management, storage, access, and delivery system. The networking of surface current mapping systems is characterized by a tiered structure that extends from the individual field installations to local regional operations maintaining multiple sites and on to centralized locations aggregating data from all regions. The data system development effort focuses on building robust data communications from remote field locations (sites) for ingestion into the data system via data on-ramps (Portals or Site Aggregators) to centralized data repositories (Nodes). Centralized surface current data enables the aggregation of national surface current grids and allows for ingestion into displays, management tools, and models. The Coastal Observing Research and Development Center has been involved in international relationships and research in the Philippines, Palau, and Vietnam. CORDC extends this IT architecture of surface current mapping data systems leveraging existing developments and furthering standardization of data services for seamless integration of higher level applications. Collaborations include the Philippine Atmospheric Geophysical and Astronomical Services Administration (PAGASA), The Coral Reef Research Foundation (CRRF), and the Center for Oceanography/Vietnamese Administration of Seas and Islands (CFO/VASI). These collaborations and data sharing improve our abilities to respond to regional, national, and global environmental and management issues.
NASA Astrophysics Data System (ADS)
Kiekebusch, Mario J.; Lucuix, Christian; Erm, Toomas M.; Chiozzi, Gianluca; Zamparelli, Michele; Kern, Lothar; Brast, Roland; Pirani, Werther; Reiss, Roland; Popovic, Dan; Knudstrup, Jens; Duchateau, Michel; Sandrock, Stefan; Di Lieto, Nicola
2014-07-01
ESO is currently in the final phase of the standardization process for PC-based Programmable Logical Controllers (PLCs) as the new platform for the development of control systems for future VLT/VLTI instruments. The standard solution used until now consists of a Local Control Unit (LCU), a VME-based system having a CPU and commercial and proprietary boards. This system includes several layers of software and many thousands of lines of code developed and maintained in house. LCUs have been used for several years as the interface to control instrument functions but now are being replaced by commercial off-the-shelf (COTS) systems based on BECKHOFF Embedded PCs and the EtherCAT fieldbus. ESO is working on the completion of the software framework that enables a seamless integration into the VLT control system in order to be ready to support upcoming instruments like ESPRESSO and ERIS, that will be the first fully VLT compliant instruments using the new standard. The technology evaluation and standardization process has been a long and combined effort of various engineering disciplines like electronics, control and software, working together to define a solution that meets the requirements and minimizes the impact on the observatory operations and maintenance. This paper presents the challenges of the standardization process and the steps involved in such a change. It provides a technical overview of how industrial standards like EtherCAT, OPC-UA, PLCOpen MC and TwinCAT can be used to replace LCU features in various areas like software engineering and programming languages, motion control, time synchronization and astronomical tracking.
ISS Interface Mechanisms and their Heritage
NASA Technical Reports Server (NTRS)
Cook, John G.; Aksamentov, Valery; Hoffman, Thomas; Bruner, Wes
2011-01-01
The International Space Station, by nurturing technological development of a variety of pressurized and unpressurized interface mechanisms fosters "competition at the technology level". Such redundancy and diversity allows for the development and testing of mechanisms that might be used for future exploration efforts. The International Space Station, as a test-bed for exploration, has 4 types of pressurized interfaces between elements and 6 unpressurized attachment mechanisms. Lessons learned from the design, test and operations of these mechanisms will help inform the design for a new international standard pressurized docking mechanism for the NASA Docking System. This paper will examine the attachment mechanisms on the ISS and their attributes. It will also look ahead at the new NASA docking system and trace its lineage to heritage mechanisms.
Donato, David I.; Shapiro, Jason L.
2016-12-13
An effort to build a unified collection of geospatial data for use in land-change modeling (LCM) led to new insights into the requirements and challenges of building an LCM data infrastructure. A case study of data compilation and unification for the Richmond, Va., Metropolitan Statistical Area (MSA) delineated the problems of combining and unifying heterogeneous data from many independent localities such as counties and cities. The study also produced conclusions and recommendations for use by the national LCM community, emphasizing the critical need for simple, practical data standards and conventions for use by localities. This report contributes an uncopyrighted core glossary and a much needed operational definition of data unification.
NASA Astrophysics Data System (ADS)
Lee, C. M.
2016-02-01
The NASA Applied Sciences Program plays a unique role in facilitating access to remote sensing-based water information derived from US federal assets towards the goal of improving science and evidence-based decision-making in water resources management. The Water Resources Application Area within NASA Applied Sciences works specifically to develop and improve water data products to support improved management of water resources, with partners who are faced with real-world constraints and conditions including cost and regulatory standards. This poster will highlight the efforts and collaborations enabled by this program that have resulted in integration of remote sensing-based information for water quality modeling and monitoring within an operational context.
NASA Astrophysics Data System (ADS)
Lee, C. M.
2016-12-01
The NASA Applied Sciences Program plays a unique role in facilitating access to remote sensing-based water information derived from US federal assets towards the goal of improving science and evidence-based decision-making in water resources management. The Water Resources Application Area within NASA Applied Sciences works specifically to develop and improve water data products to support improved management of water resources, with partners who are faced with real-world constraints and conditions including cost and regulatory standards. This poster will highlight the efforts and collaborations enabled by this program that have resulted in integration of remote sensing-based information for water quality modeling and monitoring within an operational context.
Control centers design for ergonomics and safety.
Quintana, Leonardo; Lizarazo, Cesar; Bernal, Oscar; Cordoba, Jorge; Arias, Claudia; Monroy, Magda; Cotrino, Carlos; Montoya, Olga
2012-01-01
This paper shows the general design conditions about ergonomics and safety for control centers in the petrochemical process industry. Some of the topics include guidelines for the optimized workstation design, control room layout, building layout, and lighting, acoustical and environmental design. Also takes into account the safety parameters in the control rooms and centers design. The conditions and parameters shown in this paper come from the standards and global advances on this topic on the most recent publications. And also the work was supplemented by field visits of our team to the control center operations in a petrochemical company, and technical literature search efforts. This guideline will be useful to increase the productivity and improve the working conditions at the control rooms.
Enhancing emotional-based target prediction
NASA Astrophysics Data System (ADS)
Gosnell, Michael; Woodley, Robert
2008-04-01
This work extends existing agent-based target movement prediction to include key ideas of behavioral inertia, steady states, and catastrophic change from existing psychological, sociological, and mathematical work. Existing target prediction work inherently assumes a single steady state for target behavior, and attempts to classify behavior based on a single emotional state set. The enhanced, emotional-based target prediction maintains up to three distinct steady states, or typical behaviors, based on a target's operating conditions and observed behaviors. Each steady state has an associated behavioral inertia, similar to the standard deviation of behaviors within that state. The enhanced prediction framework also allows steady state transitions through catastrophic change and individual steady states could be used in an offline analysis with additional modeling efforts to better predict anticipated target reactions.
Safety Issues with Hydrogen as a Vehicle Fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cadwallader, Lee Charles; Herring, James Stephen
1999-10-01
This report is an initial effort to identify and evaluate safety issues associated with the use of hydrogen as a vehicle fuel in automobiles. Several forms of hydrogen have been considered: gas, liquid, slush, and hydrides. The safety issues have been discussed, beginning with properties of hydrogen and the phenomenology of hydrogen combustion. Safety-related operating experiences with hydrogen vehicles have been summarized to identify concerns that must be addressed in future design activities and to support probabilistic risk assessment. Also, applicable codes, standards, and regulations pertaining to hydrogen usage and refueling have been identified and are briefly discussed. This reportmore » serves as a safety foundation for any future hydrogen safety work, such as a safety analysis or a probabilistic risk assessment.« less
Safety Issues with Hydrogen as a Vehicle Fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. C. Cadwallader; J. S. Herring
1999-09-01
This report is an initial effort to identify and evaluate safety issues associated with the use of hydrogen as a vehicle fuel in automobiles. Several forms of hydrogen have been considered: gas, liquid, slush, and hydrides. The safety issues have been discussed, beginning with properties of hydrogen and the phenomenology of hydrogen combustion. Safety-related operating experiences with hydrogen vehicles have been summarized to identify concerns that must be addressed in future design activities and to support probabilistic risk assessment. Also, applicable codes, standards, and regulations pertaining to hydrogen usage and refueling have been identified and are briefly discussed. This reportmore » serves as a safety foundation for any future hydrogen safety work, such as a safety analysis or a probabilistic risk assessment.« less
Coordination of Ocean Management: A Perspective on the Gulf of Maine,
1982-11-01
transportation modes. The U.S. Coast Guard operates two OMEGA stations; the remaining six are operated by host countries under international agreement. The...42 Reasons for a Comprehensive Approach .......................... 43 Regulatory Reform .................................. 44...physical ocean, but that framework falls short in its efforts to guide and coordinate the activities of ocean users. Much of the remaining effort in this
40 CFR 190.10 - Standards for normal operations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Standards for normal operations. 190.10 Section 190.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) RADIATION PROTECTION PROGRAMS ENVIRONMENTAL RADIATION PROTECTION STANDARDS FOR NUCLEAR POWER OPERATIONS Environmental Standards...
Strong, Vivian E.; Selby, Luke V.; Sovel, Mindy; Disa, Joseph J.; Hoskins, William; DeMatteo, Ronald; Scardino, Peter; Jaques, David P.
2015-01-01
Background Studying surgical secondary events is an evolving effort with no current established system for database design, standard reporting, or definitions. Using the Clavien-Dindo classification as a guide, in 2001 we developed a Surgical Secondary Events database based on grade of event and required intervention to begin prospectively recording and analyzing all surgical secondary events (SSE). Study Design Events are prospectively entered into the database by attending surgeons, house staff, and research staff. In 2008 we performed a blinded external audit of 1,498 operations that were randomly selected to examine the quality and reliability of the data. Results 1,498 of 4,284 operations during the 3rd quarter of 2008 were audited. 79% (N=1,180) of the operations did not have a secondary event while 21% (N=318) of operations had an identified event. 91% (1,365) of operations were correctly entered into the SSE database. 97% (129/133) of missed secondary events were Grades I and II. Three Grade III (2%) and one Grade IV (1%) secondary event were missed. There were no missed Grade 5 secondary events. Conclusion Grade III – IV events are more accurately collected than Grade I – II events. Robust and accurate secondary events data can be collected by clinicians and research staff and these data can safely be used for quality improvement projects and research. PMID:25319579
NASA Technical Reports Server (NTRS)
Chan, Gordon C.; Turner, Horace Q.
1990-01-01
COSMIC/NASTRAN, as it is supported and maintained by COSMIC, runs on four main-frame computers - CDC, VAX, IBM and UNIVAC. COSMIC/NASTRAN on other computers, such as CRAY, AMDAHL, PRIME, CONVEX, etc., is available commercially from a number of third party organizations. All these computers, with their own one-of-a-kind operating systems, make NASTRAN machine dependent. The job control language (JCL), the file management, and the program execution procedure of these computers are vastly different, although 95 percent of NASTRAN source code was written in standard ANSI FORTRAN 77. The advantage of the UNIX operating system is that it has no machine boundary. UNIX is becoming widely used in many workstations, mini's, super-PC's, and even some main-frame computers. NASTRAN for the UNIX operating system is definitely the way to go in the future, and makes NASTRAN available to a host of computers, big and small. Since 1985, many NASTRAN improvements and enhancements were made to conform to the ANSI FORTRAN 77 standards. A major UNIX migration effort was incorporated into COSMIC NASTRAN 1990 release. As a pioneer work for the UNIX environment, a version of COSMIC 89 NASTRAN was officially released in October 1989 for DEC ULTRIX VAXstation 3100 (with VMS extensions). A COSMIC 90 NASTRAN version for DEC ULTRIX DECstation 3100 (with RISC) is planned for April 1990 release. Both workstations are UNIX based computers. The COSMIC 90 NASTRAN will be made available on a TK50 tape for the DEC ULTRIX workstations. Previously in 1988, an 88 NASTRAN version was tested successfully on a SiliconGraphics workstation.
Intelligent Command and Control Systems for Satellite Ground Operations
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1999-01-01
This grant, Intelligent Command and Control Systems for Satellite Ground Operations, funded by NASA Goddard Space Flight Center, has spanned almost a decade. During this time, it has supported a broad range of research addressing the changing needs of NASA operations. It is important to note that many of NASA's evolving needs, for example, use of automation to drastically reduce (e.g., 70%) operations costs, are similar requirements in both government and private sectors. Initially the research addressed the appropriate use of emerging and inexpensive computational technologies, such as X Windows, graphics, and color, together with COTS (commercial-off-the-shelf) hardware and software such as standard Unix workstations to re-engineer satellite operations centers. The first phase of research supported by this grant explored the development of principled design methodologies to make effective use of emerging and inexpensive technologies. The ultimate performance measures for new designs were whether or not they increased system effectiveness while decreasing costs. GT-MOCA (The Georgia Tech Mission Operations Cooperative Associate) and GT-VITA (Georgia Tech Visual and Inspectable Tutor and Assistant), whose latter stages were supported by this research, explored model-based design of collaborative operations teams and the design of intelligent tutoring systems, respectively. Implemented in proof-of-concept form for satellite operations, empirical evaluations of both, using satellite operators for the former and personnel involved in satellite control operations for the latter, demonstrated unequivocally the feasibility and effectiveness of the proposed modeling and design strategy underlying both research efforts. The proof-of-concept implementation of GT-MOCA showed that the methodology could specify software requirements that enabled a human-computer operations team to perform without any significant performance differences from the standard two-person satellite operations team. GT-VITA, using the same underlying methodology, the operator function model (OFM), and its computational implementation, OFMspert, successfully taught satellite control knowledge required by flight operations team members. The tutor structured knowledge in three ways: declarative knowledge (e.g., What is this? What does it do?), procedural knowledge, and operational skill. Operational skill is essential in real-time operations. It combines the two former knowledge types, assisting a student to use them effectively in a dynamic, multi-tasking, real-time operations environment. A high-fidelity simulator of the operator interface to the ground control system, including an almost full replication of both the human-computer interface and human interaction with the dynamic system, was used in the GT-MOCA and GT-VITA evaluations. The GT-VITA empirical evaluation, conducted with a range of'novices' that included GSFC operations management, GSFC operations software developers, and new flight operations team members, demonstrated that GT-VITA effectively taught a wide range of knowledge in a succinct and engaging manner.
Measured energy savings and performance of power-managed personal computers and monitors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordman, B.; Piette, M.A.; Kinney, K.
1996-08-01
Personal computers and monitors are estimated to use 14 billion kWh/year of electricity, with power management potentially saving $600 million/year by the year 2000. The effort to capture these savings is lead by the US Environmental Protection Agency`s Energy Star program, which specifies a 30W maximum demand for the computer and for the monitor when in a {open_quote}sleep{close_quote} or idle mode. In this paper the authors discuss measured energy use and estimated savings for power-managed (Energy Star compliant) PCs and monitors. They collected electricity use measurements of six power-managed PCs and monitors in their office and five from two othermore » research projects. The devices are diverse in machine type, use patterns, and context. The analysis method estimates the time spent in each system operating mode (off, low-, and full-power) and combines these with real power measurements to derive hours of use per mode, energy use, and energy savings. Three schedules are explored in the {open_quotes}As-operated,{close_quotes} {open_quotes}Standardized,{close_quotes} and `Maximum` savings estimates. Energy savings are established by comparing the measurements to a baseline with power management disabled. As-operated energy savings for the eleven PCs and monitors ranged from zero to 75 kWh/year. Under the standard operating schedule (on 20% of nights and weekends), the savings are about 200 kWh/year. An audit of power management features and configurations for several dozen Energy Star machines found only 11% of CPU`s fully enabled and about two thirds of monitors were successfully power managed. The highest priority for greater power management savings is to enable monitors, as opposed to CPU`s, since they are generally easier to configure, less likely to interfere with system operation, and have greater savings. The difficulties in properly configuring PCs and monitors is the largest current barrier to achieving the savings potential from power management.« less
Rationale and operational plan to upgrade the U.S. gravity database
Hildenbrand, Thomas G.; Briesacher, Allen; Flanagan, Guy; Hinze, William J.; Hittelman, A.M.; Keller, Gordon R.; Kucks, R.P.; Plouff, Donald; Roest, Walter; Seeley, John; Stith, David A.; Webring, Mike
2002-01-01
A concerted effort is underway to prepare a substantially upgraded digital gravity anomaly database for the United States and to make this data set and associated usage tools available on the internet. This joint effort, spearheaded by the geophysics groups at the National Imagery and Mapping Agency (NIMA), University of Texas at El Paso (UTEP), U.S. Geological Survey (USGS), and National Oceanic and Atmospheric Administration (NOAA), is an outgrowth of the new geoscientific community initiative called Geoinformatics (www.geoinformaticsnetwork.org). This dominantly geospatial initiative reflects the realization by Earth scientists that existing information systems and techniques are inadequate to address the many complex scientific and societal issues. Currently, inadequate standardization and chaotic distribution of geoscience data, inadequate accompanying documentation, and the lack of easy-to-use access tools and computer codes for analysis are major obstacles for scientists, government agencies, and educators. An example of the type of activities envisioned, within the context of Geoinformatics, is the construction, maintenance, and growth of a public domain gravity database and development of the software tools needed to access, implement, and expand it. This product is far more than a high quality database; it is a complete data system for a specific type of geophysical measurement that includes, for example, tools to manipulate the data and tutorials to understand and properly utilize the data. On August 9, 2002, twenty-one scientists from the federal, private and academic sectors met at a workshop to discuss the rationale for upgrading both the United States and North American gravity databases (including offshore regions) and, more importantly, to begin developing an operational plan to effectively create a new gravity data system. We encourage anyone interested in contributing data or participating in this effort to contact G.R. Keller or T.G. Hildenbrand. This workshop was the first step in building a web-based data system for sharing quality gravity data and methodology, and it builds on existing collaborative efforts. This compilation effort will result in significant additions to and major refinement of the U.S. database that is currently released publicly by NOAA’s National Geophysical Data Center and will also include an additional objective to substantially upgrade the North American database, released over 15 years ago (Committee for the Gravity Anomaly Map of North America, 1987).
New Science Standards: A Readiness Assessment for State Boards of Education
ERIC Educational Resources Information Center
Center on Great Teachers and Leaders, 2015
2015-01-01
State implementation of new standards (revising, adapting, and adopting new standards) is a significant undertaking. Such an effort at the state level requires extensive support, planning, and resources. Implementing new state standards also requires the support of multiple education leaders--governors, legislators, state departments of education,…
Okeke, Claudia C; Allen, Loyd V
2009-01-01
The standard operating procedures suggested in this article are presented to compounding pharmacies to ensure the quality of the environment in which a CSP is prepared. Since United States Pharmacopeia Chapter 797 provides minimum standards, each facility should aim for best practice gold standard. The standard operating procedures should be tailored to meet the expectations and design of each facility. Compounding personnel are expected to know and understand each standard operating procedure to allow for complete execution of the procedures.
A helium-3/helium-4 dilution cryocooler for operation in zero gravity
NASA Technical Reports Server (NTRS)
Hendricks, John B.
1988-01-01
This research effort covered the development of He-3/He-4 dilution cryocooler cycles for use in zero gravity. The dilution cryocooler is currently the method of choice for producing temperatures below 0.3 Kelvin in the laboratory. However, the current dilution cryocooler depends on gravity for their operation, so some modification is required for zero gravity operation. In this effort, we have demonstrated, by analysis, that the zero gravity dilution cryocooler is feasible. We have developed a cycle that uses He-3 circulation, and an alternate cycle that uses superfluid He-4 circulation. The key elements of both cycles were demonstrated experimentally. The development of a true 'zero-gravity' dilution cryocooler is now possible, and should be undertaken in a follow-on effort.
Clearing a Path: The 16-Bit Operating System Jungle Offers Confusion, Not Standardization.
ERIC Educational Resources Information Center
Pournelle, Jerry
1984-01-01
Discusses the design and limited uses of the Pascal, MS-DOS, CP/M, and PC-DOS operating systems as standard operating systems for 16-bit microprocessors, especially with the more sophisticated microcomputers currently being developed. Advantages and disadvantages of Unix--a multitasking, multiuser operating system--as a standard operating system…
Hackley, Paul C.
2014-01-01
Vitrinite reflectance generally is considered the most robust thermal maturity parameter available for application to hydrocarbon exploration and petroleum system evaluation. However, until 2011 there was no standardized methodology available to provide guidelines for vitrinite reflectance measurements in shale. Efforts to correct this deficiency resulted in publication of ASTM D7708-11: Standard test method for microscopical determination of the reflectance of vitrinite dispersed in sedimentary rocks. In 2012-2013, an interlaboratory exercise was conducted to establish precision limits for the measurement technique. Six samples, representing a wide variety of shale, were tested in duplicate by 28 analysts in 22 laboratories from 14 countries. Samples ranged from immature to overmature (Ro 0.31-1.53%), from organic-rich to organic-lean (1-22 wt.% total organic carbon), and contained Type I (lacustrine), Type II (marine), and Type III (terrestrial) kerogens. Repeatability values (difference between repetitive results from same operator, same conditions) ranged from 0.03-0.11% absolute reflectance, whereas reproducibility values (difference between results obtained on same test material by different operators, different laboratories) ranged from 0.12-0.54% absolute reflectance. Repeatability and reproducibility degraded consistently with increasing maturity and decreasing organic content. However, samples with terrestrial kerogens (Type III) fell off this trend, showing improved levels of reproducibility due to higher vitrinite content and improved ease of identification. Operators did not consistently meet the reporting requirements of the test method, indicating that a common reporting template is required to improve data quality. The most difficult problem encountered was the petrographic distinction of solid bitumens and low-reflecting inert macerals from vitrinite when vitrinite occurred with reflectance ranges overlapping the other components. Discussion among participants suggested this problem could not be corrected via kerogen concentration or solvent extraction and is related to operator training and background. Poor reproducibility (0.54% absolute reflectance, related to increased anisotropy?) in the highest maturity sample (Ro 1.53%) suggests that vitrinite reflectance is not a highly reliable parameter in such rocks. Future work will investigate opportunities to improve reproducibility in similar high maturity, organic-lean shale varieties.
The Department of Defense Human factors standardization program.
Chaikin, G
1984-09-01
The Department of Defense (DoD) Human Factors Standardization Program is the most far-reaching standardization programme in the USA. It is an integrated component of the overall DoD Standardization Program. While only ten major documents are contained in the human factors standardization area, their effects on human factors engineering programmes are profound and wide-ranging. Preparation and updating of the human engineering standardisation documents have grown out of the efforts of several military agencies, contractors, consultants, universities and individuals. New documents, engineering practice studies and revision efforts are continuously planned by the Tri-Service (Army, Navy, Air Force) Human Factors Standardization Steering Committee in collaboration with industry groups and technical societies. The present five-year plan and other standardisation documents are readily available for review and input by anyone with relevant interests. Human factors specialists and other readers of this journal may therefore influence the direction of the human factors standardisation programme and the content of its military specifications, standards and handbooks.
40 CFR 63.548 - Monitoring requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) You must prepare, and at all times operate according to, a standard operating procedures manual that...) You must submit the standard operating procedures manual for baghouses required by paragraph (a) of... that you specify in the standard operating procedures manual for inspections and routine maintenance...
40 CFR 63.548 - Monitoring requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) You must prepare, and at all times operate according to, a standard operating procedures manual that...) You must submit the standard operating procedures manual for baghouses required by paragraph (a) of... that you specify in the standard operating procedures manual for inspections and routine maintenance...
40 CFR 63.548 - Monitoring requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) You must prepare, and at all times operate according to, a standard operating procedures manual that...) You must submit the standard operating procedures manual for baghouses required by paragraph (a) of... that you specify in the standard operating procedures manual for inspections and routine maintenance...
Jayaram, Varalakshmi; Agrawal, Harshit; Welch, William A; Miller, J Wayne; Cocker, David R
2011-03-15
Emissions from harbor-craft significantly affect air quality in populated regions near ports and inland waterways. This research measured regulated and unregulated emissions from an in-use EPA Tier 2 marine propulsion engine on a ferry operating in a bay following standard methods. A special effort was made to monitor continuously both the total Particulate Mass (PM) mass emissions and the real-time Particle Size Distribution (PSD). The engine was operated following the loads in ISO 8178-4 E3 cycle for comparison with the certification standards and across biodiesel blends. Real-time measurements were also made during a typical cruise in the bay. Results showed the in-use nitrogen oxide (NOx) and PM(2.5) emission factors were within the not to exceed standard for Tier 2 marine engines. Comparing across fuels we observed the following: a) no statistically significant change in NO(x) emissions with biodiesel blends (B20, B50); b) ∼ 16% and ∼ 25% reduction of PM(2.5) mass emissions with B20 and B50 respectively; c) a larger organic carbon (OC) to elemental carbon (EC) ratio and organic mass (OM) to OC ratio with B50 compared to B20 and B0; d) a significant number of ultrafine nuclei and a smaller mass mean diameter with increasing blend-levels of biodiesel. The real-time monitoring of gaseous and particulate emissions during a typical cruise in the San Francisco Bay (in-use cycle) revealed important effects of ocean/bay currents on emissions: NO(x) and CO(2) increased 3-fold; PM(2.5) mass increased 6-fold; and ultrafine particles disappeared due to the effect of bay currents. This finding has implications on the use of certification values instead of actual in-use emission values when developing inventories. Emission factors for some volatile organic compounds (VOCs), carbonyls, and poly aromatic hydrocarbons (PAHs) are reported as supplemental data.
Global Standards for Enhancing Quality in Online Learning
ERIC Educational Resources Information Center
Martin, Florence; Polly, Drew; Jokiaho, Annika; May, Birgit
2017-01-01
The quality of online courses offered has been a topic of discussion in the recent years, and efforts have been taken to establish standards for developing online courses. In this study, the authors review 12 online learning standard documents and examine the standards included in each of these documents. The largest number of standards were in…
Thailand mental health country profile.
Siriwanarangsan, Porntep; Liknapichitkul, Dusit; Khandelwal, Sudhir K
2004-01-01
Thailand, a constitutional monarchy, has undergone a rapid shift in its demography and economy in last two decades. This has put a great burden on the health services, including mental health care of the country. The current emphasis of the Ministry of Public Health is to change its role from health care provider to policymaker and regulator of standards, and to provide technical support to health facilities under its jurisdiction as well as in the private sector. The Department of Mental Health, established in 1994, has laid down a mental health policy that aims to promote mental health care within the community with the help of people's participation in health programmes. Focus has been placed on developing suitable and efficient technology by seeking cooperation both within and outside the Ministry of Public Health. Consequently, the Department of Mental Health has been receiving increasing budgetary allocations. Since there is a paucity of trained manpower, the emphasis is being laid on the utilization of general health care for mental health care. Some of the specific interventions are community services, prison services, psychiatric rehabilitation, and use of media in mental health operations. There have been active efforts towards international cooperation for developing technologies for specific programmes. Private and non-governmental organizations are supported and encouraged to provide mental health care to the marginalized sections of society. Efforts have also been made by the Department of Mental Health to inspect and raise the efficiency of its operations to result in quality service.
The Gap in Standards for Special Libraries.
ERIC Educational Resources Information Center
Dodd, James Beaupre
1982-01-01
The issue of standards for special libraries is discussed, highlighting surveys conducted concerning the diversity of special libraries and salaries of members of the Special Libraries Association (SLA). Efforts of SLA's Standards and Statistics Committee are noted. Twenty references are listed. (EJS)
Standardized reporting using CODES (Crash Outcome Data Evaluation System)
DOT National Transportation Integrated Search
1999-12-01
While CODES projects have expanded to 25 states, there is no standardized reporting of the outcome measures that are available with linked data. This paper describes our efforts to build a standard format for reporting these outcomes. This format is ...
Towards an Entropy Stable Spectral Element Framework for Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Carpenter, Mark H.; Parsani, Matteo; Fisher, Travis C.; Nielsen, Eric J.
2016-01-01
Entropy stable (SS) discontinuous spectral collocation formulations of any order are developed for the compressible Navier-Stokes equations on hexahedral elements. Recent progress on two complementary efforts is presented. The first effort is a generalization of previous SS spectral collocation work to extend the applicable set of points from tensor product, Legendre-Gauss-Lobatto (LGL) to tensor product Legendre-Gauss (LG) points. The LG and LGL point formulations are compared on a series of test problems. Although being more costly to implement, it is shown that the LG operators are significantly more accurate on comparable grids. Both the LGL and LG operators are of comparable efficiency and robustness, as is demonstrated using test problems for which conventional FEM techniques suffer instability. The second effort generalizes previous SS work to include the possibility of p-refinement at non-conforming interfaces. A generalization of existing entropy stability machinery is developed to accommodate the nuances of fully multi-dimensional summation-by-parts (SBP) operators. The entropy stability of the compressible Euler equations on non-conforming interfaces is demonstrated using the newly developed LG operators and multi-dimensional interface interpolation operators.
The National Science Education Standards.
ERIC Educational Resources Information Center
Bybee, Rodger W.; Champagne, Audrey B.
2000-01-01
Describes efforts under the sponsorship of the National Research Council (NRC) to improve science education. Provides an overview of the National Science Education Standards. First published in 1995. (YDS)
Van Such, Monica B.; Nesse, Robert E.; Dilling, James A.; Swensen, Stephen J.; Thompson, Kristine M.; Orlowski, Janis M.; Santrach, Paula J.
2017-01-01
The majority of quality measures used to assess providers and hospitals are based on easily obtained data, focused on a few dimensions of quality, and developed mainly for primary/community care and population health. While this approach supports efforts focused on addressing the triple aim of health care, many current quality report cards and assessments do not reflect the breadth or complexity of many referral center practices. In this article, the authors highlight the differences between population health efforts and referral care and address issues related to value measurement and performance assessment. They discuss why measures may need to differ across the three levels of care (primary/community care, secondary care, complex care) and illustrate the need for further risk adjustment to eliminate referral bias. With continued movement toward value-based purchasing, performance measures and reimbursement schemes need to reflect the increased level of intensity required to provide complex care. The authors propose a framework to operationalize value measurement and payment for specialty care, and they make specific recommendations to improve performance measurement for complex patients. Implementing such a framework to differentiate performance measures by level of care involves coordinated efforts to change both policy and operational platforms. An essential component of this framework is a new model that defines the characteristics of patients who require complex care and standardizes metrics that incorporate those definitions. PMID:28353502
Naessens, James M; Van Such, Monica B; Nesse, Robert E; Dilling, James A; Swensen, Stephen J; Thompson, Kristine M; Orlowski, Janis M; Santrach, Paula J
2017-07-01
The majority of quality measures used to assess providers and hospitals are based on easily obtained data, focused on a few dimensions of quality, and developed mainly for primary/community care and population health. While this approach supports efforts focused on addressing the triple aim of health care, many current quality report cards and assessments do not reflect the breadth or complexity of many referral center practices.In this article, the authors highlight the differences between population health efforts and referral care and address issues related to value measurement and performance assessment. They discuss why measures may need to differ across the three levels of care (primary/community care, secondary care, complex care) and illustrate the need for further risk adjustment to eliminate referral bias.With continued movement toward value-based purchasing, performance measures and reimbursement schemes need to reflect the increased level of intensity required to provide complex care. The authors propose a framework to operationalize value measurement and payment for specialty care, and they make specific recommendations to improve performance measurement for complex patients. Implementing such a framework to differentiate performance measures by level of care involves coordinated efforts to change both policy and operational platforms. An essential component of this framework is a new model that defines the characteristics of patients who require complex care and standardizes metrics that incorporate those definitions.
36 CFR 9.41 - Operating standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... MINERALS MANAGEMENT Non-Federal Oil and Gas Rights § 9.41 Operating standards. The following standards... destroyed, obliterated, or damaged by such operations. (c) Whenever drilling or producing operations are... in a safe and workmanlike manner, having due regard for the preservation of the environment of the...
36 CFR 9.41 - Operating standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
... MINERALS MANAGEMENT Non-Federal Oil and Gas Rights § 9.41 Operating standards. The following standards... destroyed, obliterated, or damaged by such operations. (c) Whenever drilling or producing operations are... in a safe and workmanlike manner, having due regard for the preservation of the environment of the...
40 CFR 264.1201 - Design and operating standards.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 26 2014-07-01 2014-07-01 false Design and operating standards. 264... FACILITIES Hazardous Waste Munitions and Explosives Storage § 264.1201 Design and operating standards. (a... Operating Procedure specifying procedures to ensure safety, security, and environmental protection. If these...
40 CFR 264.1201 - Design and operating standards.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 27 2012-07-01 2012-07-01 false Design and operating standards. 264... FACILITIES Hazardous Waste Munitions and Explosives Storage § 264.1201 Design and operating standards. (a... Operating Procedure specifying procedures to ensure safety, security, and environmental protection. If these...
40 CFR 264.1201 - Design and operating standards.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 27 2013-07-01 2013-07-01 false Design and operating standards. 264... FACILITIES Hazardous Waste Munitions and Explosives Storage § 264.1201 Design and operating standards. (a... Operating Procedure specifying procedures to ensure safety, security, and environmental protection. If these...
40 CFR 264.1201 - Design and operating standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Design and operating standards. 264... FACILITIES Hazardous Waste Munitions and Explosives Storage § 264.1201 Design and operating standards. (a... Operating Procedure specifying procedures to ensure safety, security, and environmental protection. If these...
ERIC Educational Resources Information Center
Morris, Amanda Sheffield; Age, Tolonda Ricard
2009-01-01
This study examined coping, effortful control, and mental health among 65 youth (ages 9-15) residing in families where at least one parent was serving in the United States military. Parents provided basic demographic and deployment information. Youth reported on their coping, effortful control, and adjustment using standardized self-report…
Research ethics committee auditing: the experience of a university hospital.
Marchetti, Daniela; Spagnolo, Angelico; Cicerone, Marina; Cascini, Fidelia; La Monaca, Giuseppe; Spagnolo, Antonio G
2013-09-01
The authors report the first Italian experience of a research ethics committee (REC) audit focused on the evaluation of the REC's compliance with standard operating procedures, requirements in insurance coverage, informed consent, protection of privacy and confidentiality, predictable risks/harms, selection of subjects, withdrawal criteria and other issues, such as advertisement details and justification of placebo. The internal audit was conducted over a two-year period (March 2009-February 2011) divided into quarters to better value the influence of the new insurance coverage regulation that came into effect in March 2010 (Ministerial Decree of 14 July, 2009) and expand the requirements to safeguard participants in clinical drug trials including other critical items as information and consent and the risks to benefits ratio. Out of a total of 639 REC's opinions and research studies, 316 were reviewed. Regarding the insurance policy requirements, Auditor/REC non-compliance occurred only in one case. The highest number of Auditor/REC non-compliance was in regard to information and consent, which should have incurred a suspended decision rather than a favorable opinion. This internal audit shows the importance and the difficulty of the review process. For this reason, specific courses for members of the research ethics committee and for those who aspire to become auditors will be provided. There may also be efforts to improve the standard operating procedures already in place.
Holding Area LINQ Trial (HALT).
Lee, John J; Weitz, Daniel; Anand, Rishi
Recent studies have shown that insertable cardiac monitors (ICMs) can be implanted out of the traditional hospital setting and efforts are being made to explore the feasibility of implanting these devices in a specific standardized location other than the operating room or a cardiac catherization/electrophysiology lab. This was a prospective, non-randomized, single center post-market clinical trial designed to occur in the holding area of a hospital operating room or cardiac catheterization/electrophysiology laboratory. The Medtronic Reveal LINQ ICM was implanted and patients were followed for 90 days post implant. This study was designed to observe any procedure related adverse events stemming from the holding area implantation. Twenty patients were implanted at our hospital in a holding room not traditionally associated with the electrophysiology/cardiac/operatory labs. One patient was lost to the 90-day follow up. In one case, ICM implantation led to diagnosis requiring removal of ICM before the 90 day follow up and insertion of a biventricular implantable cardioverter defibrillator (ICD). In the remaining 18 patients, there were no serious complications such as minor skin infections, systemic infections or procedure-related adverse events requiring device explant. When following a standardized protocol with attention to sterile technique, it is feasible to implant ICMs in a holding area with no procedure related adverse events (AE). Copyright © 2017 Indian Heart Rhythm Society. Production and hosting by Elsevier B.V. All rights reserved.