Quantum key distillation from Gaussian states by Gaussian operations.
Navascués, M; Bae, J; Cirac, J I; Lewestein, M; Sanpera, A; Acín, A
2005-01-14
We study the secrecy properties of Gaussian states under Gaussian operations. Although such operations are useless for quantum distillation, we prove that it is possible to distill a secret key secure against any attack from sufficiently entangled Gaussian states with nonpositive partial transposition. Moreover, all such states allow for key distillation, when Eve is assumed to perform finite-size coherent attacks before the reconciliation process.
Aerobic Digestion. Biological Treatment Process Control. Instructor's Guide.
ERIC Educational Resources Information Center
Klopping, Paul H.
This unit on aerobic sludge digestion covers the theory of the process, system components, factors that affect the process performance, standard operational concerns, indicators of steady-state operations, and operational problems. The instructor's guide includes: (1) an overview of the unit; (2) lesson plan; (3) lecture outline (keyed to a set of…
Commander’s Handbook for Strategic Communication and Communication Strategy
2010-06-24
designed to gather SC educators and key practitioners for thoughtful discussions on SC education and training issues. KLE is not about engaging key...operational design and early joint operation planning process to identify indicators that will enable us to detect when it is time to “reframe” the problem...integrating process across DOD, included in concept and doctrine development, strategy and plan design , execution, and assessment, and incorporated
NASA Technical Reports Server (NTRS)
Fatig, Michael
1993-01-01
Flight operations and the preparation for it has become increasingly complex as mission complexities increase. Further, the mission model dictates that a significant increase in flight operations activities is upon us. Finally, there is a need for process improvement and economy in the operations arena. It is therefore time that we recognize flight operations as a complex process requiring a defined, structured, and life cycle approach vitally linked to space segment, ground segment, and science operations processes. With this recognition, an FOT Tool Kit consisting of six major components designed to provide tools to guide flight operations activities throughout the mission life cycle was developed. The major components of the FOT Tool Kit and the concepts behind the flight operations life cycle process as developed at NASA's GSFC for GSFC-based missions are addressed. The Tool Kit is therefore intended to increase productivity, quality, cost, and schedule performance of the flight operations tasks through the use of documented, structured methodologies; knowledge of past lessons learned and upcoming new technology; and through reuse and sharing of key products and special application programs made possible through the development of standardized key products and special program directories.
Quantum Watermarking Scheme Based on INEQR
NASA Astrophysics Data System (ADS)
Zhou, Ri-Gui; Zhou, Yang; Zhu, Changming; Wei, Lai; Zhang, Xiafen; Ian, Hou
2018-04-01
Quantum watermarking technology protects copyright by embedding invisible quantum signal in quantum multimedia data. In this paper, a watermarking scheme based on INEQR was presented. Firstly, the watermark image is extended to achieve the requirement of embedding carrier image. Secondly, the swap and XOR operation is used on the processed pixels. Since there is only one bit per pixel, XOR operation can achieve the effect of simple encryption. Thirdly, both the watermark image extraction and embedding operations are described, where the key image, swap operation and LSB algorithm are used. When the embedding is made, the binary image key is changed. It means that the watermark has been embedded. Of course, if the watermark image is extracted, the key's state need detected. When key's state is |1>, this extraction operation is carried out. Finally, for validation of the proposed scheme, both the Signal-to-noise ratio (PSNR) and the security of the scheme are analyzed.
Particle Engineering in Pharmaceutical Solids Processing: Surface Energy Considerations
Williams, Daryl R.
2015-01-01
During the past 10 years particle engineering in the pharmaceutical industry has become a topic of increasing importance. Engineers and pharmacists need to understand and control a range of key unit manufacturing operations such as milling, granulation, crystallisation, powder mixing and dry powder inhaled drugs which can be very challenging. It has now become very clear that in many of these particle processing operations, the surface energy of the starting, intermediate or final products is a key factor in understanding the processing operation and or the final product performance. This review will consider the surface energy and surface energy heterogeneity of crystalline solids, methods for the measurement of surface energy, effects of milling on powder surface energy, adhesion and cohesion on powder mixtures, crystal habits and surface energy, surface energy and powder granulation processes, performance of DPI systems and finally crystallisation conditions and surface energy. This review will conclude that the importance of surface energy as a significant factor in understanding the performance of many particulate pharmaceutical products and processes has now been clearly established. It is still nevertheless, work in progress both in terms of development of methods and establishing the limits for when surface energy is the key variable of relevance. PMID:25876912
Argo workstation: a key component of operational oceanography
NASA Astrophysics Data System (ADS)
Dong, Mingmei; Xu, Shanshan; Miao, Qingsheng; Yue, Xinyang; Lu, Jiawei; Yang, Yang
2018-02-01
Operational oceanography requires the quantity, quality, and availability of data set and the timeliness and effectiveness of data products. Without steady and strong operational system supporting, operational oceanography will never be proceeded far. In this paper we describe an integrated platform named Argo Workstation. It operates as a data processing and management system, capable of data collection, automatic data quality control, visualized data check, statistical data search and data service. After it is set up, Argo workstation provides global high quality Argo data to users every day timely and effectively. It has not only played a key role in operational oceanography but also set up an example for operational system.
NASA Technical Reports Server (NTRS)
Watson, Michael D.; Kelley, Gary W.
2012-01-01
The Department of Defense (DoD) defined System Operational Effectiveness (SOE) model provides an exceptional framework for an affordable approach to the development and operation of space launch vehicles and their supporting infrastructure. The SOE model provides a focal point from which to direct and measure technical effectiveness and process efficiencies of space launch vehicles. The application of the SOE model to a space launch vehicle's development and operation effort leads to very specific approaches and measures that require consideration during the design phase. This paper provides a mapping of the SOE model to the development of space launch vehicles for human exploration by addressing the SOE model key points of measurement including System Performance, System Availability, Technical Effectiveness, Process Efficiency, System Effectiveness, Life Cycle Cost, and Affordable Operational Effectiveness. In addition, the application of the SOE model to the launch vehicle development process is defined providing the unique aspects of space launch vehicle production and operations in lieu of the traditional broader SOE context that examines large quantities of fielded systems. The tailoring and application of the SOE model to space launch vehicles provides some key insights into the operational design drivers, capability phasing, and operational support systems.
78 FR 32255 - HHS-Operated Risk Adjustment Data Validation Stakeholder Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-29
...-Operated Risk Adjustment Data Validation Stakeholder Meeting AGENCY: Centers for Medicare & Medicaid... Act HHS-operated risk adjustment data validation process. The purpose of this public meeting is to... interested parties about key HHS policy considerations pertaining to the HHS-operated risk adjustment data...
Improving a Dental School's Clinic Operations Using Lean Process Improvement.
Robinson, Fonda G; Cunningham, Larry L; Turner, Sharon P; Lindroth, John; Ray, Deborah; Khan, Talib; Yates, Audrey
2016-10-01
The term "lean production," also known as "Lean," describes a process of operations management pioneered at the Toyota Motor Company that contributed significantly to the success of the company. Although developed by Toyota, the Lean process has been implemented at many other organizations, including those in health care, and should be considered by dental schools in evaluating their clinical operations. Lean combines engineering principles with operations management and improvement tools to optimize business and operating processes. One of the core concepts is relentless elimination of waste (non-value-added components of a process). Another key concept is utilization of individuals closest to the actual work to analyze and improve the process. When the medical center of the University of Kentucky adopted the Lean process for improving clinical operations, members of the College of Dentistry trained in the process applied the techniques to improve inefficient operations at the Walk-In Dental Clinic. The purpose of this project was to reduce patients' average in-the-door-to-out-the-door time from over four hours to three hours within 90 days. Achievement of this goal was realized by streamlining patient flow and strategically relocating key phases of the process. This initiative resulted in patient benefits such as shortening average in-the-door-to-out-the-door time by over an hour, improving satisfaction by 21%, and reducing negative comments by 24%, as well as providing opportunity to implement the electronic health record, improving teamwork, and enhancing educational experiences for students. These benefits were achieved while maintaining high-quality patient care with zero adverse outcomes during and two years following the process improvement project.
Defining process design space for monoclonal antibody cell culture.
Abu-Absi, Susan Fugett; Yang, LiYing; Thompson, Patrick; Jiang, Canping; Kandula, Sunitha; Schilling, Bernhard; Shukla, Abhinav A
2010-08-15
The concept of design space has been taking root as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. During mapping of the process design space, the multidimensional combination of operational variables is studied to quantify the impact on process performance in terms of productivity and product quality. An efficient methodology to map the design space for a monoclonal antibody cell culture process is described. A failure modes and effects analysis (FMEA) was used as the basis for the process characterization exercise. This was followed by an integrated study of the inoculum stage of the process which includes progressive shake flask and seed bioreactor steps. The operating conditions for the seed bioreactor were studied in an integrated fashion with the production bioreactor using a two stage design of experiments (DOE) methodology to enable optimization of operating conditions. A two level Resolution IV design was followed by a central composite design (CCD). These experiments enabled identification of the edge of failure and classification of the operational parameters as non-key, key or critical. In addition, the models generated from the data provide further insight into balancing productivity of the cell culture process with product quality considerations. Finally, process and product-related impurity clearance was evaluated by studies linking the upstream process with downstream purification. Production bioreactor parameters that directly influence antibody charge variants and glycosylation in CHO systems were identified.
Design of virtual simulation experiment based on key events
NASA Astrophysics Data System (ADS)
Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu
2018-06-01
Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.
Hou, Xiang-Mei; Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang
2016-07-01
To study and establish a monitoring method for macroporous resin column chromatography process of salvianolic acids by using near infrared spectroscopy (NIR) as a process analytical technology (PAT).The multivariate statistical process control (MSPC) model was developed based on 7 normal operation batches, and 2 test batches (including one normal operation batch and one abnormal operation batch) were used to verify the monitoring performance of this model. The results showed that MSPC model had a good monitoring ability for the column chromatography process. Meanwhile, NIR quantitative calibration model was established for three key quality indexes (rosmarinic acid, lithospermic acid and salvianolic acid B) by using partial least squares (PLS) algorithm. The verification results demonstrated that this model had satisfactory prediction performance. The combined application of the above two models could effectively achieve real-time monitoring for macroporous resin column chromatography process of salvianolic acids, and can be used to conduct on-line analysis of key quality indexes. This established process monitoring method could provide reference for the development of process analytical technology for traditional Chinese medicines manufacturing. Copyright© by the Chinese Pharmaceutical Association.
Davenport, Paul B; Carter, Kimberly F; Echternach, Jeffrey M; Tuck, Christopher R
2018-02-01
High-reliability organizations (HROs) demonstrate unique and consistent characteristics, including operational sensitivity and control, situational awareness, hyperacute use of technology and data, and actionable process transformation. System complexity and reliance on information-based processes challenge healthcare organizations to replicate HRO processes. This article describes a healthcare organization's 3-year journey to achieve key HRO features to deliver high-quality, patient-centric care via an operations center powered by the principles of high-reliability data and software to impact patient throughput and flow.
NASA Technical Reports Server (NTRS)
Dunbar, D. N.; Tunnah, B. G.
1979-01-01
Program predicts production volumes of petroleum refinery products, with particular emphasis on aircraft-turbine fuel blends and their key properties. It calculates capital and operating costs for refinery and its margin of profitability. Program also includes provisions for processing of synthetic crude oils from oil shale and coal liquefaction processes and contains highly-detailed blending computations for alternative jet-fuel blends of varying endpoint specifications.
An Efficient and Secure Arbitrary N-Party Quantum Key Agreement Protocol Using Bell States
NASA Astrophysics Data System (ADS)
Liu, Wen-Jie; Xu, Yong; Yang, Ching-Nung; Gao, Pei-Pei; Yu, Wen-Bin
2018-01-01
Two quantum key agreement protocols using Bell states and Bell measurement were recently proposed by Shukla et al. (Quantum Inf. Process. 13(11), 2391-2405, 2014). However, Zhu et al. pointed out that there are some security flaws and proposed an improved version (Quantum Inf. Process. 14(11), 4245-4254, 2015). In this study, we will show Zhu et al.'s improvement still exists some security problems, and its efficiency is not high enough. For solving these problems, we utilize four Pauli operations { I, Z, X, Y} to encode two bits instead of the original two operations { I, X} to encode one bit, and then propose an efficient and secure arbitrary N-party quantum key agreement protocol. In the protocol, the channel checking with decoy single photons is introduced to avoid the eavesdropper's flip attack, and a post-measurement mechanism is used to prevent against the collusion attack. The security analysis shows the present protocol can guarantee the correctness, security, privacy and fairness of quantum key agreement.
Resource Sharing in a Network of Personal Computers.
1982-12-01
magnetic card, or a more secure identifier such as a machine-read fingerprint or voiceprint. Security and Protection 57 (3) (R, key) (5) (RB’ B, key) (B...operations are invoked via messages, a program and its terminal can easily be located on separate machines. In Spice, an interface process called Canvas ...request of a process. In Canvas , a process can only subdivide windows that it already has. On the other hand, the window manager treats the screen as a
Review of a solution-processed vertical organic transistor as a solid-state vacuum tube
NASA Astrophysics Data System (ADS)
Lin, Hung-Cheng; Zan, Hsiao-Wen; Chao, Yu-Chiang; Chang, Ming-Yu; Meng, Hsin-Fei
2015-05-01
In this paper, we investigate the key issues in raising the on/off current ratio and increasing the output current. A 1 V operated inverter composed of an enhancement-mode space-charge-limited transistor (SCLT) and a depletion-mode SCLT is demonstrated using the self-assembled monolayer modulation process. With a bulk-conduction mechanism, good bias-stress reliability, and good bending durability are obtained. Finally, key scaling-up processes, including nanoimprinting and blade-coated nanospheres, are demonstrated.
Systems engineering and integration processes involved with manned mission operations
NASA Technical Reports Server (NTRS)
Kranz, Eugene F.; Kraft, Christopher C.
1993-01-01
This paper will discuss three mission operations functions that are illustrative of the key principles of operations SE&I and of the processes and products involved. The flight systems process was selected to illustrate the role of the systems product line in developing the depth and cross disciplinary skills needed for SE&I and providing the foundation for dialogue between participating elements. FDDD was selected to illustrate the need for a structured process to assure that SE&I provides complete and accurate results that consistently support program needs. The flight director's role in mission operations was selected to illustrate the complexity of the risk/gain tradeoffs involved in the development of the flight techniques and flight rules process as well as the absolute importance of the leadership role in developing the technical, operational, and political trades.
NASA Astrophysics Data System (ADS)
Happonen, Ari; Stepanov, Alexander; Hirvimäki, Marika; Manninen, Matti; Dennisuk, William; Piili, Heidi; Salminen, Antti
This study is based on observed outcomes of motivation sources and collaboration elements from a living lab style co-operation project. In this project, researchers of engineering science and an individual artist co-operated closely. The goal was to create an artwork made from corrugated board by utilizing laser cutting technology. In the context of this study, the scientist and the artist participated in the whole process and the research was done in living lab style arrangement. The research process integrated multiple experts from different scientific fields and experts from practical contexts to develop a new art design and art forming process with utilization of laser cutting technology. The purpose of this study was to find out and discuss about the key elements for high motivation to work together and then reveal the best practice findings in this co-operative development process. Elements were studied from three different points of view: artists view, collaboration motivation view and practical cutting point of view. The elements were analysed by utilizing an active documentation collection methodology, during the whole process, and by using story-telling methodology. The documents were used to reflect facts and feelings from the co-operation, the work process and the challenges encountered within collaboration. This article contributes to research methodology and best practice context by revealing the key elements, which build the motivation compelling (as personal inner motivation) the participant to work out of office hours as well as on weekends. Furthermore, as the artist-engineer co-operation is not frequently reported in scientific literature, this study reveals valuable information for practitioners and co-operation researchers.
Advanced High-Level Waste Glass Research and Development Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peeler, David K.; Vienna, John D.; Schweiger, Michael J.
2015-07-01
The U.S. Department of Energy Office of River Protection (ORP) has implemented an integrated program to increase the loading of Hanford tank wastes in glass while meeting melter lifetime expectancies and process, regulatory, and product quality requirements. The integrated ORP program is focused on providing a technical, science-based foundation from which key decisions can be made regarding the successful operation of the Hanford Tank Waste Treatment and Immobilization Plant (WTP) facilities. The fundamental data stemming from this program will support development of advanced glass formulations, key process control models, and tactical processing strategies to ensure safe and successful operations formore » both the low-activity waste (LAW) and high-level waste (HLW) vitrification facilities with an appreciation toward reducing overall mission life. The purpose of this advanced HLW glass research and development plan is to identify the near-, mid-, and longer-term research and development activities required to develop and validate advanced HLW glasses and their associated models to support facility operations at WTP, including both direct feed and full pretreatment flowsheets. This plan also integrates technical support of facility operations and waste qualification activities to show the interdependence of these activities with the advanced waste glass (AWG) program to support the full WTP mission. Figure ES-1 shows these key ORP programmatic activities and their interfaces with both WTP facility operations and qualification needs. The plan is a living document that will be updated to reflect key advancements and mission strategy changes. The research outlined here is motivated by the potential for substantial economic benefits (e.g., significant increases in waste throughput and reductions in glass volumes) that will be realized when advancements in glass formulation continue and models supporting facility operations are implemented. Developing and applying advanced glass formulations will reduce the cost of Hanford tank waste management by reducing the schedule for tank waste treatment and reducing the amount of HLW glass for storage, transportation, and disposal. Additional benefits will be realized if advanced glasses are developed that demonstrate more tolerance for key components in the waste (such as Al 2O 3, Cr 2O 3, SO 3 and Na 2O) above the currently defined WTP constraints. Tolerating these higher concentrations of key waste loading limiters may reduce the burden on (or even eliminate the need for) leaching to remove Cr and Al and washing to remove excess S and Na from the HLW fraction. Advanced glass formulations may also make direct vitrification of the HLW fraction without significant pretreatment more cost effective. Finally, the advanced glass formulation efforts seek not only to increase waste loading in glass, but also to increase glass production rate. When coupled with higher waste loading, ensuring that all of the advanced glass formulations are processable at or above the current contract processing rate leads to significant improvements in waste throughput (the amount of waste being processed per unit time),which could significantly reduce the overall WTP mission life. The integration of increased waste loading, reduced leaching/washing requirements, and improved melting rates provides a system-wide approach to improve the effectiveness of the WTP process.« less
Quantum cryptographic system with reduced data loss
Lo, H.K.; Chau, H.F.
1998-03-24
A secure method for distributing a random cryptographic key with reduced data loss is disclosed. Traditional quantum key distribution systems employ similar probabilities for the different communication modes and thus reject at least half of the transmitted data. The invention substantially reduces the amount of discarded data (those that are encoded and decoded in different communication modes e.g. using different operators) in quantum key distribution without compromising security by using significantly different probabilities for the different communication modes. Data is separated into various sets according to the actual operators used in the encoding and decoding process and the error rate for each set is determined individually. The invention increases the key distribution rate of the BB84 key distribution scheme proposed by Bennett and Brassard in 1984. Using the invention, the key distribution rate increases with the number of quantum signals transmitted and can be doubled asymptotically. 23 figs.
Quantum cryptographic system with reduced data loss
Lo, Hoi-Kwong; Chau, Hoi Fung
1998-01-01
A secure method for distributing a random cryptographic key with reduced data loss. Traditional quantum key distribution systems employ similar probabilities for the different communication modes and thus reject at least half of the transmitted data. The invention substantially reduces the amount of discarded data (those that are encoded and decoded in different communication modes e.g. using different operators) in quantum key distribution without compromising security by using significantly different probabilities for the different communication modes. Data is separated into various sets according to the actual operators used in the encoding and decoding process and the error rate for each set is determined individually. The invention increases the key distribution rate of the BB84 key distribution scheme proposed by Bennett and Brassard in 1984. Using the invention, the key distribution rate increases with the number of quantum signals transmitted and can be doubled asymptotically.
Implications of acceleration environments on scaling materials processing in space to production
NASA Technical Reports Server (NTRS)
Demel, Ken
1990-01-01
Some considerations regarding materials processing in space are covered from a commercial perspective. Key areas include power, proprietary data, operational requirements (including logistics), and also the center of gravity location, and control of that location with respect to materials processing payloads.
Logging cuts the functional importance of invertebrates in tropical rainforest
Ewers, Robert M.; Boyle, Michael J. W.; Gleave, Rosalind A.; Plowman, Nichola S.; Benedick, Suzan; Bernard, Henry; Bishop, Tom R.; Bakhtiar, Effendi Y.; Chey, Vun Khen; Chung, Arthur Y. C.; Davies, Richard G.; Edwards, David P.; Eggleton, Paul; Fayle, Tom M.; Hardwick, Stephen R.; Homathevi, Rahman; Kitching, Roger L.; Khoo, Min Sheng; Luke, Sarah H.; March, Joshua J.; Nilus, Reuben; Pfeifer, Marion; Rao, Sri V.; Sharp, Adam C.; Snaddon, Jake L.; Stork, Nigel E.; Struebig, Matthew J.; Wearn, Oliver R.; Yusah, Kalsum M.; Turner, Edgar C.
2015-01-01
Invertebrates are dominant species in primary tropical rainforests, where their abundance and diversity contributes to the functioning and resilience of these globally important ecosystems. However, more than one-third of tropical forests have been logged, with dramatic impacts on rainforest biodiversity that may disrupt key ecosystem processes. We find that the contribution of invertebrates to three ecosystem processes operating at three trophic levels (litter decomposition, seed predation and removal, and invertebrate predation) is reduced by up to one-half following logging. These changes are associated with decreased abundance of key functional groups of termites, ants, beetles and earthworms, and an increase in the abundance of small mammals, amphibians and insectivorous birds in logged relative to primary forest. Our results suggest that ecosystem processes themselves have considerable resilience to logging, but the consistent decline of invertebrate functional importance is indicative of a human-induced shift in how these ecological processes operate in tropical rainforests. PMID:25865801
Logging cuts the functional importance of invertebrates in tropical rainforest.
Ewers, Robert M; Boyle, Michael J W; Gleave, Rosalind A; Plowman, Nichola S; Benedick, Suzan; Bernard, Henry; Bishop, Tom R; Bakhtiar, Effendi Y; Chey, Vun Khen; Chung, Arthur Y C; Davies, Richard G; Edwards, David P; Eggleton, Paul; Fayle, Tom M; Hardwick, Stephen R; Homathevi, Rahman; Kitching, Roger L; Khoo, Min Sheng; Luke, Sarah H; March, Joshua J; Nilus, Reuben; Pfeifer, Marion; Rao, Sri V; Sharp, Adam C; Snaddon, Jake L; Stork, Nigel E; Struebig, Matthew J; Wearn, Oliver R; Yusah, Kalsum M; Turner, Edgar C
2015-04-13
Invertebrates are dominant species in primary tropical rainforests, where their abundance and diversity contributes to the functioning and resilience of these globally important ecosystems. However, more than one-third of tropical forests have been logged, with dramatic impacts on rainforest biodiversity that may disrupt key ecosystem processes. We find that the contribution of invertebrates to three ecosystem processes operating at three trophic levels (litter decomposition, seed predation and removal, and invertebrate predation) is reduced by up to one-half following logging. These changes are associated with decreased abundance of key functional groups of termites, ants, beetles and earthworms, and an increase in the abundance of small mammals, amphibians and insectivorous birds in logged relative to primary forest. Our results suggest that ecosystem processes themselves have considerable resilience to logging, but the consistent decline of invertebrate functional importance is indicative of a human-induced shift in how these ecological processes operate in tropical rainforests.
Ward, Michael J.; Chang, Anna Marie; Pines, Jesse M.; Jouriles, Nick; Yealy, Donald M.
2016-01-01
The Consensus Conference on “Advancing Research in Emergency Department (ED) Operations and Its Impact on Patient Care,” hosted by The ED Operations Study Group (EDOSG), convened to craft a framework for future investigations in this important but underserved area. The EDOSG is a research consortium dedicated to promoting evidence based clinical practice in Emergency Medicine. The consensus process format was a modified version of the NIH Model for Consensus Conference Development. Recommendations provide an action plan for how to improve ED operations study design, create a facilitating research environment, identify data measures of value for process and outcomes research, and disseminate new knowledge in this area. Specifically, we called for eight key initiatives: 1) the development of universal measures for ED patient care processes; 2) attention to patient outcomes, in addition to process efficiency and best practice compliance; 3) the promotion of multi-site clinical operations studies to create more generalizable knowledge; 4) encouraging the use of mixed methods to understand the social community and human behavior factors that influence ED operations; 5) the creation of robust ED operations research registries to drive stronger evidence based research, 6) prioritizing key clinical questions with the input of patients, clinicians, medical leadership, emergency medicine organizations, payers, and other government stakeholders; 7) more consistently defining the functional components of the ED care system including observation units, fast tracks, waiting rooms, laboratories and radiology sub-units; and 8) maximizing multidisciplinary knowledge dissemination via emergency medicine, public health, general medicine, operations research and nontraditional publications. PMID:26014365
Yiadom, Maame Yaa A B; Ward, Michael J; Chang, Anna Marie; Pines, Jesse M; Jouriles, Nick; Yealy, Donald M
2015-06-01
The consensus conference on "Advancing Research in Emergency Department (ED) Operations and Its Impact on Patient Care," hosted by The ED Operations Study Group (EDOSG), convened to craft a framework for future investigations in this important but understudied area. The EDOSG is a research consortium dedicated to promoting evidence-based clinical practice in emergency medicine. The consensus process format was a modified version of the NIH Model for Consensus Conference Development. Recommendations provide an action plan for how to improve ED operations study design, create a facilitating research environment, identify data measures of value for process and outcomes research, and disseminate new knowledge in this area. Specifically, we call for eight key initiatives: 1) the development of universal measures for ED patient care processes; 2) attention to patient outcomes, in addition to process efficiency and best practice compliance; 3) the promotion of multisite clinical operations studies to create more generalizable knowledge; 4) encouraging the use of mixed methods to understand the social community and human behavior factors that influence ED operations; 5) the creation of robust ED operations research registries to drive stronger evidence-based research; 6) prioritizing key clinical questions with the input of patients, clinicians, medical leadership, emergency medicine organizations, payers, and other government stakeholders; 7) more consistently defining the functional components of the ED care system, including observation units, fast tracks, waiting rooms, laboratories, and radiology subunits; and 8) maximizing multidisciplinary knowledge dissemination via emergency medicine, public health, general medicine, operations research, and nontraditional publications. © 2015 by the Society for Academic Emergency Medicine.
Encryption for Remote Control via Internet or Intranet
NASA Technical Reports Server (NTRS)
Lineberger, Lewis
2005-01-01
A data-communication protocol has been devised to enable secure, reliable remote control of processes and equipment via a collision-based network, while using minimal bandwidth and computation. The network could be the Internet or an intranet. Control is made secure by use of both a password and a dynamic key, which is sent transparently to a remote user by the controlled computer (that is, the computer, located at the site of the equipment or process to be controlled, that exerts direct control over the process). The protocol functions in the presence of network latency, overcomes errors caused by missed dynamic keys, and defeats attempts by unauthorized remote users to gain control. The protocol is not suitable for real-time control, but is well suited for applications in which control latencies up to about 0.5 second are acceptable. The encryption scheme involves the use of both a dynamic and a private key, without any additional overhead that would degrade performance. The dynamic key is embedded in the equipment- or process-monitor data packets sent out by the controlled computer: in other words, the dynamic key is a subset of the data in each such data packet. The controlled computer maintains a history of the last 3 to 5 data packets for use in decrypting incoming control commands. In addition, the controlled computer records a private key (password) that is given to the remote computer. The encrypted incoming command is permuted by both the dynamic and private key. A person who records the command data in a given packet for hostile purposes cannot use that packet after the public key expires (typically within 3 seconds). Even a person in possession of an unauthorized copy of the command/remote-display software cannot use that software in the absence of the password. The use of a dynamic key embedded in the outgoing data makes the central-processing unit overhead very small. The use of a National Instruments DataSocket(TradeMark) (or equivalent) protocol or the User Datagram Protocol makes it possible to obtain reasonably short response times: Typical response times in event-driven control, using packets sized .300 bytes, are <0.2 second for commands issued from locations anywhere on Earth. The protocol requires that control commands represent absolute values of controlled parameters (e.g., a specified temperature), as distinguished from changes in values of controlled parameters (e.g., a specified increment of temperature). Each command is issued three or more times to ensure delivery in crowded networks. The use of absolute-value commands prevents additional (redundant) commands from causing trouble. Because a remote controlling computer receives "talkback" in the form of data packets from the controlled computer, typically within a time interval < or =1 s, the controlling computer can re-issue a command if network failure has occurred. The controlled computer, the process or equipment that it controls, and any human operator(s) at the site of the controlled equipment or process should be equipped with safety measures to prevent damage to equipment or injury to humans. These features could be a combination of software, external hardware, and intervention by the human operator(s). The protocol is not fail-safe, but by adopting these safety measures as part of the protocol, one makes the protocol a robust means of controlling remote processes and equipment by use of typical office computers via intranets and/or the Internet.
The PLATO IV Communications System.
ERIC Educational Resources Information Center
Sherwood, Bruce Arne; Stifle, Jack
The PLATO IV computer-based educational system contains its own communications hardware and software for operating plasma-panel graphics terminals. Key echoing is performed by the central processing unit: every key pressed at a terminal passes through the entire system before anything appears on the terminal's screen. Each terminal is guaranteed…
The Macro Dynamics of Weapon System Acquisition: Shaping Early Decisions to Get Better Outcomes
2012-05-17
defects and rework •Design tools and processes •Lack of feedback to key design and SE processes •Lack of quantified risk and uncertainty at key... Tools for Rapid Exploration of the Physical Design Space Coupling Operability, Interoperability, and Physical Feasibility Analyses – a Game Changer...Interoperability •Training Quantified Margins and Uncertainties at Each Critical Decision Point M&S RDT&E A Continuum of Tools Underpinned with
CTEPP STANDARD OPERATING PROCEDURE FOR PROCESSING COMPLETED DATA FORMS (SOP-4.10)
This SOP describes the methods for processing completed data forms. Key components of the SOP include (1) field editing, (2) data form Chain-of-Custody, (3) data processing verification, (4) coding, (5) data entry, (6) programming checks, (7) preparation of data dictionaries, cod...
An Asymmetric Image Encryption Based on Phase Truncated Hybrid Transform
NASA Astrophysics Data System (ADS)
Khurana, Mehak; Singh, Hukum
2017-09-01
To enhance the security of the system and to protect it from the attacker, this paper proposes a new asymmetric cryptosystem based on hybrid approach of Phase Truncated Fourier and Discrete Cosine Transform (PTFDCT) which adds non linearity by including cube and cube root operation in the encryption and decryption path respectively. In this cryptosystem random phase masks are used as encryption keys and phase masks generated after the cube operation in encryption process are reserved as decryption keys and cube root operation is required to decrypt image in decryption process. The cube and cube root operation introduced in the encryption and decryption path makes system resistant against standard attacks. The robustness of the proposed cryptosystem has been analysed and verified on the basis of various parameters by simulating on MATLAB 7.9.0 (R2008a). The experimental results are provided to highlight the effectiveness and suitability of the proposed cryptosystem and prove the system is secure.
NASA Astrophysics Data System (ADS)
Li, Xianye; Meng, Xiangfeng; Yang, Xiulun; Wang, Yurong; Yin, Yongkai; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi
2018-03-01
A multiple-image encryption method via lifting wavelet transform (LWT) and XOR operation is proposed, which is based on a row scanning compressive ghost imaging scheme. In the encryption process, the scrambling operation is implemented for the sparse images transformed by LWT, then the XOR operation is performed on the scrambled images, and the resulting XOR images are compressed in the row scanning compressive ghost imaging, through which the ciphertext images can be detected by bucket detector arrays. During decryption, the participant who possesses his/her correct key-group, can successfully reconstruct the corresponding plaintext image by measurement key regeneration, compression algorithm reconstruction, XOR operation, sparse images recovery, and inverse LWT (iLWT). Theoretical analysis and numerical simulations validate the feasibility of the proposed method.
Combating Terrorism: A Conceptual Framework for Targeting at the Operational Level
2004-06-17
for Joint Intelligence Preparation of the Battlespace. The key process is the JIPB which is tried and tested , offering a very logical and clear...Intelligence Preparation of the Battlespace (JIPB) process, as published in Joint Publication 2-01.3, Joint Tactics, Techniques, and Procedures for Joint...Intelligence Preparation of the Battlespace, 24 May 2000, for its application to targeting terrorism at the operational level. The
Anatomy of a Security Operations Center
NASA Technical Reports Server (NTRS)
Wang, John
2010-01-01
Many agencies and corporations are either contemplating or in the process of building a cyber Security Operations Center (SOC). Those Agencies that have established SOCs are most likely working on major revisions or enhancements to existing capabilities. As principle developers of the NASA SOC; this Presenters' goals are to provide the GFIRST community with examples of some of the key building blocks of an Agency scale cyber Security Operations Center. This presentation viII include the inputs and outputs, the facilities or shell, as well as the internal components and the processes necessary to maintain the SOC's subsistence - in other words, the anatomy of a SOC. Details to be presented include the SOC architecture and its key components: Tier 1 Call Center, data entry, and incident triage; Tier 2 monitoring, incident handling and tracking; Tier 3 computer forensics, malware analysis, and reverse engineering; Incident Management System; Threat Management System; SOC Portal; Log Aggregation and Security Incident Management (SIM) systems; flow monitoring; IDS; etc. Specific processes and methodologies discussed include Incident States and associated Work Elements; the Incident Management Workflow Process; Cyber Threat Risk Assessment methodology; and Incident Taxonomy. The Evolution of the Cyber Security Operations Center viII be discussed; starting from reactive, to proactive, and finally to proactive. Finally, the resources necessary to establish an Agency scale SOC as well as the lessons learned in the process of standing up a SOC viII be presented.
34 CFR 636.21 - What selection criteria does the Secretary use to evaluate an application?
Code of Federal Regulations, 2011 CFR
2011-07-01
...) Agencies of local government. (ii) Public and private elementary and secondary schools. (iii) Business... implementation strategy for each key project component activity is— (i) Comprehensive; (ii) Based on a sound... operation; (5) Describe a time-line chart that relates key evaluation processes and benchmarks to other...
34 CFR 636.21 - What selection criteria does the Secretary use to evaluate an application?
Code of Federal Regulations, 2014 CFR
2014-07-01
...) Agencies of local government. (ii) Public and private elementary and secondary schools. (iii) Business... implementation strategy for each key project component activity is— (i) Comprehensive; (ii) Based on a sound... operation; (5) Describe a time-line chart that relates key evaluation processes and benchmarks to other...
34 CFR 636.21 - What selection criteria does the Secretary use to evaluate an application?
Code of Federal Regulations, 2013 CFR
2013-07-01
...) Agencies of local government. (ii) Public and private elementary and secondary schools. (iii) Business... implementation strategy for each key project component activity is— (i) Comprehensive; (ii) Based on a sound... operation; (5) Describe a time-line chart that relates key evaluation processes and benchmarks to other...
34 CFR 636.21 - What selection criteria does the Secretary use to evaluate an application?
Code of Federal Regulations, 2012 CFR
2012-07-01
...) Agencies of local government. (ii) Public and private elementary and secondary schools. (iii) Business... implementation strategy for each key project component activity is— (i) Comprehensive; (ii) Based on a sound... operation; (5) Describe a time-line chart that relates key evaluation processes and benchmarks to other...
Biometrics based key management of double random phase encoding scheme using error control codes
NASA Astrophysics Data System (ADS)
Saini, Nirmala; Sinha, Aloka
2013-08-01
In this paper, an optical security system has been proposed in which key of the double random phase encoding technique is linked to the biometrics of the user to make it user specific. The error in recognition due to the biometric variation is corrected by encoding the key using the BCH code. A user specific shuffling key is used to increase the separation between genuine and impostor Hamming distance distribution. This shuffling key is then further secured using the RSA public key encryption to enhance the security of the system. XOR operation is performed between the encoded key and the feature vector obtained from the biometrics. The RSA encoded shuffling key and the data obtained from the XOR operation are stored into a token. The main advantage of the present technique is that the key retrieval is possible only in the simultaneous presence of the token and the biometrics of the user which not only authenticates the presence of the original input but also secures the key of the system. Computational experiments showed the effectiveness of the proposed technique for key retrieval in the decryption process by using the live biometrics of the user.
Evolving safety practices in the setting of modern complex operating room: role of nurses.
Niu, L; Li, H Y; Tang, W; Gong, S; Zhang, L J
2017-01-01
Operating room (OR) nursing previously referred to patient care provided during the intra-operative phase and the service provided within the OR itself. With the expansion of responsibilities of nurses, OR nursing now includes pre-operative and post-operative periods, therefore peri-operative nursing is accepted as a nursing process in OR in the contemporary medical literature. Peri-operative nurses provide care to the surgical patients during the entire process of surgery. They have several roles including those of manager or a director, clinical practitioner (scrub nurse, circulating nurse and nurse anesthetist), educator as well as researcher. Although, utmost priority is placed on insuring patient safety and well-being, they are also expected to participate in professional organization, continuing medical education programs and participating in research activities. A Surgical Patient Safety Checklist formulated by the World Health Organization serves as a major guideline to all activities in OR, and peri-operative nurses are key personnel in its implementation. Communication among the various players of a procedure in OR is key to successful patient outcome, and peri-operative nurses have a central role in making it happen. Setting up of OR in military conflict zones or places that suffering a widespread natural disaster poses a unique challenge to nursing. This review discusses all aspects of peri-operative nursing and suggests points of improvement in patient care.
Aerobic Digestion. Student Manual. Biological Treatment Process Control.
ERIC Educational Resources Information Center
Klopping, Paul H.
This manual contains the textual material for a single-lesson unit on aerobic sludge digestion. Topic areas addressed include: (1) theory of aerobic digestion; (2) system components; (3) performance factors; (4) indicators of stable operation; and (5) operational problems and their solutions. A list of objectives, glossary of key terms, and…
Ruohonen, Toni; Ennejmy, Mohammed
2013-01-01
Making reliable and justified operational and strategic decisions is a really challenging task in the health care domain. So far, the decisions have been made based on the experience of managers and staff, or they are evaluated with traditional methods, using inadequate data. As a result of this kind of decision-making process, attempts to improve operations usually have failed or led to only local improvements. Health care organizations have a lot of operational data, in addition to clinical data, which is the key element for making reliable and justified decisions. However, it is progressively problematic to access it and make usage of it. In this paper we discuss about the possibilities how to exploit operational data in the most efficient way in the decision-making process. We'll share our future visions and propose a conceptual framework for automating the decision-making process.
Benefits to blood banks of a sales and operations planning process.
Keal, Donald A; Hebert, Phil
2010-12-01
A formal sales and operations planning (S&OP) process is a decision making and communication process that balances supply and demand while integrating all business operational components with customer-focused business plans that links high level strategic plans to day-to-day operations. Furthermore, S&OP can assist in managing change across the organization as it provides the opportunity to be proactive in the face of problems and opportunities while establishing a plan for everyone to follow. Some of the key outcomes from a robust S&OP process in blood banking would include: higher customer satisfaction (donors and health care providers), balanced inventory across product lines and customers, more stable production rates and higher productivity, more cooperation across the entire operation, and timely updates to the business plan resulting in better forecasting and fewer surprises that negatively impact the bottom line. © 2010 American Association of Blood Banks.
PCs: Key to the Future. Business Center Provides Sound Skills and Good Attitudes.
ERIC Educational Resources Information Center
Pay, Renee W.
1991-01-01
The Advanced Computing/Management Training Program at Jordan Technical Center (Sandy, Utah) simulates an automated office to teach five sets of skills: computer architecture and operating systems, word processing, data processing, communications skills, and management principles. (SK)
Rule-Based Expert Systems in the Command Estimate: An Operational Perspective
1990-06-01
control measures. 5. Prepare COA statement(s) and sketch(es). The key inputs for developing courses of action are the DFD process of IPB, data stores...mission, or a change of information provides new direction to this process for that particular operation." Formal scientific analysis of the command...30 5. Delivery of outside news . This feature contributes to the commanders insatiable need for current information. Artificial intelligence ana rule
A review of aircraft turnaround operations and simulations
NASA Astrophysics Data System (ADS)
Schmidt, Michael
2017-07-01
The ground operational processes are the connecting element between aircraft en-route operations and airport infrastructure. An efficient aircraft turnaround is an essential component of airline success, especially for regional and short-haul operations. It is imperative that advancements in ground operations, specifically process reliability and passenger comfort, are developed while dealing with increasing passenger traffic in the next years. This paper provides an introduction to aircraft ground operations focusing on the aircraft turnaround and passenger processes. Furthermore, key challenges for current aircraft operators, such as airport capacity constraints, schedule disruptions and the increasing cost pressure, are highlighted. A review of the conducted studies and conceptual work in this field shows pathways for potential process improvements. Promising approaches attempt to reduce apron traffic and parallelize passenger processes and taxiing. The application of boarding strategies and novel cabin layouts focusing on aisle, door and seat, are options to shorten the boarding process inside the cabin. A summary of existing modeling and simulation frameworks give an insight into state-of-the-art assessment capabilities as it concerns advanced concepts. They are the prerequisite to allow a holistic assessment during the early stages of the preliminary aircraft design process and to identify benefits and drawbacks for all involved stakeholders.
Ready-to-Use Simulation: Demystifying Statistical Process Control
ERIC Educational Resources Information Center
Sumukadas, Narendar; Fairfield-Sonn, James W.; Morgan, Sandra
2005-01-01
Business students are typically introduced to the concept of process management in their introductory course on operations management. A very important learning outcome here is an appreciation that the management of processes is a key to the management of quality. Some of the related concepts are qualitative, such as strategic and behavioral…
The use of Merging and Aggregation Operators for MRDB Data Feeding
NASA Astrophysics Data System (ADS)
Kozioł, Krystian; Lupa, Michał
2013-12-01
This paper presents the application of two generalization operators - merging and displacement - in the process of automatic data feeding in a multiresolution data base of topographic objects from large-scale data-bases (1 : 500-1 : 5000). An ordered collection of objects makes a layer of development that in the process of generalization is subjected to the processes of merging and displacement in order to maintain recognizability in the reduced scale of the map. The solution to the above problem is the algorithms described in the work; these algorithms use the standard recognition of drawings (Chrobak 2010), independent of the user. A digital cartographic generalization process is a set of consecutive operators where merging and aggregation play a key role. The proper operation has a significant impact on the qualitative assessment of data generalization
Joynt, Gavin M; Loo, Shi; Taylor, Bruce L; Margalit, Gila; Christian, Michael D; Sandrock, Christian; Danis, Marion; Leoniv, Yuval; Sprung, Charles L
2010-04-01
To provide recommendations and standard operating procedures (SOPs) for intensive care unit (ICU) and hospital preparations for an influenza pandemic or mass disaster with a specific focus on enhancing coordination and collaboration between the ICU and other key stakeholders. Based on a literature review and expert opinion, a Delphi process was used to define the essential topics including coordination and collaboration. Key recommendations include: (1) establish an Incident Management System with Emergency Executive Control Groups at facility, local, regional/state or national levels to exercise authority and direction over resource use and communications; (2) develop a system of communication, coordination and collaboration between the ICU and key interface departments within the hospital; (3) identify key functions or processes requiring coordination and collaboration, the most important of these being manpower and resources utilization (surge capacity) and re-allocation of personnel, equipment and physical space; (4) develop processes to allow smooth inter-departmental patient transfers; (5) creating systems and guidelines is not sufficient, it is important to: (a) identify the roles and responsibilities of key individuals necessary for the implementation of the guidelines; (b) ensure that these individuals are adequately trained and prepared to perform their roles; (c) ensure adequate equipment to allow key coordination and collaboration activities; (d) ensure an adequate physical environment to allow staff to properly implement guidelines; (6) trigger events for determining a crisis should be defined. Judicious planning and adoption of protocols for coordination and collaboration with interface units are necessary to optimize outcomes during a pandemic.
Abraham, Sushil; Bain, David; Bowers, John; Larivee, Victor; Leira, Francisco; Xie, Jasmina
2015-01-01
The technology transfer of biological products is a complex process requiring control of multiple unit operations and parameters to ensure product quality and process performance. To achieve product commercialization, the technology transfer sending unit must successfully transfer knowledge about both the product and the process to the receiving unit. A key strategy for maximizing successful scale-up and transfer efforts is the effective use of engineering and shake-down runs to confirm operational performance and product quality prior to embarking on good manufacturing practice runs such as process performance qualification runs. We consider key factors to consider in making the decision to perform shake-down or engineering runs. We also present industry benchmarking results of how engineering runs are used in drug substance technology transfers alongside the main themes and best practices that have emerged. Our goal is to provide companies with a framework for ensuring the "right first time" technology transfers with effective deployment of resources within increasingly aggressive timeline constraints. © PDA, Inc. 2015.
Research on the Mean Logistic Delay Time of the Development Phrass
NASA Astrophysics Data System (ADS)
Na, Hou; Yi, Li; Wang, Yi-Gang; Liu, Jun-jie; Bo, Zhang; Lv, Xue-Zhi
MIDT is a key parameter affecting operational availability though equipment designing, operation and support management. In operation process, how to strengthen the support management, layout rationally supports resource, provide support resource of the equipment maintenance, in order to avoid or reduce support; ensure MLDT satisfied to Ao's requests. It's an urgently solved question that how to assort with the RMS of equipment.
Key-value store with internal key-value storage interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bent, John M.; Faibish, Sorin; Ting, Dennis P. J.
A key-value store is provided having one or more key-value storage interfaces. A key-value store on at least one compute node comprises a memory for storing a plurality of key-value pairs; and an abstract storage interface comprising a software interface module that communicates with at least one persistent storage device providing a key-value interface for persistent storage of one or more of the plurality of key-value pairs, wherein the software interface module provides the one or more key-value pairs to the at least one persistent storage device in a key-value format. The abstract storage interface optionally processes one or moremore » batch operations on the plurality of key-value pairs. A distributed embodiment for a partitioned key-value store is also provided.« less
Cost, capability, and risk for planetary operations
NASA Technical Reports Server (NTRS)
Mclaughlin, William I.; Deutsch, Marie J.; Miller, Lanny J.; Wolff, Donna M.; Zawacki, Steven J.
1992-01-01
The three key factors for flight projects - cost, capability, and risk - are examined with respect to their interplay, the uplink process, cost drivers, and risk factors. Scientific objectives are translated into a computer program during the uplink process, and examples are given relating to the Voyager Interstellar Mission, Galileo, and the Comet Rendezvous Asteroid Flyby. The development of a multimission sequence system based on these uplinks is described with reference to specific subsystems such as the pointer and the sequence generator. Operational cost drivers include mission, flight-system, and ground-system complexity, uplink traffic, and work force. Operational risks are listed in terms of the mission operations, the environment, and the mission facilities. The uplink process can be analyzed in terms of software development, and spacecraft operability is shown to be an important factor from the initial stages of spacecraft development.
Barbagallo, Simone; Corradi, Luca; de Ville de Goyet, Jean; Iannucci, Marina; Porro, Ivan; Rosso, Nicola; Tanfani, Elena; Testi, Angela
2015-05-17
The Operating Room (OR) is a key resource of all major hospitals, but it also accounts for up 40% of resource costs. Improving cost effectiveness, while maintaining a quality of care, is a universal objective. These goals imply an optimization of planning and a scheduling of the activities involved. This is highly challenging due to the inherent variable and unpredictable nature of surgery. A Business Process Modeling Notation (BPMN 2.0) was used for the representation of the "OR Process" (being defined as the sequence of all of the elementary steps between "patient ready for surgery" to "patient operated upon") as a general pathway ("path"). The path was then both further standardized as much as possible and, at the same time, keeping all of the key-elements that would allow one to address or define the other steps of planning, and the inherent and wide variability in terms of patient specificity. The path was used to schedule OR activity, room-by-room, and day-by-day, feeding the process from a "waiting list database" and using a mathematical optimization model with the objective of ending up in an optimized planning. The OR process was defined with special attention paid to flows, timing and resource involvement. Standardization involved a dynamics operation and defined an expected operating time for each operation. The optimization model has been implemented and tested on real clinical data. The comparison of the results reported with the real data, shows that by using the optimization model, allows for the scheduling of about 30% more patients than in actual practice, as well as to better exploit the OR efficiency, increasing the average operating room utilization rate up to 20%. The optimization of OR activity planning is essential in order to manage the hospital's waiting list. Optimal planning is facilitated by defining the operation as a standard pathway where all variables are taken into account. By allowing a precise scheduling, it feeds the process of planning and, further up-stream, the management of a waiting list in an interactive and bi-directional dynamic process.
Space system operations and support cost analysis using Markov chains
NASA Technical Reports Server (NTRS)
Unal, Resit; Dean, Edwin B.; Moore, Arlene A.; Fairbairn, Robert E.
1990-01-01
This paper evaluates the use of Markov chain process in probabilistic life cycle cost analysis and suggests further uses of the process as a design aid tool. A methodology is developed for estimating operations and support cost and expected life for reusable space transportation systems. Application of the methodology is demonstrated for the case of a hypothetical space transportation vehicle. A sensitivity analysis is carried out to explore the effects of uncertainty in key model inputs.
Health sector operational planning and budgeting processes in Kenya—“never the twain shall meet”
Molyneux, Sassy; Goodman, Catherine
2015-01-01
Summary Operational planning is considered an important tool for translating government policies and strategic objectives into day‐to‐day management activities. However, developing countries suffer from persistent misalignment between policy, planning and budgeting. The Medium Term Expenditure Framework (MTEF) was introduced to address this misalignment. Kenya adopted the MTEF in the early 2000s, and in 2005, the Ministry of Health adopted the Annual Operational Plan process to adapt the MTEF to the health sector. This study assessed the degree to which the health sector Annual Operational Plan process in Kenya has achieved alignment between planning and budgeting at the national level, using document reviews, participant observation and key informant interviews. We found that the Kenyan health sector was far from achieving planning and budgeting alignment. Several factors contributed to this problem including weak Ministry of Health stewardship and institutionalized separation between planning and budgeting processes; a rapidly changing planning and budgeting environment; lack of reliable data to inform target setting and poor participation by key stakeholders in the process including a top‐down approach to target setting. We conclude that alignment is unlikely to be achieved without consideration of the specific institutional contexts and the power relationships between stakeholders. In particular, there is a need for institutional integration of the planning and budgeting processes into a common cycle and framework with common reporting lines and for improved data and local‐level input to inform appropriate and realistic target setting. © 2015 The Authors. International Journal of Health Planning and Management published by John Wiley & Sons, Ltd. PMID:25783862
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, J.M.; Nieman, L.D.
In 1977 Solomon Associates, Inc. issued its first study of refining in the US entitled, Comparative Performance Analysis for Fuel Product Refineries, most commonly referred to as the Solomon Study, or the Fuels Study. In late 1993, both the Water and Waste Water Management, and Petroleum Divisions of Nalco Chemical Company came to the same conclusion; that they must have a better understanding of the Solomon Study process, and have some input to this system of measurement. The authors first approached Solomon Associates with the idea that a specific study should be done of specialty chemicals used in the refinery.more » They felt that this would result in two studies, one for water treatment applications, and one for process. The water treatment study came first, and was completed in 1993 with the United States Petroleum Refineries Water Treatment Performance Analysis for Operating Year 1993. The process study, entitled United States Petroleum Refinery Process Treatment Performance Analysis for Operating Years 1994--95 will be issued in the 2nd quarter of this year by Nalco/Exxon Energy Chemicals, L.P, which includes the combined resources of the former Petroleum Division of Nalco Chemical Company (including the petroleum related portions of most of its overseas companies), and the petroleum related specialty chemical operations of Exxon Chemical on a global basis. What follows is a recap of the process study focus, some examples of output, and comment on both the linkage to key refinery operating indicators, as well as the perception of the effect of such measurement on the supplier relationship of the future.« less
NASA Astrophysics Data System (ADS)
Hayakawa, Hitoshi; Ogawa, Makoto; Shibata, Tadashi
2005-04-01
A very large scale integrated circuit (VLSI) architecture for a multiple-instruction-stream multiple-data-stream (MIMD) associative processor has been proposed. The processor employs an architecture that enables seamless switching from associative operations to arithmetic operations. The MIMD element is convertible to a regular central processing unit (CPU) while maintaining its high performance as an associative processor. Therefore, the MIMD associative processor can perform not only on-chip perception, i.e., searching for the vector most similar to an input vector throughout the on-chip cache memory, but also arithmetic and logic operations similar to those in ordinary CPUs, both simultaneously in parallel processing. Three key technologies have been developed to generate the MIMD element: associative-operation-and-arithmetic-operation switchable calculation units, a versatile register control scheme within the MIMD element for flexible operations, and a short instruction set for minimizing the memory size for program storage. Key circuit blocks were designed and fabricated using 0.18 μm complementary metal-oxide-semiconductor (CMOS) technology. As a result, the full-featured MIMD element is estimated to be 3 mm2, showing the feasibility of an 8-parallel-MIMD-element associative processor in a single chip of 5 mm× 5 mm.
PILOT PLANT STUDY OF CONVERSION OF COAL TO LOW SULFUR FUEL
The report gives results of a program to develop, on bench and pilot scales, operating conditions for the key step in the IGT process to desulfurize coal by thermal and chemical treatment. This process, to date, uses the 'sulfur-getter' concept. (A sulfur-getter is a material tha...
ERIC Educational Resources Information Center
Edwards, Frances
2012-01-01
Increasingly school change processes are being facilitated through the formation and operation of groups of teachers working together for improved student outcomes. These groupings are variously referred to as networks, networked learning communities, communities of practice, professional learning communities, learning circles or clusters. The…
Siochi, R
2012-06-01
To develop a quality initiative discovery framework using process improvement techniques, software tools and operating principles. Process deviations are entered into a radiotherapy incident reporting database. Supervisors use an in-house Event Analysis System (EASy) to discuss incidents with staff. Major incidents are analyzed with an in-house Fault Tree Analysis (FTA). A meta-Analysis is performed using association, text mining, key word clustering, and differential frequency analysis. A key operating principle encourages the creation of forcing functions via rapid application development. 504 events have been logged this past year. The results for the key word analysis indicate that the root cause for the top ranked key words was miscommunication. This was also the root cause found from association analysis, where 24% of the time that an event involved a physician it also involved a nurse. Differential frequency analysis revealed that sharp peaks at week 27 were followed by 3 major incidents, two of which were dose related. The peak was largely due to the front desk which caused distractions in other areas. The analysis led to many PI projects but there is still a major systematic issue with the use of forms. The solution we identified is to implement Smart Forms to perform error checking and interlocking. Our first initiative replaced our daily QA checklist with a form that uses custom validation routines, preventing therapists from proceeding with treatments until out of tolerance conditions are corrected. PITSTOP has increased the number of quality initiatives in our department, and we have discovered or confirmed common underlying causes of a variety of seemingly unrelated errors. It has motivated the replacement of all forms with smart forms. © 2012 American Association of Physicists in Medicine.
Method for routing events from key strokes in a multi-processing computer systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhodes, D.A.; Rustici, E.; Carter, K.H.
1990-01-23
The patent describes a method of routing user input in a computer system which concurrently runs a plurality of processes. It comprises: generating keycodes representative of keys typed by a user; distinguishing generated keycodes by looking up each keycode in a routing table which assigns each possible keycode to an individual assigned process of the plurality of processes, one of which processes being a supervisory process; then, sending each keycode to its assigned process until a keycode assigned to the supervisory process is received; sending keycodes received subsequent to the keycode assigned to the supervisory process to a buffer; next,more » providing additional keycodes to the supervisory process from the buffer until the supervisory process has completed operation; and sending keycodes stored in the buffer to processes assigned therewith after the supervisory process has completedoperation.« less
Salehi, Mojtaba; Bahreininejad, Ardeshir
2011-08-01
Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously.
Salehi, Mojtaba
2010-01-01
Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously. PMID:21845020
Small Interactive Image Processing System (SMIPS) system description
NASA Technical Reports Server (NTRS)
Moik, J. G.
1973-01-01
The Small Interactive Image Processing System (SMIPS) operates under control of the IBM-OS/MVT operating system and uses an IBM-2250 model 1 display unit as interactive graphic device. The input language in the form of character strings or attentions from keys and light pen is interpreted and causes processing of built-in image processing functions as well as execution of a variable number of application programs kept on a private disk file. A description of design considerations is given and characteristics, structure and logic flow of SMIPS are summarized. Data management and graphic programming techniques used for the interactive manipulation and display of digital pictures are also discussed.
KSC ground operations planning for Space Station
NASA Technical Reports Server (NTRS)
Lyon, J. R.; Revesz, W., Jr.
1993-01-01
At the Kennedy Space Center (KSC) in Florida, processing facilities are being built and activated to support the processing, checkout, and launch of Space Station elements. The generic capability of these facilities will be utilized to support resupply missions for payloads, life support services, and propellants for the 30-year life of the program. Special Ground Support Equipment (GSE) is being designed for Space Station hardware special handling requirements, and a Test, Checkout, and Monitoring System (TCMS) is under development to verify that the flight elements are ready for launch. The facilities and equipment used at KSC, along with the testing required to accomplish the mission, are described in detail to provide an understanding of the complexity of operations at the launch site. Assessments of hardware processing flows through KSC are being conducted to minimize the processing flow times for each hardware element. Baseline operations plans and the changes made to improve operations and reduce costs are described, recognizing that efficient ground operations are a major key to success of the Space Station.
Manufacturing Methods and Technology Program Automatic In-Process Microcircuit Evaluation.
1980-10-01
methods of controlling the AIME system are with the computer and associated inter- face (CPU control), and with controls located on the front panels...Sync and Blanking signals When the AIME system is being operated by the front panel controls , the computer does not influence the system operation. SU...the color video monitor display. The operator controls these parameters by 1) depressing the appropriate key on the keyboard, 2) observing on the
Three-pass protocol scheme for bitmap image security by using vernam cipher algorithm
NASA Astrophysics Data System (ADS)
Rachmawati, D.; Budiman, M. A.; Aulya, L.
2018-02-01
Confidentiality, integrity, and efficiency are the crucial aspects of data security. Among the other digital data, image data is too prone to abuse of operation like duplication, modification, etc. There are some data security techniques, one of them is cryptography. The security of Vernam Cipher cryptography algorithm is very dependent on the key exchange process. If the key is leaked, security of this algorithm will collapse. Therefore, a method that minimizes key leakage during the exchange of messages is required. The method which is used, is known as Three-Pass Protocol. This protocol enables message delivery process without the key exchange. Therefore, the sending messages process can reach the receiver safely without fear of key leakage. The system is built by using Java programming language. The materials which are used for system testing are image in size 200×200 pixel, 300×300 pixel, 500×500 pixel, 800×800 pixel and 1000×1000 pixel. The result of experiments showed that Vernam Cipher algorithm in Three-Pass Protocol scheme could restore the original image.
Artifact-Based Transformation of IBM Global Financing
NASA Astrophysics Data System (ADS)
Chao, Tian; Cohn, David; Flatgard, Adrian; Hahn, Sandy; Linehan, Mark; Nandi, Prabir; Nigam, Anil; Pinel, Florian; Vergo, John; Wu, Frederick Y.
IBM Global Financing (IGF) is transforming its business using the Business Artifact Method, an innovative business process modeling technique that identifies key business artifacts and traces their life cycles as they are processed by the business. IGF is a complex, global business operation with many business design challenges. The Business Artifact Method is a fundamental shift in how to conceptualize, design and implement business operations. The Business Artifact Method was extended to solve the problem of designing a global standard for a complex, end-to-end process while supporting local geographic variations. Prior to employing the Business Artifact method, process decomposition, Lean and Six Sigma methods were each employed on different parts of the financing operation. Although they provided critical input to the final operational model, they proved insufficient for designing a complete, integrated, standard operation. The artifact method resulted in a business operations model that was at the right level of granularity for the problem at hand. A fully functional rapid prototype was created early in the engagement, which facilitated an improved understanding of the redesigned operations model. The resulting business operations model is being used as the basis for all aspects of business transformation in IBM Global Financing.
Robertson, Erin L; Liber, Karsten
2007-11-01
The main objectives of this in situ study were to evaluate the usefulness of an in situ bioassay to determine if downstream water bodies at the Key Lake and Rabbit Lake uranium operations (Saskatchewan, Canada) were toxic to Hyalella azteca and, if toxicity was observed, to differentiate between the contribution of surface water and sediment contamination to in situ toxicity. These objectives were achieved by performing 4-d in situ bioassays with laboratory-reared H. azteca confined in specially designed, paired, surface water and sediment exposure chambers. Results from the in situ bioassays revealed significant mortality, relative to the respective reference site, at the exposure sites at both Key Lake (p = 0.001) and Rabbit Lake (p = 0.001). No statistical differences were found between survival in surface water and sediment exposure chambers at either Key Lake (p = 0.232) or Rabbit Lake (p = 0.072). This suggests that surface water (the common feature of both types of exposure chambers) was the primary cause of in situ mortality of H. azteca at both operations, although this relationship was stronger at Key Lake. At Key Lake, the primary cause of aquatic toxicity to H. azteca did not appear to be correlated with the variables measured in this study, but most likely with a pulse of organic mill-process chemicals released during the time of the in situ study-a transient event that was caused by a problem with the mill's solvent extraction process. The suspected cause of in situ toxicity to H. azteca at Rabbit Lake was high levels of uranium in surface water, sediment, and pore water.
Health sector operational planning and budgeting processes in Kenya-"never the twain shall meet".
Tsofa, Benjamin; Molyneux, Sassy; Goodman, Catherine
2016-07-01
Operational planning is considered an important tool for translating government policies and strategic objectives into day-to-day management activities. However, developing countries suffer from persistent misalignment between policy, planning and budgeting. The Medium Term Expenditure Framework (MTEF) was introduced to address this misalignment. Kenya adopted the MTEF in the early 2000s, and in 2005, the Ministry of Health adopted the Annual Operational Plan process to adapt the MTEF to the health sector. This study assessed the degree to which the health sector Annual Operational Plan process in Kenya has achieved alignment between planning and budgeting at the national level, using document reviews, participant observation and key informant interviews. We found that the Kenyan health sector was far from achieving planning and budgeting alignment. Several factors contributed to this problem including weak Ministry of Health stewardship and institutionalized separation between planning and budgeting processes; a rapidly changing planning and budgeting environment; lack of reliable data to inform target setting and poor participation by key stakeholders in the process including a top-down approach to target setting. We conclude that alignment is unlikely to be achieved without consideration of the specific institutional contexts and the power relationships between stakeholders. In particular, there is a need for institutional integration of the planning and budgeting processes into a common cycle and framework with common reporting lines and for improved data and local-level input to inform appropriate and realistic target setting. © 2015 The Authors. International Journal of Health Planning and Management published by John Wiley & Sons, Ltd. © 2015 The Authors. International Journal of Health Planning and Management published by John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Duda, James L.; Mulligan, Joseph; Valenti, James; Wenkel, Michael
2005-01-01
A key feature of the National Polar-orbiting Operational Environmental Satellite System (NPOESS) is the Northrop Grumman Space Technology patent-pending innovative data routing and retrieval architecture called SafetyNetTM. The SafetyNetTM ground system architecture for the National Polar-orbiting Operational Environmental Satellite System (NPOESS), combined with the Interface Data Processing Segment (IDPS), will together provide low data latency and high data availability to its customers. The NPOESS will cut the time between observation and delivery by a factor of four when compared with today's space-based weather systems, the Defense Meteorological Satellite Program (DMSP) and NOAA's Polar-orbiting Operational Environmental Satellites (POES). SafetyNetTM will be a key element of the NPOESS architecture, delivering near real-time data over commercial telecommunications networks. Scattered around the globe, the 15 unmanned ground receptors are linked by fiber-optic systems to four central data processing centers in the U. S. known as Weather Centrals. The National Environmental Satellite, Data and Information Service; Air Force Weather Agency; Fleet Numerical Meteorology and Oceanography Center, and the Naval Oceanographic Office operate the Centrals. In addition, this ground system architecture will have unused capacity attendant with an infrastructure that can accommodate additional users.
Welch, Shari J; Stone-Griffith, Suzanne; Asplin, Brent; Davidson, Steven J; Augustine, James; Schuur, Jeremiah D
2011-05-01
The public, payers, hospitals, and Centers for Medicare and Medicaid Services (CMS) are demanding that emergency departments (EDs) measure and improve performance, but this cannot be done unless we define the terms used in ED operations. On February 24, 2010, 32 stakeholders from 13 professional organizations met in Salt Lake City, Utah, to standardize ED operations metrics and definitions, which are presented in this consensus paper. Emergency medicine (EM) experts attending the Second Performance Measures and Benchmarking Summit reviewed, expanded, and updated key definitions for ED operations. Prior to the meeting, participants were provided with the definitions created at the first summit in 2006 and relevant documents from other organizations and asked to identify gaps and limitations in the original work. Those responses were used to devise a plan to revise and update the definitions. At the summit, attendees discussed and debated key terminology, and workgroups were created to draft a more comprehensive document. These results have been crafted into two reference documents, one for metrics and the operations dictionary presented here. The ED Operations Dictionary defines ED spaces, processes, patient populations, and new ED roles. Common definitions of key terms will improve the ability to compare ED operations research and practice and provide a common language for frontline practitioners, managers, and researchers. © 2011 by the Society for Academic Emergency Medicine.
Recent developments in membrane-based separations in biotechnology processes: review.
Rathore, A S; Shirke, A
2011-01-01
Membrane-based separations are the most ubiquitous unit operations in biotech processes. There are several key reasons for this. First, they can be used with a large variety of applications including clarification, concentration, buffer exchange, purification, and sterilization. Second, they are available in a variety of formats, such as depth filtration, ultrafiltration, diafiltration, nanofiltration, reverse osmosis, and microfiltration. Third, they are simple to operate and are generally robust toward normal variations in feed material and operating parameters. Fourth, membrane-based separations typically require lower capital cost when compared to other processing options. As a result of these advantages, a typical biotech process has anywhere from 10 to 20 membrane-based separation steps. In this article we review the major developments that have occurred on this topic with a focus on developments in the last 5 years.
Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A
2007-10-31
The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.
Novel secret key generation techniques using memristor devices
NASA Astrophysics Data System (ADS)
Abunahla, Heba; Shehada, Dina; Yeun, Chan Yeob; Mohammad, Baker; Jaoude, Maguy Abi
2016-02-01
This paper proposes novel secret key generation techniques using memristor devices. The approach depends on using the initial profile of a memristor as a master key. In addition, session keys are generated using the master key and other specified parameters. In contrast to existing memristor-based security approaches, the proposed development is cost effective and power efficient since the operation can be achieved with a single device rather than a crossbar structure. An algorithm is suggested and demonstrated using physics based Matlab model. It is shown that the generated keys can have dynamic size which provides perfect security. Moreover, the proposed encryption and decryption technique using the memristor based generated keys outperforms Triple Data Encryption Standard (3DES) and Advanced Encryption Standard (AES) in terms of processing time. This paper is enriched by providing characterization results of a fabricated microscale Al/TiO2/Al memristor prototype in order to prove the concept of the proposed approach and study the impacts of process variations. The work proposed in this paper is a milestone towards System On Chip (SOC) memristor based security.
Ausserhofer, Dietmar; Rakic, Severin; Novo, Ahmed; Dropic, Emira; Fisekovic, Eldin; Sredic, Ana; Van Malderen, Greet
2016-06-01
We explored how selected 'positive deviant' healthcare facilities in Bosnia and Herzegovina approach the continuous development, adaptation, implementation, monitoring and evaluation of nursing-related standard operating procedures. Standardized nursing care is internationally recognized as a critical element of safe, high-quality health care; yet very little research has examined one of its key instruments: nursing-related standard operating procedures. Despite variability in Bosnia and Herzegovina's healthcare and nursing care quality, we assumed that some healthcare facilities would have developed effective strategies to elevate nursing quality and safety through the use of standard operating procedures. Guided by the 'positive deviance' approach, we used a multiple-case study design to examine a criterion sample of four facilities (two primary healthcare centres and two hospitals), collecting data via focus groups and individual interviews. In each studied facility, certification/accreditation processes were crucial to the initiation of continuous development, adaptation, implementation, monitoring and evaluation of nursing-related SOPs. In one hospital and one primary healthcare centre, nurses working in advanced roles (i.e. quality coordinators) were responsible for developing and implementing nursing-related standard operating procedures. Across the four studied institutions, we identified a consistent approach to standard operating procedures-related processes. The certification/accreditation process is enabling necessary changes in institutions' organizational cultures, empowering nurses to take on advanced roles in improving the safety and quality of nursing care. Standardizing nursing procedures is key to improve the safety and quality of nursing care. Nursing and Health Policy are needed in Bosnia and Herzegovina to establish a functioning institutional framework, including regulatory bodies, educational systems for developing nurses' capacities or the inclusion of nursing-related standard operating procedures in certification/accreditation standards. © 2016 International Council of Nurses.
Quantum optical circulator controlled by a single chirally coupled atom
NASA Astrophysics Data System (ADS)
Scheucher, Michael; Hilico, Adèle; Will, Elisa; Volz, Jürgen; Rauschenbeutel, Arno
2016-12-01
Integrated nonreciprocal optical components, which have an inherent asymmetry between their forward and backward propagation direction, are key for routing signals in photonic circuits. Here, we demonstrate a fiber-integrated quantum optical circulator operated by a single atom. Its nonreciprocal behavior arises from the chiral interaction between the atom and the transversally confined light. We demonstrate that the internal quantum state of the atom controls the operation direction of the circulator and that it features a strongly nonlinear response at the single-photon level. This enables, for example, photon number-dependent routing and novel quantum simulation protocols. Furthermore, such a circulator can in principle be prepared in a coherent superposition of its operational states and may become a key element for quantum information processing in scalable integrated optical circuits.
ARM Operations and Engineering Procedure Mobile Facility Site Startup
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voyles, Jimmy W
2015-05-01
This procedure exists to define the key milestones, necessary steps, and process rules required to commission and operate an Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF), with a specific focus toward on-time product delivery to the ARM Data Archive. The overall objective is to have the physical infrastructure, networking and communications, and instrument calibration, grooming, and alignment (CG&A) completed with data products available from the ARM Data Archive by the Operational Start Date milestone.
Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K; Birch, Gary E
2007-06-01
Brain-computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?
NASA Astrophysics Data System (ADS)
Bashashati, Ali; Fatourechi, Mehrdad; Ward, Rabab K.; Birch, Gary E.
2007-06-01
Brain computer interfaces (BCIs) aim at providing a non-muscular channel for sending commands to the external world using the electroencephalographic activity or other electrophysiological measures of the brain function. An essential factor in the successful operation of BCI systems is the methods used to process the brain signals. In the BCI literature, however, there is no comprehensive review of the signal processing techniques used. This work presents the first such comprehensive survey of all BCI designs using electrical signal recordings published prior to January 2006. Detailed results from this survey are presented and discussed. The following key research questions are addressed: (1) what are the key signal processing components of a BCI, (2) what signal processing algorithms have been used in BCIs and (3) which signal processing techniques have received more attention?
2009-12-01
between two parties, such as a buyer and a seller that is defined by an agreement about their respective rights and responsibilities” (Garrett, 2007, p...capabilities” (p. 270). As stated in the definition, the CMMM can be used to analyze an organization from the buyer or seller’s perspective. In our...study, USSOCOM is analyzed from the buyers ’ perspective. The CMMM utilizes six key process areas when analyzing an organization from the buyer’s
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-30
... (RFHM), Ignition Node Module (IGNM), Engine Control Module, Body Controller Module, Sentry Key... disable engine operation and immobilize the vehicle after two seconds of running. This process is also...
Harvey, Jasmine; Avery, Anthony J; Ashcroft, Darren; Boyd, Matthew; Phipps, Denham L; Barber, Nicholas
2015-01-01
Identifying risk is an important facet of a safety practice in an organization. To identify risk, all components within a system of operation should be considered. In clinical safety practice, a team of people, technologies, procedures and protocols, management structure and environment have been identified as key components in a system of operation. To explore risks in relation to prescription dispensing in community pharmacies by taking into account relationships between key components that relate to the dispensing process. Fifteen community pharmacies in England with varied characteristics were identified, and data were collected using non-participant observations, shadowing and interviews. Approximately 360 hours of observations and 38 interviews were conducted by the team. Observation field notes from each pharmacy were written into case studies. Overall, 52,500 words from 15 case studies and interview transcripts were analyzed using thematic and line-by-line analyses. Validation techniques included multiple data collectors co-authoring each case study for consensus, review of case studies by members of the wider team including academic and practicing community pharmacists, and patient safety experts and two presentations (internally and externally) to review and discuss findings. Risks identified were related to relationships between people and other key components in dispensing. This included how different levels of staff communicated internally and externally, followed procedures, interacted with technical systems, worked with management, and engaged with the environment. In a dispensing journey, the following categories were identified which show how risks are inextricably linked through relationships between human components and other key components: 1) dispensing with divided attention; 2) dispensing under pressure; 3) dispensing in a restricted space or environment; and, 4) managing external influences. To identify and evaluate risks effectively, an approach that includes understanding relationships between key components in dispensing is required. Since teams of people in community pharmacies are a key dispensing component, and therefore part of the operational process, it is important to note how they relate to other components in the environment within which they operate. Pharmacies can take the opportunity to reflect on the organization of their systems and review in particular how they can improve on the four key categories identified. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Informal Learning Processes in a Worker Co-Operative. NALL Working Paper.
ERIC Educational Resources Information Center
Quarter, Jack; Midha, Harish
A study was conducted to understand the informal learning processes of the members of a worker natural foods store cooperative, The Big Carrot, in Toronto. Eight members with central roles in the natural foods retailer were interviewed. In addition, key documents and other writings on the cooperative were examined. The data indicate that members…
Assessment of Spacecraft Operational Status Using Electro-Optical Predictive Techniques
2010-09-01
panel appendages, may require enhanced preflight characterization processes to support monitoring by passive, remote, nonimaging optical sensors...observing and characterizing key spacecraft features. The simulation results are based on electro-optical signatures apparent to nonimaging sensors, along...and communication equipment, may require enhanced preflight characterization processes to support monitoring by passive, remote, nonimaging optical
ERIC Educational Resources Information Center
McAliney, Peter J.
2009-01-01
This article presents a process for valuing a portfolio of learning assets used by line executives across industries to value traditional business assets. Embedded within the context of enterprise risk management, this strategic asset allocation process is presented step by step, providing readers the operational considerations to implement this…
Carbothermal Production of Magnesium: Csiro's Magsonic™ Process
NASA Astrophysics Data System (ADS)
Prentice, Leon H.; Nagle, Michael W.; Barton, Timothy R. D.; Tassios, Steven; Kuan, Benny T.; Witt, Peter J.; Constanti-Carey, Keri K.
Carbothermal production has been recognized as conceptually the simplest and cleanest route to magnesium metal, but has suffered from technical challenges of development and scale-up. Work by CSIRO has now successfully demonstrated the technology using supersonic quenching of magnesium vapor (the MagSonic™ Process). Key barriers to process development have been overcome: the experimental program has achieved sustained operation, no nozzle blockage, minimal reversion, and safe handling of pyrophoric powders. The laboratory equipment has been operated at industrially relevant magnesium vapor concentrations (>25% Mg) for multiple runs with no blockage. Novel computational fluid dynamics (CFD) modeling of the shock quenching and metal vapor condensation has informed nozzle design and is supported by experimental data. Reversion below 10% has been demonstrated, and magnesium successfully purified (>99.9%) from the collected powder. Safe operating procedures have been developed and demonstrated, minimizing the risk of powder explosion. The MagSonic™ Process is now ready to progress to significantly larger scale and continuous operation.
From Prime to Extended Mission: Evolution of the MER Tactical Uplink Process
NASA Technical Reports Server (NTRS)
Mishkin, Andrew H.; Laubach, Sharon
2006-01-01
To support a 90-day surface mission for two robotic rovers, the Mars Exploration Rover mission designed and implemented an intensive tactical operations process, enabling daily commanding of each rover. Using a combination of new processes, custom software tools, a Mars-time staffing schedule, and seven-day-a-week operations, the MER team was able to compress the traditional weeks-long command-turnaround for a deep space robotic mission to about 18 hours. However, the pace of this process was never intended to be continued indefinitely. Even before the end of the three-month prime mission, MER operations began evolving towards greater sustainability. A combination of continued software tool development, increasing team experience, and availability of reusable sequences first reduced the mean process duration to approximately 11 hours. The number of workshifts required to perform the process dropped, and the team returned to a modified 'Earth-time' schedule. Additional process and tool adaptation eventually provided the option of planning multiple Martian days of activity within a single workshift, making 5-day-a-week operations possible. The vast majority of the science team returned to their home institutions, continuing to participate fully in the tactical operations process remotely. MER has continued to operate for over two Earth-years as many of its key personnel have moved on to other projects, the operations team and budget have shrunk, and the rovers have begun to exhibit symptoms of aging.
NASA Astrophysics Data System (ADS)
Acevedo, Romina; Orihuela, Nuris; Blanco, Rafael; Varela, Francisco; Camacho, Enrique; Urbina, Marianela; Aponte, Luis Gabriel; Vallenilla, Leopoldo; Acuña, Liana; Becerra, Roberto; Tabare, Terepaima; Recaredo, Erica
2009-12-01
Built in cooperation with the P.R of China, in October 29th of 2008, the Bolivarian Republic of Venezuela launched its first Telecommunication Satellite, the so called VENESAT-1 (Simón Bolívar Satellite), which operates in C (covering Center America, The Caribbean Region and most of South America), Ku (Bolivia, Cuba, Dominican Republic, Haiti, Paraguay, Uruguay, Venezuela) and Ka bands (Venezuela). The launch of VENESAT-1 represents the starting point for Venezuela as an active player in the field of space science and technology. In order to fulfill mission requirements and to guarantee the satellite's health, local professionals must provide continuous monitoring, orbit calculation, maneuvers preparation and execution, data preparation and processing, as well as data base management at the VENESAT-1 Ground Segment, which includes both a primary and backup site. In summary, data processing and real time data management are part of the daily activities performed by the personnel at the ground segment. Using published and unpublished information, this paper presents how human resource organization can enhance space information acquisition and processing, by analyzing the proposed organizational structure for the VENESAT-1 Ground Segment. We have found that the proposed units within the organizational structure reflect 3 key issues for mission management: Satellite Operations, Ground Operations, and Site Maintenance. The proposed organization is simple (3 hierarchical levels and 7 units), and communication channels seem efficient in terms of facilitating information acquisition, processing, storage, flow and exchange. Furthermore, the proposal includes a manual containing the full description of personnel responsibilities and profile, which efficiently allocates the management and operation of key software for satellite operation such as the Real-time Data Transaction Software (RDTS), Data Management Software (DMS), and Carrier Spectrum Monitoring Software (CSM) within the different organizational units. In all this process, the international cooperation has played a key role for the consolidation of its space capabilities, especially through the continuous and arduous exchange of information, documentation and expertise between Chinese and Venezuelan personnel at the ground stations. Based on the principles of technology transfer and human training, since 1999 the Bolivarian Republic of Venezuela has shown an increasing interest in developing local space capabilities for peaceful purposes. According to the analysis we have performed, the proposed organizational structure of the VENESAT-1 ground segment will allow the country to face the challenges imposed by the operation of complex technologies. By enhancing human resource organization, this proposal will help to fulfill mission requirements, and to facilitate the safe access, processing and storage of satellite data across the organization, during both nominal and potential contingency situations.
Architectures Toward Reusable Science Data Systems
NASA Astrophysics Data System (ADS)
Moses, J. F.
2014-12-01
Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building ground systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research, NOAA's weather satellites and USGS's Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today. System functions such as ingest, product generation and distribution need to be configured and performed in a consistent and repeatable way with an emphasis on scalability. This paper will examine the key architectural elements of several NASA satellite data processing systems currently in operation and under development that make them suitable for scaling and reuse. Examples of architectural elements that have become attractive include virtual machine environments, standard data product formats, metadata content and file naming, workflow and job management frameworks, data acquisition, search, and distribution protocols. By highlighting key elements and implementation experience the goal is to recognize architectures that will outlast their original application and be readily adaptable for new applications. Concepts and principles are explored that lead to sound guidance for SDS developers and strategists.
Architectures Toward Reusable Science Data Systems
NASA Technical Reports Server (NTRS)
Moses, John
2015-01-01
Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research and NOAAs Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today. System functions such as ingest, product generation and distribution need to be configured and performed in a consistent and repeatable way with an emphasis on scalability. This paper will examine the key architectural elements of several NASA satellite data processing systems currently in operation and under development that make them suitable for scaling and reuse. Examples of architectural elements that have become attractive include virtual machine environments, standard data product formats, metadata content and file naming, workflow and job management frameworks, data acquisition, search, and distribution protocols. By highlighting key elements and implementation experience we expect to find architectures that will outlast their original application and be readily adaptable for new applications. Concepts and principles are explored that lead to sound guidance for SDS developers and strategists.
Laparoscopic repair of inguinal hernia in adults
Yang, Xue-Fei
2016-01-01
Laparoscopic repair of inguinal hernia is mini-invasive and has confirmed effects. The procedures include intraperitoneal onlay mesh (IPOM) repair, transabdominal preperitoneal (TAPP) repair and total extraperitoneal (TEP) repair. These procedures have totally different anatomic point of view, process and technical key points from open operations. The technical details of these operations are discussed in this article, also the strategies of treatment for some special conditions. PMID:27867954
2016-12-02
Quantum Computing , University of Waterloo, Waterloo ON, N2L 3G1, Canada (Dated: December 1, 2016) Continuous variable (CV) quantum key distribution (QKD...Networking with QUantum operationally-Secure Technology for Maritime Deployment (CONQUEST) Contract Period of Performance: 2 September 2016 – 1 September...this letter or have any other questions. Sincerely, Raytheon BBN Technologies Kathryn Carson Program Manager Quantum Information Processing
ITO-based evolutionary algorithm to solve traveling salesman problem
NASA Astrophysics Data System (ADS)
Dong, Wenyong; Sheng, Kang; Yang, Chuanhua; Yi, Yunfei
2014-03-01
In this paper, a ITO algorithm inspired by ITO stochastic process is proposed for Traveling Salesmen Problems (TSP), so far, many meta-heuristic methods have been successfully applied to TSP, however, as a member of them, ITO needs further demonstration for TSP. So starting from designing the key operators, which include the move operator, wave operator, etc, the method based on ITO for TSP is presented, and moreover, the ITO algorithm performance under different parameter sets and the maintenance of population diversity information are also studied.
Patel, Darshan C; Lyu, Yaqi Fara; Gandarilla, Jorge; Doherty, Steve
2018-04-03
In-process sampling and analysis is an important aspect of monitoring kinetic profiles and impurity formation or rejection, both in development and during commercial manufacturing. In pharmaceutical process development, the technology of choice for a substantial portion of this analysis is high-performance liquid chromatography (HPLC). Traditionally, the sample extraction and preparation for reaction characterization have been performed manually. This can be time consuming, laborious, and impractical for long processes. Depending on the complexity of the sample preparation, there can be variability introduced by different analysts, and in some cases, the integrity of the sample can be compromised during handling. While there are commercial instruments available for on-line monitoring with HPLC, they lack capabilities in many key areas. Some do not provide integration of the sampling and analysis, while others afford limited flexibility in sample preparation. The current offerings provide a limited number of unit operations available for sample processing and no option for workflow customizability. This work describes development of a microfluidic automated program (MAP) which fully automates the sample extraction, manipulation, and on-line LC analysis. The flexible system is controlled using an intuitive Microsoft Excel based user interface. The autonomous system is capable of unattended reaction monitoring that allows flexible unit operations and workflow customization to enable complex operations and on-line sample preparation. The automated system is shown to offer advantages over manual approaches in key areas while providing consistent and reproducible in-process data. Copyright © 2017 Elsevier B.V. All rights reserved.
Stoller, Marco; Ochando-Pulido, Javier Miguel; Field, Robert
2017-07-14
In the last decades, membrane processes have gained a significant share of the market for wastewater purification. Although the product (i.e., purified water) is not of high added value, these processes are feasible both technically and from an economic point of view, provided the flux is relatively high and that membrane fouling is strongly inhibited. By controlling membrane fouling, the membrane may work for years without service, thus dramatically reducing operating costs and the need for membrane substitution. There is tension between operating at high permeate fluxes, which enhances fouling but reduces capital costs, and operating at lower fluxes which increases capital costs. Operating batch membrane processes leads to increased difficulties, since the feed fed to the membrane changes as a function of the recovery value. This paper is concerned with the operation of such a process. Membrane process designers should therefore avoid membrane fouling by operating membranes away from the permeate flux point where severe fouling is triggered. The design and operation of membrane purification plants is a difficult task, and the precision to properly describe the evolution of the fouling phenomenon as a function of the operating conditions is a key to success. Many reported works have reported on the control of fouling by operating below the boundary flux. On the other hand, only a few works have successfully sought to exploit super-boundary operating conditions; most super-boundary operations are reported to have led to process failures. In this work, both sub- and super-boundary operating conditions for a batch nanofiltration membrane process used for olive mill wastewater treatment were investigated. A model to identify a priori the point of transition from a sub-boundary to a super-boundary operation during a batch operation was developed, and this will provide membrane designers with a helpful tool to carefully avoid process failures.
NASA Technical Reports Server (NTRS)
Sauerwein, Timothy
1989-01-01
The human factors design process in developing a shuttle orbiter aft flight deck workstation testbed is described. In developing an operator workstation to control various laboratory telerobots, strong elements of human factors engineering and ergonomics are integrated into the design process. The integration of human factors is performed by incorporating user feedback at key stages in the project life-cycle. An operator centered design approach helps insure the system users are working with the system designer in the design and operation of the system. The design methodology is presented along with the results of the design and the solutions regarding human factors design principles.
Historical data and analysis for the first five years of KSC STS payload processing
NASA Technical Reports Server (NTRS)
Ragusa, J. M.
1986-01-01
General and specific quantitative and qualitative results were identified from a study of actual operational experience while processing 186 science, applications, and commercial payloads for the first 5 years of Space Transportation System (STS) operations at the National Aeronautics and Space Administration's (NASA) John F. Kennedy Space Center (KSC). All non-Department of Defense payloads from STS-2 through STS-33 were part of the study. Historical data and cumulative program experiences from key personnel were used extensively. Emphasis was placed on various program planning and events that affected KSC processing, payload experiences and improvements, payload hardware condition after arrival, services to customers, and the impact of STS operations and delays. From these initial considerations, operational drivers were identified, data for selected processing parameters collected and analyzed, processing criteria and options determined, and STS payload results and conclusions reached. The study showed a significant reduction in time and effort needed by STS customers and KSC to process a wide variety of payload configurations. Also of significance is the fact that even the simplest payloads required more processing resources than were initially assumed. The success to date of payload integration, testing, and mission operations, however, indicates the soundness of the approach taken and the methods used.
The status of membrane bioreactor technology.
Judd, Simon
2008-02-01
In this article, the current status of membrane bioreactor (MBR) technology for wastewater treatment is reviewed. Fundamental facets of the MBR process and membrane and process configurations are outlined and the advantages and disadvantages over conventional suspended growth-based biotreatment are briefly identified. Key process design and operating parameters are defined and their significance explained. The inter-relationships between these parameters are identified and their implications discussed, with particular reference to impacts on membrane surface fouling and channel clogging. In addition, current understanding of membrane surface fouling and identification of candidate foulants is appraised. Although much interest in this technology exists and its penetration of the market will probably increase significantly, there remains a lack of understanding of key process constraints such as membrane channel clogging, and of the science of membrane cleaning.
Strategic planning for hotel operations: The Ritz-Carlton Hotel Company (Part II).
Shriver, S J
1993-01-01
The Ritz-Carlton Hotel Company won the Malcolm Baldrige National Quality Award in 1992. One key to its success is its strategic planning process. In this second part of a two-part article, Stephen Shriver concludes his review of the Ritz-Carlton's approach to strategic planning. Shriver begins by outlining some key steps in plan development and goes on to describe how the Ritz-Carlton disseminates, implements, and evaluates the plan.
Operations Analysis of the 2nd Generation Reusable Launch Vehicle
NASA Technical Reports Server (NTRS)
Noneman, Steven R.; Smith, C. A. (Technical Monitor)
2002-01-01
The Space Launch Initiative (SLI) program is developing a second-generation reusable launch vehicle. The program goals include lowering the risk of loss of crew to 1 in 10,000 and reducing annual operations cost to one third of the cost of the Space Shuttle. The SLI missions include NASA, military and commercial satellite launches and crew and cargo launches to the space station. The SLI operations analyses provide an assessment of the operational support and infrastructure needed to operate candidate system architectures. Measures of the operability are estimated (i.e. system dependability, responsiveness, and efficiency). Operations analysis is used to determine the impact of specific technologies on operations. A conceptual path to reducing annual operations costs by two thirds is based on key design characteristics, such as reusability, and improved processes lowering labor costs. New operations risks can be expected to emerge. They can be mitigated with effective risk management with careful identification, assignment, tracking, and closure. SLI design characteristics such as nearly full reusability, high reliability, advanced automation, and lowered maintenance and servicing coupled with improved processes are contributors to operability and large operating cost reductions.
NASA Astrophysics Data System (ADS)
Heckmann, G.; Route, G.
2009-12-01
The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes NPOESS satellite data to provide environmental data products (aka, Environmental Data Records or EDRs) to NOAA and DoD processing centers operated by the United States government. The IDPS will process EDRs beginning with the NPOESS Preparatory Project (NPP) and continuing through the lifetime of the NPOESS system. IDPS also provides the software and requirements for the Field Terminal Segment (FTS). NPOESS provides support to deployed field terminals by providing mission data in the Low Rate and High Rate downlinks (LRD/HRD), mission support data needed to generate EDRs and decryption keys needed to decrypt mission data during Selective data Encryption (SDE). Mission support data consists of globally relevant data, geographically constrained data, and two line element sets. NPOESS provides these mission support data via the Internet accessible Mission Support Data Server and HRD/LRD downlinks. This presentation will illustrate and describe the NPOESS capabilities in support of Field Terminal users. This discussion will include the mission support data available to Field Terminal users, content of the direct broadcast HRD and LRD downlinks identifying differences between the direct broadcast downlinks including the variability of the LRD downlink and NPOESS management and distribution of decryption keys to approved field terminals using Public Key Infrastructure (PKI) AES standard with 256 bit encryption and elliptical curve cryptography.
Highly accurate and fast optical penetration-based silkworm gender separation system
NASA Astrophysics Data System (ADS)
Kamtongdee, Chakkrit; Sumriddetchkajorn, Sarun; Chanhorm, Sataporn
2015-07-01
Based on our research work in the last five years, this paper highlights our innovative optical sensing system that can identify and separate silkworm gender highly suitable for sericulture industry. The key idea relies on our proposed optical penetration concepts and once combined with simple image processing operations leads to high accuracy in identifying of silkworm gender. Inside the system, there are electronic and mechanical parts that assist in controlling the overall system operation, processing the optical signal, and separating the female from male silkworm pupae. With current system performance, we achieve a very highly accurate more than 95% in identifying gender of silkworm pupae with an average system operational speed of 30 silkworm pupae/minute. Three of our systems are already in operation at Thailand's Queen Sirikit Sericulture Centers.
Scientific Hybrid Realtiy Environments (SHyRE): Bringing Field Work into the Laboratory
NASA Technical Reports Server (NTRS)
Miller, M. J.; Graff, T.; Young, K.; Coan, D.; Whelley, P.; Richardson, J.; Knudson, C.; Bleacher, J.; Garry, W. B.; Delgado, F.;
2018-01-01
The use of analog environments in preparing for future planetary surface exploration is key in ensuring we both understand the processes shaping other planetary surfaces as well as develop the technology, systems, and concepts of operations necessary to operate in these geologic environments. While conducting fieldwork and testing technology in relevant terrestrial field environments is crucial in this development, it is often the case that operational testing requires a time-intensive iterative process that is hampered by the rigorous conditions (e.g. terrain, weather, location, etc.) found in most field environments. Additionally, field deployments can be costly and must be scheduled months in advance, therefore limiting the testing opportunities required to investigate and compare science operational concepts to only once or twice per year.
Cima, Robert R; Brown, Michael J; Hebl, James R; Moore, Robin; Rogers, James C; Kollengode, Anantha; Amstutz, Gwendolyn J; Weisbrod, Cheryl A; Narr, Bradly J; Deschamps, Claude
2011-07-01
Operating rooms (ORs) are resource-intense and costly hospital units. Maximizing OR efficiency is essential to maintaining an economically viable institution. OR efficiency projects often focus on a limited number of ORs or cases. Efforts across an entire OR suite have not been reported. Lean and Six Sigma methodologies were developed in the manufacturing industry to increase efficiency by eliminating non-value-added steps. We applied Lean and Six Sigma methodologies across an entire surgical suite to improve efficiency. A multidisciplinary surgical process improvement team constructed a value stream map of the entire surgical process from the decision for surgery to discharge. Each process step was analyzed in 3 domains, ie, personnel, information processed, and time. Multidisciplinary teams addressed 5 work streams to increase value at each step: minimizing volume variation; streamlining the preoperative process; reducing nonoperative time; eliminating redundant information; and promoting employee engagement. Process improvements were implemented sequentially in surgical specialties. Key performance metrics were collected before and after implementation. Across 3 surgical specialties, process redesign resulted in substantial improvements in on-time starts and reduction in number of cases past 5 pm. Substantial gains were achieved in nonoperative time, staff overtime, and ORs saved. These changes resulted in substantial increases in margin/OR/day. Use of Lean and Six Sigma methodologies increased OR efficiency and financial performance across an entire operating suite. Process mapping, leadership support, staff engagement, and sharing performance metrics are keys to enhancing OR efficiency. The performance gains were substantial, sustainable, positive financially, and transferrable to other specialties. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Statistical process control: separating signal from noise in emergency department operations.
Pimentel, Laura; Barrueto, Fermin
2015-05-01
Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baroi, Chinmoy; Gaffney, Anne M.; Fushimi, Rebecca
Olefins or unsaturated hydrocarbons play a vital role as feedstock for many industrially significant processes. Ethylene is the simplest olefin and a key raw material for consumer products. Oxidative Dehydrogenation (ODH) is one of the most promising new routes for ethylene production that can offer a significant advantage in energy efficiency over the conventional steam pyrolysis process. This study is focused on the ODH chemistry using the mixed metal oxide MoVTeNbOx catalysts, generally referred to as M1 for the key phase known to be active for dehydrogenation. Using performance results from the patent literature a series of process simulations weremore » conducted to evaluate the effect of feed composition on operating costs, profitability and process safety. The key results of this study indicate that the ODH reaction can be made safer and more profitable without use of an inert diluent and furthermore by replacing O2 with CO2 as an oxidant. Modifications of the M1 catalyst composition in order to adopt these changes are discussed.« less
Metrology: Calibration and measurement processes guidelines
NASA Technical Reports Server (NTRS)
Castrup, Howard T.; Eicke, Woodward G.; Hayes, Jerry L.; Mark, Alexander; Martin, Robert E.; Taylor, James L.
1994-01-01
The guide is intended as a resource to aid engineers and systems contracts in the design, implementation, and operation of metrology, calibration, and measurement systems, and to assist NASA personnel in the uniform evaluation of such systems supplied or operated by contractors. Methodologies and techniques acceptable in fulfilling metrology quality requirements for NASA programs are outlined. The measurement process is covered from a high level through more detailed discussions of key elements within the process, Emphasis is given to the flowdown of project requirements to measurement system requirements, then through the activities that will provide measurements with defined quality. In addition, innovations and techniques for error analysis, development of statistical measurement process control, optimization of calibration recall systems, and evaluation of measurement uncertainty are presented.
Center for Navy Business Excellence: A Catalyst for Business Transformation
2005-12-01
selecting priority strategic initiatives and milestones. Key participants in the strategic planning process are Global Marketing and Sales Group...the Global Marketing Operations Group. This Group works to ensure current, best research practices are tested and utilized; establishing research
Community engagement as conflict prevention: Understanding the social license to operate
NASA Astrophysics Data System (ADS)
Knih, Dejana
This thesis examines community engagement as a form of conflict prevention in order to obtain the social license to operate (SLO) in Alberta's oil and gas industry. It does this by answering the question: what are the key elements of the Social License to Operate and how can these elements be applied to community engagement/consultation in a way that prevents conflicts in Alberta's oil and gas industry? The underlying assumption of this thesis is that building good relationships and working collaboratively functions as a form of conflict prevention and that this in turn leads to the SLO. This thesis outlines the key features of both successful community engagement and of the SLO, to provide a guideline for what is needed to obtain the SLO. Data was collected from semi-structured interviews and through a literature review. The data analysis concluded that there are direct parallels between the key elements of effective community engagement and the key elements of the SLO as identified in the interviews. These parallels are: knowing the community, addressing community needs, corporate social responsibility, relationship building, follow through and evidence for what has been done, executive buy-in, excellent communication, and open dialogue, all within a process which is principled (there is trust, understanding, transparency and respect), inclusive, dynamic, flexible, ongoing, and long-term. Moreover, the key elements of effective community engagement and of the SLO identified in the interviews also overlapped with those found in the literature review, with only one exception. The literature review explicitly named early involvement as a key element of both effective community engagement and the SLO, whereas the interview participants only explicitly indicated it as a key factor of community engagement and implied it to be a key element of the SLO.
A Socio-technical Approach for Transient SME Alliances
NASA Astrophysics Data System (ADS)
Rezgui, Yacine
The paper discusses technical requirements to promote the adoption of alliance modes of operation by SMEs in the construction sector. These requirements have provided a basis for specifying a set of functionality to support the collaboration and cooperation needs of SMEs. While service-oriented architectures and semantic web services provide the middleware technology to implement the identified functionality, a number of key technical limitations have been identified, including lack of support for the dynamic and non-functional characteristics of SME alliances distributed business processes, lack of execution monitoring functionality to manage running business processes, and lack of support for semantic reasoning to enable SME business process service composition. The paper examines these issues and provides key directions for supporting SME alliances effectively.
Pascual, Carlos; Luján, Marcos; Mora, José Ramón; Chiva, Vicente; Gamarra, Manuela
2015-01-01
The implantation of total quality management models in clinical departments can better adapt to the 2009 ISO 9004 model. An essential part of implantation of these models is the establishment of processes and their stabilization. There are four types of processes: key, management, support and operative (clinical). Management processes have four parts: process stabilization form, process procedures form, medical activities cost estimation form and, process flow chart. In this paper we will detail the creation of an essential process in a surgical department, such as the process of management of the surgery waiting list.
NASA Astrophysics Data System (ADS)
Wang, Jun; Li, Shi-Yu; Jiang, Feng; Wu, Ke; Liu, Guang-Li; Lu, Hui; Chen, Guang-Hao
2015-09-01
Oxic-settling-anaerobic process (OSA) was known as a cost-effective way to reduce the excess sludge production with simple upgrade of conventional activated sludge process (CAS). A low oxidation-reduction potential (ORP) level was the key factor to sludge decay and lysis in the sludge holding tank of the OSA process. However, the ORP control with nitrogen purge or chemical dosing in the OSA process would induce extra expense and complicate the operation. Hence, in this study, a sludge holding tank using gravity thickening was applied to OSA process to reduce the excess sludge production without any ORP control. Results showed that the modified OSA process not only reduced the excess sludge production effectively but also improved the sludge settleability without affected the treatment capacity. The reduction of the excess sludge production in the modified OSA process resulted from interactions among lots of factors. The key element of the process was the gravity thickening sludge holding tank.
Wavefront attributes in anisotropic media
NASA Astrophysics Data System (ADS)
Vanelle, C.; Abakumov, I.; Gajewski, D.
2018-07-01
Surface-measured wavefront attributes are the key ingredient to multiparameter methods, which are nowadays standard tools in seismic data processing. However, most operators are restricted to application to isotropic media. Whereas application of an isotropic operator will still lead to satisfactory stack results, further processing steps that interpret isotropic stacking parameters in terms of wavefront attributes will lead to erroneous results if anisotropy is present but not accounted for. In this paper, we derive relationships between the stacking parameters and anisotropic wavefront attributes that allow us to apply the common reflection surface type operator to 3-D media with arbitrary anisotropy for the zero-offset and finite-offset configurations including converted waves. The operator itself is expressed in terms of wavefront attributes that are measured in the acquisition surface, that is, no model assumptions are made. Numerical results confirm that the accuracy of the new anisotropic operator is of the same magnitude as that of its isotropic counterpart.
Matching relations for optimal entanglement concentration and purification
Kong, Fan-Zhen; Xia, Hui-Zhi; Yang, Ming; Yang, Qing; Cao, Zhuo-Liang
2016-01-01
The bilateral controlled NOT (CNOT) operation plays a key role in standard entanglement purification process, but the CNOT operation may not be the optimal joint operation in the sense that the output entanglement is maximized. In this paper, the CNOT operations in both the Schmidt-projection based entanglement concentration and the entanglement purification schemes are replaced with a general joint unitary operation, and the optimal matching relations between the entangling power of the joint unitary operation and the non-maximal entangled channel are found for optimizing the entanglement in- crement or the output entanglement. The result is somewhat counter-intuitive for entanglement concentration. The output entanglement is maximized when the entangling power of the joint unitary operation and the quantum channel satisfy certain relation. There exist a variety of joint operations with non-maximal entangling power that can induce a maximal output entanglement, which will greatly broaden the set of the potential joint operations in entanglement concentration. In addition, the entanglement increment in purification process is maximized only by the joint unitary operations (including CNOT) with maximal entangling power. PMID:27189800
Defining competency-based evaluation objectives in family medicine
Lawrence, Kathrine; Allen, Tim; Brailovsky, Carlos; Crichton, Tom; Bethune, Cheri; Donoff, Michel; Laughlin, Tom; Wetmore, Stephen; Carpentier, Marie-Pierre; Visser, Shaun
2011-01-01
Abstract Objective To develop key features for priority topics previously identified by the College of Family Physicians of Canada that, together with skill dimensions and phases of the clinical encounter, broadly describe competence in family medicine. Design Modified nominal group methodology, which was used to develop key features for each priority topic through an iterative process. Setting The College of Family Physicians of Canada. Participants An expert group of 7 family physicians and 1 educational consultant, all of whom had experience in assessing competence in family medicine. Group members represented the Canadian family medicine context with respect to region, sex, language, community type, and experience. Methods The group used a modified Delphi process to derive a detailed operational definition of competence, using multiple iterations until consensus was achieved for the items under discussion. The group met 3 to 4 times a year from 2000 to 2007. Main findings The group analyzed 99 topics and generated 773 key features. There were 2 to 20 (average 7.8) key features per topic; 63% of the key features focused on the diagnostic phase of the clinical encounter. Conclusion This project expands previous descriptions of the process of generating key features for assessment, and removes this process from the context of written examinations. A key-features analysis of topics focuses on higher-order cognitive processes of clinical competence. The project did not define all the skill dimensions of competence to the same degree, but it clearly identified those requiring further definition. This work generates part of a discipline-specific, competency-based definition of family medicine for assessment purposes. It limits the domain for assessment purposes, which is an advantage for the teaching and assessment of learners. A validation study on the content of this work would ensure that it truly reflects competence in family medicine. PMID:21998245
A New Color Image Encryption Scheme Using CML and a Fractional-Order Chaotic System
Wu, Xiangjun; Li, Yang; Kurths, Jürgen
2015-01-01
The chaos-based image cryptosystems have been widely investigated in recent years to provide real-time encryption and transmission. In this paper, a novel color image encryption algorithm by using coupled-map lattices (CML) and a fractional-order chaotic system is proposed to enhance the security and robustness of the encryption algorithms with a permutation-diffusion structure. To make the encryption procedure more confusing and complex, an image division-shuffling process is put forward, where the plain-image is first divided into four sub-images, and then the position of the pixels in the whole image is shuffled. In order to generate initial conditions and parameters of two chaotic systems, a 280-bit long external secret key is employed. The key space analysis, various statistical analysis, information entropy analysis, differential analysis and key sensitivity analysis are introduced to test the security of the new image encryption algorithm. The cryptosystem speed is analyzed and tested as well. Experimental results confirm that, in comparison to other image encryption schemes, the new algorithm has higher security and is fast for practical image encryption. Moreover, an extensive tolerance analysis of some common image processing operations such as noise adding, cropping, JPEG compression, rotation, brightening and darkening, has been performed on the proposed image encryption technique. Corresponding results reveal that the proposed image encryption method has good robustness against some image processing operations and geometric attacks. PMID:25826602
High Capacity Single Table Performance Design Using Partitioning in Oracle or PostgreSQL
2012-03-01
Indicators ( KPIs ) 13 5. Conclusion 14 List of Symbols, Abbreviations, and Acronyms 15 Distribution List 16 iv List of Figures Figure 1. Oracle...Figure 7. Time to seek and return one record. 4. Additional Key Performance Indicators ( KPIs ) In addition to pure response time, there are other...Laboratory ASM Automatic Storage Management CPU central processing unit I/O input/output KPIs key performance indicators OS operating system
Forscher, Emily C.; Zheng, Yan; Ke, Zijun; Folstein, Jonathan; Li, Wen
2016-01-01
Emotion perception is known to involve multiple operations and waves of analysis, but specific nature of these processes remains poorly understood. Combining psychophysical testing and neurometric analysis of event-related potentials (ERPs) in a fear detection task with parametrically-varied fear intensities (N=45), we sought to elucidate key processes in fear perception. Building on psychophysics marking fear perception thresholds, our neurometric model fitting identified several putative operations and stages: four key processes arose in sequence following face presentation—fear-neutral categorization (P1 at 100 ms), fear detection (P300 at 320 ms), valuation (early subcomponent of the late positive potential/LPP at 400–500 ms) and conscious awareness (late subcomponent LPP at 500–600 ms). Furthermore, within-subject brain-behavior association suggests that initial emotion categorization was mandatory and detached from behavior whereas valuation and conscious awareness directly impacted behavioral outcome (explaining 17% and 31% of the total variance, respectively). The current study thus reveals the chronometry of fear perception, ascribing psychological meaning to distinct underlying processes. The combination of early categorization and late valuation of fear reconciles conflicting (categorical versus dimensional) emotion accounts, lending support to a hybrid model. Importantly, future research could specifically interrogate these psychological processes in various behaviors and psychopathologies (e.g., anxiety and depression). PMID:27546075
High density circuit technology, part 3
NASA Technical Reports Server (NTRS)
Wade, T. E.
1982-01-01
Dry processing - both etching and deposition - and present/future trends in semiconductor technology are discussed. In addition to a description of the basic apparatus, terminology, advantages, glow discharge phenomena, gas-surface chemistries, and key operational parameters for both dry etching and plasma deposition processes, a comprehensive survey of dry processing equipment (via vendor listing) is also included. The following topics are also discussed: fine-line photolithography, low-temperature processing, packaging for dense VLSI die, the role of integrated optics, and VLSI and technology innovations.
Industry Training: Causes and Consequences.
ERIC Educational Resources Information Center
Smith, Andrew; Freeland, Brett
Research on Australian organizations in five industry sectors--building and construction, food processing, electronics manufacturing, retailing, and finance and banking--has identified these three key drivers of enterprise training: workplace change, quality assurance, and new technology. Operation of the training drivers is moderated by a range…
Preliminary techno-economic analysis of these processes will be undertaken, utilizing the literature and including key supporting data and proof-of-principle experiments. The emphasis on low-cost bioreactors and operation greatly enhances the economic feasibility and practica...
Partitioned key-value store with atomic memory operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bent, John M.; Faibish, Sorin; Grider, Gary
A partitioned key-value store is provided that supports atomic memory operations. A server performs a memory operation in a partitioned key-value store by receiving a request from an application for at least one atomic memory operation, the atomic memory operation comprising a memory address identifier; and, in response to the atomic memory operation, performing one or more of (i) reading a client-side memory location identified by the memory address identifier and storing one or more key-value pairs from the client-side memory location in a local key-value store of the server; and (ii) obtaining one or more key-value pairs from themore » local key-value store of the server and writing the obtained one or more key-value pairs into the client-side memory location identified by the memory address identifier. The server can perform functions obtained from a client-side memory location and return a result to the client using one or more of the atomic memory operations.« less
Room temperature continuous wave operation of quantum cascade laser at λ ~ 9.4 μm
NASA Astrophysics Data System (ADS)
Hou, Chuncai; Zhao, Yue; Zhang, Jinchuan; Zhai, Shenqiang; Zhuo, Ning; Liu, Junqi; Wang, Lijun; Liu, Shuman; Liu, Fengqi; Wang, Zhanguo
2018-03-01
Continuous wave (CW) operation of long wave infrared (LWIR) quantum cascade lasers (QCLs) is achieved up to a temperature of 303 K. For room temperature CW operation, the wafer with 35 stages was processed into buried heterostructure lasers. For a 2-mm-long and 10-μm-wide laser with high-reflectivity (HR) coating on the rear facet, CW output power of 45 mW at 283 K and 9 mW at 303 K is obtained. The lasing wavelength is around 9.4 μm locating in the LWIR spectrum range. Project supported by the National Key Research And Development Program (No. 2016YFB0402303), the National Natural Science Foundation of China (Nos. 61435014, 61627822, 61574136, 61774146, 61674144, 61404131), the Key Projects of Chinese Academy of Sciences (Nos. ZDRW-XH-2016-4, QYZDJ-SSW-JSC027), and the Beijing Natural Science Foundation (No. 4162060, 4172060).
NASA Astrophysics Data System (ADS)
Miharja, M.; Priadi, Y. N.
2018-05-01
Promoting a better public transport is a key strategy to cope with urban transport problems which are mostly caused by a huge private vehicle usage. A better public transport service quality not only focuses on one type of public transport mode, but also concerns on inter modes service integration. Fragmented inter mode public transport service leads to a longer trip chain as well as average travel time which would result in its failure to compete with a private vehicle. This paper examines the optimation process of operation system integration between Trans Jakarta Bus as the main public transport mode and Kopaja Bus as feeder public transport service in Jakarta. Using scoring-interview method combined with standard parameters in operation system integration, this paper identifies the key factors that determine the success of the two public transport operation system integrations. The study found that some key integration parameters, such as the cancellation of “system setoran”, passenger get in-get out at official stop points, and systematic payment, positively contribute to a better service integration. However, some parameters such as fine system, time and changing point reliability, and information system reliability are among those which need improvement. These findings are very useful for the authority to set the right strategy to improve operation system integration between Trans Jakarta and Kopaja Bus services.
3D-additive manufactured optical mount
NASA Astrophysics Data System (ADS)
Mammini, Paul V.; Ciscel, David; Wooten, John
2015-09-01
The Area Defense Anti-Munitions (ADAM) is a low cost and effective high power laser weapon system. It's designed to address and negate important threats such as short-range rockets, UAVs, and small boats. Many critical optical components operate in the system. The optics and mounts must accommodate thermal and mechanical stresses, plus maintain an exceptional wave front during operation. Lockheed Martin Space Systems Company (LMSSC) developed, designed, and currently operates ADAM. This paper covers the design and development of a key monolithic, flexured, titanium mirror mount that was manufactured by CalRAM using additive processes.
Strategic planning for hotel operations: The Ritz-Carlton Hotel Company (Part I).
Shriver, S J
1993-01-01
The Ritz-Carlton Hotel Company won the Malcolm Baldridge National Quality Award in 1992. One key to its success is its strategic planning process. This two-part article reviews the Ritz-Carlton's approach to strategic planning. In particular, it describes (1) the role of senior leadership in the planning process and (2) the specific activities that are associated with plan development and implementation.
The Landsat Data Continuity Mission Operational Land Imager (OLI) Radiometric Calibration
NASA Technical Reports Server (NTRS)
Markham, Brian L.; Dabney, Philip W.; Murphy-Morris, Jeanine E.; Knight, Edward J.; Kvaran, Geir; Barsi, Julia A.
2010-01-01
The Operational Land Imager (OLI) on the Landsat Data Continuity Mission (LDCM) has a comprehensive radiometric characterization and calibration program beginning with the instrument design, and extending through integration and test, on-orbit operations and science data processing. Key instrument design features for radiometric calibration include dual solar diffusers and multi-lamped on-board calibrators. The radiometric calibration transfer procedure from NIST standards has multiple checks on the radiometric scale throughout the process and uses a heliostat as part of the transfer to orbit of the radiometric calibration. On-orbit lunar imaging will be used to track the instruments stability and side slither maneuvers will be used in addition to the solar diffuser to flat field across the thousands of detectors per band. A Calibration Validation Team is continuously involved in the process from design to operations. This team uses an Image Assessment System (IAS), part of the ground system to characterize and calibrate the on-orbit data.
Performance analysis of Supply Chain Management with Supply Chain Operation reference model
NASA Astrophysics Data System (ADS)
Hasibuan, Abdurrozzaq; Arfah, Mahrani; Parinduri, Luthfi; Hernawati, Tri; Suliawati; Harahap, Bonar; Rahmah Sibuea, Siti; Krianto Sulaiman, Oris; purwadi, Adi
2018-04-01
This research was conducted at PT. Shamrock Manufacturing Corpora, the company is required to think creatively to implement competition strategy by producing goods/services that are more qualified, cheaper. Therefore, it is necessary to measure the performance of Supply Chain Management in order to improve the competitiveness. Therefore, the company is required to optimize its production output to meet the export quality standard. This research begins with the creation of initial dimensions based on Supply Chain Management process, ie Plan, Source, Make, Delivery, and Return with hierarchy based on Supply Chain Reference Operation that is Reliability, Responsiveness, Agility, Cost, and Asset. Key Performance Indicator identification becomes a benchmark in performance measurement whereas Snorm De Boer normalization serves to equalize Key Performance Indicator value. Analiytical Hierarchy Process is done to assist in determining priority criteria. Measurement of Supply Chain Management performance at PT. Shamrock Manufacturing Corpora produces SC. Responsiveness (0.649) has higher weight (priority) than other alternatives. The result of performance analysis using Supply Chain Reference Operation model of Supply Chain Management performance at PT. Shamrock Manufacturing Corpora looks good because its monitoring system between 50-100 is good.
Fault-tolerant composite Householder reflection
NASA Astrophysics Data System (ADS)
Torosov, Boyan T.; Kyoseva, Elica; Vitanov, Nikolay V.
2015-07-01
We propose a fault-tolerant implementation of the quantum Householder reflection, which is a key operation in various quantum algorithms, quantum-state engineering, generation of arbitrary unitaries, and entanglement characterization. We construct this operation using the modular approach of composite pulses and a relation between the Householder reflection and the quantum phase gate. The proposed implementation is highly insensitive to variations in the experimental parameters, which makes it suitable for high-fidelity quantum information processing.
Proceedings of the 4th Conference on Aerospace Materials, Processes, and Environmental Technology
NASA Technical Reports Server (NTRS)
Griffin, D. E. (Editor); Stanley, D. C. (Editor)
2001-01-01
The next millennium challenges us to produce innovative materials, processes, manufacturing, and environmental technologies that meet low-cost aerospace transportation needs while maintaining US leadership. The pursuit of advanced aerospace materials, manufacturing processes, and environmental technologies supports the development of safer, operational, next-generation, reusable, and expendable aeronautical and space vehicle systems. The Aerospace Materials, Processes, and Environmental Technology Conference (AMPET) provided a forum for manufacturing, environmental, materials, and processes engineers, scientists, and managers to describe, review, and critically assess advances in these key technology areas.
Configuration Management, Capacity Planning Decision Support, Modeling and Simulation
1988-12-01
flow includes both top-down and bottom-up requirements. The flow also includes hardware, software and transfer acquisition, installation, operation ... management and upgrade as required. Satisfaction of a users needs and requirements is a difficult and detailed process. The key assumptions at this
76 FR 11433 - Federal Transition To Secure Hash Algorithm (SHA)-256
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-02
... generating digital signatures. Current information systems, Web servers, applications and workstation operating systems were designed to process, and use SHA-1 generated signatures. National Institute of... cryptographic keys, and more robust algorithms by December 2013. Government systems may begin to encounter...
NASA Astrophysics Data System (ADS)
Li, Xiaoying; Zhu, Qinghua
2017-01-01
The question on how to evaluate a company's green practice has recently become a key strategic consideration for the food service supply chain management. This paper proposed a novel hybrid model that combines a fuzzy Decision Making Trial And Evaluation Laboratory(DEMATEL) and Analysis Network Process(ANP) methods, which developed the green restaurant criteria and demonstrated the complicated relations among various criteria to help the food service operation to better analyze the real-world situation and determine the different weight value of the criteria .The analysis of the evaluation of green practices will help the food service operation to be clear about the key measures of green practice to improve supply chain management.
Practical Use of Operation Data in the Process Industry
NASA Astrophysics Data System (ADS)
Kano, Manabu
This paper aims to reveal real problems in the process industry and introduce recent development to solve such problems from the viewpoint of effective use of operation data. Two topics are discussed: virtual sensor and process control. First, in order to clarify the present state and problems, a part of our recent questionnaire survey of process control is quoted. It is emphasized that maintenance is a key issue not only for soft-sensors but also for controllers. Then, new techniques are explained. The first one is correlation-based just-in-time modeling (CoJIT), which can realize higher prediction performance than conventional methods and simplify model maintenance. The second is extended fictitious reference iterative tuning (E-FRIT), which can realize data-driven PID control parameter tuning without process modeling. The great usefulness of these techniques are demonstrated through their industrial applications.
Xu, Zhihao; Li, Jason; Zhou, Joe X
2012-01-01
Aggregate removal is one of the most important aspects in monoclonal antibody (mAb) purification. Cation-exchange chromatography (CEX), a widely used polishing step in mAb purification, is able to clear both process-related impurities and product-related impurities. In this study, with the implementation of quality by design (QbD), a process development approach for robust removal of aggregates using CEX is described. First, resin screening studies were performed and a suitable CEX resin was chosen because of its relatively better selectivity and higher dynamic binding capacity. Second, a pH-conductivity hybrid gradient elution method for the CEX was established, and the risk assessment for the process was carried out. Third, a process characterization study was used to evaluate the impact of the potentially important process parameters on the process performance with respect to aggregate removal. Accordingly, a process design space was established. Aggregate level in load is the critical parameter. Its operating range is set at 0-3% and the acceptable range is set at 0-5%. Equilibration buffer is the key parameter. Its operating range is set at 40 ± 5 mM acetate, pH 5.0 ± 0.1, and acceptable range is set at 40 ± 10 mM acetate, pH 5.0 ± 0.2. Elution buffer, load mass, and gradient elution volume are non-key parameters; their operating ranges and acceptable ranges are equally set at 250 ± 10 mM acetate, pH 6.0 ± 0.2, 45 ± 10 g/L resin, and 10 ± 20% CV respectively. Finally, the process was scaled up 80 times and the impurities removal profiles were revealed. Three scaled-up runs showed that the size-exclusion chromatography (SEC) purity of the CEX pool was 99.8% or above and the step yield was above 92%, thereby proving that the process is both consistent and robust.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ladd-Lively, Jennifer L
2014-01-01
The objective of this work was to determine the feasibility of using on-line multivariate statistical process control (MSPC) for safeguards applications in natural uranium conversion plants. Multivariate statistical process control is commonly used throughout industry for the detection of faults. For safeguards applications in uranium conversion plants, faults could include the diversion of intermediate products such as uranium dioxide, uranium tetrafluoride, and uranium hexafluoride. This study was limited to a 100 metric ton of uranium (MTU) per year natural uranium conversion plant (NUCP) using the wet solvent extraction method for the purification of uranium ore concentrate. A key component inmore » the multivariate statistical methodology is the Principal Component Analysis (PCA) approach for the analysis of data, development of the base case model, and evaluation of future operations. The PCA approach was implemented through the use of singular value decomposition of the data matrix where the data matrix represents normal operation of the plant. Component mole balances were used to model each of the process units in the NUCP. However, this approach could be applied to any data set. The monitoring framework developed in this research could be used to determine whether or not a diversion of material has occurred at an NUCP as part of an International Atomic Energy Agency (IAEA) safeguards system. This approach can be used to identify the key monitoring locations, as well as locations where monitoring is unimportant. Detection limits at the key monitoring locations can also be established using this technique. Several faulty scenarios were developed to test the monitoring framework after the base case or normal operating conditions of the PCA model were established. In all of the scenarios, the monitoring framework was able to detect the fault. Overall this study was successful at meeting the stated objective.« less
Using pattern enumeration to accelerate process development and ramp yield
NASA Astrophysics Data System (ADS)
Zhuang, Linda; Pang, Jenny; Xu, Jessy; Tsai, Mengfeng; Wang, Amy; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh; Ding, Hua
2016-03-01
During a new technology node process setup phase, foundries do not initially have enough product chip designs to conduct exhaustive process development. Different operational teams use manually designed simple test keys to set up their process flows and recipes. When the very first version of the design rule manual (DRM) is ready, foundries enter the process development phase where new experiment design data is manually created based on these design rules. However, these IP/test keys contain very uniform or simple design structures. This kind of design normally does not contain critical design structures or process unfriendly design patterns that pass design rule checks but are found to be less manufacturable. It is desired to have a method to generate exhaustive test patterns allowed by design rules at development stage to verify the gap of design rule and process. This paper presents a novel method of how to generate test key patterns which contain known problematic patterns as well as any constructs which designers could possibly draw based on current design rules. The enumerated test key patterns will contain the most critical design structures which are allowed by any particular design rule. A layout profiling method is used to do design chip analysis in order to find potential weak points on new incoming products so fab can take preemptive action to avoid yield loss. It can be achieved by comparing different products and leveraging the knowledge learned from previous manufactured chips to find possible yield detractors.
Yuen, Yeo Tze; Sharratt, Paul N; Jie, Bu
2016-11-01
Numerous carbon dioxide mineralization (CM) processes have been proposed to overcome the slow rate of natural weathering of silicate minerals. Ten of these proposals are mentioned in this article. The proposals are described in terms of the four major areas relating to CM process design: pre-treatment, purification, carbonation, and reagent recycling operations. Any known specifics based on probable or representative operating and reaction conditions are listed, and basic analysis of the strengths and shortcomings associated with the individual process designs are given in this article. The processes typically employ physical or chemical pseudo-catalytic methods to enhance the rate of carbon dioxide mineralization; however, both methods have its own associated advantages and problems. To examine the feasibility of a CM process, three key aspects should be included in the evaluation criteria: energy use, operational considerations as well as product value and economics. Recommendations regarding the optimal level of emphasis and implementation of measures to control these aspects are given, and these will depend very much on the desired process objectives. Ultimately, a mix-and-match approach to process design might be required to provide viable and economic proposals for CM processes.
NASA Astrophysics Data System (ADS)
Ruiz-Cárcel, C.; Jaramillo, V. H.; Mba, D.; Ottewill, J. R.; Cao, Y.
2016-01-01
The detection and diagnosis of faults in industrial processes is a very active field of research due to the reduction in maintenance costs achieved by the implementation of process monitoring algorithms such as Principal Component Analysis, Partial Least Squares or more recently Canonical Variate Analysis (CVA). Typically the condition of rotating machinery is monitored separately using vibration analysis or other specific techniques. Conventional vibration-based condition monitoring techniques are based on the tracking of key features observed in the measured signal. Typically steady-state loading conditions are required to ensure consistency between measurements. In this paper, a technique based on merging process and vibration data is proposed with the objective of improving the detection of mechanical faults in industrial systems working under variable operating conditions. The capabilities of CVA for detection and diagnosis of faults were tested using experimental data acquired from a compressor test rig where different process faults were introduced. Results suggest that the combination of process and vibration data can effectively improve the detectability of mechanical faults in systems working under variable operating conditions.
SPIDER: A Framework for Understanding Driver Distraction.
Strayer, David L; Fisher, Donald L
2016-02-01
The objective was to identify key cognitive processes that are impaired when drivers divert attention from driving. Driver distraction is increasingly recognized as a significant source of injuries and fatalities on the roadway. A "SPIDER" model is developed that identifies key cognitive processes that are impaired when drivers divert attention from driving. SPIDER is an acronym standing for scanning, predicting, identifying, decision making, and executing a response. When drivers engage in secondary activities unrelated to the task of driving, SPIDER-related processes are impaired, situation awareness is degraded, and the ability to safely operate a motor vehicle may be compromised. The pattern of interference helps to illuminate the sources of driver distraction and may help guide the integration of new technology into the automobile. © 2015, Human Factors and Ergonomics Society.
Investigation Of In-Line Monitoring Options At H Canyon/HB Line For Plutonium Oxide Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sexton, L.
2015-10-14
H Canyon and HB Line have a production goal of 1 MT per year of plutonium oxide feedstock for the MOX facility by FY17 (AFS-2 mission). In order to meet this goal, steps will need to be taken to improve processing efficiency. One concept for achieving this goal is to implement in-line process monitoring at key measurement points within the facilities. In-line monitoring during operations has the potential to increase throughput and efficiency while reducing costs associated with laboratory sample analysis. In the work reported here, we mapped the plutonium oxide process, identified key measurement points, investigated alternate technologies thatmore » could be used for in-line analysis, and initiated a throughput benefit analysis.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillin, Charmel; Lipscomb, Brian
This project aimed at supporting one key component of a major multi-step undertaking on the part of the CSKT: the acquisition of the Kerr Hydroelectric project and its subsequent operation as a wholesale power generation facility. This project provided support to kick-start the development of the organizational structure to acquire and operate the facility by acquiring critical expertise necessary for the acquisition by funding in part two key personnel for the first two years of the four-year organizational development process. These individuals provided the Tribes with expert knowledge in the highly specialized areas of resource balancing, power marketing, and hydro-engineering;more » essential prerequisites to the Tribes' ability to build an organization for the operation of the Kerr Project and to securing financial backing for the acquisition. Goals achieved: • Establishing an efficient and economic conveyance process, and transition plans • Establishing an efficient and effective Tribal wholesale power generation corporation to manage the plant, balance the resources, and market the power from the Kerr Project. The success of this project, which is essential to the Tribes' acquisition of the Kerr Hydroelectric facility, helps to address poverty and unemployment among Tribal members by generating a number of highly skilled and specialized, high-paying Tribal member jobs and providing a stream of income from power sales that will be used for Tribal economic development. Objectives achieved: The project supported the position of Power Plant Operations and Maintenance engineer and power marketing coordinator positions. These are key, in part, to the Tribes' successful acquisition and operation of the facility because they will enable to the Tribes to gain the very specialized expertise required to operate a large wholesale power generation facility. Specific objectives include: Objective 1: Hire a power marketing coordinator to develop and coordinate the appropriate power marketing strategy for the sale of power generated by the operation of Kerr Dam. Objective 2: Hire a staff engineer.« less
ERIC Educational Resources Information Center
Tulsa Public Schools, OK.
A task force sought to determine the information needs of the Tulsa Public Schools (Oklahoma) by studying goals of the school district, identifying all processes necessary for operation of the school system, and conducting interviews with 48 key members. A detailed information systems matrix was constructed to show the interrelationships between…
The development of health care data warehouses to support data mining.
Lyman, Jason A; Scully, Kenneth; Harrison, James H
2008-03-01
Clinical data warehouses offer tremendous benefits as a foundation for data mining. By serving as a source for comprehensive clinical and demographic information on large patient populations, they streamline knowledge discovery efforts by providing standard and efficient mechanisms to replace time-consuming and expensive original data collection, organization, and processing. Building effective data warehouses requires knowledge of and attention to key issues in database design, data acquisition and processing, and data access and security. In this article, the authors provide an operational and technical definition of data warehouses, present examples of data mining projects enabled by existing data warehouses, and describe key issues and challenges related to warehouse development and implementation.
NASA Technical Reports Server (NTRS)
Lauer, H. V. Jr.; Ming, D. W.; Sutter, B.; Mahaffy, P. R.
2010-01-01
The Mars Science Laboratory (MSL) is scheduled for launch in 2011. The science objectives for MSL are to assess the past or present biological potential, to characterize the geology, and to investigate other planetary processes that influence habitability at the landing site. The Sample Analysis at Mars (SAM) is a key instrument on the MSL payload that will explore the potential habitability at the landing site [1]. In addition to searching for organic compounds, SAM will have the capability to characterized evolved gases as a function of increasing temperature and provide information on the mineralogy of volatile-bearing phases such as carbonates, sulfates, phyllosilicates, and Fe-oxyhydroxides. The operating conditions in SAM ovens will be maintained at 30 mb pressure with a He carrier gas flowing at 1 sccm. We have previously characterized the thermal and evolved gas behaviors of volatile-bearing species under reduced pressure conditions that simulated operating conditions of the Thermal and Evolved Gas Analyzer (TEGA) that was onboard the 2007 Mars Phoenix Scout Mission [e.g., 2-8]. TEGA ovens operated at 12 mb pressure with a N2 carrier gas flowing at 0.04 sccm. Another key difference between SAM and TEGA is that TEGA was able to perform differential scanning calorimetry whereas SAM only has a pyrolysis oven. The operating conditions for TEGA and SAM have several key parameter differences including operating pressure (12 vs 30 mb), carrier gas (N2 vs. He), and carrier gas flow rate (0.04 vs 1 sccm). The objectives of this study are to characterize the thermal and evolved gas analysis of calcite under SAM operating conditions and then compare it to calcite thermal and evolved gas analysis under TEGA operating conditions.
Landsat-5 bumper-mode geometric correction
Storey, James C.; Choate, Michael J.
2004-01-01
The Landsat-5 Thematic Mapper (TM) scan mirror was switched from its primary operating mode to a backup mode in early 2002 in order to overcome internal synchronization problems arising from long-term wear of the scan mirror mechanism. The backup bumper mode of operation removes the constraints on scan start and stop angles enforced in the primary scan angle monitor operating mode, requiring additional geometric calibration effort to monitor the active scan angles. It also eliminates scan timing telemetry used to correct the TM scan geometry. These differences require changes to the geometric correction algorithms used to process TM data. A mathematical model of the scan mirror's behavior when operating in bumper mode was developed. This model includes a set of key timing parameters that characterize the time-varying behavior of the scan mirror bumpers. To simplify the implementation of the bumper-mode model, the bumper timing parameters were recast in terms of the calibration and telemetry data items used to process normal TM imagery. The resulting geometric performance, evaluated over 18 months of bumper-mode operations, though slightly reduced from that achievable in the primary operating mode, is still within the Landsat specifications when the data are processed with the most up-to-date calibration parameters.
NASA Astrophysics Data System (ADS)
Ireland, Gareth; North, Matthew R.; Petropoulos, George P.; Srivastava, Prashant K.; Hodges, Crona
2015-04-01
Acquiring accurate information on the spatio-temporal variability of soil moisture content (SM) and evapotranspiration (ET) is of key importance to extend our understanding of the Earth system's physical processes, and is also required in a wide range of multi-disciplinary research studies and applications. The utility and applicability of Earth Observation (EO) technology provides an economically feasible solution to derive continuous spatio-temporal estimates of key parameters characterising land surface interactions, including ET as well as SM. Such information is of key value to practitioners, decision makers and scientists alike. The PREMIER-EO project recently funded by High Performance Computing Wales (HPCW) is a research initiative directed towards the development of a better understanding of EO technology's present ability to derive operational estimations of surface fluxes and SM. Moreover, the project aims at addressing knowledge gaps related to the operational estimation of such parameters, and thus contribute towards current ongoing global efforts towards enhancing the accuracy of those products. In this presentation we introduce the PREMIER-EO project, providing a detailed overview of the research aims and objectives for the 1 year duration of the project's implementation. Subsequently, we make available the initial results of the work carried out herein, in particular, related to an all-inclusive and robust evaluation of the accuracy of existing operational products of ET and SM from different ecosystems globally. The research outcomes of this project, once completed, will provide an important contribution towards addressing the knowledge gaps related to the operational estimation of ET and SM. This project results will also support efforts ongoing globally towards the operational development of related products using technologically advanced EO instruments which were launched recently or planned be launched in the next 1-2 years. Key Words: PREMIER-EO, HPC Wales, Soil Moisture, Evapotranspiration, , Earth Observation
High speed and adaptable error correction for megabit/s rate quantum key distribution.
Dixon, A R; Sato, H
2014-12-02
Quantum Key Distribution is moving from its theoretical foundation of unconditional security to rapidly approaching real world installations. A significant part of this move is the orders of magnitude increases in the rate at which secure key bits are distributed. However, these advances have mostly been confined to the physical hardware stage of QKD, with software post-processing often being unable to support the high raw bit rates. In a complete implementation this leads to a bottleneck limiting the final secure key rate of the system unnecessarily. Here we report details of equally high rate error correction which is further adaptable to maximise the secure key rate under a range of different operating conditions. The error correction is implemented both in CPU and GPU using a bi-directional LDPC approach and can provide 90-94% of the ideal secure key rate over all fibre distances from 0-80 km.
High speed and adaptable error correction for megabit/s rate quantum key distribution
Dixon, A. R.; Sato, H.
2014-01-01
Quantum Key Distribution is moving from its theoretical foundation of unconditional security to rapidly approaching real world installations. A significant part of this move is the orders of magnitude increases in the rate at which secure key bits are distributed. However, these advances have mostly been confined to the physical hardware stage of QKD, with software post-processing often being unable to support the high raw bit rates. In a complete implementation this leads to a bottleneck limiting the final secure key rate of the system unnecessarily. Here we report details of equally high rate error correction which is further adaptable to maximise the secure key rate under a range of different operating conditions. The error correction is implemented both in CPU and GPU using a bi-directional LDPC approach and can provide 90–94% of the ideal secure key rate over all fibre distances from 0–80 km. PMID:25450416
Optimizing Processes to Minimize Risk
NASA Technical Reports Server (NTRS)
Loyd, David
2017-01-01
NASA, like the other hazardous industries, has suffered very catastrophic losses. Human error will likely never be completely eliminated as a factor in our failures. When you can't eliminate risk, focus on mitigating the worst consequences and recovering operations. Bolstering processes to emphasize the role of integration and problem solving is key to success. Building an effective Safety Culture bolsters skill-based performance that minimizes risk and encourages successful engagement.
Computer-Mediated Group Processes in Distributed Command and Control Systems
1988-06-01
Linville, "Michael J. Liebhaber, and Richard W. Obermayer Vreuls Corporation Jon J. Fallesen Army Research Institute DTIC SELECTEr • AUG I 1. 1988 ARI...control staffs who will operate in a computer- mediated environment. The Army Research Institute has initiated research to examine selected issues...computar-mediated group processes is needed. Procedure: The identification and selection of key research issues followed a three- step procedure. Previous
Process Modeling and Dynamic Simulation for EAST Helium Refrigerator
NASA Astrophysics Data System (ADS)
Lu, Xiaofei; Fu, Peng; Zhuang, Ming; Qiu, Lilong; Hu, Liangbing
2016-06-01
In this paper, the process modeling and dynamic simulation for the EAST helium refrigerator has been completed. The cryogenic process model is described and the main components are customized in detail. The process model is controlled by the PLC simulator, and the realtime communication between the process model and the controllers is achieved by a customized interface. Validation of the process model has been confirmed based on EAST experimental data during the cool down process of 300-80 K. Simulation results indicate that this process simulator is able to reproduce dynamic behaviors of the EAST helium refrigerator very well for the operation of long pulsed plasma discharge. The cryogenic process simulator based on control architecture is available for operation optimization and control design of EAST cryogenic systems to cope with the long pulsed heat loads in the future. supported by National Natural Science Foundation of China (No. 51306195) and Key Laboratory of Cryogenics, Technical Institute of Physics and Chemistry, CAS (No. CRYO201408)
NASA Technical Reports Server (NTRS)
1994-01-01
This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.
NASA Astrophysics Data System (ADS)
Guzman, J. C.; Bennett, T.
2008-08-01
The Convergent Radio Astronomy Demonstrator (CONRAD) is a collaboration between the computing teams of two SKA pathfinder instruments, MeerKAT (South Africa) and ASKAP (Australia). Our goal is to produce the required common software to operate, process and store the data from the two instruments. Both instruments are synthesis arrays composed of a large number of antennas (40 - 100) operating at centimeter wavelengths with wide-field capabilities. Key challenges are the processing of high volume of data in real-time as well as the remote mode of operations. Here we present the software architecture for CONRAD. Our design approach is to maximize the use of open solutions and third-party software widely deployed in commercial applications, such as SNMP and LDAP, and to utilize modern web-based technologies for the user interfaces, such as AJAX.
Ku-band signal design study. [space shuttle orbiter data processing network
NASA Technical Reports Server (NTRS)
Rubin, I.
1978-01-01
Analytical tools, methods and techniques for assessing the design and performance of the space shuttle orbiter data processing system (DPS) are provided. The computer data processing network is evaluated in the key areas of queueing behavior synchronization and network reliability. The structure of the data processing network is described as well as the system operation principles and the network configuration. The characteristics of the computer systems are indicated. System reliability measures are defined and studied. System and network invulnerability measures are computed. Communication path and network failure analysis techniques are included.
Delta clipper lessons learned for increased operability in reusable space vehicles
NASA Astrophysics Data System (ADS)
Charette, Ray O.; Steinmeyer, Don A.; Smiljanic, Ray R.
1998-01-01
Important lessons were learned from the design, development, and test (DD&T), and operation of the Delta Clipper Experimental (DC-X/XA) Reusable Launch Vehicle (RLV) which apply to increased operability for the operational Reusable Space Vehicles (RSVs). Boeing maintains a continuous process improvement program that provides the opportunity to ``institutionalize'' the results from projects such as Delta Clipper for application to product improvement in future programs. During the design phase, operations and supportability (O&S) were emphasized to ensure aircraft-like operations, traceable to an operational RSV. The operations personnel, flight, and ground crew and crew chief were actively involved in the design, manufacture, and checkout of the systems. Changes and additions to capability were implemented as they evolved from knowledge gained in each phase of development. This paper presents key lessons learned with respect to design and implementation of flight systems, propulsion, airframe, hydraulics, avionics, and ground operations. Information was obtained from discussions with personnel associated with this program concerning their experience and lessons learned. Additionally, field process records and operations timelines were evaluated for applicability to RSVs. The DC-X program pursued reusability in all aspects of the design, a unique approach in rocket system development.
2003-10-31
The NASA News Center, seen here, is the hub of news operations for the media, providing information and contacts about Space Shuttle processing and other activities around KSC. News Center staff also conduct media tours, escorting journalists and photo/videographers to key sites such as the launch pads and Vehicle Assembly Building as needed.
ERIC Educational Resources Information Center
Coupet, Jason
2017-01-01
Historically Black Colleges and Universities (HBCUs), a set of US higher education institutions historically tasked with educating African-American students, receive both state and federal funding. However, state governments often assert operational control through the political process, potentially influencing how key resources are used. Do these…
PHYSICAL AND OPTICAL PROPERTIES OF STEAM-EXPLODED LASER-PRINTED PAPER
Laser-printed paper was pulped by the steam-explosion process. A full-factorial experimental design was applied to determine the effects of key operating variables on the properties of steam-exploded pulp. The variables were addition level for pulping chemicals (NaOH and/or Na2SO...
Human Mars Surface Science Operations
NASA Technical Reports Server (NTRS)
Bobskill, Marianne R.; Lupisella, Mark L.
2014-01-01
Human missions to the surface of Mars will have challenging science operations. This paper will explore some of those challenges, based on science operations considerations as part of more general operational concepts being developed by NASA's Human Spaceflight Architecture (HAT) Mars Destination Operations Team (DOT). The HAT Mars DOT has been developing comprehensive surface operations concepts with an initial emphasis on a multi-phased mission that includes a 500-day surface stay. This paper will address crew science activities, operational details and potential architectural and system implications in the areas of (a) traverse planning and execution, (b) sample acquisition and sample handling, (c) in-situ science analysis, and (d) planetary protection. Three cross-cutting themes will also be explored in this paper: (a) contamination control, (b) low-latency telerobotic science, and (c) crew autonomy. The present traverses under consideration are based on the report, Planning for the Scientific Exploration of Mars by Humans1, by the Mars Exploration Planning and Analysis Group (MEPAG) Human Exploration of Mars-Science Analysis Group (HEM-SAG). The traverses are ambitious and the role of science in those traverses is a key component that will be discussed in this paper. The process of obtaining, handling, and analyzing samples will be an important part of ensuring acceptable science return. Meeting planetary protection protocols will be a key challenge and this paper will explore operational strategies and system designs to meet the challenges of planetary protection, particularly with respect to the exploration of "special regions." A significant challenge for Mars surface science operations with crew is preserving science sample integrity in what will likely be an uncertain environment. Crewed mission surface assets -- such as habitats, spacesuits, and pressurized rovers -- could be a significant source of contamination due to venting, out-gassing and cleanliness levels associated with crew presence. Low-latency telerobotic science operations has the potential to address a number of contamination control and planetary protection issues and will be explored in this paper. Crew autonomy is another key cross-cutting challenge regarding Mars surface science operations, because the communications delay between earth and Mars could as high as 20 minutes one way, likely requiring the crew to perform many science tasks without direct timely intervention from ground support on earth. Striking the operational balance between crew autonomy and earth support will be a key challenge that this paper will address.
Concentrating Solar Power Central Receiver Panel Component Fabrication and Testing FINAL REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDowell, Michael W; Miner, Kris
The objective of this project is to complete a design of an advanced concentrated solar panel and demonstrate the manufacturability of key components. Then confirm the operation of the key components under prototypic solar flux conditions. This work is an important step in reducing the levelized cost of energy (LCOE) from a central receiver solar power plant. The key technical risk to building larger power towers is building the larger receiver systems. Therefore, this proposed technology project includes the design of an advanced molten salt prototypic sub-scale receiver panel that can be utilized into a large receiver system. Then completemore » the fabrication and testing of key components of the receive design that will be used to validate the design. This project shall have a significant impact on solar thermal power plant design. Receiver panels of suitable size for utility scale plants are a key element to a solar power tower plant. Many subtle and complex manufacturing processes are involved in producing a reliable, robust receiver panel. Given the substantial size difference between receiver panels manufactured in the past and those needed for large plant designs, the manufacture and demonstration on prototype receiver panel components with representative features of a full-sized panel will be important to improving the build process for commercial success. Given the thermal flux limitations of the test facility, the panel components cannot be rendered full size. Significance changes occurred in the projects technical strategies from project initiation to the accomplishments described herein. The initial strategy was to define cost improvements for the receiver, design and build a scale prototype receiver and test, on sun, with a molten salt heat transport system. DOE had committed to constructing a molten salt heat transport loop to support receiver testing at the top of the NSTTF tower. Because of funding constraints this did not happen. A subsequent plan to test scale prototype receiver, off sun but at temperature, at a molten salt loop at ground level adjacent to the tower also had to be abandoned. Thus, no test facility existed for a molten salt receiver test. As a result, PWR completed the prototype receiver design and then fabricated key components for testing instead of fabricating the complete prototype receiver. A number of innovative design ideas have been developed. Key features of the receiver panel have been identified. This evaluation includes input from Solar 2, personal experience of people working on these programs and meetings with Sandia. Key components of the receiver design and key processes used to fabricate a receiver have been selected for further evaluation. The Test Plan, Concentrated Solar Power Receiver In Cooperation with the Department of Energy and Sandia National Laboratory was written to define the scope of the testing to be completed as well as to provide details related to the hardware, instrumentation, and data acquisition. The document contains a list of test objectives, a test matrix, and an associated test box showing the operating points to be tested. Test Objectives: 1. Demonstrate low-cost manufacturability 2. Demonstrate robustness of two different tube base materials 3. Collect temperature data during on sun operation 4. Demonstrate long term repeated daily operation of heat shields 5. Complete pinhole tube weld repairs 6. Anchor thermal models This report discusses the tests performed, the results, and implications for design improvements and LCOE reduction.« less
Forscher, Emily C; Zheng, Yan; Ke, Zijun; Folstein, Jonathan; Li, Wen
2016-10-01
Emotion perception is known to involve multiple operations and waves of analysis, but specific nature of these processes remains poorly understood. Combining psychophysical testing and neurometric analysis of event-related potentials (ERPs) in a fear detection task with parametrically varied fear intensities (N=45), we sought to elucidate key processes in fear perception. Building on psychophysics marking fear perception thresholds, our neurometric model fitting identified several putative operations and stages: four key processes arose in sequence following face presentation - fear-neutral categorization (P1 at 100ms), fear detection (P300 at 320ms), valuation (early subcomponent of the late positive potential/LPP at 400-500ms) and conscious awareness (late subcomponent LPP at 500-600ms). Furthermore, within-subject brain-behavior association suggests that initial emotion categorization was mandatory and detached from behavior whereas valuation and conscious awareness directly impacted behavioral outcome (explaining 17% and 31% of the total variance, respectively). The current study thus reveals the chronometry of fear perception, ascribing psychological meaning to distinct underlying processes. The combination of early categorization and late valuation of fear reconciles conflicting (categorical versus dimensional) emotion accounts, lending support to a hybrid model. Importantly, future research could specifically interrogate these psychological processes in various behaviors and psychopathologies (e.g., anxiety and depression). Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bingsheng, Xu
2017-04-01
Considering the large quantities of wastewater generated from iron and steel enterprises in China, this paper is aimed to research the common methods applied for evaluating the integrated wastewater treatment effect of iron and steel enterprises. Based on survey results on environmental protection performance, technological economy, resource & energy consumption, services and management, an indicator system for evaluating the operation effect of integrated wastewater treatment facilities is set up. By discussing the standards and industrial policies in and out of China, 27 key secondary indicators are further defined on the basis of investigation on main equipment and key processes for wastewater treatment, so as to determine the method for setting key quantitative and qualitative indicators for evaluation indicator system. It is also expected to satisfy the basic requirements of reasonable resource allocation, environmental protection and sustainable economic development, further improve the integrated wastewater treatment effect of iron and steel enterprises, and reduce the emission of hazardous substances and environmental impact.
Supplementary motor area as key structure for domain-general sequence processing: A unified account.
Cona, Giorgia; Semenza, Carlo
2017-01-01
The Supplementary Motor Area (SMA) is considered as an anatomically and functionally heterogeneous region and is implicated in several functions. We propose that SMA plays a crucial role in domain-general sequence processes, contributing to the integration of sequential elements into higher-order representations regardless of the nature of such elements (e.g., motor, temporal, spatial, numerical, linguistic, etc.). This review emphasizes the domain-general involvement of the SMA, as this region has been found to support sequence operations in a variety of cognitive domains that, albeit different, share an inherent sequence processing. These include action, time and spatial processing, numerical cognition, music and language processing, and working memory. In this light, we reviewed and synthesized recent neuroimaging, stimulation and electrophysiological studies in order to compare and reconcile the distinct sources of data by proposing a unifying account for the role of the SMA. We also discussed the differential contribution of the pre-SMA and SMA-proper in sequence operations, and possible neural mechanisms by which such operations are executed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Measurement-based reliability/performability models
NASA Technical Reports Server (NTRS)
Hsueh, Mei-Chen
1987-01-01
Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.
Taking advantage of ground data systems attributes to achieve quality results in testing software
NASA Technical Reports Server (NTRS)
Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.
1994-01-01
During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mariani, R.D.; Benedict, R.W.; Lell, R.M.
1996-05-01
As part of the termination activities of Experimental Breeder Reactor II (EBR-II) at Argonne National Laboratory (ANL) West, the spent metallic fuel from EBR-II will be treated in the fuel cycle facility (FCF). A key component of the spent-fuel treatment process in the FCF is the electrorefiner (ER) in which the actinide metals are separated from the active metal fission products and the reactive bond sodium. In the electrorefining process, the metal fuel is anodically dissolved into a high-temperature molten salt, and refined uranium or uranium/plutonium products are deposited at cathodes. The criticality safety strategy and analysis for the ANLmore » West FCF ER is summarized. The FCF ER operations and processes formed the basis for evaluating criticality safety and control during actinide metal fuel refining. To show criticality safety for the FCF ER, the reference operating conditions for the ER had to be defined. Normal operating envelopes (NOEs) were then defined to bracket the important operating conditions. To keep the operating conditions within their NOEs, process controls were identified that can be used to regulate the actinide forms and content within the ER. A series of operational checks were developed for each operation that will verify the extent or success of an operation. The criticality analysis considered the ER operating conditions at their NOE values as the point of departure for credible and incredible failure modes. As a result of the analysis, FCF ER operations were found to be safe with respect to criticality.« less
Key Performance Indicators in Radiology: You Can't Manage What You Can't Measure.
Harvey, H Benjamin; Hassanzadeh, Elmira; Aran, Shima; Rosenthal, Daniel I; Thrall, James H; Abujudeh, Hani H
2016-01-01
Quality assurance (QA) is a fundamental component of every successful radiology operation. A radiology QA program must be able to efficiently and effectively monitor and respond to quality problems. However, as radiology QA has expanded into the depths of radiology operations, the task of defining and measuring quality has become more difficult. Key performance indicators (KPIs) are highly valuable data points and measurement tools that can be used to monitor and evaluate the quality of services provided by a radiology operation. As such, KPIs empower a radiology QA program to bridge normative understandings of health care quality with on-the-ground quality management. This review introduces the importance of KPIs in health care QA, a framework for structuring KPIs, a method to identify and tailor KPIs, and strategies to analyze and communicate KPI data that would drive process improvement. Adopting a KPI-driven QA program is both good for patient care and allows a radiology operation to demonstrate measurable value to other health care stakeholders. Copyright © 2015 Mosby, Inc. All rights reserved.
Realtime Decision Making on EO-1 Using Onboard Science Analysis
NASA Technical Reports Server (NTRS)
Sherwood, Robert; Chien, Steve; Davies, Ashley; Mandl, Dan; Frye, Stu
2004-01-01
Recent autonomy experiments conducted on Earth Observing 1 (EO-1) using the Autonomous Sciencecraft Experiment (ASE) flight software has been used to classify key features in hyperspectral images captured by EO-1. Furthermore, analysis is performed by this software onboard EO-1 and then used to modify the operational plan without interaction from the ground. This paper will outline the overall operations concept and provide some details and examples of the onboard science processing, science analysis, and replanning.
Biogas Production: Microbiology and Technology.
Schnürer, Anna
Biogas, containing energy-rich methane, is produced by microbial decomposition of organic material under anaerobic conditions. Under controlled conditions, this process can be used for the production of energy and a nutrient-rich residue suitable for use as a fertilising agent. The biogas can be used for production of heat, electricity or vehicle fuel. Different substrates can be used in the process and, depending on substrate character, various reactor technologies are available. The microbiological process leading to methane production is complex and involves many different types of microorganisms, often operating in close relationships because of the limited amount of energy available for growth. The microbial community structure is shaped by the incoming material, but also by operating parameters such as process temperature. Factors leading to an imbalance in the microbial community can result in process instability or even complete process failure. To ensure stable operation, different key parameters, such as levels of degradation intermediates and gas quality, are often monitored. Despite the fact that the anaerobic digestion process has long been used for industrial production of biogas, many questions need still to be resolved to achieve optimal management and gas yields and to exploit the great energy and nutrient potential available in waste material. This chapter discusses the different aspects that need to be taken into consideration to achieve optimal degradation and gas production, with particular focus on operation management and microbiology.
Penloglou, Giannis; Chatzidoukas, Christos; Kiparissides, Costas
2012-01-01
The microbial production of polyhydroxybutyrate (PHB) is a complex process in which the final quantity and quality of the PHB depend on a large number of process operating variables. Consequently, the design and optimal dynamic operation of a microbial process for the efficient production of PHB with tailor-made molecular properties is an extremely interesting problem. The present study investigates how key process operating variables (i.e., nutritional and aeration conditions) affect the biomass production rate and the PHB accumulation in the cells and its associated molecular weight distribution. A combined metabolic/polymerization/macroscopic modelling approach, relating the process performance and product quality with the process variables, was developed and validated using an extensive series of experiments and measurements. The model predicts the dynamic evolution of the biomass growth, the polymer accumulation, the consumption of carbon and nitrogen sources and the average molecular weights of the PHB in a bioreactor, under batch and fed-batch operating conditions. The proposed integrated model was used for the model-based optimization of the production of PHB with tailor-made molecular properties in Azohydromonas lata bacteria. The process optimization led to a high intracellular PHB accumulation (up to 95% g of PHB per g of DCW) and the production of different grades (i.e., different molecular weight distributions) of PHB. Copyright © 2011 Elsevier Inc. All rights reserved.
State Recognition of Bone Drilling Based on Acoustic Emission in Pedicle Screw Operation.
Guan, Fengqing; Sun, Yu; Qi, Xiaozhi; Hu, Ying; Yu, Gang; Zhang, Jianwei
2018-05-09
Pedicle drilling is an important step in pedicle screw fixation and the most significant challenge in this operation is how to determine a key point in the transition region between cancellous and inner cortical bone. The purpose of this paper is to find a method to achieve the recognition for the key point. After acquiring acoustic emission (AE) signals during the drilling process, this paper proposed a novel frequency distribution-based algorithm (FDB) to analyze the AE signals in the frequency domain after certain processes. Then we select a specific frequency domain of the signal for standard operations and choose a fitting function to fit the obtained sequence. Characters of the fitting function are extracted as outputs for identification of different bone layers. The results, which are obtained by detecting force signal and direct measurement, are given in the paper. Compared with the results above, the results obtained by AE signals are distinguishable for different bone layers and are more accurate and precise. The results of the algorithm are trained and identified by a neural network and the recognition rate reaches 84.2%. The proposed method is proved to be efficient and can be used for bone layer identification in pedicle screw fixation.
Offset quadrature communications with decision-feedback carrier synchronization
NASA Technical Reports Server (NTRS)
Simon, M. K.; Smith, J. G.
1974-01-01
In order to accommodate a quadrature amplitude-shift-keyed (QASK) signal, Simon and Smith (1974) have modified the decision-feedback loop which tracks a quadrature phase-shift-keyed (QPSK). In the investigation reported approaches are considered to modify the loops in such a way that offset QASK signals can be tracked, giving attention to the special case of an offset QPSK. The development of the stochastic integro-differential equation of operation for a decision-feedback offset QASK loop is discussed along with the probability density function of the phase error process.
Study of CFB Simulation Model with Coincidence at Multi-Working Condition
NASA Astrophysics Data System (ADS)
Wang, Z.; He, F.; Yang, Z. W.; Li, Z.; Ni, W. D.
A circulating fluidized bed (CFB) two-stage simulation model was developed. To realize the model results coincident with the design value or real operation value at specified multi-working conditions and with capability of real-time calculation, only the main key processes were taken into account and the dominant factors were further abstracted out of these key processes. The simulation results showed a sound accordance at multi-working conditions, and confirmed the advantage of the two-stage model over the original single-stage simulation model. The combustion-support effect of secondary air was investigated using the two-stage model. This model provides a solid platform for investigating the pant-leg structured CFB furnace, which is now under design for a supercritical power plant.
Sensors systems for the automation of operations in the ship repair industry.
Navarro, Pedro Javier; Muro, Juan Suardíaz; Alcover, Pedro María; Fernández-Isla, Carlos
2013-09-13
Hull cleaning before repainting is a key operation in the maintenance of ships. For years, a method to improve such operation has been sought by means of the robotization of techniques such as grit blasting and ultra high pressure water jetting. Despite this, it continues to be standard practice in shipyards that this process is carried out manually because the developed robotized systems are too expensive to be widely accepted by shipyards. We have chosen to apply a more conservative and realistic approach to this problem, which has resulted in the development of several solutions that have been designed with different automation and operation range degrees. These solutions are fitted with most of the elements already available in many shipyards, so the installation of additional machinery in the workplace would not be necessary. This paper describes the evolutionary development of sensor systems for the automation of the preparation process of ship hull surfaces before the painting process is performed. Such evolution has given rise to the development of new technologies for coating removal.
Sensors Systems for the Automation of Operations in the Ship Repair Industry
Navarro, Pedro Javier; Muro, Juan Suardíaz; Alcover, Pedro María; Fernández-Isla, Carlos
2013-01-01
Hull cleaning before repainting is a key operation in the maintenance of ships. For years, a method to improve such operation has been sought by means of the robotization of techniques such as grit blasting and ultra high pressure water jetting. Despite this, it continues to be standard practice in shipyards that this process is carried out manually because the developed robotized systems are too expensive to be widely accepted by shipyards. We have chosen to apply a more conservative and realistic approach to this problem, which has resulted in the development of several solutions that have been designed with different automation and operation range degrees. These solutions are fitted with most of the elements already available in many shipyards, so the installation of additional machinery in the workplace would not be necessary. This paper describes the evolutionary development of sensor systems for the automation of the preparation process of ship hull surfaces before the painting process is performed. Such evolution has given rise to the development of new technologies for coating removal. PMID:24064601
Sequence and batch language programs and alarm-related ``C`` programs for the 242-A MCS. Revision 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berger, J.F.
1995-03-01
A Distributive Process Control system was purchased by Project B-534, ``242-A Evaporator/Crystallizer Upgrades``. This control system, called the Monitor and Control System (MCS), was installed in the 242-A Evaporator located in the 200 East Area. The purpose of the MCS is to monitor and control the Evaporator and monitor a number of alarms and other signals from various Tank Farm facilities. Applications software for the MCS was developed by the Waste Treatment Systems Engineering (WTSE) group of Westinghouse. The standard displays and alarm scheme provide for control and monitoring, but do not directly indicate the signal location or depict themore » overall process. To do this, WTSE developed a second alarm scheme which uses special programs, annunciator keys, and process graphics. The special programs are written in two languages; Sequence and Batch Language (SABL), and ``C`` language. The WTSE-developed alarm scheme works as described below: SABL relates signals and alarms to the annunciator keys, called SKID keys. When an alarm occurs, a SABL program causes a SKID key to flash, and if the alarm is of yellow or white priority then a ``C`` program turns on an audible horn (the D/3 system uses a different audible horn for the red priority alarms). The horn and flashing key draws the attention of the operator.« less
Evolving Systems: An Outcome of Fondest Hopes and Wildest Dreams
NASA Technical Reports Server (NTRS)
Frost, Susan A.; Balas, Mark J.
2012-01-01
New theory is presented for evolving systems, which are autonomously controlled subsystems that self-assemble into a new evolved system with a higher purpose. Evolving systems of aerospace structures often require additional control when assembling to maintain stability during the entire evolution process. This is the concept of Adaptive Key Component Control that operates through one specific component to maintain stability during the evolution. In addition, this control must often overcome persistent disturbances that occur while the evolution is in progress. Theoretical results will be presented for Adaptive Key Component control for persistent disturbance rejection. An illustrative example will demonstrate the Adaptive Key Component controller on a system composed of rigid body and flexible body modes.
Cellular and Synaptic Properties of Local Inhibitory Circuits.
Hull, Court
2017-05-01
Inhibitory interneurons play a key role in sculpting the information processed by neural circuits. Despite the wide range of physiologically and morphologically distinct types of interneurons that have been identified, common principles have emerged that have shed light on how synaptic inhibition operates, both mechanistically and functionally, across cell types and circuits. This introduction summarizes how electrophysiological approaches have been used to illuminate these key principles, including basic interneuron circuit motifs, the functional properties of inhibitory synapses, and the main roles for synaptic inhibition in regulating neural circuit function. It also highlights how some key electrophysiological methods and experiments have advanced our understanding of inhibitory synapse function. © 2017 Cold Spring Harbor Laboratory Press.
Natural language processing to ascertain two key variables from operative reports in ophthalmology.
Liu, Liyan; Shorstein, Neal H; Amsden, Laura B; Herrinton, Lisa J
2017-04-01
Antibiotic prophylaxis is critical to ophthalmology and other surgical specialties. We performed natural language processing (NLP) of 743 838 operative notes recorded for 315 246 surgeries to ascertain two variables needed to study the comparative effectiveness of antibiotic prophylaxis in cataract surgery. The first key variable was an exposure variable, intracameral antibiotic injection. The second was an intraoperative complication, posterior capsular rupture (PCR), which functioned as a potential confounder. To help other researchers use NLP in their settings, we describe our NLP protocol and lessons learned. For each of the two variables, we used SAS Text Miner and other SAS text-processing modules with a training set of 10 000 (1.3%) operative notes to develop a lexicon. The lexica identified misspellings, abbreviations, and negations, and linked words into concepts (e.g. "antibiotic" linked with "injection"). We confirmed the NLP tools by iteratively obtaining random samples of 2000 (0.3%) notes, with replacement. The NLP tools identified approximately 60 000 intracameral antibiotic injections and 3500 cases of PCR. The positive and negative predictive values for intracameral antibiotic injection exceeded 99%. For the intraoperative complication, they exceeded 94%. NLP was a valid and feasible method for obtaining critical variables needed for a research study of surgical safety. These NLP tools were intended for use in the study sample. Use with external datasets or future datasets in our own setting would require further testing. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Natural Language Processing to Ascertain Two Key Variables from Operative Reports in Ophthalmology
Liu, Liyan; Shorstein, Neal H.; Amsden, Laura B; Herrinton, Lisa J.
2016-01-01
Purpose Antibiotic prophylaxis is critical to ophthalmology and other surgical specialties. We performed natural language processing (NLP) of 743,838 operative notes recorded for 315,246 surgeries to ascertain two variables needed to study the comparative effectiveness of antibiotic prophylaxis in cataract surgery. The first key variable was an exposure variable, intracameral antibiotic injection. The second was an intraoperative complication, posterior capsular rupture (PCR), that functioned as a potential confounder. To help other researchers use NLP in their settings, we describe our NLP protocol and lessons learned. Methods For each of the two variables, we used SAS Text Miner and other SAS text-processing modules with a training set of 10,000 (1.3%) operative notes to develop a lexicon. The lexica identified misspellings, abbreviations, and negations, and linked words into concepts (e.g., “antibiotic” linked with “injection”). We confirmed the NLP tools by iteratively obtaining random samples of 2,000 (0.3%) notes, with replacement. Results The NLP tools identified approximately 60,000 intracameral antibiotic injections and 3,500 cases of PCR. The positive and negative predictive values for intracameral antibiotic injection exceeded 99%. For the intraoperative complication, they exceeded 94%. Conclusion NLP was a valid and feasible method for obtaining critical variables needed for a research study of surgical safety. These NLP tools were intended for use in the study sample. Use with external datasets or future datasets in our own setting would require further testing. PMID:28052483
Biofiltration: Fundamentals, design and operations principles and applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swanson, W.J.; Loehr, R.C.
1997-06-01
Biofiltration is a biological air pollution control technology for volatile organic compounds (VOCs). This paper summarizes the fundamentals, design and operation, and application of the process. Biofiltration has been demonstrated to be an effective technology for VOCs from many industries. Large and full-scale systems are in use in Europe and the US. With proper design and operation, VOC removal efficiencies of 95--99% have been achieved. Important parameters for design and performance are empty-bed contact time, gas surface loading, mass loading, elimination capacity, and removal efficiency. Key design and operation factors include chemical and media properties, moisture, pH, temperature, nutrient availability,more » gas pretreatment, and variations in loading.« less
Adapting New Space System Designs into Existing Ground Infrastructure
NASA Technical Reports Server (NTRS)
Delgado, Hector N.; McCleskey, Carey M.
2008-01-01
As routine space operations extend beyond earth orbit, the ability for ground infrastructures to take on new launch vehicle systems and a more complex suite of spacecraft and payloads has become a new challenge. The U.S. Vision for Space Exploration and its Constellation Program provides opportunities for our space operations community to meet this challenge. Presently, as new flight and ground systems add to the overall groundbased and space-based capabilities for NASA and its international partners, specific choices are being made as to what to abandon, what to retain, as well as what to build new. The total ground and space-based infrastructure must support a long-term, sustainable operation after it is all constructed, deployed, and activated. This paper addresses key areas of engineering concern during conceptual design, development, and routine operations, with a particular focus on: (1) legacy system reusability, (2) system supportability attributes and operations characteristics, (3) ground systems design trades and criteria, and (4) technology application survey. Each key area explored weighs the merits of reusability of the infrastructure in terms of: engineering analysis methods and techniques; top-level facility, systems, and equipment design criteria; and some suggested methods for making the operational system attributes (the "-ilities") highly visible to the design teams and decisionmakers throughout the design process.
NASA Technical Reports Server (NTRS)
Lehtonen, Kenneth
1994-01-01
The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) International Solar-Terrestrial Physics (ISTP) Program is committed to the development of a comprehensive, multi-mission ground data system which will support a variety of national and international scientific missions in an effort to study the flow of energy from the sun through the Earth-space environment, known as the geospace. A major component of the ISTP ground data system is an ISTP-dedicated Central Data Handling Facility (CDHF). Acquisition, development, and operation of the ISTP CDHF were delegated by the ISTP Project Office within the Flight Projects Directorate to the Information Processing Division (IPD) within the Mission Operations and Data Systems Directorate (MO&DSD). The ISTP CDHF supports the receipt, storage, and electronic access of the full complement of ISTP Level-zero science data; serves as the linchpin for the centralized processing and long-term storage of all key parameters generated either by the ISTP CDHF itself or received from external, ISTP Program approved sources; and provides the required networking and 'science-friendly' interfaces for the ISTP investigators. Once connected to the ISTP CDHF, the online catalog of key parameters can be browsed from their remote processing facilities for the immediate electronic receipt of selected key parameters using the NASA Science Internet (NSI), managed by NASA's Ames Research Center. The purpose of this paper is twofold: (1) to describe how the ISTP CDHF was successfully implemented and operated to support initially the Japanese Geomagnetic Tail (GEOTAIL) mission and correlative science investigations, and (2) to describe how the ISTP CDHF has been enhanced to support ongoing as well as future ISTP missions. Emphasis will be placed on how various project management approaches were undertaken that proved to be highly effective in delivering an operational ISTP CDHF to the Project on schedule and within budget. Examples to be discussed include: the development of superior teams; the use of Defect Causal Analysis (DCA) concepts to improve the software development process in a pilot Total Quality Management (TQM) initiative; and the implementation of a robust architecture that will be able to support the anticipated growth in the ISTP Program science requirements with only incremental upgrades to the baseline system. Further examples include the use of automated data management software and the implementation of Government and/or industry standards, whenever possible, into the hardware and software development life-cycle. Finally, the paper will also report on several new technologies (for example, the installation of a Fiber Data Distribution Interface network) that were successfully employed.
Development of Medical Technology for Contingency Response to Marrow Toxic Agents
1. Contingency Preparedness: Collect information from transplant centers, build awareness of the Transplant Center Contingency Planning Committee and...Matched Donors: Increase operational efficiencies that accelerate the search process and increase patient access are key to preparedness in a contingency ...Transplantation: Create a platform that facilitates multicenter collaboration and data management.
Strengthening the revenue cycle: a 4-step method for optimizing payment.
Clark, Jonathan J
2008-10-01
Four steps for enhancing the revenue cycle to ensure optimal payment are: *Establish key performance indicator dashboards in each department that compare current with targeted performance; *Create proper organizational structures for each department; *Ensure that high-performing leaders are hired in all management and supervisory positions; *Implement efficient processes in underperforming operations.
NASA System Engineering Design Process
NASA Technical Reports Server (NTRS)
Roman, Jose
2011-01-01
This slide presentation reviews NASA's use of systems engineering for the complete life cycle of a project. Systems engineering is a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. Each phase of a NASA project is terminated with a Key decision point (KDP), which is supported by major reviews.
Plastic Solar Cells: A Multidisciplinary Field to Construct Chemical Concepts from Current Research
ERIC Educational Resources Information Center
Gomez, Rafael; Segura, Jose L.
2007-01-01
Examples of plastic solar-cell technology to illustrate core concepts in chemistry are presented. The principles of operations of a plastic solar cell could be used to introduce key concepts, which are fundamentally important to understand photosynthesis and the basic process that govern most novel optoelectronic devices.
McKinnon, Adam D; Ozanne-Smith, Joan; Pope, Rodney
2009-05-01
Injury prevention guided by robust injury surveillance systems (ISS's) can effectively reduce military injury rates, but ISS's depend on human interaction. This study examined experiences and requirements of key users of the Australian Defence Force (ADF) ISS to determine whether the operation of the ISS was optimal, whether there were any shortcomings, and if so, how these shortcomings might be addressed. Semistructured interviews were conducted with 18 Australian Defence Department participants located throughout Australia. Grounded theory methods were used to analyze data by developing an understanding of processes and social phenomena related to injury surveillance systems within the military context. Interviews were recorded and professionally transcribed and information contained in the transcripts was analyzed using NVivo. Key themes relating to the components of an injury surveillance system were identified from the analysis. A range of processes and sociocultural factors influence the utility of military ISS's. These are discussed in detail and should be considered in the future design and operation of military ISS's to facilitate optimal outcomes for injury prevention.
NASA Space Technology Draft Roadmap Area 13: Ground and Launch Systems Processing
NASA Technical Reports Server (NTRS)
Clements, Greg
2011-01-01
This slide presentation reviews the technology development roadmap for the area of ground and launch systems processing. The scope of this technology area includes: (1) Assembly, integration, and processing of the launch vehicle, spacecraft, and payload hardware (2) Supply chain management (3) Transportation of hardware to the launch site (4) Transportation to and operations at the launch pad (5) Launch processing infrastructure and its ability to support future operations (6) Range, personnel, and facility safety capabilities (7) Launch and landing weather (8) Environmental impact mitigations for ground and launch operations (9) Launch control center operations and infrastructure (10) Mission integration and planning (11) Mission training for both ground and flight crew personnel (12) Mission control center operations and infrastructure (13) Telemetry and command processing and archiving (14) Recovery operations for flight crews, flight hardware, and returned samples. This technology roadmap also identifies ground, launch and mission technologies that will: (1) Dramatically transform future space operations, with significant improvement in life-cycle costs (2) Improve the quality of life on earth, while exploring in co-existence with the environment (3) Increase reliability and mission availability using low/zero maintenance materials and systems, comprehensive capabilities to ascertain and forecast system health/configuration, data integration, and the use of advanced/expert software systems (4) Enhance methods to assess safety and mission risk posture, which would allow for timely and better decision making. Several key technologies are identified, with a couple of slides devoted to one of these technologies (i.e., corrosion detection and prevention). Development of these technologies can enhance life on earth and have a major impact on how we can access space, eventually making routine commercial space access and improve building and manufacturing, and weather forecasting for example for the effect of these process improvements on our daily lives.
Six Sigma in healthcare delivery.
Liberatore, Matthew J
2013-01-01
The purpose of this paper is to conduct a comprehensive review and assessment of the extant Six Sigma healthcare literature, focusing on: application, process changes initiated and outcomes, including improvements in process metrics, cost and revenue. Data were obtained from an extensive literature search. Healthcare Six Sigma applications were categorized by functional area and department, key process metric, cost savings and revenue generation (if any) and other key implementation characteristics. Several inpatient care areas have seen most applications, including admission, discharge, medication administration, operating room (OR), cardiac and intensive care. About 42.1 percent of the applications have error rate as their driving metric, with the remainder focusing on process time (38 percent) and productivity (18.9 percent). While 67 percent had initial improvement in the key process metric, only 10 percent reported sustained improvement. Only 28 percent reported cost savings and 8 percent offered revenue enhancement. These results do not favorably assess Six Sigma's overall effectiveness and the value it offers healthcare. Results are based on reported applications. Future research can include directly surveying healthcare organizations to provide additional data for assessment. Future application should emphasize obtaining improvements that lead to significant and sustainable value. Healthcare staff can use the results to target promising areas. This article comprehensively assesses Six Sigma healthcare applications and impact.
Simultaneous Visualization of Different Utility Networks for Disaster Management
NASA Astrophysics Data System (ADS)
Semm, S.; Becker, T.; Kolbe, T. H.
2012-07-01
Cartographic visualizations of crises are used to create a Common Operational Picture (COP) and enforce Situational Awareness by presenting and representing relevant information. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific decision-making throughout the crises. Since, Operator's attention span and their working memory are limiting factors for the process of getting and interpreting information; the cartographic presentation has to support individuals in coordinating their activities and with handling highly dynamic situations. The Situational Awareness of operators in conjunction with a COP are key aspects of the decision making process and essential for coming to appropriate decisions. Utility networks are one of the most complex and most needed systems within a city. The visualization of utility infrastructure in crisis situations is addressed in this paper. The paper will provide a conceptual approach on how to simplify, aggregate, and visualize multiple utility networks and their components to meet the requirements of the decision-making process and to support Situational Awareness.
NASA Astrophysics Data System (ADS)
Malago`, M.; Mucchi, E.; Dalpiaz, G.
2016-03-01
Heavy duty wheels are used in applications such as automatic vehicles and are mainly composed of a polyurethane tread glued to a cast iron hub. In the manufacturing process, the adhesive application between tread and hub is a critical assembly phase, since it is completely made by an operator and a contamination of the bond area may happen. Furthermore, the presence of rust on the hub surface can contribute to worsen the adherence interface, reducing the operating life. In this scenario, a quality control procedure for fault detection to be used at the end of the manufacturing process has been developed. This procedure is based on vibration processing techniques and takes advantages of the results of a lumped parameter model. Indicators based on cyclostationarity can be considered as key parameters to be adopted in a monitoring test station at the end of the production line due to their not deterministic characteristics.
SAR processing in the cloud for oil detection in the Arctic
NASA Astrophysics Data System (ADS)
Garron, J.; Stoner, C.; Meyer, F. J.
2016-12-01
A new world of opportunity is being thawed from the ice of the Arctic, driven by decreased persistent Arctic sea-ice cover, increases in shipping, tourism, natural resource development. Tools that can automatically monitor key sea ice characteristics and potential oil spills are essential for safe passage in these changing waters. Synthetic aperture radar (SAR) data can be used to discriminate sea ice types and oil on the ocean surface and also for feature tracking. Additionally, SAR can image the earth through the night and most weather conditions. SAR data is volumetrically large and requires significant computing power to manipulate. Algorithms designed to identify key environmental features, like oil spills, in SAR imagery require secondary processing, and are computationally intensive, which can functionally limit their application in a real-time setting. Cloud processing is designed to manage big data and big data processing jobs by means of small cycles of off-site computations, eliminating up-front hardware costs. Pairing SAR data with cloud processing has allowed us to create and solidify a processing pipeline for SAR data products in the cloud to compare operational algorithms efficiency and effectiveness when run using an Alaska Satellite Facility (ASF) defined Amazon Machine Image (AMI). The products created from this secondary processing, were compared to determine which algorithm was most accurate in Arctic feature identification, and what operational conditions were required to produce the results on the ASF defined AMI. Results will be used to inform a series of recommendations to oil-spill response data managers and SAR users interested in expanding their analytical computing power.
Near net shape processing: A necessity for advanced materials applications
NASA Technical Reports Server (NTRS)
Kuhn, Howard A.
1993-01-01
High quality discrete parts are the backbones for successful operation of equipment used in transportation, communication, construction, manufacturing, and appliances. Traditional shapemaking for discrete parts is carried out predominantly by machining, or removing unwanted material to produce the desired shape. As the cost and complexity of modern materials escalates, coupled with the expense and environmental hazards associated with handling of scrap, it is increasingly important to develop near net shape processes for these materials. Such processes involve casting of liquid materials, consolidation of powder materials, or deformation processing of simple solid shapes into the desired shape. Frequently, several of these operations may be used in sequence to produce a finished part. The processes for near net shape forming may be applied to any type of material, including metals, polymers, ceramics, and their composites. The ability to produce shapes is the key to implementation of laboratory developments in materials science into real world applications. This seminar presents an overview of near net shapemaking processes, some application examples, current developments, and future research opportunities.
A Framework for WWW Query Processing
NASA Technical Reports Server (NTRS)
Wu, Binghui Helen; Wharton, Stephen (Technical Monitor)
2000-01-01
Query processing is the most common operation in a DBMS. Sophisticated query processing has been mainly targeted at a single enterprise environment providing centralized control over data and metadata. Submitting queries by anonymous users on the web is different in such a way that load balancing or DBMS' accessing control becomes the key issue. This paper provides a solution by introducing a framework for WWW query processing. The success of this framework lies in the utilization of query optimization techniques and the ontological approach. This methodology has proved to be cost effective at the NASA Goddard Space Flight Center Distributed Active Archive Center (GDAAC).
Lunar resource recovery: A definition of requirements
NASA Technical Reports Server (NTRS)
Elsworth, D.; Kohler, J. L.; Alexander, S. S.
1992-01-01
The capability to locate, mine, and process the natural resources of the Moon will be an essential requirement for lunar base development and operation. The list of materials that will be necessary is extensive and ranges from oxygen and hydrogen for fuel and life support to process tailings for emplacement over habitats. Despite the resources need, little is known about methodologies that might be suitable for utilizing lunar resources. This paper examines some of the requirements and constraints for resource recovery and identifies key areas of research needed to locate, mine, and process extraterrestrial natural resources.
NASA Technical Reports Server (NTRS)
Hussey, K. J.; Hall, J. R.; Mortensen, R. A.
1986-01-01
Image processing methods and software used to animate nonimaging remotely sensed data on cloud cover are described. Three FORTRAN programs were written in the VICAR2/TAE image processing domain to perform 3D perspective rendering, to interactively select parameters controlling the projection, and to interpolate parameter sets for animation images between key frames. Operation of the 3D programs and transferring the images to film is automated using executive control language and custom hardware to link the computer and camera.
1994-05-01
LOGISTICS MANAGEMENT INSTITUTE An Approach for Meeting Customer Standards Under Executive Order 12862 Summary Executive Order 12862, Setting...search Centers all operate and manage wind tunnels for both NASA and indus- try customers . Nonetheless, a separate wind-tunnel process should be...could include the man- ager of the process, selected members of the manager’s staff, a key customer , and a survey expert. The manager and staff would
High-energy solar flare observations at the Y2K maximum
NASA Astrophysics Data System (ADS)
Emslie, A. Gordon
2000-04-01
Solar flares afford an opportunity to observe processes associated with the acceleration and propagation of high-energy particles at a level of detail not accessible in any other astrophysical source. I will review some key results from previous high-energy solar flare observations, including those from the Compton Gamma-Ray Observatory, and the problems that they pose for our understanding of energy release and particle acceleration processes in the astrophysical environment. I will then discuss a program of high-energy observations to be carried out during the upcoming 2000-2001 solar maximum that is aimed at addressing and resolving these issues. A key element in this observational program is the High Energy Solar Spectroscopic Imager (HESSI) spacecraft, which will provide imaging spectroscopic observations with spatial, temporal, and energy resolutions commensurate with the physical processes believed to be operating, and will in addition provide the first true gamma-ray spectroscopy of an astrophysical source. .
Two dimensional radial gas flows in atmospheric pressure plasma-enhanced chemical vapor deposition
NASA Astrophysics Data System (ADS)
Kim, Gwihyun; Park, Seran; Shin, Hyunsu; Song, Seungho; Oh, Hoon-Jung; Ko, Dae Hong; Choi, Jung-Il; Baik, Seung Jae
2017-12-01
Atmospheric pressure (AP) operation of plasma-enhanced chemical vapor deposition (PECVD) is one of promising concepts for high quality and low cost processing. Atmospheric plasma discharge requires narrow gap configuration, which causes an inherent feature of AP PECVD. Two dimensional radial gas flows in AP PECVD induces radial variation of mass-transport and that of substrate temperature. The opposite trend of these variations would be the key consideration in the development of uniform deposition process. Another inherent feature of AP PECVD is confined plasma discharge, from which volume power density concept is derived as a key parameter for the control of deposition rate. We investigated deposition rate as a function of volume power density, gas flux, source gas partial pressure, hydrogen partial pressure, plasma source frequency, and substrate temperature; and derived a design guideline of deposition tool and process development in terms of deposition rate and uniformity.
Wiler, Jennifer L; Welch, Shari; Pines, Jesse; Schuur, Jeremiah; Jouriles, Nick; Stone-Griffith, Suzanne
2015-05-01
The objective was to review and update key definitions and metrics for emergency department (ED) performance and operations. Forty-five emergency medicine leaders convened for the Third Performance Measures and Benchmarking Summit held in Las Vegas, February 21-22, 2014. Prior to arrival, attendees were assigned to workgroups to review, revise, and update the definitions and vocabulary being used to communicate about ED performance and operations. They were provided with the prior definitions of those consensus summits that were published in 2006 and 2010. Other published definitions from key stakeholders in emergency medicine and health care were also reviewed and circulated. At the summit, key terminology and metrics were discussed and debated. Workgroups communicated online, via teleconference, and finally in a face-to-face meeting to reach consensus regarding their recommendations. Recommendations were then posted and open to a 30-day comment period. Participants then reanalyzed the recommendations, and modifications were made based on consensus. A comprehensive dictionary of ED terminology related to ED performance and operation was developed. This article includes definitions of operating characteristics and internal and external factors relevant to the stratification and categorization of EDs. Time stamps, time intervals, and measures of utilization were defined. Definitions of processes and staffing measures are also presented. Definitions were harmonized with performance measures put forth by the Centers for Medicare and Medicaid Services (CMS) for consistency. Standardized definitions are necessary to improve the comparability of EDs nationally for operations research and practice. More importantly, clear precise definitions describing ED operations are needed for incentive-based pay-for-performance models like those developed by CMS. This document provides a common language for front-line practitioners, managers, health policymakers, and researchers. © 2015 by the Society for Academic Emergency Medicine.
Notes from the field: the economic value chain in disease management organizations.
Fetterolf, Donald
2006-12-01
The disease management (DM) "value chain" is composed of a linear series of steps that include operational milestones in the development of knowledge, each stage evolving from the preceding one. As an adaptation of Michael Porter's "value chain" model, the process flow in DM moves along the following path: (1) data/information technology, (2) information generation, (3) analysis, (4) assessment/recommendations, (5) actionable customer plan, and (6) program assessment/reassessment. Each of these stages is managed as a major line of product operations within a DM company or health plan. Metrics around each of the key production variables create benchmark milestones, ongoing management insight into program effectiveness, and potential drivers for activity-based cost accounting pricing models. The value chain process must remain robust from early entry of data and information into the system, through the final presentation and recommendations for our clients if the program is to be effective. For individuals involved in the evaluation or review of DM programs, this framework is an excellent method to visualize the key components and sequence in the process. The value chain model is an excellent way to establish the value of a formal DM program and to create a consultancy relationship with a client involved in purchasing these complex services.
Achieving Operability via the Mission System Paradigm
NASA Technical Reports Server (NTRS)
Hammer, Fred J.; Kahr, Joseph R.
2006-01-01
In the past, flight and ground systems have been developed largely-independently, with the flight system taking the lead, and dominating the development process. Operability issues have been addressed poorly in planning, requirements, design, I&T, and system-contracting activities. In many cases, as documented in lessons-learned, this has resulted in significant avoidable increases in cost and risk. With complex missions and systems, operability is being recognized as an important end-to-end design issue. Never-the-less, lessons-learned and operability concepts remain, in many cases, poorly understood and sporadically applied. A key to effective application of operability concepts is adopting a 'mission system' paradigm. In this paradigm, flight and ground systems are treated, from an engineering and management perspective, as inter-related elements of a larger mission system. The mission system consists of flight hardware, flight software, telecom services, ground data system, testbeds, flight teams, science teams, flight operations processes, procedures, and facilities. The system is designed in functional layers, which span flight and ground. It is designed in response to project-level requirements, mission design and an operations concept, and is developed incrementally, with early and frequent integration of flight and ground components.
NASA Astrophysics Data System (ADS)
Black, Stephen T.; Eshleman, Wally
1997-01-01
This paper describes the VentureStar™ SSTO RLV and X-33 operations concepts. Applications of advanced technologies, automated ground support systems, advanced aircraft and launch vehicle lessons learned have been integrated to develop a streamlined vehicle and mission processing concept necessary to meet the goals of a commercial SSTO RLV. These concepts will be validated by the X-33 flight test program where financial and technical risk mitigation are required. The X-33 flight test program totally demonstrates the vehicle performance, technology, and efficient ground operations at the lowest possible cost. The Skunk Work's test program approach and test site proximity to the production plant are keys. The X-33 integrated flight and ground test program incrementally expands the knowledge base of the overall system allowing minimum risk progression to the next flight test program milestone. Subsequent X-33 turnaround processing flows will be performed with an aircraft operations philosophy. The differences will be based on research and development, component reliability and flight test requirements.
Maintenance & construction operations user service : an addendum to the ITS program plan
DOT National Transportation Integrated Search
2001-01-26
The Maintenance and Construction Operations User Service describes the need for integrating key activities. Generally, key Maintenance and Construction Operations (MCO) activities include monitoring, operating, maintaining, improving, and managing th...
Chapter 3. Coordination and collaboration with interface units
Joynt, Gavin M.; Loo, Shi; Taylor, Bruce L.; Margalit, Gila; Christian, Michael D.; Sandrock, Christian; Danis, Marion; Leoniv, Yuval
2016-01-01
Purpose To provide recommendations and standard operating procedures (SOPs) for intensive care unit (ICU) and hospital preparations for an influenza pandemic or mass disaster with a specific focus on enhancing coordination and collaboration between the ICU and other key stakeholders. Methods Based on a literature review and expert opinion, a Delphi process was used to define the essential topics including coordination and collaboration. Results Key recommendations include: (1) establish an Incident Management System with Emergency Executive Control Groups at facility, local, regional/state or national levels to exercise authority and direction over resource use and communications; (2) develop a system of communication, coordination and collaboration between the ICU and key interface departments within the hospital; (3) identify key functions or processes requiring coordination and collaboration, the most important of these being manpower and resources utilization (surge capacity) and re-allocation of personnel, equipment and physical space; (4) develop processes to allow smooth inter-departmental patient transfers; (5) creating systems and guidelines is not sufficient, it is important to: (a) identify the roles and responsibilities of key individuals necessary for the implementation of the guidelines; (b) ensure that these individuals are adequately trained and prepared to perform their roles; (c) ensure adequate equipment to allow key coordination and collaboration activities; (d) ensure an adequate physical environment to allow staff to properly implement guidelines; (6) trigger events for determining a crisis should be defined. Conclusions Judicious planning and adoption of protocols for coordination and collaboration with interface units are necessary to optimize outcomes during a pandemic. PMID:20213418
Building an outpatient imaging center: A case study at genesis healthcare system, part 2.
Yanci, Jim
2006-01-01
In the second of 2 parts, this article will focus on process improvement projects utilizing a case study at Genesis HealthCare System located in Zanesville, OH. Operational efficiency is a key step in developing a freestanding diagnostic imaging center. The process improvement projects began with an Expert Improvement Session (EIS) on the scheduling process. An EIS session is a facilitated meeting that can last anywhere from 3 hours to 2 days. Its intention is to take a group of people involved with the problem or operational process and work to understand current failures or breakdowns in the process. Recommendations are jointly developed to overcome any current deficiencies, and a work plan is structured to create ownership over the changes. A total of 11 EIS sessions occurred over the course of this project, covering 5 sections: Scheduling/telephone call process, Pre-registration, Verification/pre-certification, MRI throughput, CT throughput. Following is a single example of a project focused on the process improvement efforts. All of the process improvement projects utilized a quasi methodology of "DMAIC" (Define, Measure, Analyze, Improve, and Control).
Solving L-L Extraction Problems with Excel Spreadsheet
ERIC Educational Resources Information Center
Teppaitoon, Wittaya
2016-01-01
This work aims to demonstrate the use of Excel spreadsheets for solving L-L extraction problems. The key to solving the problems successfully is to be able to determine a tie line on the ternary diagram where the calculation must be carried out. This enables the reader to analyze the extraction process starting with a simple operation, the…
ERIC Educational Resources Information Center
Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.
2010-01-01
Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…
Thinking Science: A Way to Change Teacher Practice in Order to Raise Students' Ability to Think
ERIC Educational Resources Information Center
Hueppauff, Sonia
2016-01-01
This article describes key facets of the Cognitive Acceleration through Science Education (CASE), a curriculum that emerged in the United Kingdom, enabling teachers to accelerate the process of cognitive development so that more students could attain the higher-order thinking skills (formal operational thinking) required (Lecky, 2012). CASE, also…
Minimization of operational impacts on spectrophotometer color measurements for cotton
USDA-ARS?s Scientific Manuscript database
A key cotton quality and processing property that is gaining increasing importance is the color of the cotton. Cotton fiber in the U.S. is classified for color using the Uster® High Volume Instrument (HVI), using the parameters Rd and +b. Rd and +b are specific to cotton fiber and are not typical ...
After-Action Reviews - who conducts them?
Anne E. Black; Kathleen Sutcliffe; Michelle Barton
2009-01-01
Reflecting on the links between intentions and outcomes is a key practice of a learning organization (Garvin 2000). The After-Action Review (AAR) is a formal reflection process intended to assist groups in capturing lessons learned from a task. AARs typically ask four questions regarding fire-response operations: (1) what did we set out to do, (2) what actually...
Thermoelectric Energy Conversion: Future Directions and Technology Development Needs
NASA Technical Reports Server (NTRS)
Fleurial, Jean-Pierre
2007-01-01
This viewgraph presentation reviews the process of thermoelectric energy conversion along with key technology needs and challenges. The topics include: 1) The Case for Thermoelectrics; 2) Advances in Thermoelectrics: Investment Needed; 3) Current U.S. Investment (FY07); 4) Increasing Thermoelectric Materials Conversion Efficiency Key Science Needs and Challenges; 5) Developing Advanced TE Components & Systems Key Technology Needs and Challenges; 6) Thermoelectrics; 7) 200W Class Lightweight Portable Thermoelectric Generator; 8) Hybrid Absorption Cooling/TE Power Cogeneration System; 9) Major Opportunities in Energy Industry; 10) Automobile Waste Heat Recovery; 11) Thermoelectrics at JPL; 12) Recent Advances at JPL in Thermoelectric Converter Component Technologies; 13) Thermoelectrics Background on Power Generation and Cooling Operational Modes; 14) Thermoelectric Power Generation; and 15) Thermoelectric Cooling.
Optical multiple-image hiding based on interference and grating modulation
NASA Astrophysics Data System (ADS)
He, Wenqi; Peng, Xiang; Meng, Xiangfeng
2012-07-01
We present a method for multiple-image hiding on the basis of interference-based encryption architecture and grating modulation. By using a modified phase retrieval algorithm, we can separately hide a number of secret images into one arbitrarily preselected host image associated with a set of phase-only masks (POMs), which are regarded as secret keys. Thereafter, a grating modulation operation is introduced to multiplex and store the different POMs into a single key mask, which is then assigned to the authorized users in privacy. For recovery, after an appropriate demultiplexing process, one can reconstruct the distributions of all the secret keys and then recover the corresponding hidden images with suppressed crosstalk. Computer simulation results are presented to validate the feasibility of our approach.
Development of Airport Surface Required Navigation Performance (RNP)
NASA Technical Reports Server (NTRS)
Cassell, Rick; Smith, Alex; Hicok, Dan
1999-01-01
The U.S. and international aviation communities have adopted the Required Navigation Performance (RNP) process for defining aircraft performance when operating the en-route, approach and landing phases of flight. RNP consists primarily of the following key parameters - accuracy, integrity, continuity, and availability. The processes and analytical techniques employed to define en-route, approach and landing RNP have been applied in the development of RNP for the airport surface. To validate the proposed RNP requirements several methods were used. Operational and flight demonstration data were analyzed for conformance with proposed requirements, as were several aircraft flight simulation studies. The pilot failure risk component was analyzed through several hypothetical scenarios. Additional simulator studies are recommended to better quantify crew reactions to failures as well as additional simulator and field testing to validate achieved accuracy performance, This research was performed in support of the NASA Low Visibility Landing and Surface Operations Programs.
Chung, Ji-Woo; Kim, Kyung-Min; Yoon, Tae-Ung; Kim, Seung-Ik; Jung, Tae-Sung; Han, Sang-Sup; Bae, Youn-Sang
2017-12-22
A novel power partial-discard (PPD) strategy was developed as a variant of the partial-discard (PD) operation to further improve the separation performance of the simulated moving bed (SMB) process. The PPD operation varied the flow rates of discard streams by introducing a new variable, the discard amount (DA) as well as varying the reported variable, discard length (DL), while the conventional PD used fixed discard flow rates. The PPD operations showed significantly improved purities in spite of losses in recoveries. Remarkably, the PPD operation could provide more enhanced purity for a given recovery or more enhanced recovery for a given purity than the PD operation. The two variables, DA and DL, in the PPD operation played a key role in achieving the desired purity and recovery. The PPD operations will be useful for attaining high-purity products with reasonable recoveries. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zender, J.; Berghmans, D.; Bloomfield, D. S.; Cabanas Parada, C.; Dammasch, I.; De Groof, A.; D'Huys, E.; Dominique, M.; Gallagher, P.; Giordanengo, B.; Higgins, P. A.; Hochedez, J.-F.; Yalim, M. S.; Nicula, B.; Pylyser, E.; Sanchez-Duarte, L.; Schwehm, G.; Seaton, D. B.; Stanger, A.; Stegen, K.; Willems, S.
2013-08-01
The PROBA2 Science Centre (P2SC) is a small-scale science operations centre supporting the Sun observation instruments onboard PROBA2: the EUV imager Sun Watcher using APS detectors and image Processing (SWAP) and Large-Yield Radiometer (LYRA). PROBA2 is one of ESA's small, low-cost Projects for Onboard Autonomy (PROBA) and part of ESA's In-Orbit Technology Demonstration Programme. The P2SC is hosted at the Royal Observatory of Belgium, co-located with both Principal Investigator teams. The P2SC tasks cover science planning, instrument commanding, instrument monitoring, data processing, support of outreach activities, and distribution of science data products. PROBA missions aim for a high degree of autonomy at mission and system level, including the science operations centre. The autonomy and flexibility of the P2SC is reached by a set of web-based interfaces allowing the operators as well as the instrument teams to monitor quasi-continuously the status of the operations, allowing a quick reaction to solar events. In addition, several new concepts are implemented at instrument, spacecraft, and ground-segment levels allowing a high degree of flexibility in the operations of the instruments. This article explains the key concepts of the P2SC, emphasising the automation and the flexibility achieved in the commanding as well as the data-processing chain.
Operational experience of the OC-OTEC experiments at NELH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Link, H
1989-02-01
The Solar Energy Research Institute, under funding and program direction from the US Department of Energy, has been operating a small-scale test apparatus to investigate key components of open- cycle ocean thermal energy conversion (OC-OTEC). The apparatus started operations in October 1987 and continues to provide valuable information on heat-and mass-transfer processes in evaporators and condensers, gas sorption processes as seawater is depressurized and repressurized, and control and instrumentation characteristics of open-cycle systems. Although other test facilities have been used to study some of these interactions, this is the largest apparatus of its kind to use seawater since Georges Claude`smore » efforts in 1926. The information obtained from experiments conducted in this apparatus is being used to design a larger scale experiment in which a positive net power production is expected to be demonstrated for the first time with OC-OTEC. This paper describes the apparatus, the major tests conducted during its first 18 months of operation, and the experience gained in OC-OTEC system operation. 13 refs., 8 figs.« less
Improving security of the ping-pong protocol
NASA Astrophysics Data System (ADS)
Zawadzki, Piotr
2013-01-01
A security layer for the asymptotically secure ping-pong protocol is proposed and analyzed in the paper. The operation of the improvement exploits inevitable errors introduced by the eavesdropping in the control and message modes. Its role is similar to the privacy amplification algorithms known from the quantum key distribution schemes. Messages are processed in blocks which guarantees that an eavesdropper is faced with a computationally infeasible problem as long as the system parameters are within reasonable limits. The introduced additional information preprocessing does not require quantum memory registers and confidential communication is possible without prior key agreement or some shared secret.
One-click scanning of large-size documents using mobile phone camera
NASA Astrophysics Data System (ADS)
Liu, Sijiang; Jiang, Bo; Yang, Yuanjie
2016-07-01
Currently mobile apps for document scanning do not provide convenient operations to tackle large-size documents. In this paper, we present a one-click scanning approach for large-size documents using mobile phone camera. After capturing a continuous video of documents, our approach automatically extracts several key frames by optical flow analysis. Then based on key frames, a mobile GPU based image stitching method is adopted to generate a completed document image with high details. There are no extra manual intervention in the process and experimental results show that our app performs well, showing convenience and practicability for daily life.
Unravelling Some of the Key Transformations in the Hydrothermal Liquefaction of Lignin.
Lui, Matthew Y; Chan, Bun; Yuen, Alexander K L; Masters, Anthony F; Montoya, Alejandro; Maschmeyer, Thomas
2017-05-22
Using both experimental and computational methods, focusing on intermediates and model compounds, some of the main features of the reaction mechanisms that operate during the hydrothermal processing of lignin were elucidated. Key reaction pathways and their connection to different structural features of lignin were proposed. Under neutral conditions, subcritical water was demonstrated to act as a bifunctional acid/base catalyst for the dissection of lignin structures. In a complex web of mutually dependent interactions, guaiacyl units within lignin were shown to significantly affect overall lignin reactivity. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Design techniques for low-voltage analog integrated circuits
NASA Astrophysics Data System (ADS)
Rakús, Matej; Stopjaková, Viera; Arbet, Daniel
2017-08-01
In this paper, a review and analysis of different design techniques for (ultra) low-voltage integrated circuits (IC) are performed. This analysis shows that the most suitable design methods for low-voltage analog IC design in a standard CMOS process include techniques using bulk-driven MOS transistors, dynamic threshold MOS transistors and MOS transistors operating in weak or moderate inversion regions. The main advantage of such techniques is that there is no need for any modification of standard CMOS structure or process. Basic circuit building blocks like differential amplifiers or current mirrors designed using these approaches are able to operate with the power supply voltage of 600 mV (or even lower), which is the key feature towards integrated systems for modern portable applications.
Minimization In Digital Design As A Meta-Planning Problem
NASA Astrophysics Data System (ADS)
Ho, William P. C.; Wu, Jung-Gen
1987-05-01
In our model-based expert system for automatic digital system design, we formalize the design process into three sub-processes - compiling high-level behavioral specifications into primitive behavioral operations, grouping primitive operations into behavioral functions, and grouping functions into modules. Consideration of design minimization explicitly controls decision-making in the last two subprocesses. Design minimization, a key task in the automatic design of digital systems, is complicated by the high degree of interaction among the time sequence and content of design decisions. In this paper, we present an AI approach which directly addresses these interactions and their consequences by modeling the minimization prob-lem as a planning problem, and the management of design decision-making as a meta-planning problem.
2013-06-01
Kobu, 2007) Gunasekaran and Kobu also presented six observations as they relate to these key performance indicators ( KPI ), as follows: 1...Internal business process (50% of the KPI ) and customers (50% of the KPI ) play a significant role in SC environments. This implies that internal business...process PMs have significant impact on the operational performance. 2. The most widely used PM is financial performance (38% of the KPI ). This
Random ambience using high fidelity images
NASA Astrophysics Data System (ADS)
Abu, Nur Azman; Sahib, Shahrin
2011-06-01
Most of the secure communication nowadays mandates true random keys as an input. These operations are mostly designed and taken care of by the developers of the cryptosystem. Due to the nature of confidential crypto development today, pseudorandom keys are typically designed and still preferred by the developers of the cryptosystem. However, these pseudorandom keys are predictable, periodic and repeatable, hence they carry minimal entropy. True random keys are believed to be generated only via hardware random number generators. Careful statistical analysis is still required to have any confidence the process and apparatus generates numbers that are sufficiently random to suit the cryptographic use. In this underlying research, each moment in life is considered unique in itself. The random key is unique for the given moment generated by the user whenever he or she needs the random keys in practical secure communication. An ambience of high fidelity digital image shall be tested for its randomness according to the NIST Statistical Test Suite. Recommendation on generating a simple 4 megabits per second random cryptographic keys live shall be reported.
Spitzer Space Telescope Sequencing Operations Software, Strategies, and Lessons Learned
NASA Technical Reports Server (NTRS)
Bliss, David A.
2006-01-01
The Space Infrared Telescope Facility (SIRTF) was launched in August, 2003, and renamed to the Spitzer Space Telescope in 2004. Two years of observing the universe in the wavelength range from 3 to 180 microns has yielded enormous scientific discoveries. Since this magnificent observatory has a limited lifetime, maximizing science viewing efficiency (ie, maximizing time spent executing activities directly related to science observations) was the key operational objective. The strategy employed for maximizing science viewing efficiency was to optimize spacecraft flexibility, adaptability, and use of observation time. The selected approach involved implementation of a multi-engine sequencing architecture coupled with nondeterministic spacecraft and science execution times. This approach, though effective, added much complexity to uplink operations and sequence development. The Jet Propulsion Laboratory (JPL) manages Spitzer s operations. As part of the uplink process, Spitzer s Mission Sequence Team (MST) was tasked with processing observatory inputs from the Spitzer Science Center (SSC) into efficiently integrated, constraint-checked, and modeled review and command products which accommodated the complexity of non-deterministic spacecraft and science event executions without increasing operations costs. The MST developed processes, scripts, and participated in the adaptation of multi-mission core software to enable rapid processing of complex sequences. The MST was also tasked with developing a Downlink Keyword File (DKF) which could instruct Deep Space Network (DSN) stations on how and when to configure themselves to receive Spitzer science data. As MST and uplink operations developed, important lessons were learned that should be applied to future missions, especially those missions which employ command-intensive operations via a multi-engine sequence architecture.
NASA's Earth Observing System (EOS): Delivering on the Dream, Today and Tomorrow
NASA Technical Reports Server (NTRS)
Kelly, Angelita C.; Johnson, Patricia; Case, Warren F.
2010-01-01
This paper describes the successful operations of NASA's Earth Observing System (EOS) satellites over the past 10 years and the plans for the future. Excellent operations performance has been a key factor in the overall success of EOS. The EOS Program was conceived in the 1980s and began to take shape in the early 1990s. EOS consists of a series of satellites that study the Earth as an interrelated system. It began with the launch of Terra in December 1999, followed by Aqua in May 2002, and Aura in July 2004. A key EOS goal is to provide a long-term continuous data set to enable the science community to develop a better understanding of land, ocean, and atmospheric processes and their interactions. EOS has produced unprecedented amounts of data which are used all over the world free of charge. Mission operations have resulted in data recovery for Terra, Aqua, and Aura that have consistently exceeded mission requirements. The paper describes the ground systems and organizations that control the EOS satellites, capture the raw data, and distribute the processed science data sets. The paper further describes how operations have evolved since 1999. Examples of this evolution include (a) the implementation of new mission safety requirements for orbital debris monitoring; (b) technology upgrades to keep facilities at the state of the art; (c) enhancements to meet changing security requirements; and (d) operations management of the 2 international Earth Observing Constellations of 11 satellites known as the "Morning Constellation" and the "A-Train". The paper concludes with a view into the future based on the latest spacecraft status, lifetime projections, and mission plans.
InSAR data for monitoring land subsidence: time to think big
NASA Astrophysics Data System (ADS)
Ferretti, A.; Colombo, D.; Fumagalli, A.; Novali, F.; Rucci, A.
2015-11-01
Satellite interferometric synthetic aperture radar (InSAR) data have proven effective and valuable in the analysis of urban subsidence phenomena based on multi-temporal radar images. Results obtained by processing data acquired by different radar sensors, have shown the potential of InSAR and highlighted the key points for an operational use of this technology, namely: (1) regular acquisition over large areas of interferometric data stacks; (2) use of advanced processing algorithms, capable of estimating and removing atmospheric disturbances; (3) access to significant processing power for a regular update of the information over large areas. In this paper, we show how the operational potential of InSAR has been realized thanks to the recent advances in InSAR processing algorithms, the advent of cloud computing and the launch of new satellite platforms, specifically designed for InSAR analyses (e.g. Sentinel-1a operated by the ESA and ALOS2 operated by JAXA). The processing of thousands of SAR scenes to cover an entire nation has been performed successfully in Italy in a project financed by the Italian Ministry of the Environment. The challenge for the future is to pass from the historical analysis of SAR scenes already acquired in digital archives to a near real-time monitoring program where up to date deformation data are routinely provided to final users and decision makers.
Cruse, Damian; Wilding, Edward L
2011-06-01
In a pair of recent studies, frontally distributed event-related potential (ERP) indices of two distinct post-retrieval processes were identified. It has been proposed that one of these processes operates over any kinds of task relevant information in service of task demands, while the other operates selectively over recovered contextual (episodic) information. The experiment described here was designed to test this account, by requiring retrieval of different kinds of contextual information to that required in previous relevant studies. Participants heard words spoken in either a male or female voice at study and ERPs were acquired at test where all words were presented visually. Half of the test words had been spoken at study. Participants first made an old/new judgment, distinguishing via key press between studied and unstudied words. For words judged 'old', participants indicated the voice in which the word had been spoken at study, and their confidence (high/low) in the voice judgment. There was evidence for only one of the two frontal old/new effects that had been identified in the previous studies. One possibility is that the ERP effect in previous studies that was tied specifically to recollection reflects processes operating over only some kinds of contextual information. An alternative is that the index reflects processes that are engaged primarily when there are few contextual features that distinguish between studied stimuli. Copyright © 2011 Elsevier Ltd. All rights reserved.
Comprehension of Spacecraft Telemetry Using Hierarchical Specifications of Behavior
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Joshi, Rajeev
2014-01-01
A key challenge in operating remote spacecraft is that ground operators must rely on the limited visibility available through spacecraft telemetry in order to assess spacecraft health and operational status. We describe a tool for processing spacecraft telemetry that allows ground operators to impose structure on received telemetry in order to achieve a better comprehension of system state. A key element of our approach is the design of a domain-specific language that allows operators to express models of expected system behavior using partial specifications. The language allows behavior specifications with data fields, similar to other recent runtime verification systems. What is notable about our approach is the ability to develop hierarchical specifications of behavior. The language is implemented as an internal DSL in the Scala programming language that synthesizes rules from patterns of specification behavior. The rules are automatically applied to received telemetry and the inferred behaviors are available to ground operators using a visualization interface that makes it easier to understand and track spacecraft state. We describe initial results from applying our tool to telemetry received from the Curiosity rover currently roving the surface of Mars, where the visualizations are being used to trend subsystem behaviors, in order to identify potential problems before they happen. However, the technology is completely general and can be applied to any system that generates telemetry such as event logs.
Towards a comprehensive greenhouse gas emissions inventory for biosolids.
Alvarez-Gaitan, J P; Short, Michael D; Lundie, Sven; Stuetz, Richard
2016-06-01
Effective handling and treatment of the solids fraction from advanced wastewater treatment operations carries a substantial burden for water utilities relative to the total economic and environmental impacts from modern day wastewater treatment. While good process-level data for a range of wastewater treatment operations are becoming more readily available, there remains a dearth of high quality operational data for solids line processes in particular. This study seeks to address this data gap by presenting a suite of high quality, process-level life cycle inventory data covering a range of solids line wastewater treatment processes, extending from primary treatment through to biosolids reuse in agriculture. Within the study, the impacts of secondary treatment technology and key parameters such as sludge retention time, activated sludge age and primary-to-waste activated sludge ratio (PS:WAS) on the life cycle inventory data of solids processing trains for five model wastewater treatment plant configurations are presented. BioWin(®) models are calibrated with real operational plant data and estimated electricity consumption values were reconciled against overall plant energy consumption. The concept of "representative crop" is also introduced in order to reduce the uncertainty associated with nitrous oxide emissions and soil carbon sequestration offsets under biosolids land application scenarios. Results indicate that both the treatment plant biogas electricity offset and the soil carbon sequestration offset from land-applied biosolids, represent the main greenhouse gas mitigation opportunities. In contrast, fertiliser offsets are of relatively minor importance in terms of the overall life cycle emissions impacts. Results also show that fugitive methane emissions at the plant, as well as nitrous oxide emissions both at the plant and following agricultural application of biosolids, are significant contributors to the overall greenhouse gas balance and combined are higher than emissions associated with transportation. Sensitivity analyses for key parameters including digester PS:WAS and sludge retention time, and assumed biosolids nitrogen content and agricultural availability also provide additional robustness and comprehensiveness to our inventory data and will facilitate more customised user analyses. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mashup Model and Verification Using Mashup Processing Network
NASA Astrophysics Data System (ADS)
Zahoor, Ehtesham; Perrin, Olivier; Godart, Claude
Mashups are defined to be lightweight Web applications aggregating data from different Web services, built using ad-hoc composition and being not concerned with long term stability and robustness. In this paper we present a pattern based approach, called Mashup Processing Network (MPN). The idea is based on Event Processing Network and is supposed to facilitate the creation, modeling and the verification of mashups. MPN provides a view of how different actors interact for the mashup development namely the producer, consumer, mashup processing agent and the communication channels. It also supports modeling transformations and validations of data and offers validation of both functional and non-functional requirements, such as reliable messaging and security, that are key issues within the enterprise context. We have enriched the model with a set of processing operations and categorize them into data composition, transformation and validation categories. These processing operations can be seen as a set of patterns for facilitating the mashup development process. MPN also paves a way for realizing Mashup Oriented Architecture where mashups along with services are used as building blocks for application development.
Information gathering, management and transfering for geospacial intelligence
NASA Astrophysics Data System (ADS)
Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena
2017-07-01
Information is a key subject in modern organization operations. The success of joint and combined operations with organizations partners depends on the accurate information and knowledge flow concerning the operations theatre: provision of resources, environment evolution, markets location, where and when an event occurred. As in the past and nowadays we cannot conceive modern operations without maps and geo-spatial information (GI). Information and knowledge management is fundamental to the success of organizational decisions in an uncertainty environment. The georeferenced information management is a process of knowledge management, it begins in the raw data and ends on generating knowledge. GI and intelligence systems allow us to integrate all other forms of intelligence and can be a main platform to process and display geo-spatial-time referenced events. Combining explicit knowledge with peoples know-how to generate a continuous learning cycle that supports real time decisions mitigates the influences of fog of everyday competition and provides the knowledge supremacy. Extending the preliminary analysis done in [1], this work applies the exploratory factor analysis to a questionnaire about the GI and intelligence management in an organization company allowing to identify future lines of action to improve information process sharing and exploration of all the potential of this important resource.
Development of an operational manual for a consultation-liaison psychiatry service.
Wand, Anne Pf; Sharma, Swapnil; Carpenter, Lindsay J; Gatsi, Mike
2018-02-01
Consultation-liaison psychiatry (CLP) services sit between mental health and the general hospital, and risk being poorly understood by both systems. The aim of this study was to develop an operational manual for a CLP service, which defined functions and governance. The CLP literature was reviewed with a focus on descriptions of CLP roles, organisational processes, quality measures and service development. The CLP team held service planning meetings and met with members of the mental health and hospital executives. Site visits and collaboration with other CLP services occurred in defining the roles of the CLP service and organisational governance. A CLP operational document was developed, including a description of the service, its functions, staff roles and governance. Procedural information such as the CLP timetable, referral process, triage and assessment, documentation, activity recording, quality assurance and relevant policies were outlined. The development of a dedicated operational manual for CLP clarified the roles, functions and governance of CLP within the general hospital and mental health systems. The development process facilitated the engagement of key clinicians and administrators of these systems, the determination of quality improvement targets and greater transparency and accountability.
Enzyme reactor design under thermal inactivation.
Illanes, Andrés; Wilson, Lorena
2003-01-01
Temperature is a very relevant variable for any bioprocess. Temperature optimization of bioreactor operation is a key aspect for process economics. This is especially true for enzyme-catalyzed processes, because enzymes are complex, unstable catalysts whose technological potential relies on their operational stability. Enzyme reactor design is presented with a special emphasis on the effect of thermal inactivation. Enzyme thermal inactivation is a very complex process from a mechanistic point of view. However, for the purpose of enzyme reactor design, it has been oversimplified frequently, considering one-stage first-order kinetics of inactivation and data gathered under nonreactive conditions that poorly represent the actual conditions within the reactor. More complex mechanisms are frequent, especially in the case of immobilized enzymes, and most important is the effect of catalytic modulators (substrates and products) on enzyme stability under operation conditions. This review focuses primarily on reactor design and operation under modulated thermal inactivation. It also presents a scheme for bioreactor temperature optimization, based on validated temperature-explicit functions for all the kinetic and inactivation parameters involved. More conventional enzyme reactor design is presented merely as a background for the purpose of highlighting the need for a deeper insight into enzyme inactivation for proper bioreactor design.
The advanced linked extended reconnaissance and targeting technology demonstration project
NASA Astrophysics Data System (ADS)
Cruickshank, James; de Villers, Yves; Maheux, Jean; Edwards, Mark; Gains, David; Rea, Terry; Banbury, Simon; Gauthier, Michelle
2007-06-01
The Advanced Linked Extended Reconnaissance & Targeting (ALERT) Technology Demonstration (TD) project is addressing key operational needs of the future Canadian Army's Surveillance and Reconnaissance forces by fusing multi-sensor and tactical data, developing automated processes, and integrating beyond line-of-sight sensing. We discuss concepts for displaying and fusing multi-sensor and tactical data within an Enhanced Operator Control Station (EOCS). The sensor data can originate from the Coyote's own visible-band and IR cameras, laser rangefinder, and ground-surveillance radar, as well as beyond line-of-sight systems such as a mini-UAV and unattended ground sensors. The authors address technical issues associated with the use of fully digital IR and day video cameras and discuss video-rate image processing developed to assist the operator to recognize poorly visible targets. Automatic target detection and recognition algorithms processing both IR and visible-band images have been investigated to draw the operator's attention to possible targets. The machine generated information display requirements are presented with the human factors engineering aspects of the user interface in this complex environment, with a view to establishing user trust in the automation. The paper concludes with a summary of achievements to date and steps to project completion.
Nitrogen cycling models and their application to forest harvesting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, D.W.; Dale, V.H.
1986-01-01
The characterization of forest nitrogen- (N-) cycling processes by several N-cycling models (FORCYTE, NITCOMP, FORTNITE, and LINKAGES) is briefly reviewed and evaluated against current knowledge of N cycling in forests. Some important processes (e.g., translocation within trees, N dynamics in decaying leaf litter) appear to be well characterized, whereas others (e.g., N mineralization from soil organic matter, N fixation, N dynamics in decaying wood, nitrification, and nitrate leaching) are poorly characterized, primarily because of a lack of knowledge rather than an oversight by model developers. It is remarkable how well the forest models do work in the absence of datamore » on some key processes. For those systems in which the poorly understood processes could cause major changes in N availability or productivity, the accuracy of model predictions should be examined. However, the development of N-cycling models represents a major step beyond the much simpler, classic conceptual models of forest nutrient cycling developed by early investigators. The new generation of computer models will surely improve as research reveals how key nutrient-cycling processes operate.« less
A three-level support method for smooth switching of the micro-grid operation model
NASA Astrophysics Data System (ADS)
Zong, Yuanyang; Gong, Dongliang; Zhang, Jianzhou; Liu, Bin; Wang, Yun
2018-01-01
Smooth switching of micro-grid between the grid-connected operation mode and off-grid operation mode is one of the key technologies to ensure it runs flexible and efficiently. The basic control strategy and the switching principle of micro-grid are analyzed in this paper. The reasons for the fluctuations of the voltage and the frequency in the switching process are analyzed from views of power balance and control strategy, and the operation mode switching strategy has been improved targeted. From the three aspects of controller’s current inner loop reference signal, voltage outer loop control strategy optimization and micro-grid energy balance management, a three-level security strategy for smooth switching of micro-grid operation mode is proposed. From the three aspects of controller’s current inner loop reference signal tracking, voltage outer loop control strategy optimization and micro-grid energy balance management, a three-level strategy for smooth switching of micro-grid operation mode is proposed. At last, it is proved by simulation that the proposed control strategy can make the switching process smooth and stable, the fluctuation problem of the voltage and frequency has been effectively improved.
Using AUTORAD for Cassini File Uplinks: Incorporating Automated Commanding into Mission Operations
NASA Technical Reports Server (NTRS)
Goo, Sherwin
2014-01-01
As the Cassini spacecraft embarked on the Solstice Mission in October 2010, the flight operations team faced a significant challenge in planning and executing the continuing tour of the Saturnian system. Faced with budget cuts that reduced the science and engineering staff by over a third in size, new and streamlined processes had to be developed to allow the Cassini mission to maintain a high level of science data return with a lower amount of available resources while still minimizing the risk. Automation was deemed an important key in enabling mission operations with reduced workforce and the Cassini flight team has made this goal a priority for the Solstice Mission. The operations team learned about a utility called AUTORAD which would give the flight operations team the ability to program selected command files for radiation up to seven days in advance and help minimize the need for off-shift support that could deplete available staffing during the prime shift hours. This paper will describe how AUTORAD is being utilized by the Cassini flight operations team and the processes that were developed or modified to ensure that proper oversight and verification is maintained in the generation and execution of radiated command files.
Zhang, Hang; Xu, Qingyan; Liu, Baicheng
2014-01-01
The rapid development of numerical modeling techniques has led to more accurate results in modeling metal solidification processes. In this study, the cellular automaton-finite difference (CA-FD) method was used to simulate the directional solidification (DS) process of single crystal (SX) superalloy blade samples. Experiments were carried out to validate the simulation results. Meanwhile, an intelligent model based on fuzzy control theory was built to optimize the complicate DS process. Several key parameters, such as mushy zone width and temperature difference at the cast-mold interface, were recognized as the input variables. The input variables were functioned with the multivariable fuzzy rule to get the output adjustment of withdrawal rate (v) (a key technological parameter). The multivariable fuzzy rule was built, based on the structure feature of casting, such as the relationship between section area, and the delay time of the temperature change response by changing v, and the professional experience of the operator as well. Then, the fuzzy controlling model coupled with CA-FD method could be used to optimize v in real-time during the manufacturing process. The optimized process was proven to be more flexible and adaptive for a steady and stray-grain free DS process. PMID:28788535
Beyond business process redesign: redefining Baxter's business network.
Short, J E; Venkatraman, N
1992-01-01
Business process redesign has focused almost exclusively on improving the firm's internal operations. Although internal efficiency and effectiveness are important objectives, the authors argue that business network redesign--reconceptualizing the role of the firm and its key business processes in the larger business network--is of greater strategic importance. To support their argument, they analyze the evolution of Baxter's ASAP system, one of the most publicized but inadequately understood strategic information systems of the 1980s. They conclude by examining whether ASAP's early successes have positioned the firm well for the changing hospital supplies marketplace of the 1990s.
Study on the Preliminary Design of ARGO-M Operation System
NASA Astrophysics Data System (ADS)
Seo, Yoon-Kyung; Lim, Hyung-Chul; Rew, Dong-Young; Jo, Jung Hyun; Park, Jong-Uk; Park, Eun-Seo; Park, Jang-Hyun
2010-12-01
Korea Astronomy and Space Science Institute has been developing one mobile satellite laser ranging system named as accurate ranging system for geodetic observation-mobile (ARGO-M). Preliminary design of ARGO-M operation system (AOS) which is one of the ARGO-M subsystems was completed in 2009. Preliminary design results are applied to the following development phase by performing detailed design with analysis of pre-defined requirements and analysis of the derived specifications. This paper addresses the preliminary design of the whole AOS. The design results in operation and control part which is a key part in the operation system are described in detail. Analysis results of the interface between operation-supporting hardware and the control computer are summarized, which is necessary in defining the requirements for the operation-supporting hardware. Results of this study are expected to be used in the critical design phase to finalize the design process.
NASA Technical Reports Server (NTRS)
Campbell, R. H.; Essick, R. B.; Grass, J.; Johnston, G.; Kenny, K.; Russo, V.
1986-01-01
The EOS project is investigating the design and construction of a family of real-time distributed embedded operating systems for reliable, distributed aerospace applications. Using the real-time programming techniques developed in co-operation with NASA in earlier research, the project staff is building a kernel for a multiple processor networked system. The first six months of the grant included a study of scheduling in an object-oriented system, the design philosophy of the kernel, and the architectural overview of the operating system. In this report, the operating system and kernel concepts are described. An environment for the experiments has been built and several of the key concepts of the system have been prototyped. The kernel and operating system is intended to support future experimental studies in multiprocessing, load-balancing, routing, software fault-tolerance, distributed data base design, and real-time processing.
International Ultraviolet Explorer Observatory operations
NASA Technical Reports Server (NTRS)
1985-01-01
This volume contains the final report for the International Ultraviolet Explorer IUE Observatory Operations contract. The fundamental operational objective of the International Ultraviolet Explorer (IUE) program is to translate competitively selected observing programs into IUE observations, to reduce these observations into meaningful scientific data, and then to present these data to the Guest Observer in a form amenable to the pursuit of scientific research. The IUE Observatory is the key to this objective since it is the central control and support facility for all science operations functions within the IUE Project. In carrying out the operation of this facility, a number of complex functions were provided beginning with telescope scheduling and operation, proceeding to data processing, and ending with data distribution and scientific data analysis. In support of these critical-path functions, a number of other significant activities were also provided, including scientific instrument calibration, systems analysis, and software support. Routine activities have been summarized briefly whenever possible.
Gilna, Paul; Lynd, Lee R.; Mohnen, Debra; ...
2017-11-30
The DOE BioEnergy Science Center has operated as a virtual center with multiple partners for a decade targeting overcoming biomass recalcitrance. BESC has redefined biomass recalcitrance from an observable phenotype to a better understood and manipulatable fundamental and operational property. These manipulations are then the result of deeper biological understanding and can be combined with other advanced biotechnology improvements in biomass conversion to improve bioenergy processes and markets. This article provides an overview of key accomplishments in overcoming recalcitrance via better plants, better microbes, and better tools and combinations. Finally, we present a perspective on the aspects of successful centermore » operation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilna, Paul; Lynd, Lee R.; Mohnen, Debra
The DOE BioEnergy Science Center has operated as a virtual center with multiple partners for a decade targeting overcoming biomass recalcitrance. BESC has redefined biomass recalcitrance from an observable phenotype to a better understood and manipulatable fundamental and operational property. These manipulations are then the result of deeper biological understanding and can be combined with other advanced biotechnology improvements in biomass conversion to improve bioenergy processes and markets. This article provides an overview of key accomplishments in overcoming recalcitrance via better plants, better microbes, and better tools and combinations. Finally, we present a perspective on the aspects of successful centermore » operation.« less
β-Decay Studies of r-Process Nuclei Using the Advanced Implantation Detector Array (AIDA)
NASA Astrophysics Data System (ADS)
Griffin, C. J.; Davinson, T.; Estrade, A.; Braga, D.; Burrows, I.; Coleman-Smith, P. J.; Grahn, T.; Grant, A.; Harkness-Brennan, L. J.; Kiss, G.; Kogimtzis, M.; Lazarus, I. H.; Letts, S. C.; Liu, Z.; Lorusso, G.; Matsui, K.; Nishimura, S.; Page, R. D.; Prydderch, M.; Phong, V. H.; Pucknell, V. F. E.; Rinta-Antila, S.; Roberts, O. J.; Seddon, D. A.; Simpson, J.; Thomas, S. L.; Woods, P. J.
Thought to produce around half of all isotopes heavier than iron, the r-process is a key mechanism for nucleosynthesis. However, a complete description of the r-process is still lacking and many unknowns remain. Experimental determination of β-decay half-lives and β-delayed neutron emission probabilities along the r-process path would help to facilitate a greater understanding of this process. The Advanced Implantation Detector Array (AIDA) represents the latest generation of silicon implantation detectors for β-decay studies with fast radioactive ion beams. Preliminary results from commissioning experiments demonstrate successful operation of AIDA and analysis of the data obtained during the first official AIDA experiments is now under-way.
NASA Operational Environment Team (NOET) - NASA's key to environmental technology
NASA Technical Reports Server (NTRS)
Cook, Beth
1993-01-01
NOET is a NASA-wide team which supports the research and development community by sharing information both in person and via a computerized network, assisting in specification and standard revisions, developing cleaner propulsion systems, and exploring environmentally compliant alternatives to current processes. NOET's structure, dissemination of materials, electronic information, EPA compliance, specifications and standards, and environmental research and development are discussed.
Kevin M. Potter
2018-01-01
As a pervasive disturbance agent operating at many spatial and temporal scales, wildland fire is a key abiotic factor affecting forest health both positively and negatively. In some ecosystems, for example, wildland fires have been essential for regulating processes that maintain forest health (Lundquist and others 2011). Wildland fire is an important ecological...
The Internal Medicine of the 21st century: Organizational and operational standards.
Casariego-Vales, E; Zapatero-Gaviria, A; Elola-Somoza, F J
2017-12-01
The Spanish Society of Internal Medicine has developed a consensus document on the standards and recommendations that they consider essential to the organisation of internal medicine units for conducting their activities efficiently and with high quality. We defined 3 groups of key processes: the care of acutely ill adult patients, the comprehensive care of complex chronic patients and the examination of a patient with a difficult diagnosis and no organ-specific disease. As support processes, we identified the structure and operation of the Internal Medicine units. As strategic processes, we identified training and research. The main subprocesses are structured below, and we established the standards and recommendations for each of them. Lastly, we proposed resulting workloads. The prepared standards must be reviewed within a maximum of 4 years. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.
Impact of Operating Context on the Use of Structure in Air Traffic Controller Cognitive Processes
NASA Technical Reports Server (NTRS)
Davison, Hayley J.; Histon, Jonathan M.; Ragnarsdottir, Margret Dora; Major, Laura M.; Hansman, R. John
2004-01-01
This paper investigates the influence of structure on air traffic controllers cognitive processes in the TRACON, En Route, and Oceanic environments. Radar data and voice command analyses were conducted to support hypotheses generated through observations and interviews conducted at the various facilities. Three general types of structure-based abstractions (standard flows, groupings, and critical points) have been identified as being used in each context, though the details of their application varied in accordance with the constraints of the particular operational environment. Projection emerged as a key cognitive process aided by the structure-based abstractions, and there appears to be a significant difference between how time-based versus spatial-based projection is performed by controllers. It is recommended that consideration be given to the value provided by the structure-based abstractions to the controller as well as to maintain consistency between the type (time or spatial) of information support provided to the controller.
The Standard Autonomous File Server, A Customized, Off-the-Shelf Success Story
NASA Technical Reports Server (NTRS)
Semancik, Susan K.; Conger, Annette M.; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
The Standard Autonomous File Server (SAFS), which includes both off-the-shelf hardware and software, uses an improved automated file transfer process to provide a quicker, more reliable, prioritized file distribution for customers of near real-time data without interfering with the assets involved in the acquisition and processing of the data. It operates as a stand-alone solution, monitoring itself, and providing an automated fail-over process to enhance reliability. This paper describes the unique problems and lessons learned both during the COTS selection and integration into SAFS, and the system's first year of operation in support of NASA's satellite ground network. COTS was the key factor in allowing the two-person development team to deploy systems in less than a year, meeting the required launch schedule. The SAFS system has been so successful; it is becoming a NASA standard resource, leading to its nomination for NASA's Software of the Year Award in 1999.
STS-114: Discovery Tanking Operations for Launch
NASA Technical Reports Server (NTRS)
2005-01-01
Jessica Rye from NASA Public Affairs is the narrator for the tanking operations for the launch of the Space Shuttle Discovery. She presents a video of the arrival and processing of the new external tank at the Kennedy Space Center. The external tank is also shown entering the Vehicle Assembly Building (VAB). The external tank underwent new processing resulting from its redesign including inspection of the bipod heater and the external separation camera. The changes to the external tank include: 1) Electric heaters to protect from icing; and 2) Liquid Oxygen feed line bellows to carry fuel from the external tank to the Orbiter. Footage of the external tank processing facility at NASA's Michoud Assembly Facility in New Orleans, La. prior to its arrival at Kennedy Space Center is shown and a video of the three key modifications to the external tank including the bipod, flange and bellows are shown.
NASA Astrophysics Data System (ADS)
Tobiska, W.; Knipp, D. J.; Burke, W. J.; Bouwer, D.; Bailey, J. J.; Hagan, M. P.; Didkovsky, L. V.; Garrett, H. B.; Bowman, B. R.; Gannon, J. L.; Atwell, W.; Blake, J. B.; Crain, W.; Rice, D.; Schunk, R. W.; Fulgham, J.; Bell, D.; Gersey, B.; Wilkins, R.; Fuschino, R.; Flynn, C.; Cecil, K.; Mertens, C. J.; Xu, X.; Crowley, G.; Reynolds, A.; Azeem, S. I.; Wiley, S.; Holland, M.; Malone, K.
2013-12-01
Space weather's effects upon the near-Earth environment are due to dynamic changes in the energy transfer processes from the Sun's photons, particles, and fields. Of the space environment domains that are affected by space weather, the magnetosphere, thermosphere, and even troposphere are key regions that are affected. Space Environment Technologies (SET) has developed and is producing innovative space weather applications. Key operational systems for providing timely information about the effects of space weather on these domains are SET's Magnetosphere Alert and Prediction System (MAPS), LEO Alert and Prediction System (LAPS), and Automated Radiation Measurements for Aviation Safety (ARMAS) system. MAPS provides a forecast Dst index out to 6 days through the data-driven, redundant data stream Anemomilos algorithm. Anemomilos uses observational proxies for the magnitude, location, and velocity of solar ejecta events. This forecast index is used by satellite operations to characterize upcoming geomagnetic storms, for example. LAPS is the SET fully redundant operational system providing recent history, current epoch, and forecast solar and geomagnetic indices for use in operational versions of the JB2008 thermospheric density model. The thermospheric densities produced by that system, driven by the LAPS data, are forecast to 72-hours to provide the global mass densities for satellite operators. ARMAS is a project that has successfully demonstrated the operation of a micro dosimeter on aircraft to capture the real-time radiation environment due to Galactic Cosmic Rays and Solar Energetic Particles. The dose and dose-rates are captured on aircraft, downlinked in real-time via the Iridium satellites, processed on the ground, incorporated into the most recent NAIRAS global radiation climatology data runs, and made available to end users via the web and smart phone apps. ARMAS provides the 'weather' of the radiation environment to improve air-crew and passenger safety. Many of the data products from MAPS, LAPS, and ARMAS are available on the SpaceWx smartphone app for iPhone, iPad, iPod, and Android professional users and public space weather education. We describe recent forecasting advances for moving the space weather information from these automated systems into operational, derivative products for communications, aviation, and satellite operations uses.
Information systems and human error in the lab.
Bissell, Michael G
2004-01-01
Health system costs in clinical laboratories are incurred daily due to human error. Indeed, a major impetus for automating clinical laboratories has always been the opportunity it presents to simultaneously reduce cost and improve quality of operations by decreasing human error. But merely automating these processes is not enough. To the extent that introduction of these systems results in operators having less practice in dealing with unexpected events or becoming deskilled in problemsolving, however new kinds of error will likely appear. Clinical laboratories could potentially benefit by integrating findings on human error from modern behavioral science into their operations. Fully understanding human error requires a deep understanding of human information processing and cognition. Predicting and preventing negative consequences requires application of this understanding to laboratory operations. Although the occurrence of a particular error at a particular instant cannot be absolutely prevented, human error rates can be reduced. The following principles are key: an understanding of the process of learning in relation to error; understanding the origin of errors since this knowledge can be used to reduce their occurrence; optimal systems should be forgiving to the operator by absorbing errors, at least for a time; although much is known by industrial psychologists about how to write operating procedures and instructions in ways that reduce the probability of error, this expertise is hardly ever put to use in the laboratory; and a feedback mechanism must be designed into the system that enables the operator to recognize in real time that an error has occurred.
NASA Astrophysics Data System (ADS)
Becker, T.; König, G.
2015-10-01
Cartographic visualizations of crises are used to create a Common Operational Picture (COP) and enforce Situational Awareness by presenting relevant information to the involved actors. As nearly all crises affect geospatial entities, geo-data representations have to support location-specific analysis throughout the decision-making process. Meaningful cartographic presentation is needed for coordinating the activities of crisis manager in a highly dynamic situation, since operators' attention span and their spatial memories are limiting factors during the perception and interpretation process. Situational Awareness of operators in conjunction with a COP are key aspects in decision-making process and essential for making well thought-out and appropriate decisions. Considering utility networks as one of the most complex and particularly frequent required systems in urban environment, meaningful cartographic presentation of multiple utility networks with respect to disaster management do not exist. Therefore, an optimized visualization of utility infrastructure for emergency response procedures is proposed. The article will describe a conceptual approach on how to simplify, aggregate, and visualize multiple utility networks and their components to meet the requirements of the decision-making process and to support Situational Awareness.
NASA Astrophysics Data System (ADS)
Wang, Hongfeng; Fu, Yaping; Huang, Min; Wang, Junwei
2016-03-01
The operation process design is one of the key issues in the manufacturing and service sectors. As a typical operation process, the scheduling with consideration of the deteriorating effect has been widely studied; however, the current literature only studied single function requirement and rarely considered the multiple function requirements which are critical for a real-world scheduling process. In this article, two function requirements are involved in the design of a scheduling process with consideration of the deteriorating effect and then formulated into two objectives of a mathematical programming model. A novel multiobjective evolutionary algorithm is proposed to solve this model with combination of three strategies, i.e. a multiple population scheme, a rule-based local search method and an elitist preserve strategy. To validate the proposed model and algorithm, a series of randomly-generated instances are tested and the experimental results indicate that the model is effective and the proposed algorithm can achieve the satisfactory performance which outperforms the other state-of-the-art multiobjective evolutionary algorithms, such as nondominated sorting genetic algorithm II and multiobjective evolutionary algorithm based on decomposition, on all the test instances.
Development of an alternate pathway for materials destined for disposition to WIPP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ayers, Georgette Y; Mckerley, Bill; Veazey, Gerald W
2010-01-01
The Los Alamos National Laboratory currently has an inventory of process residues that may be viable candidates for disposition to the Waste Isolation Pilot Project (WIPP) located at Carlsbad, New Mexico. A recent 'Attractiveness Level D' exemption allows for the discard of specified intractable materials regardless of the percent plutonium. However, the limits with respect to drum loadings must be met. Cementation is a key component of the aqueous nitrate flowsheet and serves as a 'bleed-off' stream for impurities separated from the plutonium during processing operations. The main 'feed' to the cementation operations are the 'bottoms' from the evaporation process.more » In the majority of cases, the cemented bottoms contain less than the allowed amount per drum for WIPP acceptance. This project would expand the route to WIPP for items that have no defined disposition path, are difficult to process, have been through multiple passes, have no current recovery operations available to recover the plutonium and that are amenable to cementation. This initial work will provide the foundation for a full scale disposition pathway of the candidate materials. Once the pathway has been expanded and a cementation matrix developed, routine discard activities will be initiated.« less
NASA Astrophysics Data System (ADS)
Kalluri, S. N.; Haman, B.; Vititoe, D.
2014-12-01
The ground system under development for Geostationary Operational Environmental Satellite-R (GOES-R) series of weather satellite has completed a key milestone in implementing the science algorithms that process raw sensor data to higher level products in preparation for launch. Real time observations from GOES-R are expected to make significant contributions to Earth and space weather prediction, and there are stringent requirements to product weather products at very low latency to meet NOAA's operational needs. Simulated test data from all the six GOES-R sensors are being processed by the system to test and verify performance of the fielded system. Early results show that the system development is on track to meet functional and performance requirements to process science data. Comparison of science products generated by the ground system from simulated data with those generated by the algorithm developers show close agreement among data sets which demonstrates that the algorithms are implemented correctly. Successful delivery of products to AWIPS and the Product Distribution and Access (PDA) system from the core system demonstrate that the external interfaces are working.
Analytical control test plan and microbiological methods for the water recovery test
NASA Technical Reports Server (NTRS)
Traweek, M. S. (Editor); Tatara, J. D. (Editor)
1994-01-01
Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.
An innovative approach to capability-based emergency operations planning
Keim, Mark E
2013-01-01
This paper describes the innovative use information technology for assisting disaster planners with an easily-accessible method for writing and improving evidence-based emergency operations plans. This process is used to identify all key objectives of the emergency response according to capabilities of the institution, community or society. The approach then uses a standardized, objective-based format, along with a consensus-based method for drafting capability-based operational-level plans. This information is then integrated within a relational database to allow for ease of access and enhanced functionality to search, sort and filter and emergency operations plan according to user need and technological capacity. This integrated approach is offered as an effective option for integrating best practices of planning with the efficiency, scalability and flexibility of modern information and communication technology. PMID:28228987
An innovative approach to capability-based emergency operations planning.
Keim, Mark E
2013-01-01
This paper describes the innovative use information technology for assisting disaster planners with an easily-accessible method for writing and improving evidence-based emergency operations plans. This process is used to identify all key objectives of the emergency response according to capabilities of the institution, community or society. The approach then uses a standardized, objective-based format, along with a consensus-based method for drafting capability-based operational-level plans. This information is then integrated within a relational database to allow for ease of access and enhanced functionality to search, sort and filter and emergency operations plan according to user need and technological capacity. This integrated approach is offered as an effective option for integrating best practices of planning with the efficiency, scalability and flexibility of modern information and communication technology.
NASA Technical Reports Server (NTRS)
Huyse, Luc; Bushnell, Dennis M. (Technical Monitor)
2001-01-01
Free-form shape optimization of airfoils poses unexpected difficulties. Practical experience has indicated that a deterministic optimization for discrete operating conditions can result in dramatically inferior performance when the actual operating conditions are different from the - somewhat arbitrary - design values used for the optimization. Extensions to multi-point optimization have proven unable to adequately remedy this problem of "localized optimization" near the sampled operating conditions. This paper presents an intrinsically statistical approach and demonstrates how the shortcomings of multi-point optimization with respect to "localized optimization" can be overcome. The practical examples also reveal how the relative likelihood of each of the operating conditions is automatically taken into consideration during the optimization process. This is a key advantage over the use of multipoint methods.
Zhao, Qi; Liu, Yunchao; Yuan, Xiao; Chitambar, Eric; Ma, Xiongfeng
2018-02-16
Manipulation and quantification of quantum resources are fundamental problems in quantum physics. In the asymptotic limit, coherence distillation and dilution have been proposed by manipulating infinite identical copies of states. In the nonasymptotic setting, finite data-size effects emerge, and the practically relevant problem of coherence manipulation using finite resources has been left open. This Letter establishes the one-shot theory of coherence dilution, which involves converting maximally coherent states into an arbitrary quantum state using maximally incoherent operations, dephasing-covariant incoherent operations, incoherent operations, or strictly incoherent operations. We introduce several coherence monotones with concrete operational interpretations that estimate the one-shot coherence cost-the minimum amount of maximally coherent states needed for faithful coherence dilution. Furthermore, we derive the asymptotic coherence dilution results with maximally incoherent operations, incoherent operations, and strictly incoherent operations as special cases. Our result can be applied in the analyses of quantum information processing tasks that exploit coherence as resources, such as quantum key distribution and random number generation.
NASA Astrophysics Data System (ADS)
Zhao, Qi; Liu, Yunchao; Yuan, Xiao; Chitambar, Eric; Ma, Xiongfeng
2018-02-01
Manipulation and quantification of quantum resources are fundamental problems in quantum physics. In the asymptotic limit, coherence distillation and dilution have been proposed by manipulating infinite identical copies of states. In the nonasymptotic setting, finite data-size effects emerge, and the practically relevant problem of coherence manipulation using finite resources has been left open. This Letter establishes the one-shot theory of coherence dilution, which involves converting maximally coherent states into an arbitrary quantum state using maximally incoherent operations, dephasing-covariant incoherent operations, incoherent operations, or strictly incoherent operations. We introduce several coherence monotones with concrete operational interpretations that estimate the one-shot coherence cost—the minimum amount of maximally coherent states needed for faithful coherence dilution. Furthermore, we derive the asymptotic coherence dilution results with maximally incoherent operations, incoherent operations, and strictly incoherent operations as special cases. Our result can be applied in the analyses of quantum information processing tasks that exploit coherence as resources, such as quantum key distribution and random number generation.
NASA Astrophysics Data System (ADS)
Biset, S.; Nieto Deglioumini, L.; Basualdo, M.; Garcia, V. M.; Serra, M.
The aim of this work is to investigate which would be a good preliminary plantwide control structure for the process of Hydrogen production from bioethanol to be used in a proton exchange membrane (PEM) accounting only steady-state information. The objective is to keep the process under optimal operation point, that is doing energy integration to achieve the maximum efficiency. Ethanol, produced from renewable feedstocks, feeds a fuel processor investigated for steam reforming, followed by high- and low-temperature shift reactors and preferential oxidation, which are coupled to a polymeric fuel cell. Applying steady-state simulation techniques and using thermodynamic models the performance of the complete system with two different control structures have been evaluated for the most typical perturbations. A sensitivity analysis for the key process variables together with the rigorous operability requirements for the fuel cell are taking into account for defining acceptable plantwide control structure. This is the first work showing an alternative control structure applied to this kind of process.
NASA Technical Reports Server (NTRS)
Siegfried, D. E.
1982-01-01
A quartz hollow tube cathode was used to determine the operating conditions within a mercury orificed hollow cathode. Insert temperature profiles, cathode current distributions, plasma properties profile, and internal pressure-mass flow rate results are summarized and used in a phenomenological model which qualitatively describes electron emission and plasma production processes taking place within the cathode. By defining an idealized ion production region within which most of the plasma processes are concentrated, this model is expressed analytically as a simple set of equations which relate cathode dimensions and specifiable operating conditions, such as mass flow rate and discharge current, to such important parameters as emission surface temperature and internal plasma properties. Key aspects of the model are examined.
NASA Astrophysics Data System (ADS)
Koon, Phillip L.; Greene, Scott
2002-07-01
Our aerospace customers are demanding that we drastically reduce the cost of operating and supporting our products. Our space customer in particular is looking for the next generation of reusable launch vehicle systems to support more aircraft like operation. To achieve this goal requires more than an evolution in materials, processes and systems, what is required is a paradigm shift in the design of the launch vehicles and the processing systems that support the launch vehicles. This paper describes the Automated Informed Maintenance System (AIM) we are developing for NASA's Space Launch Initiative (SLI) Second Generation Reusable Launch Vehicle (RLV). Our system includes an Integrated Health Management (IHM) system for the launch vehicles and ground support systems, which features model based diagnostics and prognostics. Health Management data is used by our AIM decision support and process aids to automatically plan maintenance, generate work orders and schedule maintenance activities along with the resources required to execute these processes. Our system will automate the ground processing for a spaceport handling multiple RLVs executing multiple missions. To accomplish this task we are applying the latest web based distributed computing technologies and application development techniques.
Investigating influences on current community pharmacy practice at micro, meso, and macro levels.
Hermansyah, Andi; Sainsbury, Erica; Krass, Ines
The nature of Australian community pharmacy is continually evolving, raising the need to explore the current situation in order to understand the potential impact of any changes. Although community pharmacy has the potential to play a greater role in health care, it is currently not meeting this potential. To investigate the nature of the contemporary practice of community pharmacy in Australia and examine the potential missed opportunities for role expansion in health care. In-depth semi-structured interviews with a wide-range of key stakeholders within and beyond community pharmacy circles were conducted. Interviews were audio-recorded, transcribed verbatim and analyzed for emerging themes. Twenty-seven key informants across Eastern half of Australia were interviewed between December 2014 and August 2015. Several key elements of the current situation representing the social, economic and policy context of community pharmacy have been identified. These elements operate interdependently, influence micro, meso and macro levels of community pharmacy operation and are changing in the current climate. Community pharmacy has untapped potential in primary health care, but it has been slow to change to meet opportunities available in the current situation. As the current situation is complex, interrelated and dynamic with often unintended and unpredictable consequences, this paper suggests that policy makers to consider the micro, meso and macro levels of community pharmacy operation when making significant policy changes. The framework proposed in this study can be a helpful tool to analyze the processes operating at these three levels and their influences on practice. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mariani, R.D.; Benedict, R.W.; Lell, R.M.
1993-09-01
The Integral Fast Reactor being developed by Argonne National Laboratory (ANL) combines the advantages of metal-fueled, liquid-metal-cooled reactors and a closed fuel cycle. Presently, the Fuel Cycle Facility (FCF) at ANL-West in Idaho Falls, Idaho is being modified to recycle spent metallic fuel from Experimental Breeder Reactor II as part of a demonstration project sponsored by the Department of Energy. A key component of the FCF is the electrorefiner (ER) in which the actinides are separated from the fission products. In the electrorefining process, the metal fuel is anodically dissolved into a high-temperature molten salt and refined uranium or uranium/plutoniummore » products are deposited at cathodes. In this report, the criticality safety strategy for the FCF ER is summarized. FCF ER operations and processes formed the basis for evaluating criticality safety and control during actinide metal fuel refining. In order to show criticality safety for the FCF ER, the reference operating conditions for the ER had to be defined. Normal operating envelopes (NOES) were then defined to bracket the important operating conditions. To keep the operating conditions within their NOES, process controls were identified that can be used to regulate the actinide forms and content within the ER. A series of operational checks were developed for each operation that wig verify the extent or success of an operation. The criticality analysis considered the ER operating conditions at their NOE values as the point of departure for credible and incredible failure modes. As a result of the analysis, FCF ER operations were found to be safe with respect to criticality.« less
A key to success: optimizing the planning process
NASA Astrophysics Data System (ADS)
Turk, Huseyin; Karakaya, Kamil
2014-05-01
By adopting The NATO Strategic Concept Document in 2010, some important changes in the perception of threat and management of crisis were introduced. This new concept, named ''Comprehensive Approach'', includes the precautions of pre-crisis management, applications of crisis-duration management and reconstruction phase of post-intervention management. NATO will be interested in not only the political and military options , but also social, economical and informational aspects of crisis. NATO will take place in all phases of conflict. The conflicts which occur outside the borders of NATO's nations and terrorism are perceived as threat sources for peace and stability. In addition to conventional threats, cyber attacks which threaten network-supported communication systems, preventing applications from accessing to space that will be used in different fields of life. On the other hand, electronic warfare capabilities which can effect us negatively are added to threat list as new threats. In the process in which military is thought as option, a harder planning phase is waiting for NATO's decision makers who struggle for keeping peace and security. Operation planning process which depends on comprehensive approach, contains these steps: Situational awareness of battlefield, evaluation of the military intervention options, orientation, developing an operation plan, reviewing the plan and transition phases.1 To be successful in theater which is always changing with the technological advances, there has to be an accurate and timely planning on the table. So, spending time for planning can be shown as one of the biggest problem. In addition, sustaining situational awareness which is important for the whole operation planning process, technical command and control hitches, human factor, inability to determine the center of gravity of opponent in asymmetrical threat situations can be described as some of the difficulties in operation planning. In this study, a possible air operation planning process is analyzed according to a comprehensive approach. The difficulties of planning are identified. Consequently, for optimizing a decisionmaking process of an air operation, a planning process is identified in a virtual command and control structure.
Rosenberg-Lee, Miriam; Chang, Ting Ting; Young, Christina B; Wu, Sarah; Menon, Vinod
2011-01-01
Although lesion studies over the past several decades have focused on functional dissociations in posterior parietal cortex (PPC) during arithmetic, no consistent view has emerged of its differential involvement in addition, subtraction, multiplication, and division. To circumvent problems with poor anatomical localization, we examined functional overlap and dissociations in cytoarchitectonically-defined subdivisions of the intraparietal sulcus (IPS), superior parietal lobule (SPL) and angular gyrus (AG), across these four operations. Compared to a number identification control task, all operations except addition, showed a consistent profile of left posterior IPS activation and deactivation in the right posterior AG. Multiplication and subtraction differed significantly in right, but not left, IPS and AG activity, challenging the view that the left AG differentially subserves retrieval during multiplication. Although addition and multiplication both rely on retrieval, multiplication evoked significantly greater activation in right posterior IPS, as well as the prefrontal cortex, lingual and fusiform gyri, demonstrating that addition and multiplication engage different brain processes. Comparison of PPC responses to the two pairs of inverse operations: division vs. multiplication and subtraction vs. addition revealed greater activation of left lateral SPL during division, suggesting that processing inverse relations is operation specific. Our findings demonstrate that individual IPS, SPL and AG subdivisions are differentially modulated by the four arithmetic operations and they point to significant functional heterogeneity and individual differences in activation and deactivation within the PPC. Critically, these effects are related to retrieval, calculation and inversion, the three key cognitive processes that are differentially engaged by arithmetic operations. Our findings point to distributed representation of these processes in the human PPC and also help explain why lesion and previous imaging studies have yielded inconsistent findings. PMID:21616086
Rosenberg-Lee, Miriam; Chang, Ting Ting; Young, Christina B; Wu, Sarah; Menon, Vinod
2011-07-01
Although lesion studies over the past several decades have focused on functional dissociations in posterior parietal cortex (PPC) during arithmetic, no consistent view has emerged of its differential involvement in addition, subtraction, multiplication, and division. To circumvent problems with poor anatomical localization, we examined functional overlap and dissociations in cytoarchitectonically defined subdivisions of the intraparietal sulcus (IPS), superior parietal lobule (SPL) and angular gyrus (AG), across these four operations. Compared to a number identification control task, all operations except addition, showed a consistent profile of left posterior IPS activation and deactivation in the right posterior AG. Multiplication and subtraction differed significantly in right, but not left, IPS and AG activity, challenging the view that the left AG differentially subserves retrieval during multiplication. Although addition and multiplication both rely on retrieval, multiplication evoked significantly greater activation in right posterior IPS, as well as the prefrontal cortex, lingual and fusiform gyri, demonstrating that addition and multiplication engage different brain processes. Comparison of PPC responses to the two pairs of inverse operations: division versus multiplication and subtraction versus addition revealed greater activation of left lateral SPL during division, suggesting that processing inverse relations is operation specific. Our findings demonstrate that individual IPS, SPL and AG subdivisions are differentially modulated by the four arithmetic operations and they point to significant functional heterogeneity and individual differences in activation and deactivation within the PPC. Critically, these effects are related to retrieval, calculation and inversion, the three key cognitive processes that are differentially engaged by arithmetic operations. Our findings point to distribute representation of these processes in the human PPC and also help explain why lesion and previous imaging studies have yielded inconsistent findings. Copyright © 2011 Elsevier Ltd. All rights reserved.
St. Michael's Improvement Program - A Collaborative Approach to Sustainable Cost Savings.
Trafford, Anne; Jane, Danielle
2017-01-01
In response to a challenging financial environment and increasing patient demand, St. Michael's Hospital needed to find long-term sustainable solutions to continue to provide high-quality patient care and invest in key priorities. By conducting Operational Reviews in focused areas, the hospital achieved $7.4 million of in-year savings in the first year, found standardizations, process efficiencies and direct cost savings that positioned itself for success in future funding models. Initiatives were grounded in evidence and relied heavily on the effective execution by the leadership, front-line staff and physicians. As organizations face similar challenges, this journey can provide key learnings.
Mechanics analysis of the multi-point-load process for the thin film solar cell
NASA Astrophysics Data System (ADS)
Wang, Zhiming; Wei, Guangpu; Gong, Zhengbang
2008-02-01
The main element of thin film solar cell is silicon. Because of the special mechanical characteristic of silicon, the method of loading pressure on the thin film solar cell and the value of pressure is the key problem which must be solved during the manufacturing of thin film solar cell. This paper describes the special mechanical characteristic of silicon, discussed the test method overall; value of pressure on thin film solar cell; the elements and the method of load by ANSYS finite element, according to these theory analysis, we obtained the key conclusion in the actual operation, these result have a great meaning in industry.
The emerging role of lysosomes in copper homeostasis.
Polishchuk, Elena V; Polishchuk, Roman S
2016-09-01
The lysosomal system operates as a focal point where a number of important physiological processes such as endocytosis, autophagy and nutrient sensing converge. One of the key functions of lysosomes consists of regulating the metabolism/homeostasis of metals. Metal-containing components are carried to the lysosome through incoming membrane flows, while numerous transporters allow metal ions to move across the lysosome membrane. These properties enable lysosomes to direct metal fluxes to the sites where metal ions are either used by cellular components or sequestered. Copper belongs to a group of metals that are essential for the activity of vitally important enzymes, although it is toxic when in excess. Thus, copper uptake, supply and intracellular compartmentalization have to be tightly regulated. An increasing number of publications have indicated that these processes involve lysosomes. Here we review studies that reveal the expanding role of the lysosomal system as a hub for the control of Cu homeostasis and for the regulation of key Cu-dependent processes in health and disease.
NASA Technical Reports Server (NTRS)
Patterson, Linda P.
2001-01-01
The International Space Station (ISS) has an operational mission and profile that makes it a Logistics and Maintenance (L&M) support challenge different from previous programs. It is permanently manned, assembled on orbit, and multi-national. With this technical and operational challenge, a unique approach is needed to support the hardware and crew. The key is the integration of on-orbit and ground analysis, supply, maintenance, and crew training into a coherent functional process that supports ISS goals and objectives. To integrate all the necessary aspects of hardware and personnel to support on-orbit maintenance, a myriad of products and processes must be created and coordinated, such that the right resources are in the right place at the right time to ensure continued ISS functionality. This paper will familiarize the audience with ISS On-Orbit Maintenance (OOM) concepts and capabilities for different maintenance tasks and discuss some of the logic behind their selection. It will also identify the operational maintenance support responsibility split between the U.S. and the various International Partners (IPs).
Martín-Gamboa, Mario; Iribarren, Diego; Susmozas, Ana; Dufour, Javier
2016-08-01
A novel approach is developed to evaluate quantitatively the influence of operational inefficiency in biomass production on the life-cycle performance of hydrogen from biomass gasification. Vine-growers and process simulation are used as key sources of inventory data. The life cycle assessment of biohydrogen according to current agricultural practices for biomass production is performed, as well as that of target biohydrogen according to agricultural practices optimised through data envelopment analysis. Only 20% of the vineyards assessed operate efficiently, and the benchmarked reduction percentages of operational inputs range from 45% to 73% in the average vineyard. The fulfilment of operational benchmarks avoiding irregular agricultural practices is concluded to improve significantly the environmental profile of biohydrogen (e.g., impact reductions above 40% for eco-toxicity and global warming). Finally, it is shown that this type of bioenergy system can be an excellent replacement for conventional hydrogen in terms of global warming and non-renewable energy demand. Copyright © 2016 Elsevier Ltd. All rights reserved.
Catalyst system and process for benzyl ether fragmentation and coal liquefaction
Zoeller, Joseph Robert
1998-04-28
Dibenzyl ether can be readily cleaved to form primarily benzaldehyde and toluene as products, along with minor amounts of bibenzyl and benzyl benzoate, in the presence of a catalyst system comprising a Group 6 metal, preferably molybdenum, a salt, and an organic halide. Although useful synthetically for the cleavage of benzyl ethers, this cleavage also represents a key model reaction for the liquefaction of coal; thus this catalyst system and process should be useful in coal liquefaction with the advantage of operating at significantly lower temperatures and pressures.
Decision support systems and the healthcare strategic planning process: a case study.
Lundquist, D L; Norris, R M
1991-01-01
The repertoire of applications that comprises health-care decision support systems (DSS) includes analyses of clinical, financial, and operational activities. As a whole, these applications facilitate developing comprehensive and interrelated business and medical models that support the complex decisions required to successfully manage today's health-care organizations. Kennestone Regional Health Care System's use of DSS to facilitate strategic planning has precipitated marked changes in the organization's method of determining capital allocations. This case study discusses Kennestone's use of DSS in the strategic planning process, including profiles of key DSS modeling components.
Instructions included? Make safety training part of medical device procurement process.
Keller, James P
2010-04-01
Before hospitals embrace new technologies, it's important that medical personnel agree on how best to use them. Likewise, hospitals must provide the support to operate these sophisticated devices safely. With this in mind, it's wise for hospitals to include medical device training in the procurement process. Moreover, purchasing professionals can play a key role in helping to increase the amount of user training for medical devices and systems. What steps should you take to help ensure that new medical devices are implemented safely? Here are some tips.
Tuck, Melissa K; Chan, Daniel W; Chia, David; Godwin, Andrew K; Grizzle, William E; Krueger, Karl E; Rom, William; Sanda, Martin; Sorbara, Lynn; Stass, Sanford; Wang, Wendy; Brenner, Dean E
2009-01-01
Specimen collection is an integral component of clinical research. Specimens from subjects with various stages of cancers or other conditions, as well as those without disease, are critical tools in the hunt for biomarkers, predictors, or tests that will detect serious diseases earlier or more readily than currently possible. Analytic methodologies evolve quickly. Access to high-quality specimens, collected and handled in standardized ways that minimize potential bias or confounding factors, is key to the "bench to bedside" aim of translational research. It is essential that standard operating procedures, "the how" of creating the repositories, be defined prospectively when designing clinical trials. Small differences in the processing or handling of a specimen can have dramatic effects in analytical reliability and reproducibility, especially when multiplex methods are used. A representative working group, Standard Operating Procedures Internal Working Group (SOPIWG), comprised of members from across Early Detection Research Network (EDRN) was formed to develop standard operating procedures (SOPs) for various types of specimens collected and managed for our biomarker discovery and validation work. This report presents our consensus on SOPs for the collection, processing, handling, and storage of serum and plasma for biomarker discovery and validation.
Instrumentation status of the low-b magnet systems at the Large Hadron Collider (LHC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darve, C.; /Fermilab; Balle, C.
2011-05-01
The low-{beta} magnet systems are located in the Large Hadron Collider (LHC) insertion regions around the four interaction points. They are the key elements in the beams focusing/defocusing process allowing proton collisions at luminosity up to 10{sup 34}cm{sup -2}s{sup -1}. Those systems are a contribution of the US-LHC Accelerator project. The systems are mainly composed of the quadrupole magnets (triplets), the separation dipoles and their respective electrical feed-boxes (DFBX). The low-{beta} magnet systems operate in an environment of extreme radiation, high gradient magnetic field and high heat load to the cryogenic system due to the beam dynamic effect. Due tomore » the severe environment, the robustness of the diagnostics is primordial for the operation of the triplets. The hardware commissioning phase of the LHC was completed in February 2010. In the sake of a safer and more user-friendly operation, several consolidations and instrumentation modifications were implemented during this commissioning phase. This paper presents the instrumentation used to optimize the engineering process and operation of the final focusing/defocusing quadrupole magnets for the first years of operation.« less
Software Assurance Curriculum Project Volume 1: Master of Software Assurance Reference Curriculum
2010-08-01
activity by providing a check on the relevance and currency of the process used to develop the MSwA2010 curriculum content. Figure 2 is an expansion of...random oracle model, symmetric crypto primitives, modes of operations, asymmetric crypto primitives (Chapter 5) [16] Detailed design...encryption, public key encryption, digital signatures, message authentication codes, crypto protocols, cryptanalysis, and further detailed crypto
Massive Multiplayer Online Gaming: A Research Framework for Military Training and Education
2005-03-01
those required by a military transforming itself to operating under the concept of network centric warfare. The technologies and practice...learning. Simulations are popular in other business situations and management processes. Data files, video clips, and flowcharts might help learners...on nature of these environments is another key motivator. According to Randy Hinrich, Microsoft Research Group Research Manager for Learning
Quantum Dialogue with Authentication Based on Bell States
NASA Astrophysics Data System (ADS)
Shen, Dongsu; Ma, Wenping; Yin, Xunru; Li, Xiaoping
2013-06-01
We propose an authenticated quantum dialogue protocol, which is based on a shared private quantum entangled channel. In this protocol, the EPR pairs are randomly prepared in one of the four Bell states for communication. By performing four Pauli operations on the shared EPR pairs to encode their shared authentication key and secret message, two legitimate users can implement mutual identity authentication and quantum dialogue without the help from the third party authenticator. Furthermore, due to the EPR pairs which are used for secure communication are utilized to implement authentication and the whole authentication process is included in the direct secure communication process, it does not require additional particles to realize authentication in this protocol. The updated authentication key provides the counterparts with a new authentication key for the next authentication and direct communication. Compared with other secure communication with authentication protocols, this one is more secure and efficient owing to the combination of authentication and direct communication. Security analysis shows that it is secure against the eavesdropping attack, the impersonation attack and the man-in-the-middle (MITM) attack.
Nottingham Health Science Biobank: a sustainable bioresource.
Matharoo-Ball, Balwir; Thomson, Brian J
2014-10-01
Nottingham Health Science Biobank (NHSB) was established in 2011 by a 3-year "pump priming" grant from the United Kingdom National Institute of Health Research. Before biobanking operations began, NHSB commissioned a financial report on the full costs of biobanking and worked with key stakeholders and external consultants to develop a business plan with the aim of achieving financial and operational sustainability. The plan included: scanning published information, telephone interviews with commercial companies, Freedom of Information Requests, dialogue with prospective customers, and a market analysis of global trends in the use of human tissue samples in research. Our financial report provided a comprehensive and structured costing template for biobanking and confirmed the absolute requirement to ensure cost-efficient processes, careful staff utilization, and maximization of sample turnover. Together with our external consultants, we developed a business model responsive to global interest in healthcare founded on i) identification of key therapeutic areas that mapped to the strengths of the NHSB; ii) a systematic approach to identifying companies operating in these therapy areas; iii) engagement with noncommercial stakeholders to agree strategically aligned sample collection with the aim of ensuring the value of our tissue resource. By adopting this systematic approach to business modelling, the NHSB has achieved sustainability after less than 3 years of operation.
Structural design/margin assessment
NASA Technical Reports Server (NTRS)
Ryan, R. S.
1993-01-01
Determining structural design inputs and the structural margins following design completion is one of the major activities in space exploration. The end result is a statement of these margins as stability, safety factors on ultimate and yield stresses, fracture limits (fracture control), fatigue lifetime, reuse criteria, operational criteria and procedures, stability factors, deflections, clearance, handling criteria, etc. The process is normally called a load cycle and is time consuming, very complex, and involves much more than structures. The key to successful structural design is the proper implementation of the process. It depends on many factors: leadership and management of the process, adequate analysis and testing tools, data basing, communications, people skills, and training. This process and the various factors involved are discussed.
Renewable Energy Zone (REZ) Transmission Planning Process: A Guidebook for Practitioners
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Nathan; Flores-Espino, Francisco; Hurlbut, David J.
Achieving clean energy goals may require new investments in transmission, especially if planners anticipate economic growth and increased demand for electricity. The renewable energy zone (REZ) transmission planning process can help policymakers ensure their infrastructure investments achieve national goals in the most economical manner. Policymakers, planners, and system operators around the world have used variations of the REZ process to chart the expansion of their transmission networks and overcome the barriers of traditional transmission planning. This guidebook seeks to help power system planners, key decision makers, and stakeholders understand and use the REZ transmission planning process to integrate transmission expansionmore » planning and renewable energy generation planning.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagans, K.G.; Clough, R.E.
2000-04-25
An optical key system comprises a battery-operated optical key and an isolated lock that derives both its operating power and unlock signals from the correct optical key. A light emitting diode or laser diode is included within the optical key and is connected to transmit a bit-serial password. The key user physically enters either the code-to-transmit directly, or an index to a pseudorandom number code, in the key. Such person identification numbers can be retained permanently, or ephemeral. When a send button is pressed, the key transmits a beam of light modulated with the password information. The modulated beam ofmore » light is received by a corresponding optical lock with a photovoltaic cell that produces enough power from the beam of light to operate a password-screen digital logic. In one application, an acceptable password allows a two watt power laser diode to pump ignition and timing information over a fiberoptic cable into a sealed engine compartment. The receipt of a good password allows the fuel pump, spark, and starter systems to each operate. Therefore, bypassing the lock mechanism as is now routine with automobile thieves is pointless because the engine is so thoroughly disabled.« less
Hagans, Karla G.; Clough, Robert E.
2000-01-01
An optical key system comprises a battery-operated optical key and an isolated lock that derives both its operating power and unlock signals from the correct optical key. A light emitting diode or laser diode is included within the optical key and is connected to transmit a bit-serial password. The key user physically enters either the code-to-transmit directly, or an index to a pseudorandom number code, in the key. Such person identification numbers can be retained permanently, or ephemeral. When a send button is pressed, the key transmits a beam of light modulated with the password information. The modulated beam of light is received by a corresponding optical lock with a photovoltaic cell that produces enough power from the beam of light to operate a password-screen digital logic. In one application, an acceptable password allows a two watt power laser diode to pump ignition and timing information over a fiberoptic cable into a sealed engine compartment. The receipt of a good password allows the fuel pump, spark, and starter systems to each operate. Therefore, bypassing the lock mechanism as is now routine with automobile thieves is pointless because the engine is so thoroughly disabled.
Ochando-Pulido, J M; Hodaifa, G; Victor-Ortega, M D; Rodriguez-Vives, S; Martinez-Ferez, A
2013-12-15
Production of olive oil results in the generation of high amounts of heavy polluted effluents characterized by extremely variable contaminants degree, leading to sensible complexity in treatment. In this work, batch membrane processes in series comprising ultrafiltration (UF), nanofiltration (NF) and reverse osmosis (RO) are used to purify the effluents exiting both the two-phase and tree-phase extraction processes to a grade compatible to the discharge in municipal sewer systems in Spain and Italy. However, one main problem in applying this technology to wastewater management issues is given by membrane fouling. In the last years, the threshold flux theory was introduced as a key tool to understand fouling problems, and threshold flux measurement can give valuable information regarding optimal membrane process design and operation. In the present manuscript, mathematical approach of threshold flux conditions for membranes operation is addressed, also implementing proper pretreatment processes such as pH-T flocculation and UV/TiO2 photocatalysis with ferromagnetic-core nanoparticles in order to reduce membranes fouling. Both influence the organic matter content as well as the particle size distribution of the solutes surviving in the wastewater stream, leading, when properly applied, to reduced fouling, higher rejection and recovery values, thus enhancing the economic feasibility of the process. Copyright © 2013 Elsevier B.V. All rights reserved.
Amino acids and autophagy: cross-talk and co-operation to control cellular homeostasis.
Carroll, Bernadette; Korolchuk, Viktor I; Sarkar, Sovan
2015-10-01
Maintenance of amino acid homeostasis is important for healthy cellular function, metabolism and growth. Intracellular amino acid concentrations are dynamic; the high demand for protein synthesis must be met with constant dietary intake, followed by cellular influx, utilization and recycling of nutrients. Autophagy is a catabolic process via which superfluous or damaged proteins and organelles are delivered to the lysosome and degraded to release free amino acids into the cytoplasm. Furthermore, autophagy is specifically activated in response to amino acid starvation via two key signaling cascades: the mammalian target of rapamycin (mTOR) complex 1 (mTORC1) and the general control nonderepressible 2 (GCN2) pathways. These pathways are key regulators of the integration between anabolic (amino acid depleting) and catabolic (such as autophagy which is amino acid replenishing) processes to ensure intracellular amino acid homeostasis. Here, we discuss the key roles that amino acids, along with energy (ATP, glucose) and oxygen, are playing in cellular growth and proliferation. We further explore how sophisticated methods are employed by cells to sense intracellular amino acid concentrations, how amino acids can act as a switch to dictate the temporal and spatial activation of anabolic and catabolic processes and how autophagy contributes to the replenishment of free amino acids, all to ensure cell survival. Relevance of these molecular processes to cellular and organismal physiology and pathology is also discussed.
The Long and Arduous Road to CRAC
Vig, Monika; Kinet, Jean-Pierre
2007-01-01
Store-operated calcium (SOC) entry is the major route of calcium influx in non-excitable cells, especially immune cells. The best characterized store operated current, ICRAC, is carried by calcium release activated calcium (CRAC) channels. The existence of the phenomenon of store-operated calcium influx was proposed almost two decades ago. However, in spite of rigorous research by many laboratories, the identity of the key molecules participating in the process has remained a mystery. In all these years, multiple different approaches have been adopted by countless researchers to identify the molecular players in this fundamental process. Along the way many crucial discoveries have been made, some of which have been summarized here. The last couple of years have seen significant breakthroughs in the field–identification of STIM1 as the store Ca2+ sensor and CRACM1 (Orai1) as the pore forming subunit of the CRAC channel. The field is now actively engaged in deciphering the gating mechanism of CRAC channels. We summarize here the latest progress in this direction. PMID:17517435
NORM Management in the Oil & Gas Industry
NASA Astrophysics Data System (ADS)
Cowie, Michael; Mously, Khalid; Fageeha, Osama; Nassar, Rafat
2008-08-01
It has been established that Naturally Occurring Radioactive Materials (NORM) accumulates at various locations along the oil/gas production process. Components such as wellheads, separation vessels, pumps, and other processing equipment can become NORM contaminated, and NORM can accumulate in sludge and other waste media. Improper handling and disposal of NORM contaminated equipment and waste can create a potential radiation hazard to workers and the environment. Saudi Aramco Environmental Protection Department initiated a program to identify the extent, form and level of NORM contamination associated with the company operations. Once identified the challenge of managing operations which had a NORM hazard was addressed in a manner that gave due consideration to workers and environmental protection as well as operations' efficiency and productivity. The benefits of shared knowledge, practice and experience across the oil & gas industry are seen as key to the establishment of common guidance on NORM management. This paper outlines Saudi Aramco's experience in the development of a NORM management strategy and its goals of establishing common guidance throughout the oil and gas industry.
Liquid rocket booster integration study. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
1988-01-01
The impacts of introducing liquid rocket booster engines (LRB) into the Space Transportation System (STS)/Kennedy Space Center (KSC) launch environment are identified and evaluated. Proposed ground systems configurations are presented along with a launch site requirements summary. Prelaunch processing scenarios are described and the required facility modifications and new facility requirements are analyzed. Flight vehicle design recommendations to enhance launch processing are discussed. Processing approaches to integrate LRB with existing STS launch operations are evaluated. The key features and significance of launch site transition to a new STS configuration in parallel with ongoing launch activities are enumerated. This volume is the executive summary of the five volume series.
Liquid rocket booster integration study. Volume 5, part 1: Appendices
NASA Technical Reports Server (NTRS)
1988-01-01
The impacts of introducing liquid rocket booster engines (LRB) into the Space Transportation System (STS)/Kennedy Space Center (KSC) launch environment are identified and evaluated. Proposed ground systems configurations are presented along with a launch site requirements summary. Prelaunch processing scenarios are described and the required facility modifications and new facility requirements are analyzed. Flight vehicle design recommendations to enhance launch processing are discussed. Processing approaches to integrate LRB with existing STS launch operations are evaluated. The key features and significance of launch site transition to a new STS configuration in parallel with ongoing launch activities are enumerated. This volume is the appendices of the five volume series.
Liquid Rocket Booster Integration Study. Volume 2: Study synopsis
NASA Technical Reports Server (NTRS)
1988-01-01
The impacts of introducing liquid rocket booster engines (LRB) into the Space Transportation System (STS)/Kennedy Space Center (KSC) launch environment are identified and evaluated. Proposed ground systems configurations are presented along with a launch site requirements summary. Prelaunch processing scenarios are described and the required facility modifications and new facility requirements are analyzed. Flight vehicle design recommendations to enhance launch processing are discussed. Processing approaches to integrate LRB with existing STS launch operations are evaluated. The key features and significance of launch site transition to a new STS configuration in parallel with ongoing launch activities are enumerated. This volume is the study summary of the five volume series.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stickel, Jonathan J.; Adhikari, Birendra; Sievers, David A.
Converting abundant lignocellulosic biomass to sugars as fungible precursors to fuels and chemicals has the potential to diversify the supply chain for those products, but further process improvements are needed to achieve economic viability. In the current work, process intensification of the key enzymatic hydrolysis unit operation is demonstrated by means of a membrane reactor system that was operated continuously. Lignocellulosic biomass (pretreated corn stover) and buffered enzyme solution were fed to a continuously stirred-tank reactor, and clarified sugar solution was withdrawn via a commercial tubular ultrafiltration membrane. The membrane permeance decline and membrane cleaning efficacy were studied and didmore » not vary significantly when increasing fraction insoluble solids (FIS) from 2.5% to 5%. Continuous enzymatic hydrolysis was successfully operated for more than 80 h. A model for the reactor system was able to predict dynamic behavior that was in reasonable agreement with experimental results. The modeled technical performance of anticipated commercial batch and continuous enzymatic hydrolysis processes were compared and showed that continuous operation would provide at least twice the volumetric productivity for the conditions studied. Further improvements are anticipated by better membrane selection and by increasing FIS.« less
An Integrated Assessment of Location-Dependent Scaling for Microalgae Biofuel Production Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Andre M.; Abodeely, Jared; Skaggs, Richard
Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting/design through processing/upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are addressed in part by applying the Integrated Assessment Framework (IAF)—an integrated multi-scale modeling, analysis, and data management suite—to address key issues in developing and operating an open-pond facility by analyzing how variability and uncertainty in space andmore » time affect algal feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. The IAF was applied to a set of sites previously identified as having the potential to cumulatively produce 5 billion-gallons/year in the southeastern U.S. and results indicate costs can be reduced by selecting the most effective processing technology pathway and scaling downstream processing capabilities to fit site-specific growing conditions, available resources, and algal strains.« less
Stickel, Jonathan J.; Adhikari, Birendra; Sievers, David A.; ...
2018-02-21
Converting abundant lignocellulosic biomass to sugars as fungible precursors to fuels and chemicals has the potential to diversify the supply chain for those products, but further process improvements are needed to achieve economic viability. In the current work, process intensification of the key enzymatic hydrolysis unit operation is demonstrated by means of a membrane reactor system that was operated continuously. Lignocellulosic biomass (pretreated corn stover) and buffered enzyme solution were fed to a continuously stirred-tank reactor, and clarified sugar solution was withdrawn via a commercial tubular ultrafiltration membrane. The membrane permeance decline and membrane cleaning efficacy were studied and didmore » not vary significantly when increasing fraction insoluble solids (FIS) from 2.5% to 5%. Continuous enzymatic hydrolysis was successfully operated for more than 80 h. A model for the reactor system was able to predict dynamic behavior that was in reasonable agreement with experimental results. The modeled technical performance of anticipated commercial batch and continuous enzymatic hydrolysis processes were compared and showed that continuous operation would provide at least twice the volumetric productivity for the conditions studied. Further improvements are anticipated by better membrane selection and by increasing FIS.« less
Hukari, Sirja; Hermann, Ludwig; Nättorp, Anders
2016-01-15
The present paper is based on an analysis of the EU legislation regulating phosphorus recovery and recycling from wastewater stream, in particular as fertiliser. To recover phosphorus, operators need to deal with market regulations, health and environment protection laws. Often, several permits and lengthy authorisation processes for both installation (e.g. environmental impact assessment) and the recovered phosphorus (e.g. End-of-Waste, REACH) are required. Exemptions to certain registration processes for recoverers are in place but rarely applied. National solutions are often needed. Emerging recovery and recycling sectors are affected by legislation in different ways: Wastewater treatment plants are obliged to remove phosphorus but may also recover it in low quantities for operational reasons. Permit processes allowing recovery and recycling operations next to water purification should thus be rationalised. In contrast, the fertiliser industry relies on legal quality requirements, ensuring their market reputation. For start-ups, raw-material sourcing and related legislation will be the key. Phosphorus recycling is governed by fragmented decision-making in regional administrations. Active regulatory support, such as recycling obligation or subsidies, is lacking. Legislation harmonisation, inclusion of recycled phosphorus in existing fertiliser regulations and support of new operators would speed up market penetration of novel technologies, reduce phosphorus losses and safeguard European quality standards.
Racy, Emmanuel; Le Norcy, Elvire
2017-12-01
The consultation of announcement is one of the key periods in an orthodontic and surgical process. The aim of this consultation is not only to make an aesthetic and orthodontic diagnosis but also a fine psychological analysis of the patient and his family before proposing a treatment plan. Integrative medical therapies, a recent evolution of medicine within the framework of the doctor-patient relation, have shown the positive impact on the treatment success of a good relationship. The preliminary collection of information on the patient's psyches is now part of the treatment guidelines and has a positive impact on treatment observance, management of pre- and post-operative care or more simply surgery acceptance. Therefore, a systematized patient record process including global medical assessment of the patient and not only orthodontic and cephalometric diagnosis is a key factor for the treatment outcome. © EDP Sciences, SFODF, 2017.
Microwave and continuous flow technologies in drug discovery.
Sadler, Sara; Moeller, Alexander R; Jones, Graham B
2012-12-01
Microwave and continuous flow microreactors have become mainstream heating sources in contemporary pharmaceutical company laboratories. Such technologies will continue to benefit from design and engineering improvements, and now play a key role in the drug discovery process. The authors review the applications of flow- and microwave-mediated heating in library, combinatorial, solid-phase, metal-assisted, and protein chemistries. Additionally, the authors provide a description of the combination of microwave and continuous flow platforms, with applications in the preparation of radiopharmaceuticals and in drug candidate development. Literature reviewed is chiefly 2000 - 2012, plus key citations from earlier reports. With the advent of microwave irradiation, reactions that normally took days to complete can now be performed in a matter of minutes. Coupled with the introduction of continuous flow microreactors, pharmaceutical companies have an easy way to improve the greenness and efficiency of many synthetic operations. The combined force of these technologies offers the potential to revolutionize discovery and manufacturing processes.
Heavy Vehicle Crash Characteristics in Oman 2009–2011
Al-Bulushi, Islam; Edwards, Jason; Davey, Jeremy; Armstrong, Kerry; Al-Reesi, Hamed; Al-Shamsi, Khalid
2015-01-01
In recent years, Oman has seen a shift in the burden of diseases towards road accidents. The main objective of this paper, therefore, is to describe key characteristics of heavy vehicle crashes in Oman and identify the key driving behaviours that influence fatality risks. Crash data from January 2009 to December 2011 were examined and it was found that, of the 22,543 traffic accidents that occurred within this timeframe, 3,114 involved heavy vehicles. While the majority of these crashes were attributed to driver behaviours, a small proportion was attributed to other factors. The results of the study indicate that there is a need for a more thorough crash investigation process in Oman. Future research should explore the reporting processes used by the Royal Oman Police, cultural influences on heavy vehicle operations in Oman and improvements to the current licensing system. PMID:26052451
A Quality Function Deployment Method Applied to Highly Reusable Space Transportation
NASA Technical Reports Server (NTRS)
Zapata, Edgar
2016-01-01
This paper will describe a Quality Function Deployment (QFD) currently in work the goal of which is to add definition and insight to the development of long term Highly Reusable Space Transportation (HRST). The objective here is twofold. First, to describe the process, the actual QFD experience as applies to the HRST study. Second, to describe the preliminary results of this process, in particular the assessment of possible directions for future pursuit such as promising candidate technologies or approaches that may finally open the space frontier. The iterative and synergistic nature of QFD provides opportunities in the process for the discovery of what is key in so far as it is useful, what is not, and what is merely true. Key observations on the QFD process will be presented. The importance of a customer definition as well as the similarity of the process of developing a technology portfolio to product development will be shown. Also, the relation of identified cost and operating drivers to future space vehicle designs that are robust to an uncertain future will be discussed. The results in particular of this HRST evaluation will be preliminary given the somewhat long term (or perhaps not?) nature of the task being considered.
NASA Astrophysics Data System (ADS)
Zhang, Miao; Tong, Xiaojun
2017-07-01
This paper proposes a joint image encryption and compression scheme based on a new hyperchaotic system and curvelet transform. A new five-dimensional hyperchaotic system based on the Rabinovich system is presented. By means of the proposed hyperchaotic system, a new pseudorandom key stream generator is constructed. The algorithm adopts diffusion and confusion structure to perform encryption, which is based on the key stream generator and the proposed hyperchaotic system. The key sequence used for image encryption is relation to plain text. By means of the second generation curvelet transform, run-length coding, and Huffman coding, the image data are compressed. The joint operation of compression and encryption in a single process is performed. The security test results indicate the proposed methods have high security and good compression effect.
A Novel Real-Time Reference Key Frame Scan Matching Method.
Mohamed, Haytham; Moussa, Adel; Elhabiby, Mohamed; El-Sheimy, Naser; Sesay, Abu
2017-05-07
Unmanned aerial vehicles represent an effective technology for indoor search and rescue operations. Typically, most indoor missions' environments would be unknown, unstructured, and/or dynamic. Navigation of UAVs in such environments is addressed by simultaneous localization and mapping approach using either local or global approaches. Both approaches suffer from accumulated errors and high processing time due to the iterative nature of the scan matching method. Moreover, point-to-point scan matching is prone to outlier association processes. This paper proposes a low-cost novel method for 2D real-time scan matching based on a reference key frame (RKF). RKF is a hybrid scan matching technique comprised of feature-to-feature and point-to-point approaches. This algorithm aims at mitigating errors accumulation using the key frame technique, which is inspired from video streaming broadcast process. The algorithm depends on the iterative closest point algorithm during the lack of linear features which is typically exhibited in unstructured environments. The algorithm switches back to the RKF once linear features are detected. To validate and evaluate the algorithm, the mapping performance and time consumption are compared with various algorithms in static and dynamic environments. The performance of the algorithm exhibits promising navigational, mapping results and very short computational time, that indicates the potential use of the new algorithm with real-time systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peeler, D.; Edwards, T.
High-level waste (HLW) throughput (i.e., the amount of waste processed per unit of time) is primarily a function of two critical parameters: waste loading (WL) and melt rate. For the Defense Waste Processing Facility (DWPF), increasing HLW throughput would significantly reduce the overall mission life cycle costs for the Department of Energy (DOE). Significant increases in waste throughput have been achieved at DWPF since initial radioactive operations began in 1996. Key technical and operational initiatives that supported increased waste throughput included improvements in facility attainment, the Chemical Processing Cell (CPC) flowsheet, process control models and frit formulations. As a resultmore » of these key initiatives, DWPF increased WLs from a nominal 28% for Sludge Batch 2 (SB2) to {approx}34 to 38% for SB3 through SB6 while maintaining or slightly improving canister fill times. Although considerable improvements in waste throughput have been obtained, future contractual waste loading targets are nominally 40%, while canister production rates are also expected to increase (to a rate of 325 to 400 canisters per year). Although implementation of bubblers have made a positive impact on increasing melt rate for recent sludge batches targeting WLs in the mid30s, higher WLs will ultimately make the feeds to DWPF more challenging to process. Savannah River Remediation (SRR) recently requested the Savannah River National Laboratory (SRNL) to perform a paper study assessment using future sludge projections to evaluate whether the current Process Composition Control System (PCCS) algorithms would provide projected operating windows to allow future contractual WL targets to be met. More specifically, the objective of this study was to evaluate future sludge batch projections (based on Revision 16 of the HLW Systems Plan) with respect to projected operating windows using current PCCS models and associated constraints. Based on the assessments, the waste loading interval over which a glass system (i.e., a projected sludge composition with a candidate frit) is predicted to be acceptable can be defined (i.e., the projected operating window) which will provide insight into the ability to meet future contractual WL obligations. In this study, future contractual WL obligations are assumed to be 40%, which is the goal after all flowsheet enhancements have been implemented to support DWPF operations. For a system to be considered acceptable, candidate frits must be identified that provide access to at least 40% WL while accounting for potential variation in the sludge resulting from differences in batch-to-batch transfers into the Sludge Receipt and Adjustment Tank (SRAT) and/or analytical uncertainties. In more general terms, this study will assess whether or not the current glass formulation strategy (based on the use of the Nominal and Variation Stage assessments) and current PCCS models will allow access to compositional regions required to targeted higher WLs for future operations. Some of the key questions to be considered in this study include: (1) If higher WLs are attainable with current process control models, are the models valid in these compositional regions? If the higher WL glass regions are outside current model development or validation ranges, is there existing data that could be used to demonstrate model applicability (or lack thereof)? If not, experimental data may be required to revise current models or serve as validation data with the existing models. (2) Are there compositional trends in frit space that are required by the PCCS models to obtain access to these higher WLs? If so, are there potential issues with the compositions of the associated frits (e.g., limitations on the B{sub 2}O{sub 3} and/or Li{sub 2}O concentrations) as they are compared to model development/validation ranges or to the term 'borosilicate' glass? If limitations on the frit compositional range are realized, what is the impact of these restrictions on other glass properties such as the ability to suppress nepheline formation or influence melt rate? The model based assessments being performed make the assumption that the process control models are applicable over the glass compositional regions being evaluated. Although the glass compositional region of interest is ultimately defined by the specific frit, sludge, and WL interval used, there is no prescreening of these compositional regions with respect to the model development or validation ranges which is consistent with current DWPF operations.« less
78 FR 79061 - Noise Exposure Map Notice; Key West International Airport, Key West, FL
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
..., Flight Track Utilization by Aircraft Category for East Flow Operations; Table 4-3, Flight Track Utilization by Aircraft Category for West Flow Operations; Table 4-4, 2013 Air Carrier Flight Operations; Table 4-5, 2013 Commuter and Air Taxi Flight Operations; Table 4-6, 2013 Average Daily Engine Run-Up...
Toshiba TDF-500 High Resolution Viewing And Analysis System
NASA Astrophysics Data System (ADS)
Roberts, Barry; Kakegawa, M.; Nishikawa, M.; Oikawa, D.
1988-06-01
A high resolution, operator interactive, medical viewing and analysis system has been developed by Toshiba and Bio-Imaging Research. This system provides many advanced features including high resolution displays, a very large image memory and advanced image processing capability. In particular, the system provides CRT frame buffers capable of update in one frame period, an array processor capable of image processing at operator interactive speeds, and a memory system capable of updating multiple frame buffers at frame rates whilst supporting multiple array processors. The display system provides 1024 x 1536 display resolution at 40Hz frame and 80Hz field rates. In particular, the ability to provide whole or partial update of the screen at the scanning rate is a key feature. This allows multiple viewports or windows in the display buffer with both fixed and cine capability. To support image processing features such as windowing, pan, zoom, minification, filtering, ROI analysis, multiplanar and 3D reconstruction, a high performance CPU is integrated into the system. This CPU is an array processor capable of up to 400 million instructions per second. To support the multiple viewer and array processors' instantaneous high memory bandwidth requirement, an ultra fast memory system is used. This memory system has a bandwidth capability of 400MB/sec and a total capacity of 256MB. This bandwidth is more than adequate to support several high resolution CRT's and also the fast processing unit. This fully integrated approach allows effective real time image processing. The integrated design of viewing system, memory system and array processor are key to the imaging system. It is the intention to describe the architecture of the image system in this paper.
NASA Astrophysics Data System (ADS)
Haeffelin, Martial
2016-04-01
Radiation fog formation is largely influenced by the chemical composition, size and number concentration of cloud condensation nuclei and by heating/cooling and drying/moistening processes in a shallow mixing layer near the surface. Once a fog water layer is formed, its development and dissipation become predominantly controlled by radiative cooling/heating, turbulent mixing, sedimentation and deposition. Key processes occur in the atmospheric surface layer, directly in contact with the soil and vegetation, and throughout the atmospheric column. Recent publications provide detailed descriptions of these processes for idealized cases using very high-resolution models and proper representation of microphysical processes. Studying these processes in real fog situations require atmospheric profiling capabilities to monitor the temporal evolution of key parameters at several heights (surface, inside the fog, fog top, free troposphere). This could be done with in-situ sensors flown on tethered balloons or drones, during dedicated intensive field campaigns. In addition Backscatter Lidars, Doppler Lidars, Microwave Radiometers and Cloud Doppler Radars can provide more continuous, yet precise monitoring of key parameters throughout the fog life cycle. The presentation will describe how Backscatter Lidars can be used to study the height and kinetics of aerosol activation into fog droplets. Next we will show the potential of Cloud Doppler Radar measurements to characterize the temporal evolution of droplet size, liquid water content, sedimentation and deposition. Contributions from Doppler Lidars and Microwave Radiometers will be discussed. This presentation will conclude on the potential to use Lidar and Radar remote sensing measurements to support operational fog nowcasting.
Control and manipulation of antiferromagnetic skyrmions in racetrack
NASA Astrophysics Data System (ADS)
Xia, Haiyan; Jin, Chendong; Song, Chengkun; Wang, Jinshuai; Wang, Jianbo; Liu, Qingfang
2017-12-01
Controllable manipulations of magnetic skyrmions are essential for next-generation spintronic devices. Here, the duplication and merging of skyrmions, as well as logical AND and OR functions, are designed in antiferromagnetic (AFM) materials with a cusp or smooth Y-junction structures. The operational time are in the dozens of picoseconds, enabling ultrafast information processing. A key factor for the successful operation is the relatively complex Y-junction structures, where domain walls propagate through in a controlled manner, without significant risks of pinning, vanishing or unwanted depinning of existing domain walls, as well as the nucleation of new domain walls. The motions of a multi-bit, namely the motion of an AFM skyrmion-chain in racetrack, are also investigated. Those micromagnetic simulations may contribute to future AFM skyrmion-based spintronic devices, such as nanotrack memory, logic gates and other information processes.
NASA Astrophysics Data System (ADS)
Nakai, S.; Yamanaka, M.; Kitagawa, Y.; Fujita, K.; Heya, M.; Mima, K.; Izawa, Y.; Nakatsuka, M.; Murakami, M.; Ueda, K.; Sasaki, T.; Mori, Y.; Kanabe, T.; Hiruma, T.; Kan, H.; Kawashima, T.
2006-06-01
The typical specifications of the laser driver for a commercial IFE power plant are (1) total energy (MJ/pulse) with a tailored 20-40 ns pulse, (2) repetition operation (˜ 10 Hz), (3) efficiency (˜ 10%) with enough robustness and low cost. The key elements of the DPSSL driver technology are under development with HALNA. The HALNA 10 (High Average-power Laser for Nuclear-fusion Application) demonstrated 10 J × 10 Hz operation and the HALNA 100 (100 J × 10 Hz) is now under construction. By using the high average power and high intensity lasers, new industrial applications are being proceeded. The collaborative process for the development of high power laser with industry and for the industrial applications is effective and essential in the development of the laser driver for IFE power plant.
Numerical Investigation of Novel Oxygen Blast Furnace Ironmaking Processes
NASA Astrophysics Data System (ADS)
Li, Zhaoyang; Kuang, Shibo; Yu, Aibing; Gao, Jianjun; Qi, Yuanhong; Yan, Dingliu; Li, Yuntao; Mao, Xiaoming
2018-04-01
Oxygen blast furnace (OBF) ironmaking process has the potential to realize "zero carbon footprint" production, but suffers from the "thermal shortage" problem. This paper presents three novel OBF processes, featured by belly injection of reformed coke oven gas, burden hot-charge operation, and their combination, respectively. These processes were studied by a multifluid process model. The applicability of the model was confirmed by comparing the numerical results against the measured key performance indicators of an experimental OBF operated with or without injection of reformed coke oven gas. Then, these different OBF processes together with a pure OBF were numerically examined in aspects of in-furnace states and global performance, assuming that the burden quality can be maintained during the hot-charge operation. The numerical results show that under the present conditions, belly injection and hot charge, as auxiliary measures, are useful for reducing the fuel rate and increasing the productivity for OBFs but in different manners. Hot charge should be more suitable for OBFs of different sizes because it improves the thermochemical states throughout the dry zone rather than within a narrow region in the case of belly injection. The simultaneous application of belly injection and hot charge leads to the best process performance, at the same time, lowering down hot-charge temperature to achieve the same carbon consumption and hot metal temperature as that achieved when applying the hot charge alone. This feature will be practically beneficial in the application of hot-charge operation. In addition, a systematic study of hot-charge temperature reveals that optimal hot-charge temperatures can be identified according to the utilization efficiency of the sensible heat of hot burden.
Eisenstein, Eric L; Diener, Lawrence W; Nahm, Meredith; Weinfurt, Kevin P
2011-12-01
New technologies may be required to integrate the National Institutes of Health's Patient Reported Outcome Management Information System (PROMIS) into multi-center clinical trials. To better understand this need, we identified likely PROMIS reporting formats, developed a multi-center clinical trial process model, and identified gaps between current capabilities and those necessary for PROMIS. These results were evaluated by key trial constituencies. Issues reported by principal investigators fell into two categories: acceptance by key regulators and the scientific community, and usability for researchers and clinicians. Issues reported by the coordinating center, participating sites, and study subjects were those faced when integrating new technologies into existing clinical trial systems. We then defined elements of a PROMIS Tool Kit required for integrating PROMIS into a multi-center clinical trial environment. The requirements identified in this study serve as a framework for future investigators in the design, development, implementation, and operation of PROMIS Tool Kit technologies.
Diener, Lawrence W.; Nahm, Meredith; Weinfurt, Kevin P.
2013-01-01
New technologies may be required to integrate the National Institutes of Health’s Patient Reported Outcome Management Information System (PROMIS) into multi-center clinical trials. To better understand this need, we identified likely PROMIS reporting formats, developed a multi-center clinical trial process model, and identified gaps between current capabilities and those necessary for PROMIS. These results were evaluated by key trial constituencies. Issues reported by principal investigators fell into two categories: acceptance by key regulators and the scientific community, and usability for researchers and clinicians. Issues reported by the coordinating center, participating sites, and study subjects were those faced when integrating new technologies into existing clinical trial systems. We then defined elements of a PROMIS Tool Kit required for integrating PROMIS into a multi-center clinical trial environment. The requirements identified in this study serve as a framework for future investigators in the design, development, implementation, and operation of PROMIS Tool Kit technologies. PMID:20703765
Chaotic Image Encryption Algorithm Based on Bit Permutation and Dynamic DNA Encoding.
Zhang, Xuncai; Han, Feng; Niu, Ying
2017-01-01
With the help of the fact that chaos is sensitive to initial conditions and pseudorandomness, combined with the spatial configurations in the DNA molecule's inherent and unique information processing ability, a novel image encryption algorithm based on bit permutation and dynamic DNA encoding is proposed here. The algorithm first uses Keccak to calculate the hash value for a given DNA sequence as the initial value of a chaotic map; second, it uses a chaotic sequence to scramble the image pixel locations, and the butterfly network is used to implement the bit permutation. Then, the image is coded into a DNA matrix dynamic, and an algebraic operation is performed with the DNA sequence to realize the substitution of the pixels, which further improves the security of the encryption. Finally, the confusion and diffusion properties of the algorithm are further enhanced by the operation of the DNA sequence and the ciphertext feedback. The results of the experiment and security analysis show that the algorithm not only has a large key space and strong sensitivity to the key but can also effectively resist attack operations such as statistical analysis and exhaustive analysis.
Chaotic Image Encryption Algorithm Based on Bit Permutation and Dynamic DNA Encoding
2017-01-01
With the help of the fact that chaos is sensitive to initial conditions and pseudorandomness, combined with the spatial configurations in the DNA molecule's inherent and unique information processing ability, a novel image encryption algorithm based on bit permutation and dynamic DNA encoding is proposed here. The algorithm first uses Keccak to calculate the hash value for a given DNA sequence as the initial value of a chaotic map; second, it uses a chaotic sequence to scramble the image pixel locations, and the butterfly network is used to implement the bit permutation. Then, the image is coded into a DNA matrix dynamic, and an algebraic operation is performed with the DNA sequence to realize the substitution of the pixels, which further improves the security of the encryption. Finally, the confusion and diffusion properties of the algorithm are further enhanced by the operation of the DNA sequence and the ciphertext feedback. The results of the experiment and security analysis show that the algorithm not only has a large key space and strong sensitivity to the key but can also effectively resist attack operations such as statistical analysis and exhaustive analysis. PMID:28912802
McCoy, Ryan J; O'Brien, Fergal J
2012-12-01
Tissue engineering approaches to developing functional substitutes are often highly complex, multivariate systems where many aspects of the biomaterials, bio-regulatory factors or cell sources may be controlled in an effort to enhance tissue formation. Furthermore, success is based on multiple performance criteria reflecting both the quantity and quality of the tissue produced. Managing the trade-offs between different performance criteria is a challenge. A "windows of operation" tool that graphically represents feasible operating spaces to achieve user-defined levels of performance has previously been described by researchers in the bio-processing industry. This paper demonstrates the value of "windows of operation" to the tissue engineering field using a perfusion-scaffold bioreactor system as a case study. In our laboratory, perfusion bioreactor systems are utilized in the context of bone tissue engineering to enhance the osteogenic differentiation of cell-seeded scaffolds. A key challenge of such perfusion bioreactor systems is to maximize the induction of osteogenesis but minimize cell detachment from the scaffold. Two key operating variables that influence these performance criteria are the mean scaffold pore size and flow-rate. Using cyclooxygenase-2 and osteopontin gene expression levels as surrogate indicators of osteogenesis, we employed the "windows of operation" methodology to rapidly identify feasible operating ranges for the mean scaffold pore size and flow-rate that achieved user-defined levels of performance for cell detachment and differentiation. Incorporation of such tools into the tissue engineer's armory will hopefully yield a greater understanding of the highly complex systems used and help aid decision making in future translation of products from the bench top to the market place. Copyright © 2012 Wiley Periodicals, Inc.
Reducing intraoperative red blood cell unit wastage in a large academic medical center.
Whitney, Gina M; Woods, Marcella C; France, Daniel J; Austin, Thomas M; Deegan, Robert J; Paroskie, Allison; Booth, Garrett S; Young, Pampee P; Dmochowski, Roger R; Sandberg, Warren S; Pilla, Michael A
2015-11-01
The wastage of red blood cell (RBC) units within the operative setting results in significant direct costs to health care organizations. Previous education-based efforts to reduce wastage were unsuccessful at our institution. We hypothesized that a quality and process improvement approach would result in sustained reductions in intraoperative RBC wastage in a large academic medical center. Utilizing a failure mode and effects analysis supplemented with time and temperature data, key drivers of perioperative RBC wastage were identified and targeted for process improvement. Multiple contributing factors, including improper storage and transport and lack of accurate, locally relevant RBC wastage event data were identified as significant contributors to ongoing intraoperative RBC unit wastage. Testing and implementation of improvements to the process of transport and storage of RBC units occurred in liver transplant and adult cardiac surgical areas due to their history of disproportionately high RBC wastage rates. Process interventions targeting local drivers of RBC wastage resulted in a significant reduction in RBC wastage (p < 0.0001; adjusted odds ratio, 0.24; 95% confidence interval, 0.15-0.39), despite an increase in operative case volume over the period of the study. Studied process interventions were then introduced incrementally in the remainder of the perioperative areas. These results show that a multidisciplinary team focused on the process of blood product ordering, transport, and storage was able to significantly reduce operative RBC wastage and its associated costs using quality and process improvement methods. © 2015 AABB.
Reducing intraoperative red blood cell unit wastage in a large academic medical center
Whitney, Gina M.; Woods, Marcella C.; France, Daniel J.; Austin, Thomas M.; Deegan, Robert J.; Paroskie, Allison; Booth, Garrett S.; Young, Pampee P.; Dmochowski, Roger R.; Sandberg, Warren S.; Pilla, Michael A.
2015-01-01
BACKGROUND The wastage of red blood cell (RBC) units within the operative setting results in significant direct costs to health care organizations. Previous education-based efforts to reduce wastage were unsuccessful at our institution. We hypothesized that a quality and process improvement approach would result in sustained reductions in intraoperative RBC wastage in a large academic medical center. STUDY DESIGN AND METHODS Utilizing a failure mode and effects analysis supplemented with time and temperature data, key drivers of perioperative RBC wastage were identified and targeted for process improvement. RESULTS Multiple contributing factors, including improper storage and transport and lack of accurate, locally relevant RBC wastage event data were identified as significant contributors to ongoing intraoperative RBC unit wastage. Testing and implementation of improvements to the process of transport and storage of RBC units occurred in liver transplant and adult cardiac surgical areas due to their history of disproportionately high RBC wastage rates. Process interventions targeting local drivers of RBC wastage resulted in a significant reduction in RBC wastage (p <0.0001; adjusted odds ratio, 0.24; 95% confidence interval, 0.15–0.39), despite an increase in operative case volume over the period of the study. Studied process interventions were then introduced incrementally in the remainder of the perioperative areas. CONCLUSIONS These results show that a multidisciplinary team focused on the process of blood product ordering, transport, and storage was able to significantly reduce operative RBC wastage and its associated costs using quality and process improvement methods. PMID:26202213
ERIC Educational Resources Information Center
Mirzeoglu, Ayse Dilsad
2014-01-01
This study is related to one of the teaching models, peer teaching which is used in physical education courses. The fundamental feature of peer teaching is defined "to structure a learning environment in which some students assume and carry out many of the key operations of instruction to assist other students in the learning process".…
NASA Technical Reports Server (NTRS)
Schultz, Chris; Carey, Larry; Schultz, Elise V.; Stano, Geoffrey; Gatlin, Patrick N.; Kozlowski, Danielle M.; Blakeslee, Rich J.; Goodman, Steve
2013-01-01
Key points this analysis will address: 1) What physically is going on in the cloud when there is a jump in lightning? -- Updraft variations, Ice fluxes 2) How do these processes fit in with severe storm conceptual models? 3) What would this information provide an end user? --Relate LJA to radar observations, like changes in reflectivity, MESH, VIL, etc. based multi -Doppler derived physical relationships
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olsen, Tim; Preus, Robert
Site assessment for small wind energy systems is one of the key factors in the successful installation, operation, and performance of a small wind turbine. A proper site assessment is a difficult process that includes wind resource assessment and the evaluation of site characteristics. These guidelines address many of the relevant parts of a site assessment with an emphasis on wind resource assessment, using methods other than on-site data collection and creating a small wind site assessment report.
Reducing waste and errors: piloting lean principles at Intermountain Healthcare.
Jimmerson, Cindy; Weber, Dorothy; Sobek, Durward K
2005-05-01
The Toyota Production System (TPS), based on industrial engineering principles and operational innovations, is used to achieve waste reduction and efficiency while increasing product quality. Several key tools and principles, adapted to health care, have proved effective in improving hospital operations. Value Stream Maps (VSMs), which represent the key people, material, and information flows required to deliver a product or service, distinguish between value-adding and non-value-adding steps. The one-page Problem-Solving A3 Report guides staff through a rigorous and systematic problem-solving process. PILOT PROJECT at INTERMOUNTAIN HEALTHCARE: In a pilot project, participants made many improvements, ranging from simple changes implemented immediately (for example, heart monitor paper not available when a patient presented with a dysrythmia) to larger projects involving patient or information flow issues across multiple departments. Most of the improvements required little or no investment and reduced significant amounts of wasted time for front-line workers. In one unit, turnaround time for pathologist reports from an anatomical pathology lab was reduced from five to two days. TPS principles and tools are applicable to an endless variety of processes and work settings in health care and can be used to address critical challenges such as medical errors, escalating costs, and staffing shortages.
Belval, Richard; Alamir, Ab; Corte, Christopher; DiValentino, Justin; Fernandes, James; Frerking, Stuart; Jenkins, Derek; Rogers, George; Sanville-Ross, Mary; Sledziona, Cindy; Taylor, Paul
2012-12-01
Boehringer Ingelheim's Automated Liquids Processing System (ALPS) in Ridgefield, Connecticut, was built to accommodate all compound solution-based operations following dissolution in neat DMSO. Process analysis resulted in the design of two nearly identical conveyor-based subsystems, each capable of executing 1400 × 384-well plate or punch tube replicates per batch. Two parallel-positioned subsystems are capable of independent execution or alternatively executed as a unified system for more complex or higher throughput processes. Primary ALPS functions include creation of high-throughput screening plates, concentration-response plates, and reformatted master stock plates (e.g., 384-well plates from 96-well plates). Integrated operations included centrifugation, unsealing/piercing, broadcast diluent addition, barcode print/application, compound transfer/mix via disposable pipette tips, and plate sealing. ALPS key features included instrument pooling for increased capacity or fail-over situations, programming constructs to associate one source plate to an array of replicate plates, and stacked collation of completed plates. Due to the hygroscopic nature of DMSO, ALPS was designed to operate within a 10% relativity humidity environment. The activities described are the collaborative efforts that contributed to the specification, build, delivery, and acceptance testing between Boehringer Ingelheim Pharmaceuticals, Inc. and the automation integration vendor, Thermo Scientific Laboratory Automation (Burlington, ON, Canada).
NASA Astrophysics Data System (ADS)
Lopez, Bernhard; Whyborn, Nicholas D.; Guniat, Serge; Hernandez, Octavio; Gairing, Stefan
2016-07-01
The Atacama Large Millimeter/submillimeter Array (ALMA) is a joint project between astronomical organizations in Europe, North America, and East Asia, in collaboration with the Republic of Chile. ALMA consists of 54 twelve-meter antennas and 12 seven-meter antennas operating as an aperture synthesis array in the (sub)millimeter wavelength range. Since the inauguration of the observatory back in March 2013 there has been a continuous effort to establish solid operations processes for effective and efficient management of technical and administrative tasks on site. Here a key aspect had been the centralized maintenance and operations planning: input is collected from science stakeholders, the computerized maintenance management system (CMMS) and from the technical teams spread around the world, then this information is analyzed and consolidated based on the established maintenance strategy, the observatory long-term plan and the short-term priorities definitions. This paper presents the high-level process that has been developed for the planning and scheduling of planned- and unplanned maintenance tasks, and for site operations like the telescope array reconfiguration campaigns. We focus on the centralized planning approach by presenting its genesis, its current implementation for the observatory operations including related planning products, and we explore the necessary next steps in order to fully achieve a comprehensive centralized planning approach for ALMA in steady-state operations.
Present and future free-space quantum key distribution
NASA Astrophysics Data System (ADS)
Nordholt, Jane E.; Hughes, Richard J.; Morgan, George L.; Peterson, C. Glen; Wipf, Christopher C.
2002-04-01
Free-space quantum key distribution (QKD), more popularly know as quantum cryptography, uses single-photon free-space optical communications to distribute the secret keys required for secure communications. At Los Alamos National Laboratory we have demonstrated a fully automated system that is capable of operations at any time of day over a horizontal range of several kilometers. This has proven the technology is capable of operation from a spacecraft to the ground, opening up the possibility of QKD between any group of users anywhere on Earth. This system, the prototyping of a new system for use on a spacecraft, and the techniques required for world-wide quantum key distribution will be described. The operational parameters and performance of a system designed to operate between low earth orbit (LEO) and the ground will also be discussed.
The Doubting System 1: Evidence for automatic substitution sensitivity.
Johnson, Eric D; Tubau, Elisabet; De Neys, Wim
2016-02-01
A long prevailing view of human reasoning suggests severe limits on our ability to adhere to simple logical or mathematical prescriptions. A key position assumes these failures arise from insufficient monitoring of rapidly produced intuitions. These faulty intuitions are thought to arise from a proposed substitution process, by which reasoners unknowingly interpret more difficult questions as easier ones. Recent work, however, suggests that reasoners are not blind to this substitution process, but in fact detect that their erroneous responses are not warranted. Using the popular bat-and-ball problem, we investigated whether this substitution sensitivity arises out of an automatic System 1 process or whether it depends on the operation of an executive resource demanding System 2 process. Results showed that accuracy on the bat-and-ball problem clearly declined under cognitive load. However, both reduced response confidence and increased response latencies indicated that biased reasoners remained sensitive to their faulty responses under load. Results suggest that a crucial substitution monitoring process is not only successfully engaged, but that it automatically operates as an autonomous System 1 process. By signaling its doubt along with a biased intuition, it appears System 1 is "smarter" than traditionally assumed.
Operator agency in process intervention: tampering versus application of tacit knowledge
NASA Astrophysics Data System (ADS)
Van Gestel, P.; Pons, D. J.; Pulakanam, V.
2015-09-01
Statistical process control (SPC) theory takes a negative view of adjustment of process settings, which is termed tampering. In contrast, quality and lean programmes actively encourage operators to acts of intervention and personal agency in the improvement of production outcomes. This creates a conflict that requires operator judgement: How does one differentiate between unnecessary tampering and needful intervention? Also, difficult is that operators apply tacit knowledge to such judgements. There is a need to determine where in a given production process the operators are applying tacit knowledge, and whether this is hindering or aiding quality outcomes. The work involved the conjoint application of systems engineering, statistics, and knowledge management principles, in the context of a case study. Systems engineering was used to create a functional model of a real plant. Actual plant data were analysed with the statistical methods of ANOVA, feature selection, and link analysis. This identified the variables to which the output quality was most sensitive. These key variables were mapped back to the functional model. Fieldwork was then directed to those areas to prospect for operator judgement activities. A natural conversational approach was used to determine where and how operators were applying judgement. This contrasts to the interrogative approach of conventional knowledge management. Data are presented for a case study of a meat rendering plant. The results identify specific areas where operators' tacit knowledge and mental model contribute to quality outcomes and untangles the motivations behind their agency. Also evident is how novice and expert operators apply their knowledge differently. Novices were focussed on meeting throughput objectives, and their incomplete understanding of the plant characteristics led them to inadvertently sacrifice quality in the pursuit of productivity in certain situations. Operators' responses to the plant are affected by their individual mental models of the plant, which differ between operators and have variable validity. Their behaviour is also affected by differing interpretations of how their personal agency should be applied to the achievement of production objectives. The methodology developed here is an integration of systems engineering, statistical analysis, and knowledge management. It shows how to determine where in a given production process the operator intervention is occurring, how it affects quality outcomes, and what tacit knowledge operators are using. It thereby assists the continuous quality improvement processes in a different way to SPC. A second contribution is the provision of a novel methodology for knowledge management, one that circumvents the usual codification barriers to knowledge management.
NASA Technical Reports Server (NTRS)
Soeder, James F.; Pinero, Luis; Schneidegger, Robert; Dunning, John; Birchenough, Art
2012-01-01
The NASA's Evolutionary Xenon Thruster (NEXT) project is developing an advanced ion propulsion system for future NASA missions for solar system exploration. A critical element of the propulsion system is the Power Processing Unit (PPU) which supplies regulated power to the key components of the thruster. The PPU contains six different power supplies including the beam, discharge, discharge heater, neutralizer, neutralizer heater, and accelerator supplies. The beam supply is the largest and processes up to 93+% of the power. The NEXT PPU had been operated for approximately 200+ hours and has experienced a series of three capacitor failures in the beam supply. The capacitors are in the same, nominally non-critical location the input filter capacitor to a full wave switching inverter. The three failures occurred after about 20, 30, and 135 hours of operation. This paper provides background on the NEXT PPU and the capacitor failures. It discusses the failure investigation approach, the beam supply power switching topology and its operating modes, capacitor characteristics and circuit testing. Finally, it identifies root cause of the failures to be the unusual confluence of circuit switching frequency, the physical layout of the power circuits, and the characteristics of the capacitor.
NASA Technical Reports Server (NTRS)
Soeder, James F.; Scheidegger, Robert J.; Pinero, Luis R.; Birchenough, Arthur J.; Dunning, John W.
2012-01-01
The NASA s Evolutionary Xenon Thruster (NEXT) project is developing an advanced ion propulsion system for future NASA missions for solar system exploration. A critical element of the propulsion system is the Power Processing Unit (PPU) which supplies regulated power to the key components of the thruster. The PPU contains six different power supplies including the beam, discharge, discharge heater, neutralizer, neutralizer heater, and accelerator supplies. The beam supply is the largest and processes up to 93+% of the power. The NEXT PPU had been operated for approximately 200+ hr and has experienced a series of three capacitor failures in the beam supply. The capacitors are in the same, nominally non-critical location-the input filter capacitor to a full wave switching inverter. The three failures occurred after about 20, 30, and 135 hr of operation. This paper provides background on the NEXT PPU and the capacitor failures. It discusses the failure investigation approach, the beam supply power switching topology and its operating modes, capacitor characteristics and circuit testing. Finally, it identifies root cause of the failures to be the unusual confluence of circuit switching frequency, the physical layout of the power circuits, and the characteristics of the capacitor.
Extending the performance of KrF laser for microlithography by using novel F2 control technology
NASA Astrophysics Data System (ADS)
Zambon, Paolo; Gong, Mengxiong; Carlesi, Jason; Padmabandu, Gunasiri G.; Binder, Mike; Swanson, Ken; Das, Palash P.
2000-07-01
Exposure tools for 248nm lithography have reached a level of maturity comparable to those based on i-line. With this increase in maturity, there is a concomitant requirement for greater flexibility from the laser by the process engineers. Usually, these requirements pertain to energy, spectral width and repetition rate. By utilizing a combination of laser parameters, the process engineers are often able to optimize throughput, reduce cost-of-operation or achieve greater process margin. Hitherto, such flexibility of laser operation was possible only via significant changes to various laser modules. During our investigation, we found that the key measure of the laser that impacts the aforementioned parameters is its F2 concentration. By monitoring and controlling its slope efficiency, the laser's F2 concentration may be precisely controlled. Thus a laser may tune to operate under specifications as diverse as 7mJ, (Delta) (lambda) FWHM < 0.3 pm and 10mJ, (Delta) (lambda) FWHM < 0.6pm and still meet the host of requirements necessary for lithography. We discus this new F2 control technique and highlight some laser performance parameters.
Methods to speed up the gain recovery of an SOA
NASA Astrophysics Data System (ADS)
Wang, Zhi; Wang, Yongjun; Meng, Qingwen; Zhao, Rui
2008-01-01
The semiconductor optical amplifiers (SOAs) are employed in all optical networking and all optical signal processing due to the excellent nonlinearity and high speed. The gain recovery time is the key parameter to describe the response speed of the SOA. The relationship between the gain dynamics and a few operation parameters is obtained in this article. A few simple formula and some simulations are demonstrated, from which, a few methods to improve the response speed of the SOA can be concluded as following, lengthening the active area, or lessening the cross area, increasing the injection current, increasing the probe power, operating with a CW holding beam.
Sexy splicing: regulatory interplays governing sex determination from Drosophila to mammals.
Lalli, Enzo; Ohe, Kenji; Latorre, Elisa; Bianchi, Marco E; Sassone-Corsi, Paolo
2003-02-01
A remarkable array of strategies is used to produce sexual differentiation in different species. Complex gene hierarchies govern sex determination pathways, as exemplified by the classic D. melanogaster paradigm, where an interplay of transcriptional, splicing and translational mechanisms operate. Molecular studies support the hypothesis that genetic sex determination pathways evolved in reverse order, from downstream to upstream genes, in the cascade. The recent identification of a role for the key regulatory factors SRY and WT1(+KTS) in pre-mRNA splicing indicates that important steps in the mammalian sex determination process are likely to operate at the post-transcriptional level.
Laboratory Information Systems.
Henricks, Walter H
2015-06-01
Laboratory information systems (LISs) supply mission-critical capabilities for the vast array of information-processing needs of modern laboratories. LIS architectures include mainframe, client-server, and thin client configurations. The LIS database software manages a laboratory's data. LIS dictionaries are database tables that a laboratory uses to tailor an LIS to the unique needs of that laboratory. Anatomic pathology LIS (APLIS) functions play key roles throughout the pathology workflow, and laboratories rely on LIS management reports to monitor operations. This article describes the structure and functions of APLISs, with emphasis on their roles in laboratory operations and their relevance to pathologists. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Edler, H. G.
1978-01-01
Potential organizational options for a solar power satellite system (SPS) were investigated. Selection and evaluation criteria were determined to include timeliness, reliability, and adequacy to contribute meaningfully to the U.S. supply; political feasibility (both national and international); and cost effectiveness (including environmental and other external costs). Based on these criteria, four organizational alternatives appeared to offer reasonable promise as potential options for SPS. A large number of key issues emerged as being factors which would influence the final selection process. Among these issues were a variety having to do with international law, international institutions, environmental controls, economics, operational flexibility, congressional policies, commercial-vs-governmental ownership, national dedication, and national and operational stategic issues.
Maulik, Pallab K; Kallakuri, Sudha; Devarapalli, Siddhardha
2018-01-01
Background: There are large gaps in the delivery of mental health care in low- and middle-income countries such as India, and the problems are even more acute in rural settings due to lack of resources, remoteness, and lack of infrastructure, amongst other factors. The Systematic Medical Appraisal Referral and Treatment (SMART) Mental Health Project was conceived as a mental health services delivery model using technology-based solutions for rural India. This paper reports on the operational strategies used to facilitate the implementation of the intervention. Method: Key components of the SMART Mental Health Project included delivering an anti-stigma campaign, training of primary health workers in screening, diagnosing and managing stress, depression and increased suicide risk and task sharing of responsibilities in delivering care; and using mobile technology based electronic decision support systems to support delivery of algorithm based care for such disorders. The intervention was conducted in 42 villages across two sites in the state of Andhra Pradesh in south India. A pre-post mixed methods evaluation was done, and in this paper operational challenges are reported. Results: Both quantitative and qualitative results from the evaluation from one site covering about 5000 adults showed that the intervention was feasible and acceptable, and initial results indicated that it was beneficial in increasing access to mental health care and reducing depression and anxiety symptoms. A number of strategies were initiated in response to operational challenges to ensure smoother conduct of the project and facilitated the project to be delivered as envisaged. Conclusions: The operational strategies initiated for this project were successful in ensuring the delivery of the intervention. Those, coupled with other more systematic processes have informed the researchers to understand key processes that need to be in place to develop a more robust study, that could eventually be scaled up.
NASA Astrophysics Data System (ADS)
Stegen, J.; Scheibe, T. D.; Chen, X.; Huang, M.; Arntzen, E.; Garayburu-Caruso, V. A.; Graham, E.; Johnson, T. C.; Strickland, C. E.
2017-12-01
The installation and operation of dams have myriad influences on ecosystems, from direct effects on hydrographs to indirect effects on marine biogeochemistry and terrestrial food webs. With > 50000 existing and > 3700 planned large dams world-wide there is a pressing need for holistic understanding of dam impacts. Such understanding is likely to reveal unrecognized opportunities to modify dam operations towards beneficial outcomes. One of the most dramatic influences of daily dam operations is the creation of `artificial intertidal zones' that emerge from short-term increases and decreases in discharge due to hydroelectric power demands; known as hydropeaking. There is a long history of studying the influences of hydropeaking on macrofauna such as fish and invertebrates, but only recently has significant attention been paid to the hydrobiogeochemical effects of hydropeaking. Our aim here is to develop an integrated conceptual model of the hydrobiogeochemical influences of hydropeaking. To do so we reviewed available literature focusing on hydrologic and/or biogeochemical influences of hydropeaking. Results from these studies were collated into a single conceptual model that integrates key physical (e.g., sediment transport, hydromorphology) and biological (e.g., timescale of microbiome response) processes. This conceptual model highlights non-intuitive impacts of hydropeaking, the presence of critical thresholds, and strong interactions among processes. When examined individually these features suggest context dependency, but when viewed through an integrated conceptual model, common themes emerge. We will further discuss a critical next step, which is the local to regional to global evaluation of this conceptual model, to enable multiscale understanding. We specifically propose a global `hydropeaking network' of researchers using common methods, data standards, and analysis techniques to quantify the hydrobiogeochemical effects of hydropeaking across biomes. We will conclude with a prospective discussion of key science questions that emerge from the conceptual model and that can only be answered through a global, synchronized effort. Such an effort has the potential to strongly influence dam operations towards improved health of river corridor ecosystems from local to global scales.
NASA Astrophysics Data System (ADS)
Katchasuwanmanee, Kanet; Cheng, Kai; Bateman, Richard
2016-09-01
As energy efficiency is one of the key essentials towards sustainability, the development of an energy-resource efficient manufacturing system is among the great challenges facing the current industry. Meanwhile, the availability of advanced technological innovation has created more complex manufacturing systems that involve a large variety of processes and machines serving different functions. To extend the limited knowledge on energy-efficient scheduling, the research presented in this paper attempts to model the production schedule at an operation process by considering the balance of energy consumption reduction in production, production work flow (productivity) and quality. An innovative systematic approach to manufacturing energy-resource efficiency is proposed with the virtual simulation as a predictive modelling enabler, which provides real-time manufacturing monitoring, virtual displays and decision-makings and consequentially an analytical and multidimensional correlation analysis on interdependent relationships among energy consumption, work flow and quality errors. The regression analysis results demonstrate positive relationships between the work flow and quality errors and the work flow and energy consumption. When production scheduling is controlled through optimization of work flow, quality errors and overall energy consumption, the energy-resource efficiency can be achieved in the production. Together, this proposed multidimensional modelling and analysis approach provides optimal conditions for the production scheduling at the manufacturing system by taking account of production quality, energy consumption and resource efficiency, which can lead to the key competitive advantages and sustainability of the system operations in the industry.
Collective Framework and Performance Optimizations to Open MPI for Cray XT Platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ladd, Joshua S; Gorentla Venkata, Manjunath; Shamis, Pavel
2011-01-01
The performance and scalability of collective operations plays a key role in the performance and scalability of many scientific applications. Within the Open MPI code base we have developed a general purpose hierarchical collective operations framework called Cheetah, and applied it at large scale on the Oak Ridge Leadership Computing Facility's Jaguar (OLCF) platform, obtaining better performance and scalability than the native MPI implementation. This paper discuss Cheetah's design and implementation, and optimizations to the framework for Cray XT 5 platforms. Our results show that the Cheetah's Broadcast and Barrier perform better than the native MPI implementation. For medium data,more » the Cheetah's Broadcast outperforms the native MPI implementation by 93% for 49,152 processes problem size. For small and large data, it out performs the native MPI implementation by 10% and 9%, respectively, at 24,576 processes problem size. The Cheetah's Barrier performs 10% better than the native MPI implementation for 12,288 processes problem size.« less
The Standard Autonomous File Server, a Customized, Off-the-Shelf Success Story
NASA Technical Reports Server (NTRS)
Semancik, Susan K.; Conger, Annette M.; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
The Standard Autonomous File Server (SAFS), which includes both off-the-shelf hardware and software, uses an improved automated file transfer process to provide a quicker, more reliable, prioritized file distribution for customers of near real-time data without interfering with the assets involved in the acquisition and processing of the data. It operates as a stand-alone solution, monitoring itself, and providing an automated fail-over process to enhance reliability. This paper will describe the unique problems and lessons learned both during the COTS selection and integration into SAFS, and the system's first year of operation in support of NASA's satellite ground network. COTS was the key factor in allowing the two-person development team to deploy systems in less than a year, meeting the required launch schedule. The SAFS system his been so successful, it is becoming a NASA standard resource, leading to its nomination for NASA's Software or the Year Award in 1999.
A Gaussian Processes Technique for Short-term Load Forecasting with Considerations of Uncertainty
NASA Astrophysics Data System (ADS)
Ohmi, Masataro; Mori, Hiroyuki
In this paper, an efficient method is proposed to deal with short-term load forecasting with the Gaussian Processes. Short-term load forecasting plays a key role to smooth power system operation such as economic load dispatching, unit commitment, etc. Recently, the deregulated and competitive power market increases the degree of uncertainty. As a result, it is more important to obtain better prediction results to save the cost. One of the most important aspects is that power system operator needs the upper and lower bounds of the predicted load to deal with the uncertainty while they require more accurate predicted values. The proposed method is based on the Bayes model in which output is expressed in a distribution rather than a point. To realize the model efficiently, this paper proposes the Gaussian Processes that consists of the Bayes linear model and kernel machine to obtain the distribution of the predicted value. The proposed method is successively applied to real data of daily maximum load forecasting.
Bayramzadeh, Sara; Joseph, Anjali; Allison, David; Shultz, Jonas; Abernathy, James
2018-07-01
This paper describes the process and tools developed as part of a multidisciplinary collaborative simulation-based approach for iterative design and evaluation of operating room (OR) prototypes. Full-scale physical mock-ups of healthcare spaces offer an opportunity to actively communicate with and to engage multidisciplinary stakeholders in the design process. While mock-ups are increasingly being used in healthcare facility design projects, they are rarely evaluated in a manner to support active user feedback and engagement. Researchers and architecture students worked closely with clinicians and architects to develop OR design prototypes and engaged clinical end-users in simulated scenarios. An evaluation toolkit was developed to compare design prototypes. The mock-up evaluation helped the team make key decisions about room size, location of OR table, intra-room zoning, and doors location. Structured simulation based mock-up evaluations conducted in the design process can help stakeholders visualize their future workspace and provide active feedback. Copyright © 2018 Elsevier Ltd. All rights reserved.
Office of River Protection Advanced Low-Activity Waste Glass Research and Development Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kruger, A. A.; Peeler, D. K.; Kim, D. S.
2015-11-23
The U.S. Department of Energy Office of River Protection (ORP) has initiated and leads an integrated Advanced Waste Glass (AWG) program to increase the loading of Hanford tank wastes in glass while meeting melter lifetime expectancies and process, regulatory, and product performance requirements. The integrated ORP program is focused on providing a technical, science-based foundation for making key decisions regarding the successful operation of the Hanford Tank Waste Treatment and Immobilization Plant (WTP) facilities in the context of an optimized River Protection Project (RPP) flowsheet. The fundamental data stemming from this program will support development of advanced glass formulations, keymore » product performance and process control models, and tactical processing strategies to ensure safe and successful operations for both the low-activity waste (LAW) and high-level waste vitrification facilities. These activities will be conducted with the objective of improving the overall RPP mission by enhancing flexibility and reducing cost and schedule.« less
Effect of process parameters on greenhouse gas generation by wastewater treatment plants.
Yerushalmi, L; Shahabadi, M Bani; Haghighat, F
2011-05-01
The effect of key process parameters on greenhouse gas (GHG) emission by wastewater treatment plants was evaluated, and the governing parameters that exhibited major effects on the overall on- and off-site GHG emissions were identified. This evaluation used aerobic, anaerobic, and hybrid anaerobic/aerobic treatment systems with food processing industry wastewater. The operating temperature of anaerobic sludge digester was identified to have the highest effect on GHG generation in the aerobic treatment system. The total GHG emissions of 2694 kg CO2e/d were increased by 72.5% with the increase of anaerobic sludge digester temperature from 20 to 40 degrees C. The operating temperature of the anaerobic reactor was the dominant controlling parameter in the anaerobic and hybrid treatment systems. Raising the anaerobic reactor's temperature from 25 to 40 degrees C increased the total GHG emissions from 5822 and 6617 kg CO2e/d by 105.6 and 96.5% in the anaerobic and hybrid treatment systems, respectively.
Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert
2015-05-28
System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less
NASA Astrophysics Data System (ADS)
Tobiska, W. Kent
Space weather’s effects upon the near-Earth environment are due to dynamic changes in the energy transfer processes from the Sun’s photons, particles, and fields. Of the space environment domains that are affected by space weather, the magnetosphere, thermosphere, and even troposphere are key regions that are affected. Space Environment Technologies (SET) has developed and is producing innovative space weather applications. Key operational systems for providing timely information about the effects of space weather on these domains are SET’s Magnetosphere Alert and Prediction System (MAPS), LEO Alert and Prediction System (LAPS), and Automated Radiation Measurements for Aviation Safety (ARMAS) system. MAPS provides a forecast Dst index out to 6 days through the data-driven, redundant data stream Anemomilos algorithm. Anemomilos uses observational proxies for the magnitude, location, and velocity of solar ejecta events. This forecast index is used by satellite operations to characterize upcoming geomagnetic storms, for example. In addition, an ENLIL/Rice Dst prediction out to several days has also been developed and will be described. LAPS is the SET fully redundant operational system providing recent history, current epoch, and forecast solar and geomagnetic indices for use in operational versions of the JB2008 thermospheric density model. The thermospheric densities produced by that system, driven by the LAPS data, are forecast to 72-hours to provide the global mass densities for satellite operators. ARMAS is a project that has successfully demonstrated the operation of a micro dosimeter on aircraft to capture the real-time radiation environment due to Galactic Cosmic Rays and Solar Energetic Particles. The dose and dose-rates are captured on aircraft, downlinked in real-time via the Iridium satellites, processed on the ground, incorporated into the most recent NAIRAS global radiation climatology data runs, and made available to end users via the web and smart phone apps. ARMAS provides the “weather” of the radiation environment to improve air-crew and passenger safety. Many of the data products from MAPS, LAPS, and ARMAS are available on the SpaceWx smartphone app for iPhone, iPad, iPod, and Android professional users and public space weather education. We describe recent forecasting advances for moving the space weather information from these automated systems into operational, derivative products for communications, aviation, and satellite operations uses.
Method for encryption and transmission of digital keying data
Mniszewski, Susan M.; Springer, Edward A.; Brenner, David P.
1988-01-01
A method for the encryption, transmission, and subsequent decryption of digital keying data. The method utilizes the Data Encryption Standard and is implemented by means of a pair of apparatus, each of which is selectable to operate as either a master unit or remote unit. Each unit contains a set of key encryption keys which are indexed by a common indexing system. The master unit operates upon command from the remote unit to generate a data encryption key and encrypt the data encryption key using a preselected key encryption key. The encrypted data encryption key and an index designator are then downloaded to the remote unit, where the data encryption key is decrypted for subsequent use in the encryption and transmission data. Downloading of the encrypted data encryption key enables frequent change of keys without requiring manual entry or storage of keys at the remote unit.
Integrated Main Propulsion System Performance Reconstruction Process/Models
NASA Technical Reports Server (NTRS)
Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael
2013-01-01
The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.
Using near infrared spectroscopy and heart rate variability to detect mental overload.
Durantin, G; Gagnon, J-F; Tremblay, S; Dehais, F
2014-02-01
Mental workload is a key factor influencing the occurrence of human error, especially during piloting and remotely operated vehicle (ROV) operations, where safety depends on the ability of pilots to act appropriately. In particular, excessively high or low mental workload can lead operators to neglect critical information. The objective of the present study is to investigate the potential of functional near infrared spectroscopy (fNIRS) - a non-invasive method of measuring prefrontal cortex activity - in combination with measurements of heart rate variability (HRV), to predict mental workload during a simulated piloting task, with particular regard to task engagement and disengagement. Twelve volunteers performed a computer-based piloting task in which they were asked to follow a dynamic target with their aircraft, a task designed to replicate key cognitive demands associated with real life ROV operating tasks. In order to cover a wide range of mental workload levels, task difficulty was manipulated in terms of processing load and difficulty of control - two critical sources of workload associated with piloting and remotely operating a vehicle. Results show that both fNIRS and HRV are sensitive to different levels of mental workload; notably, lower prefrontal activation as well as a lower LF/HF ratio at the highest level of difficulty, suggest that these measures are suitable for mental overload detection. Moreover, these latter measurements point toward the existence of a quadratic model of mental workload. Copyright © 2013 Elsevier B.V. All rights reserved.
Recce NG: from Recce sensor to image intelligence (IMINT)
NASA Astrophysics Data System (ADS)
Larroque, Serge
2001-12-01
Recce NG (Reconnaissance New Generation) is presented as a complete and optimized Tactical Reconnaissance System. Based on a new generation Pod integrating high resolution Dual Band sensors, the system has been designed with the operational lessons learnt from the last Peace Keeping Operations in Bosnia and Kosovo. The technical solutions retained as component modules of a full IMINT acquisition system, take benefit of the state of art in the following key technologies: Advanced Mission Planning System for long range stand-off Manned Recce, Aircraft and/or Pod tasking, operating sophisticated back-up software tools, high resolution 3D geo data and improved/combat proven MMI to reduce planning delays, Mature Dual Band sensors technology to achieve the Day and Night Recce Mission, including advanced automatic operational functions, as azimuth and roll tracking capabilities, low risk in Pod integration and in carrier avionics, controls and displays upgrades, to save time in operational turn over and maintenance, High rate Imagery Down Link, for Real Time or Near Real Time transmission, fully compatible with STANAG 7085 requirements, Advanced IMINT Exploitation Ground Segment, combat proven, NATO interoperable (STANAG 7023), integrating high value software tools for accurate location, improved radiometric image processing and open link to the C4ISR systems. The choice of an industrial Prime contractor mastering across the full system, all the prior listed key products and technologies, is mandatory to a successful delivery in terms of low Cost, Risk and Time Schedule.
Wang, Zuowei; Xia, Siqing; Xu, Xiaoyin; Wang, Chenhui
2016-02-01
In this study, a one-dimensional multispecies model (ODMSM) was utilized to simulate NO3(-)-N and ClO4(-) reduction performances in two kinds of H2-based membrane-aeration biofilm reactors (H2-MBfR) within different operating conditions (e.g., NO3(-)-N/ClO4(-) loading rates, H2 partial pressure, etc.). Before the simulation process, we conducted the sensitivity analysis of some key parameters which would fluctuate in different environmental conditions, then we used the experimental data to calibrate the more sensitive parameters μ1 and μ2 (maximum specific growth rates of denitrification bacteria and perchlorate reduction bacteria) in two H2-MBfRs, and the diversity of the two key parameters' values in two types of reactors may be resulted from the different carbon source fed in the reactors. From the simulation results of six different operating conditions (four in H2-MBfR 1 and two in H2-MBfR 2), the applicability of the model was approved, and the variation of the removal tendency in different operating conditions could be well simulated. Besides, the rationality of operating parameters (H2 partial pressure, etc.) could be judged especially in condition of high nutrients' loading rates. To a certain degree, the model could provide theoretical guidance to determine the operating parameters on some specific conditions in practical application.
Maintenance-free operation of WDM quantum key distribution system through a field fiber over 30 days
NASA Astrophysics Data System (ADS)
Yoshino, Ken-ichiro; Ochi, Takao; Fujiwara, Mikio; Sasaki, Masahide; Tajima, Akio
2013-12-01
Maintenance-free wavelength-division-multiplexing quantum key distribution for 30 days was achieved through a 22-km field fiber. Using polarization-independent interferometers and stabilization techniques, we attained a quantum bit error rate as low as 1.70% and a key rate as high as 229.8 kbps, making the record of total secure key of 595.6 Gbits accumulated over an uninterrupted operation period.
Daylight operation of a free space, entanglement-based quantum key distribution system
NASA Astrophysics Data System (ADS)
Peloso, Matthew P.; Gerhardt, Ilja; Ho, Caleb; Lamas-Linares, Antía; Kurtsiefer, Christian
2009-04-01
Many quantum key distribution (QKD) implementations using a free space transmission path are restricted to operation at night time in order to distinguish the signal photons used for a secure key establishment from the background light. Here, we present a lean entanglement-based QKD system overcoming that limitation. By implementing spectral, spatial and temporal filtering techniques, we establish a secure key continuously over several days under varying light and weather conditions.
Korasa, Klemen; Vrečer, Franc
2018-01-01
Over the last two decades, regulatory agencies have demanded better understanding of pharmaceutical products and processes by implementing new technological approaches, such as process analytical technology (PAT). Process analysers present a key PAT tool, which enables effective process monitoring, and thus improved process control of medicinal product manufacturing. Process analysers applicable in pharmaceutical coating unit operations are comprehensibly described in the present article. The review is focused on monitoring of solid oral dosage forms during film coating in two most commonly used coating systems, i.e. pan and fluid bed coaters. Brief theoretical background and critical overview of process analysers used for real-time or near real-time (in-, on-, at- line) monitoring of critical quality attributes of film coated dosage forms are presented. Besides well recognized spectroscopic methods (NIR and Raman spectroscopy), other techniques, which have made a significant breakthrough in recent years, are discussed (terahertz pulsed imaging (TPI), chord length distribution (CLD) analysis, and image analysis). Last part of the review is dedicated to novel techniques with high potential to become valuable PAT tools in the future (optical coherence tomography (OCT), acoustic emission (AE), microwave resonance (MR), and laser induced breakdown spectroscopy (LIBS)). Copyright © 2017 Elsevier B.V. All rights reserved.
Atmosphere Revitalization Technology Development for Crewed Space Exploration
NASA Technical Reports Server (NTRS)
Perry, Jay L.; Carrasquillo, Robyn L.; Harris, Danny W.
2006-01-01
As space exploration objectives extend human presence beyond low Earth orbit, the solutions to technological challenges presented by supporting human life in the hostile space environment must build upon experience gained during past and present crewed space exploration programs. These programs and the cabin atmosphere revitalization process technologies and systems developed for them represent the National Aeronautics and Space Administration s (NASA) past and present operational knowledge base for maintaining a safe, comfortable environment for the crew. The contributions of these programs to the NASA s technological and operational working knowledge base as well as key strengths and weaknesses to be overcome are discussed. Areas for technological development to address challenges inherent with the Vision for Space Exploration (VSE) are presented and a plan for their development employing unit operations principles is summarized
Halladay, Jacqueline R; Donahue, Katrina E; Sleath, Betsy; Reuland, Dan; Black, Adina; Mitchell, C Madeline; Breland, Carol E; Coyne-Beasley, Tamera; Mottus, Kathleen; Watson, Sable Noelle; Lewis, Virginia; Wynn, Mysha; Corbie-Smith, Giselle
2017-01-01
Engaging stakeholders in research carries the promise of enhancing the research relevance, transparency, and speed of getting findings into practice. By describing the context and functional aspects of stakeholder groups, like those working as community advisory boards (CABs), others can learn from these experiences and operationalize their own CABs. Our objective is to describe our experiences with diverse CABs affiliated with our community engagement group within our institution's Clinical Translational Sciences Award (CTSA). We identify key contextual elements that are important to administering CABs. A group of investigators, staff, and community members engaged in a 6-month collaboration to describe their experiences of working with six research CABs. We identified the key contextual domains that illustrate how CABS are developed and sustained. Two lead authors, with experience with CABs and identifying contextual domains in other work, led a team of 13 through the process. Additionally, we devised a list of key tips to consider when devising CABs. The final domains include (1) aligned missions among stakeholders (2) resources/support, (3) defined operational processes/shared power, (4) well-described member roles, and (5) understanding and mitigating challenges. The tips are a set of actions that support the domains. Identifying key contextual domains was relatively easy, despite differences in the respective CAB's condition of focus, overall mission, or patient demographics represented. By contextualizing these five domains, other research and community partners can take an informed approach to move forward with CAB planning and engaged research.
NASA Astrophysics Data System (ADS)
Cordova, Martin; Serio, Andrew; Meza, Francisco; Arriagada, Gustavo; Swett, Hector; Ball, Jesse; Collins, Paul; Masuda, Neal; Fuentes, Javier
2016-07-01
In 2014 Gemini Observatory started the base facility operations (BFO) project. The project's goal was to provide the ability to operate the two Gemini telescopes from their base facilities (respectively Hilo, HI at Gemini North, and La Serena, Chile at Gemini South). BFO was identified as a key project for Gemini's transition program, as it created an opportunity to reduce operational costs. In November 2015, the Gemini North telescope started operating from the base facility in Hilo, Hawaii. In order to provide the remote operator the tools to work from the base, many of the activities that were normally performed by the night staff at the summit were replaced with new systems and tools. This paper describes some of the key systems and tools implemented for environmental monitoring, and the design used in the implementation at the Gemini North telescope.
Artificial Intelligent Platform as Decision Tool for Asset Management, Operations and Maintenance.
2018-01-04
An Artificial Intelligence (AI) system has been developed and implemented for water, wastewater and reuse plants to improve management of sensors, short and long term maintenance plans, asset and investment management plans. It is based on an integrated approach to capture data from different computer systems and files. It adds a layer of intelligence to the data. It serves as a repository of key current and future operations and maintenance conditions that a plant needs have knowledge of. With this information, it is able to simulate the configuration of processes and assets for those conditions to improve or optimize operations, maintenance and asset management, using the IViewOps (Intelligent View of Operations) model. Based on the optimization through model runs, it is able to create output files that can feed data to other systems and inform the staff regarding optimal solutions to the conditions experienced or anticipated in the future.
Rendezvous, proximity operations and capture quality function deployment report
NASA Technical Reports Server (NTRS)
Lamkin, Stephen L. (Editor)
1991-01-01
Rendezvous, Proximity Operations, and Capture (RPOC) is a missions operations area which is extremely important to present and future space initiatives and must be well planned and coordinated. To support this, a study team was formed to identify a specific plan of action using the Quality Function Deployment (QFD) process. This team was composed of members from a wide spectrum of engineering and operations organizations which are involved in the RPOC technology area. The key to this study's success is an understanding of the needs of potential programmatic customers and the technology base available for system implementation. To this end, the study team conducted interviews with a variety of near term and future programmatic customers and technology development sponsors. The QFD activity led to a thorough understanding of the needs of these customers in the RPOC area, as well as the relative importance of these needs.
Functional description of a command and control language tutor
NASA Technical Reports Server (NTRS)
Elke, David R.; Seamster, Thomas L.; Truszkowski, Walter
1990-01-01
The status of an ongoing project to explore the application of Intelligent Tutoring System (ITS) technology to NASA command and control languages is described. The primary objective of the current phase of the project is to develop a user interface for an ITS to assist NASA control center personnel in learning Systems Test and Operations Language (STOL). Although this ITS will be developed for Gamma Ray Observatory operators, it will be designed with sufficient flexibility so that its modules may serve as an ITS for other control languages such as the User Interface Language (UIL). The focus of this phase is to develop at least one other form of STOL representation to complement the operational STOL interface. Such an alternative representation would be adaptively employed during the tutoring session to facilitate the learning process. This is a key feature of this ITS which distinguishes it from a simulator that is only capable of representing the operational environment.
Ventre, Kathleen M; Barry, James S; Davis, Deborah; Baiamonte, Veronica L; Wentworth, Allen C; Pietras, Michele; Coughlin, Liza; Barley, Gwyn
2014-04-01
Relocating obstetric (OB) services to a children's hospital imposes demands on facility operations, which must be met to ensure quality care and a satisfactory patient experience. We used in situ simulations to prospectively and iteratively evaluate operational readiness of a children's hospital-based OB unit before it opened for patient care. This project took place at a 314-bed, university-affiliated children's hospital. We developed 3 full-scale simulation scenarios depicting a concurrent maternal and neonatal emergency. One scenario began with a standardized patient experiencing admission; the mannequin portrayed a mother during delivery. We ran all 3 scenarios on 2 dates scheduled several weeks apart. We ran 2 of the scenarios on a third day to verify the reliability of key processes. During the simulations, content experts completed equipment checklists, and participants identified latent safety hazards. Each simulation involved a unique combination of scheduled participants who were supplemented by providers from responding ancillary services. The simulations involved 133 scheduled participants representing OB, neonatology, and anesthesiology. We exposed and addressed operational deficiencies involving equipment availability, staffing, interprofessional communication, and systems issues such as transfusion protocol failures and electronic order entry challenges. Process changes between simulation days 1 to 3 decreased the elapsed time between transfusion protocol activation and blood arrival to the operating room and labor/delivery/recovery/postpartum setting. In situ simulations identified multiple operational deficiencies on the OB unit, allowing us to take corrective action before its opening. This project may guide other children's hospitals regarding care processes likely to require significant focus and possible modification to accommodate an OB service.
Biometrics encryption combining palmprint with two-layer error correction codes
NASA Astrophysics Data System (ADS)
Li, Hengjian; Qiu, Jian; Dong, Jiwen; Feng, Guang
2017-07-01
To bridge the gap between the fuzziness of biometrics and the exactitude of cryptography, based on combining palmprint with two-layer error correction codes, a novel biometrics encryption method is proposed. Firstly, the randomly generated original keys are encoded by convolutional and cyclic two-layer coding. The first layer uses a convolution code to correct burst errors. The second layer uses cyclic code to correct random errors. Then, the palmprint features are extracted from the palmprint images. Next, they are fused together by XORing operation. The information is stored in a smart card. Finally, the original keys extraction process is the information in the smart card XOR the user's palmprint features and then decoded with convolutional and cyclic two-layer code. The experimental results and security analysis show that it can recover the original keys completely. The proposed method is more secure than a single password factor, and has higher accuracy than a single biometric factor.
Calleja, Jesus Maria Garcia; Zhao, Jinkou; Reddy, Amala; Seguy, Nicole
2014-01-01
Problem Size estimates of key populations at higher risk of HIV exposure are recognized as critical for understanding the trajectory of the HIV epidemic and planning and monitoring an effective response, especially for countries with concentrated and low epidemics such as those in Asia. Context To help countries estimate population sizes of key populations, global guidelines were updated in 2011 to reflect new technical developments and recent field experiences in applying these methods. Action In September 2013, a meeting of programme managers and experts experienced with population size estimates (PSE) for key populations was held for 13 Asian countries. This article summarizes the key results presented, shares practical lessons learnt and reviews the methodological approaches from implementing PSE in 13 countries. Lessons learnt It is important to build capacity to collect, analyse and use PSE data; establish a technical review group; and implement a transparent, well documented process. Countries should adapt global PSE guidelines and maintain operational definitions that are more relevant and useable for country programmes. Development of methods for non-venue-based key populations requires more investment and collaborative efforts between countries and among partners. PMID:25320676
Liquid rocket booster integration study. Volume 3: Study products. Part 2: Sections 8-19
NASA Technical Reports Server (NTRS)
1988-01-01
The impacts of introducing liquid rocket booster engines (LRB) into the Space Transportation System (STS)/Kennedy Space Center (KSC) launch environment are identified and evaluated. Proposed ground systems configurations are presented along with a launch site requirements summary. Prelaunch processing scenarios are described and the required facility modifications and new facility requirements are analyzed. Flight vehicle design recommendations to enhance launch processing are discussed. Processing approaches to integrate LRB with existing STS launch operations are evaluated. The key features and significance of launch site transition to a new STS configuration in parallel with ongoing launch activities are enumerated. This volume is part two of the study products section of the five volume series.
Psychiatric nurses' beliefs, attitudes, and perceived barriers about medical emergency teams.
Herisko, Camellia; Puskar, Kathryn; Mitchell, Ann M
2013-10-01
A literature review of nurses' attitudes, beliefs, and barriers regarding the medical emergency team (MET) process is limited to medical hospitals. How psychiatric nurses view the MET process and their prior experiences with METs are important because they are often the ones assessing the need for, and then calling, the MET. This article examines psychiatric nurses' attitudes, beliefs, and barriers toward the MET process in a 310-bed psychiatric hospital that is part of an urban academic medical center. Through the use of key informant interviews, nurses were asked for their feedback and input regarding the current MET practices. The results may be useful in improving the current operating system.
Liquid rocket booster integration study. Volume 3, part 1: Study products
NASA Technical Reports Server (NTRS)
1988-01-01
The impacts of introducing liquid rocket booster engines (LRB) into the Space Transportation System (STS)/Kennedy Space Center (KSC) launch environment are identified and evaluated. Proposed ground systems configurations are presented along with a launch site requirements summary. Prelaunch processing scenarios are described and the required facility modifications and new facility requirements are analyzed. Flight vehicle design recommendations to enhance launch processing are discussed. Processing approaches to integrate LRB with existing STS launch operations are evaluated. The key features and significance of launch site transition to a new STS configuration in parallel with ongoing launch activities are enumerated. This volume is part one of the study products section of the five volume series.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-03
... via the U.S. Postal Service to Naval Facilities Engineering Command Southeast, NAS Key West Air... the project Web site ( http://www.keywesteis.com ). All statements, oral or written, submitted during... Engineering Command Southeast, NAS Key West Air Operations EIS Project Manager, P.O. Box 30, Building 903, NAS...
Summary of Key Operating Statistics: Data Collected from the 2009 Annual Institutional Report
ERIC Educational Resources Information Center
Accrediting Council for Independent Colleges and Schools, 2010
2010-01-01
The Accrediting Council for Independent Colleges and Schools (ACICS) provides the Summary of Key Operating Statistics (KOS) as an annual review of the performance and key measurements of the more than 800 private post-secondary institutions we accredit. This edition of the KOS contains information based on the 2009 Annual Institutional Reports…
Developing a comprehensive training curriculum for integrated predictive maintenance
NASA Astrophysics Data System (ADS)
Wurzbach, Richard N.
2002-03-01
On-line equipment condition monitoring is a critical component of the world-class production and safety histories of many successful nuclear plant operators. From addressing availability and operability concerns of nuclear safety-related equipment to increasing profitability through support system reliability and reduced maintenance costs, Predictive Maintenance programs have increasingly become a vital contribution to the maintenance and operation decisions of nuclear facilities. In recent years, significant advancements have been made in the quality and portability of many of the instruments being used, and software improvements have been made as well. However, the single most influential component of the success of these programs is the impact of a trained and experienced team of personnel putting this technology to work. Changes in the nature of the power generation industry brought on by competition, mergers, and acquisitions, has taken the historically stable personnel environment of power generation and created a very dynamic situation. As a result, many facilities have seen a significant turnover in personnel in key positions, including predictive maintenance personnel. It has become the challenge for many nuclear operators to maintain the consistent contribution of quality data and information from predictive maintenance that has become important in the overall equipment decision process. These challenges can be met through the implementation of quality training to predictive maintenance personnel and regular updating and re-certification of key technology holders. The use of data management tools and services aid in the sharing of information across sites within an operating company, and with experts who can contribute value-added data management and analysis. The overall effectiveness of predictive maintenance programs can be improved through the incorporation of newly developed comprehensive technology training courses. These courses address the use of key technologies such as vibration analysis, infrared thermography, and oil analysis not as singular entities, but as a toolbox resource from which to address overall equipment and plant reliability in a structured program and decision environment.
NASA Astrophysics Data System (ADS)
Rafiee, Seyed Ehsan; Sadeghiazad, M. M.
2016-06-01
Air separators provide safe, clean, and appropriate air flow to engines and are widely used in vehicles with large engines such as ships and submarines. In this operational study, the separation process inside a Ranque-Hilsch vortex tube cleaning (cooling) system is investigated to analyze the impact of the operating gas type on the vortex tube performance; the operating gases used are air, nitrogen, oxygen, carbon dioxide and nitrogen dioxide. The computational fluid dynamic model used is equipped with a three-dimensional structure, and the steady-state condition is applied during computations. The standard k-ɛ turbulence model is employed to resolve nonlinear flow equations, and various key parameters, such as hot and cold exhaust thermal drops, and power separation rates, are described numerically. The results show that nitrogen dioxide creates the greatest separation power out of all gases tested, and the numerical results are validated by good agreement with available experimental data. In addition, a comparison is made between the use of two different boundary conditions, the pressure-far-field and the pressure-outlet, when analyzing complex turbulent flows inside the air separators. Results present a comprehensive and practical solution for use in future numerical studies.
Web Monitoring of EOS Front-End Ground Operations, Science Downlinks and Level 0 Processing
NASA Technical Reports Server (NTRS)
Cordier, Guy R.; Wilkinson, Chris; McLemore, Bruce
2008-01-01
This paper addresses the efforts undertaken and the technology deployed to aggregate and distribute the metadata characterizing the real-time operations associated with NASA Earth Observing Systems (EOS) high-rate front-end systems and the science data collected at multiple ground stations and forwarded to the Goddard Space Flight Center for level 0 processing. Station operators, mission project management personnel, spacecraft flight operations personnel and data end-users for various EOS missions can retrieve the information at any time from any location having access to the internet. The users are distributed and the EOS systems are distributed but the centralized metadata accessed via an external web server provide an effective global and detailed view of the enterprise-wide events as they are happening. The data-driven architecture and the implementation of applied middleware technology, open source database, open source monitoring tools, and external web server converge nicely to fulfill the various needs of the enterprise. The timeliness and content of the information provided are key to making timely and correct decisions which reduce project risk and enhance overall customer satisfaction. The authors discuss security measures employed to limit access of data to authorized users only.
Li, Jiabao; Rui, Junpeng; Yao, Minjie; Zhang, Shiheng; Yan, Xuefeng; Wang, Yuanpeng; Yan, Zhiying; Li, Xiangzhen
2015-01-01
The microbial-mediated anaerobic digestion (AD) process represents an efficient biological process for the treatment of organic waste along with biogas harvest. Currently, the key factors structuring bacterial communities and the potential core and unique bacterial populations in manure anaerobic digesters are not completely elucidated yet. In this study, we collected sludge samples from 20 full-scale anaerobic digesters treating cattle or swine manure, and investigated the variations of bacterial community compositions using high-throughput 16S rRNA amplicon sequencing. Clustering and correlation analysis suggested that substrate type and free ammonia (FA) play key roles in determining the bacterial community structure. The COD: [Formula: see text] (C:N) ratio of substrate and FA were the most important available operational parameters correlating to the bacterial communities in cattle and swine manure digesters, respectively. The bacterial populations in all of the digesters were dominated by phylum Firmicutes, followed by Bacteroidetes, Proteobacteria and Chloroflexi. Increased FA content selected Firmicutes, suggesting that they probably play more important roles under high FA content. Syntrophic metabolism by Proteobacteria, Chloroflexi, Synergistetes and Planctomycetes are likely inhibited when FA content is high. Despite the different manure substrates, operational conditions and geographical locations of digesters, core bacterial communities were identified. The core communities were best characterized by phylum Firmicutes, wherein Clostridium predominated overwhelmingly. Substrate-unique and abundant communities may reflect the properties of manure substrate and operational conditions. These findings extend our current understanding of the bacterial assembly in full-scale manure anaerobic digesters.
Li, Jiabao; Rui, Junpeng; Yao, Minjie; Zhang, Shiheng; Yan, Xuefeng; Wang, Yuanpeng; Yan, Zhiying; Li, Xiangzhen
2015-01-01
The microbial-mediated anaerobic digestion (AD) process represents an efficient biological process for the treatment of organic waste along with biogas harvest. Currently, the key factors structuring bacterial communities and the potential core and unique bacterial populations in manure anaerobic digesters are not completely elucidated yet. In this study, we collected sludge samples from 20 full-scale anaerobic digesters treating cattle or swine manure, and investigated the variations of bacterial community compositions using high-throughput 16S rRNA amplicon sequencing. Clustering and correlation analysis suggested that substrate type and free ammonia (FA) play key roles in determining the bacterial community structure. The COD: NH4+-N (C:N) ratio of substrate and FA were the most important available operational parameters correlating to the bacterial communities in cattle and swine manure digesters, respectively. The bacterial populations in all of the digesters were dominated by phylum Firmicutes, followed by Bacteroidetes, Proteobacteria and Chloroflexi. Increased FA content selected Firmicutes, suggesting that they probably play more important roles under high FA content. Syntrophic metabolism by Proteobacteria, Chloroflexi, Synergistetes and Planctomycetes are likely inhibited when FA content is high. Despite the different manure substrates, operational conditions and geographical locations of digesters, core bacterial communities were identified. The core communities were best characterized by phylum Firmicutes, wherein Clostridium predominated overwhelmingly. Substrate-unique and abundant communities may reflect the properties of manure substrate and operational conditions. These findings extend our current understanding of the bacterial assembly in full-scale manure anaerobic digesters. PMID:26648921
Keyter, Andrea; Gouws, Joey; Salek, Sam; Walker, Stuart
2018-01-01
The aims of this study were to assess the regulatory review process in South Africa from 2015 to 2017, identify the key milestones and timelines; evaluate the effectiveness of measures to ensure consistency, transparency, timeliness, and predictability in the review process; and to provide recommendations for enhanced regulatory practices. A questionnaire was completed by the Medicines Control Council (MCC) to describe the organization of the authority, record key milestones and timelines in the review process and to identify good review practices (GRevPs). Currently, the MCC conducts a full assessment of quality, efficacy, and safety data in the review of all applications. The overall regulatory median approval time decreased by 14% in 2017 (1411 calendar days) compared with that of 2016, despite the 27% increase in the number of applications. However, the MCC has no target for overall approval time of new active substance applications and no targets for key review milestones. Guidelines, standard operating procedures, and review templates are in place, while the formal implementation of GRevPs and the application of an electronic document management system are planned for the near future. As the MCC transitions to the newly established South Africa Health Products Regulatory Authority, it would be crucial for the authority to recognize the opportunities for an enhanced regulatory review and should consider models such as abridged assessment, which encompass elements of risk stratification and reliance. It is hoped that resource constraints may then be alleviated and capacity developed to meet target timelines.
Galileo mission planning for Low Gain Antenna based operations
NASA Technical Reports Server (NTRS)
Gershman, R.; Buxbaum, K. L.; Ludwinski, J. M.; Paczkowski, B. G.
1994-01-01
The Galileo mission operations concept is undergoing substantial redesign, necessitated by the deployment failure of the High Gain Antenna, while the spacecraft is on its way to Jupiter. The new design applies state-of-the-art technology and processes to increase the telemetry rate available through the Low Gain Antenna and to increase the information density of the telemetry. This paper describes the mission planning process being developed as part of this redesign. Principal topics include a brief description of the new mission concept and anticipated science return (these have been covered more extensively in earlier papers), identification of key drivers on the mission planning process, a description of the process and its implementation schedule, a discussion of the application of automated mission planning tool to the process, and a status report on mission planning work to date. Galileo enhancements include extensive reprogramming of on-board computers and substantial hard ware and software upgrades for the Deep Space Network (DSN). The principal mode of operation will be onboard recording of science data followed by extended playback periods. A variety of techniques will be used to compress and edit the data both before recording and during playback. A highly-compressed real-time science data stream will also be important. The telemetry rate will be increased using advanced coding techniques and advanced receivers. Galileo mission planning for orbital operations now involves partitioning of several scarce resources. Particularly difficult are division of the telemetry among the many users (eleven instruments, radio science, engineering monitoring, and navigation) and allocation of space on the tape recorder at each of the ten satellite encounters. The planning process is complicated by uncertainty in forecast performance of the DSN modifications and the non-deterministic nature of the new data compression schemes. Key mission planning steps include quantifying resource or capabilities to be allocated, prioritizing science observations and estimating resource needs for each, working inter-and intra-orbit trades of these resources among the Project elements, and planning real-time science activity. The first major mission planning activity, a high level, orbit-by-orbit allocation of resources among science objectives, has already been completed; and results are illustrated in the paper. To make efficient use of limited resources, Galileo mission planning will rely on automated mission planning tools capable of dealing with interactions among time-varying downlink capability, real-time science and engineering data transmission, and playback of recorded data. A new generic mission planning tool is being adapted for this purpose.
Galileo mission planning for Low Gain Antenna based operations
NASA Astrophysics Data System (ADS)
Gershman, R.; Buxbaum, K. L.; Ludwinski, J. M.; Paczkowski, B. G.
1994-11-01
The Galileo mission operations concept is undergoing substantial redesign, necessitated by the deployment failure of the High Gain Antenna, while the spacecraft is on its way to Jupiter. The new design applies state-of-the-art technology and processes to increase the telemetry rate available through the Low Gain Antenna and to increase the information density of the telemetry. This paper describes the mission planning process being developed as part of this redesign. Principal topics include a brief description of the new mission concept and anticipated science return (these have been covered more extensively in earlier papers), identification of key drivers on the mission planning process, a description of the process and its implementation schedule, a discussion of the application of automated mission planning tool to the process, and a status report on mission planning work to date. Galileo enhancements include extensive reprogramming of on-board computers and substantial hard ware and software upgrades for the Deep Space Network (DSN). The principal mode of operation will be onboard recording of science data followed by extended playback periods. A variety of techniques will be used to compress and edit the data both before recording and during playback. A highly-compressed real-time science data stream will also be important. The telemetry rate will be increased using advanced coding techniques and advanced receivers. Galileo mission planning for orbital operations now involves partitioning of several scarce resources. Particularly difficult are division of the telemetry among the many users (eleven instruments, radio science, engineering monitoring, and navigation) and allocation of space on the tape recorder at each of the ten satellite encounters. The planning process is complicated by uncertainty in forecast performance of the DSN modifications and the non-deterministic nature of the new data compression schemes. Key mission planning steps include quantifying resource or capabilities to be allocated, prioritizing science observations and estimating resource needs for each, working inter-and intra-orbit trades of these resources among the Project elements, and planning real-time science activity. The first major mission planning activity, a high level, orbit-by-orbit allocation of resources among science objectives, has already been completed; and results are illustrated in the paper. To make efficient use of limited resources, Galileo mission planning will rely on automated mission planning tools capable of dealing with interactions among time-varying downlink capability, real-time science and engineering data transmission, and playback of recorded data. A new generic mission planning tool is being adapted for this purpose.
Understanding knowledge transfer in an ergonomics intervention at a poultry processing plant.
Antle, David M; MacKinnon, Scott N; Molgaard, John; Vézina, Nicole; Parent, Robert; Bornstein, Stephen; Leclerc, Louise
2011-01-01
This case study reviews the knowledge transfer (KT) process of implementing a knife sharpening and steeling program into a poultry processing plant via a participatory ergonomics intervention. This ergonomics intervention required stakeholder participation at the company level to move a 'train-the-trainer' program, developed in Québec, Canada, into action on the plant's deboning line. Communications and exchanges with key stakeholders, as well as changes in steeling and production behaviours were recorded. The intervention was assumed to be at least partially successful because positive changes in work operations occurred. Ergonomic-related changes such as those documented have been cited in the academic literature as beneficial to worker health. However, several components cited in literature that are associated with a successful participatory ergonomics intervention were not attained during the project. A Dynamic Knowledge Transfer Model was used to identify KT issues that impacted on the success of train-the-trainer program. A debriefing analysis reveals that a failure to consider key participatory ergonomics factors necessary for success were related to capacity deficits in the knowledge dissemination strategy.
O'Connor, Nick; Paton, Michael
2008-04-01
A framework developed to promote the understanding and application of clinical governance principles in an area mental health service is described. The framework is operationalized through systems, processes, roles and responsibilities. The development of an explicit and operationalizable framework for clinical governance arose from the authors' experiences in leading and managing mental health services. There is a particular emphasis on improvement of quality of care and patient safety. The framework is informed by recent developments in thinking about clinical governance, including key documents from Australia and the United Kingdom. The operational nature of the framework allows for key components of clinical governance to be described explicitly, communicated effectively, and continually tested and improved. Further consideration and assessment of the value of differing approaches to this task are required. For example, a general, illustrative approach to raise clinician awareness can be contrasted with prescriptive and specified approaches which progressively encompass the many functions and processes of a mental health service. Mental health clinicians and managers can be guided by a framework that will ensure safe, high quality and continually improving processes of care.
Study of the Influence of Key Process Parameters on Furfural Production.
Fele Žilnik, Ljudmila; Grilc, Viktor; Mirt, Ivan; Cerovečki, Željko
2016-01-01
The present work reports the influence of key process variables on the furfural formation from leached chestnut-wood chips in a pressurized reactor. Effect of temperature, pressure, type and concentration of the catalyst solution, the steam flow rate or stripping module, the moisture content of the wood particles and geometric characteristics such as size and type of the reactor, particle size and bed height were considered systematically. One stage process was only taken into consideration. Lab-scale and pilot-scale studies were performed. The results of the non-catalysed laboratory experiments were compared with an actual non-catalysed (auto-catalysed) industrial process and with experiments on the pilot scale, the latter with 28% higher furfural yield compared to the others. Application of sulphuric acid as catalyst, in an amount of 0.03-0.05 g (H2SO4 100%)/g d.m. (dry material), enables a higher production of furfural at lower temperature and pressure of steam in a shorter reaction time. Pilot scale catalysed experiments have revealed very good performance for furfural formation under less severe operating conditions, with a maximum furfural yield as much as 88% of the theoretical value.
Finding simplicity in complexity: modelling post-fire hydrogeomorphic processes and risks
NASA Astrophysics Data System (ADS)
Sheridan, Gary; Langhans, Christoph; Lane, Patrick; Nyman, Petter
2017-04-01
Post-fire runoff and erosion can shape landscapes, destroy infrastructure, and result in the loss of human life. However even within seemingly similar geographic regions post-fire hydro-geomorphic responses vary from almost no response through to catastrophic flash floods and debris flows. Why is there so much variability, and how can we predict areas at risk? This presentation describes the research journey taken by the post-fire research group at The University of Melbourne to answer this question for the se Australian uplands. Key steps along the way have included identifying the dominant erosion processes (and their forcings), and the key system properties controlling the rates of these dominant processes. The high degree of complexity in the interactions between the forcings, the system properties, and the erosion processes, necessitated the development of a simplified conceptual representation of post-fire hydrogeomorphic system that was conducive to modelling and simulation. Spatially mappable metrics (and proxies) for key system forcings and properties were then required to parameterize and drive the model. Each step in this journey has depended on new research, as well as ongoing feedback from land and water management agencies tasked with implementing these risk models and interpreting the results. These models are now imbedded within agencies and used for strategic risk assessments, for tactical response during fires, and for post-fire remediation and risk planning. Reflecting on the successes and failures along the way provides for some more general insights into the process of developing research-based models for operational use by land and water management agencies.
Analyzing AQP Data to Improve Electronic Flight Bag (EFB) Operations and Training
NASA Technical Reports Server (NTRS)
Seamster, Thomas L.; Kanki, Barbara
2010-01-01
Key points include: Initiate data collection and analysis early in the implementation process. Use data to identify procedural and training refinements. Use a de-identified system to analyze longitudinal data. Use longitudinal I/E data to improve their standardization. Identify above average pilots and crews and use their performance to specify best practices. Analyze below average crew performance data to isolate problems with the training, evaluator standardization and pilot proficiency.
Review of Vibration-Based Helicopters Health and Usage Monitoring Methods
2001-04-05
FM4, NA4, NA4*, NB4 and NB48* (Polyshchuk et al., 1998). The Wigner - Ville distribution ( WVD ) is a joint time-frequency signal analysis. The WVD is one...signal processing methodologies that are of relevance to vibration based damage detection (e.g., Wavelet Transform and Wigner - Ville distribution ) will be...operation cost, reduce maintenance flights, and increase flight safety. Key Words: HUMS; Wavelet Transform; Wigner - Ville distribution ; O&S; Machinery
2016-07-14
and (2) track the implementation of these recommendations and measure the effectiveness of the actions it has taken to address them. We briefed the... effectiveness of actions taken, we reviewed key documents, including the reports of the nuclear enterprise reviews, Strategic Command’s action plan, DOD...Federal Government—including assessing and responding to risk, using and effectively communicating quality information, and performing monitoring
Boussinesq Modeling for Inlets, Harbors & Structures (Bouss-2D)
2014-10-27
subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 27...work applications. It may be used from deep to shallow water to simulate the nonlinear wave processes of interest in the open coast, nearshore zone...design, and operation of coastal navigation and flooding projects. It provides key engineering estimates for coastal and hydraulic engineering practice
Yuan, Yi-Ping; Zhai, Hua-Qiang; Guo, Zhao-Juan; Zhang, Tian; Kong, Li-Ting; Jia, Xiao-Yu; Tian, Wei-Lan; Li, Rui
2016-05-01
To collect Li Shizhen's experience in Aconiti Lateralis Radix Praeparata identification and clinical application, compare and analyze national physician master Jin Shiyuan's practical operation and theoretical knowledge, which is beneficial for the inheritance and improvement of Aconiti Lateralis Radix Praeparata clinical dispensing technology. In the analysis process, CNKI, Wanfang and other databases were searched with "Aconiti Lateralis Radix Praeparata", "Li Shizhen", "pharmacological method state theory" "Jin Shiyuan" and "Chinese medicine dispensing technology" as the key words. In addition, Treatise on Febrile Disease, Compendium of Materia Medica, Chinese Pharmacopoeia(2015 edition), Notes to Medical Professions(Yi Zong Shuo Yue), and other medicine books were accessed to summarize the processing methods and decoction dosage of Aconiti Lateralis Radix Praeparata in both ancient and modern medicine, and in consideration of technical research and practice operation, Li Shizhen's description of Aconiti Lateralis Radix Praeparata and Professor Jin Shiyuan's research on Aconiti Lateralis Radix Praeparata dispensing technology were analyzed and collected. Li Shizhen recorded the nature identification and clinical application of Aconiti Lateralis Radix Praeparata by using pharmacological method state theory in Compendium of Materia Medica. National physician master Jin Shiyuan carries forward the essence of Li Shizhen's pharmaceutical academic thought with his own proficient knowledge structure in medicine, providing scientific pharmaceutical service for clinical application of Aconiti Lateralis Radix Praeparata Professor. Jin Shiyuan put forward the dispensing technology for the first time, including nature identification technology, clinical processing technology, clinical decocting technology, prescription coping technology, and class specifications of Aconiti Lateralis Radix Praeparata. In this paper, Aconiti Lateralis Radix Praeparata was used as an example to analyze the key dispensing technology of traditional Chinese medicine, and apply the key dispensing technology of traditional Chinese medicine in various commonly used Chinese medicines in the future. Copyright© by the Chinese Pharmaceutical Association.
Varughese, Anna M; Hagerman, Nancy; Townsend, Mari E
2013-07-01
The anesthesia preoperative screening and evaluation of a patient prior to surgery is a critical element in the safe and effective delivery of anesthesia care. In this era of increased focus on cost containment, many anesthesia practices are looking for ways to maximize productivity while maintaining the quality of the preoperative evaluation process by harnessing and optimizing all available resources. We sought to develop a Nurse Practitioner-assisted Preoperative Anesthesia Screening process using quality improvement methods with the goal of maintaining the quality of the screening process, while at the same time redirecting anesthesiologists time for the provision of nonoperating room (OR) anesthesia. The Nurse practitioner (NP) time (approximately 10 h per week) directed to this project was gained as a result of an earlier resource utilization improvement project within the Department of Anesthesia. The goal of this improvement project was to increase the proportion of patient anesthesia screens conducted by NPs to 50% within 6 months. After discussion with key stakeholders of the process, a multidisciplinary improvement team identified a set of operational factors (key drivers) believed to be important to the success of the preoperative anesthesia screening process. These included the development of dedicated NP time for daily screening, NP competency and confidence with the screening process, effective mentoring by anesthesiologists, standardization of screening process, and communication with stakeholders of the process, that is, surgeons. These key drivers focused on the development of several interventions such as (i) NP education in the preoperative anesthesia screening for consultation process by a series of didactic lectures conducted by anesthesiologists, and NP's shadowing an anesthesiologist during the screening process, (ii) Anesthesiologist mentoring and assessment of NP screenings using the dual screening process whereby both anesthesiologists and NP conducted the screening process independently and results were compared and discussed, (iii) Examination and re-adjustment of NP schedules to provide time for daily screening while preserving other responsibilities, and (iv) Standardization through the development of guidelines for the preoperative screening process. Measures recorded included the percentage of patient anesthesia screens conducted by NP, the percentage of dual screens with MD and NP agreement regarding the screening decision, and the average times taken for the anesthesiologist and NP screening process. After implementation of these interventions, the percentage of successful NP-assisted anesthesia consultation screenings increased from 0% to 65% over a period of 6 months. The Anesthesiologists' time redirected to non-OR anesthesia averaged at least 8 h a week. The percentage of dual screens with agreement on the screening decision was 96% (goal >95%). The overall average time taken for a NP screen was 8.2 min vs 4.5 min for an anesthesiologist screen. The overall average operating room delays and cancelations for cases on the day of surgery remained the same. By applying quality improvement methods, we identified key drivers for the institution of an NP-assisted preoperative screening process and successfully implemented this process while redirecting anesthesiologists' time for the provision of non-OR anesthesia. This project was instrumental in improving the matching of provider skills with clinical need while maintaining superior outcomes at the lowest possible cost. © 2013 John Wiley & Sons Ltd.
The role of health informatics in clinical audit: part of the problem or key to the solution?
Georgiou, Andrew; Pearson, Michael
2002-05-01
The concepts of quality assurance (for which clinical audit is an essential part), evaluation and clinical governance each depend on the ability to derive and record measurements that describe clinical performance. Rapid IT developments have raised many new possibilities for managing health care. They have allowed for easier collection and processing of data in greater quantities. These developments have encouraged the growth of quality assurance as a key feature of health care delivery. In the past most of the emphasis has been on hospital information systems designed predominantly for the administration of patients and the management of financial performance. Large, hi-tech information system capacity does not guarantee quality information. The task of producing information that can be confidently used to monitor the quality of clinical care requires attention to key aspects of the design and operation of the audit. The Myocardial Infarction National Audit Project (MINAP) utilizes an IT-based system to collect and process data on large numbers of patients and make them readily available to contributing hospitals. The project shows that IT systems that employ rigorous health informatics methodologies can do much to improve the monitoring and provision of health care.
Tsai, Yuan-Cheng; Cheng, Yu-Tien
2012-01-01
With the transformation of its population structure and economic environment, Taiwan is rapidly becoming an aging society. There is a growing need for elderly products, and therefore the operation of web shops that sell elderly products is important. In an era which values performance management, searching for key performance indicators (KPIs) helps to reveal, if the goals of a web shop are achieved. In the current study, researchers adopted the constructs of the Balanced Scorecard (BSC) to evaluate web shop performance. Additionally, the Delphi method, along with questionnaires, was used to develop 29 indicators. Finally, the decision making trial and evaluation laboratory (DEMATEL) method assisted in identifying the level of importance of the constructs, in which "internal process" ranked top, followed by "learning and growth", "customer", and "financial". "Internal process" was the key construct that impacted other factors, while "customer" was an important construct affected by other factors. By understanding the influences and relationships among the constructs, enterprises can conduct additional monitoring and management to achieve functions of prevention, continuous improvement, and innovation in order to shape their core competence. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Michael-Kordatou, I; Karaolia, P; Fatta-Kassinos, D
2018-02-01
An upsurge in the study of antibiotic resistance in the environment has been observed in the last decade. Nowadays, it is becoming increasingly clear that urban wastewater is a key source of antibiotic resistance determinants, i.e. antibiotic-resistant bacteria and antibiotic resistance genes (ARB&ARGs). Urban wastewater reuse has arisen as an important component of water resources management in the European Union and worldwide to address prolonged water scarcity issues. Especially, biological wastewater treatment processes (i.e. conventional activated sludge), which are widely applied in urban wastewater treatment plants, have been shown to provide an ideal environment for the evolution and spread of antibiotic resistance. The ability of advanced chemical oxidation processes (AOPs), e.g. light-driven oxidation in the presence of H 2 O 2 , ozonation, homogeneous and heterogeneous photocatalysis, to inactivate ARB and remove ARGs in wastewater effluents has not been yet evaluated through a systematic and integrated approach. Consequently, this review seeks to provide an extensive and critical appraisal on the assessment of the efficiency of these processes in inactivating ARB and removing ARGs in wastewater effluents, based on recent available scientific literature. It tries to elucidate how the key operating conditions may affect the process efficiency, while pinpointing potential areas for further research and major knowledge gaps which need to be addressed. Also, this review aims at shedding light on the main oxidative damage pathways involved in the inactivation of ARB and removal of ARGs by these processes. In general, the lack and/or heterogeneity of the available scientific data, as well as the different methodological approaches applied in the various studies, make difficult the accurate evaluation of the efficiency of the processes applied. Besides the operating conditions, the variable behavior observed by the various examined genetic constituents of the microbial community, may be directed by the process distinct oxidative damage mechanisms in place during the application of each treatment technology. For example, it was shown in various studies that the majority of cellular damage by advanced chemical oxidation may be on cell wall and membrane structures of the targeted bacteria, leaving the internal components of the cells relatively intact/able to repair damage. As a result, further in-depth mechanistic studies are required, to establish the optimum operating conditions under which oxidative mechanisms target internal cell components such as genetic material and ribosomal structures more intensively, thus conferring permanent damage and/or death and preventing potential post-treatment re-growth. Copyright © 2017 Elsevier Ltd. All rights reserved.
Biobanking sustainability--experiences of the Australian Breast Cancer Tissue Bank (ABCTB).
Carpenter, Jane E; Clarke, Christine L
2014-12-01
Sustainability of biorepositories is a key issue globally. This article is a description of the different strategies and mechanisms used by the Australian Breast Cancer Tissue Bank (ABCTB) in developing and operating the resource since its inception in 2005. ABCTB operates according to a hub and spoke model, with a central management hub that is responsible for overall management of the resource including financial, ethical, and legal processes, researcher applications for material, clinical follow-up, information/database activities, and security. A centralized processing laboratory also operates from the hub site where DNA and RNA extractions are performed, digital imaging of stained tumor sections occurs, and specimens are assembled for dispatch for research projects. ABCTB collection sites where donors are identified, consent obtained, and specimens collected and processed for initial storage are located across Australia. Each of the activities of the resource requires financial support and different sources of revenue, some of which are allocated to a specific function of the ABCTB. Different models are in use at different collection centers where local variations may exist and local financial support may sometimes be obtained. There is also significant in-kind support by clinics and diagnostic and research facilities that house the various activities of the resource. However, long-term financial commitment to ensure the survival of the resource is not in place, and forward planning of operations remains challenging under these circumstances.
CFD-aided modelling of activated sludge systems - A critical review.
Karpinska, Anna M; Bridgeman, John
2016-01-01
Nowadays, one of the major challenges in the wastewater sector is the successful design and reliable operation of treatment processes, which guarantee high treatment efficiencies to comply with effluent quality criteria, while keeping the investment and operating cost as low as possible. Although conceptual design and process control of activated sludge plants are key to ensuring these goals, they are still based on general empirical guidelines and operators' experience, dominated often by rule of thumb. This review paper discusses the rationale behind the use of Computational Fluid Dynamics (CFD) to model aeration, facilitating enhancement of treatment efficiency and reduction of energy input. Several single- and multiphase approaches commonly used in CFD studies of aeration tank operation, are comprehensively described, whilst the shortcomings of the modelling assumptions imposed to evaluate mixing and mass transfer in AS tanks are identified and discussed. Examples and methods of coupling of CFD data with biokinetics, accounting for the actual flow field and its impact on the oxygen mass transfer and yield of the biological processes occurring in the aeration tanks, are also critically discussed. Finally, modelling issues, which remain unaddressed, (e.g. coupling of the AS tank with secondary clarifier and the use of population balance models to simulate bubbly flow or flocculation of the activated sludge), are also identified and discussed. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Rolling scheduling of electric power system with wind power based on improved NNIA algorithm
NASA Astrophysics Data System (ADS)
Xu, Q. S.; Luo, C. J.; Yang, D. J.; Fan, Y. H.; Sang, Z. X.; Lei, H.
2017-11-01
This paper puts forth a rolling modification strategy for day-ahead scheduling of electric power system with wind power, which takes the operation cost increment of unit and curtailed wind power of power grid as double modification functions. Additionally, an improved Nondominated Neighbor Immune Algorithm (NNIA) is proposed for solution. The proposed rolling scheduling model has further improved the operation cost of system in the intra-day generation process, enhanced the system’s accommodation capacity of wind power, and modified the key transmission section power flow in a rolling manner to satisfy the security constraint of power grid. The improved NNIA algorithm has defined an antibody preference relation model based on equal incremental rate, regulation deviation constraints and maximum & minimum technical outputs of units. The model can noticeably guide the direction of antibody evolution, and significantly speed up the process of algorithm convergence to final solution, and enhance the local search capability.
Extraction of Volatiles from Regolith or Soil on Mars, the Moon, and Asteroids
NASA Technical Reports Server (NTRS)
Linne, Diane; Kleinhenz, Julie; Trunek, Andrew; Hoffman, Stephen; Collins, Jacob
2017-01-01
NASA's Advanced Exploration Systems ISRU Technology Project is evaluating concepts to extract water from all resource types Near-term objectives: Produce high-fidelity mass, power, and volume estimates for mining and processing systems Identify critical challenges for development focus Begin demonstration of component and subsystem technologies in relevant environment Several processor types: Closed processors either partially or completely sealed during processing Open air processors operates at Mars ambient conditions In-situ processors Extract product directly without excavation of raw resource Design features Elimination of sweep gas reduces dust particles in water condensate Pressure maintained by height of soil in hopper Model developed to evaluate key design parameters Geometry: conveyor diameter, screw diameter, shaft diameter, flight spacing and pitch Operational: screw speed vs. screw length (residence time) Thermal: Heat flux, heat transfer to soil Testing to demonstrate feasibility and performance Agglomeration, clogging Pressure rise forced flow to condenser.
Estevez, Claudio; Kailas, Aravind
2012-01-01
Millimeter-wave technology shows high potential for future wireless personal area networks, reaching over 1 Gbps transmissions using simple modulation techniques. Current specifications consider dividing the spectrum into effortlessly separable spectrum ranges. These low requirements open a research area in time and space multiplexing techniques for millimeter-waves. In this work a process-stacking multiplexing access algorithm is designed for single channel operation. The concept is intuitive, but its implementation is not trivial. The key to stacking single channel events is to operate while simultaneously obtaining and handling a-posteriori time-frame information of scheduled events. This information is used to shift a global time pointer that the wireless access point manages and uses to synchronize all serviced nodes. The performance of the proposed multiplexing access technique is lower bounded by the performance of legacy TDMA and can significantly improve the effective throughput. Work is validated by simulation results.
Reference dosimeter system of the iaea
NASA Astrophysics Data System (ADS)
Mehta, Kishor; Girzikowsky, Reinhard
1995-09-01
Quality assurance programmes must be in operation at radiation processing facilities to satisfy national and international Standards. Since dosimetry has a vital function in these QA programmes, it is imperative that the dosimetry systems in use at these facilities are well calibrated with a traceability to a Primary Standard Dosimetry Laboratory. As a service to the Member States, the International Atomic Energy Agency operates the International Dose Assurance Service (IDAS) to assist in this process. The transfer standard dosimetry system that is used for this service is based on ESR spectrometry. The paper describes the activities undertaken at the IAEA Dosimetry Laboratory to establish the QA programme for its reference dosimetry system. There are four key elements of such a programme: quality assurance manual; calibration that is traceable to a Primary Standard Dosimetry Laboratory; a clear and detailed statement of uncertainty in the dose measurement; and, periodic quality audit.
MODIS information, data and control system (MIDACS) operations concepts
NASA Technical Reports Server (NTRS)
Han, D.; Salomonson, V.; Ormsby, J.; Ardanuy, P.; Mckay, A.; Hoyt, D.; Jaffin, S.; Vallette, B.; Sharts, B.; Folta, D.
1988-01-01
The MODIS Information, Data, and Control System (MIDACS) Operations Concepts Document provides a basis for the mutual understanding between the users and the designers of the MIDACS, including the requirements, operating environment, external interfaces, and development plan. In defining the concepts and scope of the system, how the MIDACS will operate as an element of the Earth Observing System (EOS) within the EosDIS environment is described. This version follows an earlier release of a preliminary draft version. The individual operations concepts for planning and scheduling, control and monitoring, data acquisition and processing, calibration and validation, data archive and distribution, and user access do not yet fully represent the requirements of the data system needed to achieve the scientific objectives of the MODIS instruments and science teams. The teams are not yet formed; however, it is possible to develop the operations concepts based on the present concept of EosDIS, the level 1 and level 2 Functional Requirements Documents, and through interviews and meetings with key members of the scientific community. The operations concepts were exercised through the application of representative scenarios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jong Suk; Chen, Jun; Garcia, Humberto E.
An RO (reverse osmosis) desalination plant is proposed as an effective, FLR (flexible load resource) to be integrated into HES (hybrid energy systems) to support various types of ancillary services to the electric grid, under variable operating conditions. To study the dynamic (transient) analysis of such system, among the various unit operations within HES, special attention is given here to the detailed dynamic modeling and control design of RO desalination process with a spiral-wound membrane module. The model incorporates key physical phenomena that have been investigated individually into a dynamic integrated model framework. In particular, the solution-diffusion model modified withmore » the concentration polarization theory is applied to predict RO performance over a large range of operating conditions. Simulation results involving several case studies suggest that an RO desalination plant, acting as a FLR, can provide operational flexibility to participate in energy management at the utility scale by dynamically optimizing the use of excess electrical energy. Here, the incorporation of additional commodity (fresh water) produced from a FLR allows a broader range of HES operations for maximizing overall system performance and profitability. For the purpose of assessing the incorporation of health assessment into process operations, an online condition monitoring approach for RO membrane fouling supervision is addressed in the case study presented.« less
Kim, Jong Suk; Chen, Jun; Garcia, Humberto E.
2016-06-17
An RO (reverse osmosis) desalination plant is proposed as an effective, FLR (flexible load resource) to be integrated into HES (hybrid energy systems) to support various types of ancillary services to the electric grid, under variable operating conditions. To study the dynamic (transient) analysis of such system, among the various unit operations within HES, special attention is given here to the detailed dynamic modeling and control design of RO desalination process with a spiral-wound membrane module. The model incorporates key physical phenomena that have been investigated individually into a dynamic integrated model framework. In particular, the solution-diffusion model modified withmore » the concentration polarization theory is applied to predict RO performance over a large range of operating conditions. Simulation results involving several case studies suggest that an RO desalination plant, acting as a FLR, can provide operational flexibility to participate in energy management at the utility scale by dynamically optimizing the use of excess electrical energy. Here, the incorporation of additional commodity (fresh water) produced from a FLR allows a broader range of HES operations for maximizing overall system performance and profitability. For the purpose of assessing the incorporation of health assessment into process operations, an online condition monitoring approach for RO membrane fouling supervision is addressed in the case study presented.« less
GRAS NRT Precise Orbit Determination: Operational Experience
NASA Technical Reports Server (NTRS)
MartinezFadrique, Francisco M.; Mate, Alberto Agueda; Rodriquez-Portugal, Francisco Sancho
2007-01-01
EUMETSAT launched the meteorological satellite MetOp-A in October 2006; it is the first of the three satellites that constitute the EUMETSAT Polar System (EPS) space segment. This satellite carries a challenging and innovative instrument, the GNSS Receiver for Atmospheric Sounding (GRAS). The goal of the GRAS instrument is to support the production of atmospheric profiles of temperature and humidity with high accuracy, in an operational context, based on the bending of the GPS signals traversing the atmosphere during the so-called occultation periods. One of the key aspects associated to the data processing of the GRAS instrument is the necessity to describe the satellite motion and GPS receiver clock behaviour with high accuracy and within very strict timeliness limitations. In addition to these severe requirements, the GRAS Product Processing Facility (PPF) must be integrated in the EPS core ground segment, which introduces additional complexity from the data integration and operational procedure points of view. This paper sets out the rationale for algorithm selection and the conclusions from operational experience. It describes in detail the rationale and conclusions derived from the selection and implementation of the algorithms leading to the final orbit determination requirements (0.1 mm/s in velocity and 1 ns in receiver clock error at 1 Hz). Then it describes the operational approach and extracts the ideas and conclusions derived from the operational experience.
Comprehensive assessment of the L-lysine production process from fermentation of sugarcane molasses.
Anaya-Reza, Omar; Lopez-Arenas, Teresa
2017-07-01
L-Lysine is an essential amino acid that can be produced by chemical processes from fossil raw materials, as well as by microbial fermentation, the latter being a more efficient and environmentally friendly procedure. In this work, the production process of L-lysine-HCl is studied using a systematic approach based on modeling and simulation, which supports decision making in the early stage of process design. The study considers two analysis stages: first, the dynamic analysis of the fermentation reactor, where the conversion of sugars from sugarcane molasses to L-lysine with a strain of Corynebacterium glutamicum is carried out. In this stage, the operation mode (either batch or fed batch) and operating conditions of the fermentation reactor are defined to reach the maximum technical criteria. Afterwards, the second analysis stage relates to the industrial production process of L-lysine-HCl, where the fermentation reactor, upstream processing, and downstream processing are included. In this stage, the influence of key parameters on the overall process performance is scrutinized through the evaluation of several technical, economic, and environmental criteria, to determine a profitable and sustainable design of the L-lysine production process. The main results show how the operating conditions, process design, and selection of evaluation criteria can influence in the conceptual design. The best plant design shows maximum product yield (0.31 g L-lysine/g glucose) and productivity (1.99 g/L/h), achieving 26.5% return on investment (ROI) with a payback period (PBP) of 3.8 years, decreasing water and energy consumption, and with a low potential environmental impact (PEI) index.
Energy Efficiency Model for Induction Furnace
NASA Astrophysics Data System (ADS)
Dey, Asit Kr
2018-01-01
In this paper, a system of a solar induction furnace unit was design to find out a new solution for the existing AC power consuming heating process through Supervisory control and data acquisition system. This unit can be connected directly to the DC system without any internal conversion inside the device. The performance of the new system solution is compared with the existing one in terms of power consumption and losses. This work also investigated energy save, system improvement, process control model in a foundry induction furnace heating framework corresponding to PV solar power supply. The results are analysed for long run in terms of saving energy and integrated process system. The data acquisition system base solar foundry plant is an extremely multifaceted system that can be run over an almost innumerable range of operating conditions, each characterized by specific energy consumption. Determining ideal operating conditions is a key challenge that requires the involvement of the latest automation technologies, each one contributing to allow not only the acquisition, processing, storage, retrieval and visualization of data, but also the implementation of automatic control strategies that can expand the achievement envelope in terms of melting process, safety and energy efficiency.
IMAGE 100: The interactive multispectral image processing system
NASA Technical Reports Server (NTRS)
Schaller, E. S.; Towles, R. W.
1975-01-01
The need for rapid, cost-effective extraction of useful information from vast quantities of multispectral imagery available from aircraft or spacecraft has resulted in the design, implementation and application of a state-of-the-art processing system known as IMAGE 100. Operating on the general principle that all objects or materials possess unique spectral characteristics or signatures, the system uses this signature uniqueness to identify similar features in an image by simultaneously analyzing signatures in multiple frequency bands. Pseudo-colors, or themes, are assigned to features having identical spectral characteristics. These themes are displayed on a color CRT, and may be recorded on tape, film, or other media. The system was designed to incorporate key features such as interactive operation, user-oriented displays and controls, and rapid-response machine processing. Owing to these features, the user can readily control and/or modify the analysis process based on his knowledge of the input imagery. Effective use can be made of conventional photographic interpretation skills and state-of-the-art machine analysis techniques in the extraction of useful information from multispectral imagery. This approach results in highly accurate multitheme classification of imagery in seconds or minutes rather than the hours often involved in processing using other means.
Continuous flow technology vs. the batch-by-batch approach to produce pharmaceutical compounds.
Cole, Kevin P; Johnson, Martin D
2018-01-01
For the manufacture of small molecule drugs, many pharmaceutical innovator companies have recently invested in continuous processing, which can offer significant technical and economic advantages over traditional batch methodology. This Expert Review will describe the reasons for this interest as well as many considerations and challenges that exist today concerning continuous manufacturing. Areas covered: Continuous processing is defined and many reasons for its adoption are described. The current state of continuous drug substance manufacturing within the pharmaceutical industry is summarized. Current key challenges to implementation of continuous manufacturing are highlighted, and an outlook provided regarding the prospects for continuous within the industry. Expert commentary: Continuous processing at Lilly has been a journey that started with the need for increased safety and capability. Over twelve years the original small, dedicated group has grown to more than 100 Lilly employees in discovery, development, quality, manufacturing, and regulatory designing in continuous drug substance processing. Recently we have focused on linked continuous unit operations for the purpose of all-at-once pharmaceutical manufacturing, but the technical and business drivers that existed in the very beginning for stand-alone continuous unit operations in hybrid processes have persisted, which merits investment in both approaches.
Nathoo, Jeeten; Randall, Dyllon Garth
2016-01-01
Membrane distillation (MD) could be applicable in zero liquid discharge applications. This is due to the fact that MD is applicable at high salinity ranges which are generally outside the scope of reverse osmosis (RO) applications, although this requires proper management of precipitating salts to avoid membrane fouling. One way of managing these salts is with MD crystallisation (MDC). This paper focuses on the applicability of MDC for the treatment of mining wastewater by thermodynamically modelling the aqueous chemistry of the process at different temperatures. The paper is based on the typical brine generated from an RO process in the South African coal mining industry and investigates the effect water recovery and operating temperature have on the salts that are predicted to crystallise out, the sequence in which they will crystallise out and purities as a function of the water recovery. The study confirmed the efficacy of using thermodynamic modelling as a tool for investigating and predicting the crystallisation aspects of the MDC process. The key finding from this work was that, for an MDC process, a purer product can be obtained at higher operating temperatures and recoveries because of the inverse solubility of calcium sulphate.
GPU: the biggest key processor for AI and parallel processing
NASA Astrophysics Data System (ADS)
Baji, Toru
2017-07-01
Two types of processors exist in the market. One is the conventional CPU and the other is Graphic Processor Unit (GPU). Typical CPU is composed of 1 to 8 cores while GPU has thousands of cores. CPU is good for sequential processing, while GPU is good to accelerate software with heavy parallel executions. GPU was initially dedicated for 3D graphics. However from 2006, when GPU started to apply general-purpose cores, it was noticed that this architecture can be used as a general purpose massive-parallel processor. NVIDIA developed a software framework Compute Unified Device Architecture (CUDA) that make it possible to easily program the GPU for these application. With CUDA, GPU started to be used in workstations and supercomputers widely. Recently two key technologies are highlighted in the industry. The Artificial Intelligence (AI) and Autonomous Driving Cars. AI requires a massive parallel operation to train many-layers of neural networks. With CPU alone, it was impossible to finish the training in a practical time. The latest multi-GPU system with P100 makes it possible to finish the training in a few hours. For the autonomous driving cars, TOPS class of performance is required to implement perception, localization, path planning processing and again SoC with integrated GPU will play a key role there. In this paper, the evolution of the GPU which is one of the biggest commercial devices requiring state-of-the-art fabrication technology will be introduced. Also overview of the GPU demanding key application like the ones described above will be introduced.
STS-56, RSRM-031, 360L031 KSC processing configuration and data report
NASA Technical Reports Server (NTRS)
1993-01-01
KSC Processing Configuration and Data Report is being provided as a historical document and as an enhancement to future RSRM manufacturing and processing operations. The following sections provide information on segment receipt, aft booster build-up, booster assembly, and closeout for STS-56, RSRM flight set 36OL031. Section 2.0 contains a summary of RSRM-031 processing. Section 3.0 discusses any significant problems or special issues that require special attention. Sections 4.0 through 6.0 contain narrative descriptions of all key events, including any related processing problems. Appendix A provides engineering specifications and changes. A list and matrix of all problem reports (PR's) pertinent to this flight set is provided in Appendix B. The matrix was provided by the Thiokol LSS Quality Engineering office. Copies of the PR's generated during the processing of RSRM-031 will be provided upon request. Appendix C contains the motor set status matrix, which provides milestone dates for the RSRM-031 flow. Section 7.0 provides recommendations, if any, for the improvement of flight hardware processing. Section 8.0 contains data sheets that provide flight hardware parts and consumables information installed during the booster build-up and stacking operations by location, lot/serial number, expiration and cure dates/times, and installation dates.
STS-51, RSRM-033, 360T033 KSC processing configuration and data report
NASA Technical Reports Server (NTRS)
Hillard, Robert C.
1993-01-01
KSC Processing Configuration and Data Report is being provided as a historical document and as an enhancement to future RSRM manufacturing and processing operations. The following sections provide information on segment receipt, aft booster build up, motor assembly, and closeout for STS-51, RSRM flight set 360T033. Section 2.0 contains a summary of RSRM-033 processing. Section 3.0 discusses any significant problems or special issues that require special attention. Sections 4.0 through 6.0 contain narrative descriptions of all key events, including any related processing problems. Appendix A provides engineering specifications and changes. A list and matrix of all problem reports (PR's) pertinent to this flight set is provided in Appendix B. The matrix was provided by the Thiokol LSS Quality Engineering office. Copies of the PR's generated during the processing of RSRM-033 will be provided upon request. Appendix C contains the motor set status matrix, which provides milestone dates for the RSRM-033 flow. Section 7.0 provides recommendations for the improvement of flight hardware processing. Section 8.0 contains data sheets that provide flight hardware parts and consumable information installed during the booster build-up and stacking operations by location, lot/serial number, expiration and cure dates/times, and installation dates.
Influence of temperature on the single-stage ATAD process predicted by a thermal equilibrium model.
Cheng, Jiehong; Zhu, Jun; Kong, Feng; Zhang, Chunyong
2015-06-01
Autothermal thermophilic aerobic digestion (ATAD) is a promising biological process that will produce an effluent satisfying the Class A requirements on pathogen control and land application. The thermophilic temperature in an ATAD reactor is one of the critical factors that can affect the satisfactory operation of the ATAD process. This paper established a thermal equilibrium model to predict the effect of variables on the auto-rising temperature in an ATAD system. The reactors with volumes smaller than 10 m(3) could not achieve temperatures higher than 45 °C under ambient temperature of -5 °C. The results showed that for small reactors, the reactor volume played a key role in promoting auto-rising temperature in the winter. Thermophilic temperature achieved in small ATAD reactors did not entirely depend on the heat release from biological activities during degrading organic matters in sludges, but was related to the ambient temperature. The ratios of surface area-to-effective volume less than 2.0 had less impact on the auto-rising temperature of an ATAD reactor. The influence of ambient temperature on the auto-rising reactor temperature decreased with increasing reactor volumes. High oxygen transfer efficiency had a significant influence on the internal temperature rise in an ATAD system, indicating that improving the oxygen transfer efficiency of aeration devices was a key factor to achieve a higher removal rate of volatile solids (VS) during the ATAD process operation. Compared with aeration using cold air, hot air demonstrated a significant effect on maintaining the internal temperature (usually 4-5 °C higher). Copyright © 2015 Elsevier Ltd. All rights reserved.
A Novel Real-Time Reference Key Frame Scan Matching Method
Mohamed, Haytham; Moussa, Adel; Elhabiby, Mohamed; El-Sheimy, Naser; Sesay, Abu
2017-01-01
Unmanned aerial vehicles represent an effective technology for indoor search and rescue operations. Typically, most indoor missions’ environments would be unknown, unstructured, and/or dynamic. Navigation of UAVs in such environments is addressed by simultaneous localization and mapping approach using either local or global approaches. Both approaches suffer from accumulated errors and high processing time due to the iterative nature of the scan matching method. Moreover, point-to-point scan matching is prone to outlier association processes. This paper proposes a low-cost novel method for 2D real-time scan matching based on a reference key frame (RKF). RKF is a hybrid scan matching technique comprised of feature-to-feature and point-to-point approaches. This algorithm aims at mitigating errors accumulation using the key frame technique, which is inspired from video streaming broadcast process. The algorithm depends on the iterative closest point algorithm during the lack of linear features which is typically exhibited in unstructured environments. The algorithm switches back to the RKF once linear features are detected. To validate and evaluate the algorithm, the mapping performance and time consumption are compared with various algorithms in static and dynamic environments. The performance of the algorithm exhibits promising navigational, mapping results and very short computational time, that indicates the potential use of the new algorithm with real-time systems. PMID:28481285
NASA Technical Reports Server (NTRS)
Buden, D.
1991-01-01
Topics dealing with nuclear safety are addressed which include the following: general safety requirements; safety design requirements; terrestrial safety; SP-100 Flight System key safety requirements; potential mission accidents and hazards; key safety features; ground operations; launch operations; flight operations; disposal; safety concerns; licensing; the nuclear engine for rocket vehicle application (NERVA) design philosophy; the NERVA flight safety program; and the NERVA safety plan.
14 CFR 437.31 - Verification of operating area containment and key flight-safety event limitations.
Code of Federal Regulations, 2010 CFR
2010-01-01
...(a) to contain its reusable suborbital rocket's instantaneous impact point within an operating area... limits on the ability of the reusable suborbital rocket to leave the operating area; or (2) Abort... requirements of § 437.59 to conduct any key flight-safety event so that the reusable suborbital rocket's...
14 CFR 437.31 - Verification of operating area containment and key flight-safety event limitations.
Code of Federal Regulations, 2013 CFR
2013-01-01
...(a) to contain its reusable suborbital rocket's instantaneous impact point within an operating area... limits on the ability of the reusable suborbital rocket to leave the operating area; or (2) Abort... requirements of § 437.59 to conduct any key flight-safety event so that the reusable suborbital rocket's...
14 CFR 437.31 - Verification of operating area containment and key flight-safety event limitations.
Code of Federal Regulations, 2012 CFR
2012-01-01
...(a) to contain its reusable suborbital rocket's instantaneous impact point within an operating area... limits on the ability of the reusable suborbital rocket to leave the operating area; or (2) Abort... requirements of § 437.59 to conduct any key flight-safety event so that the reusable suborbital rocket's...
14 CFR 437.31 - Verification of operating area containment and key flight-safety event limitations.
Code of Federal Regulations, 2011 CFR
2011-01-01
...(a) to contain its reusable suborbital rocket's instantaneous impact point within an operating area... limits on the ability of the reusable suborbital rocket to leave the operating area; or (2) Abort... requirements of § 437.59 to conduct any key flight-safety event so that the reusable suborbital rocket's...
14 CFR 437.31 - Verification of operating area containment and key flight-safety event limitations.
Code of Federal Regulations, 2014 CFR
2014-01-01
...(a) to contain its reusable suborbital rocket's instantaneous impact point within an operating area... limits on the ability of the reusable suborbital rocket to leave the operating area; or (2) Abort... requirements of § 437.59 to conduct any key flight-safety event so that the reusable suborbital rocket's...
Research on the EDM Technology for Micro-holes at Complex Spatial Locations
NASA Astrophysics Data System (ADS)
Y Liu, J.; Guo, J. M.; Sun, D. J.; Cai, Y. H.; Ding, L. T.; Jiang, H.
2017-12-01
For the demands on machining micro-holes at complex spatial location, several key technical problems are conquered such as micro-Electron Discharge Machining (micro-EDM) power supply system’s development, the host structure’s design and machining process technical. Through developing low-voltage power supply circuit, high-voltage circuit, micro and precision machining circuit and clearance detection system, the narrow pulse and high frequency six-axis EDM machining power supply system is developed to meet the demands on micro-hole discharging machining. With the method of combining the CAD structure design, CAE simulation analysis, modal test, ODS (Operational Deflection Shapes) test and theoretical analysis, the host construction and key axes of the machine tool are optimized to meet the position demands of the micro-holes. Through developing the special deionized water filtration system to make sure that the machining process is stable enough. To verify the machining equipment and processing technical developed in this paper through developing the micro-hole’s processing flow and test on the real machine tool. As shown in the final test results: the efficient micro-EDM machining pulse power supply system, machine tool host system, deionized filtration system and processing method developed in this paper meet the demands on machining micro-holes at complex spatial locations.
24 CFR 3280.105 - Exit facilities; exterior doors.
Code of Federal Regulations, 2013 CFR
2013-04-01
... opening. (3) Each swinging exterior door other than screen or storm doors shall have a key-operated lock... the use of a key for operation from the inside. (4) All exterior doors, including storm and screen...
24 CFR 3280.105 - Exit facilities; exterior doors.
Code of Federal Regulations, 2012 CFR
2012-04-01
... opening. (3) Each swinging exterior door other than screen or storm doors shall have a key-operated lock... the use of a key for operation from the inside. (4) All exterior doors, including storm and screen...
24 CFR 3280.105 - Exit facilities; exterior doors.
Code of Federal Regulations, 2010 CFR
2010-04-01
... opening. (3) Each swinging exterior door other than screen or storm doors shall have a key-operated lock... the use of a key for operation from the inside. (4) All exterior doors, including storm and screen...
24 CFR 3280.105 - Exit facilities; exterior doors.
Code of Federal Regulations, 2011 CFR
2011-04-01
... opening. (3) Each swinging exterior door other than screen or storm doors shall have a key-operated lock... the use of a key for operation from the inside. (4) All exterior doors, including storm and screen...
Soós, Reka; Whiteman, Andrew D; Wilson, David C; Briciu, Cosmin; Nürnberger, Sofia; Oelz, Barbara; Gunsilius, Ellen; Schwehn, Ekkehard
2017-08-01
This is the second of two papers reporting the results of a major study considering 'operator models' for municipal solid waste management (MSWM) in emerging and developing countries. Part A documents the evidence base, while Part B presents a four-step decision support system for selecting an appropriate operator model in a particular local situation. Step 1 focuses on understanding local problems and framework conditions; Step 2 on formulating and prioritising local objectives; and Step 3 on assessing capacities and conditions, and thus identifying strengths and weaknesses, which underpin selection of the operator model. Step 4A addresses three generic questions, including public versus private operation, inter-municipal co-operation and integration of services. For steps 1-4A, checklists have been developed as decision support tools. Step 4B helps choose locally appropriate models from an evidence-based set of 42 common operator models ( coms); decision support tools here are a detailed catalogue of the coms, setting out advantages and disadvantages of each, and a decision-making flowchart. The decision-making process is iterative, repeating steps 2-4 as required. The advantages of a more formal process include avoiding pre-selection of a particular com known to and favoured by one decision maker, and also its assistance in identifying the possible weaknesses and aspects to consider in the selection and design of operator models. To make the best of whichever operator models are selected, key issues which need to be addressed include the capacity of the public authority as 'client', management in general and financial management in particular.
Voigt, Wieland; Hoellthaler, Josef; Magnani, Tiziana; Corrao, Vito; Valdagni, Riccardo
2014-01-01
Multidisciplinary care of prostate cancer is increasingly offered in specialised cancer centres. It requires the optimisation of medical and operational processes and the integration of the different medical and non-medical stakeholders. To develop a standardised operational process assessment tool basing on the capability maturity model integration (CMMI) able to implement multidisciplinary care and improve process quality and efficiency. Information for model development was derived from medical experts, clinical guidelines, best practice elements of renowned cancer centres, and scientific literature. Data were organised in a hierarchically structured model, consisting of 5 categories, 30 key process areas, 172 requirements, and more than 1500 criteria. Compliance with requirements was assessed through structured on-site surveys covering all relevant clinical and management processes. Comparison with best practice standards allowed to recommend improvements. 'Act On Oncology'(AoO) was applied in a pilot study on a prostate cancer unit in Europe. Several best practice elements such as multidisciplinary clinics or advanced organisational measures for patient scheduling were observed. Substantial opportunities were found in other areas such as centre management and infrastructure. As first improvements the evaluated centre administration described and formalised the organisation of the prostate cancer unit with defined personnel assignments and clinical activities and a formal agreement is being worked on to have structured access to First-Aid Posts. In the pilot study, the AoO approach was feasible to identify opportunities for process improvements. Measures were derived that might increase the operational process quality and efficiency.
Dynamic analysis of process reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shadle, L.J.; Lawson, L.O.; Noel, S.D.
1995-06-01
The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process modelsmore » are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.« less
Robotic operation of the Observatorio Astrofísico de Javalambre
NASA Astrophysics Data System (ADS)
Yanes-Díaz, A.; Antón, J. L.; Rueda-Teruel, S.; Guillén-Civera, L.; Bello, R.; Jiménez-Mejías, D.; Chueca, S.; Lasso-Cabrera, N. M.; Suárez, O.; Rueda-Teruel, F.; Cenarro, A. J.; Cristóbal-Hornillos, D.; Marín-Franch, A.; Luis-Simoes, R.; López-Alegre, G.; Rodríguez-Hernández, M. A. C.; Moles, M.; Ederoclite, A.; Varela, J.; Vázquez Ramió, H.; Díaz-Martí, M. C.; Iglesias-Marzoa, R.; Maicas, N.; Lamadrid, J. L.; López-Sainz, A.; Hernández-Fuertes, J.; Valdivielso, L.
2015-05-01
The Observatorio Astrofísico de Javalambre (OAJ) is a new astronomical facility located at the Sierra de Javalambre (Teruel, Spain) whose primary role will be to conduct all-sky astronomical surveys with two unprecedented telescopes of unusually large fields of view: the JST/T250, a 2.55 m telescope of 3 deg field of view, and the JAST/T80, an 83 cm telescope of 2 deg field of view. CEFCA engineering team has been designing the OAJ control system as a global concept to manage, monitor, control and maintain all the observatory systems including not only astronomical subsystems but also infrastructure and other facilities. Three main factors have been considered in the design of a global control system for the robotic OAJ: quality, reliability and efficiency. We propose CIA (Control Integrated Architecture) design and OEE (Overall Equipment Effectiveness) as a key performance indicator in order to improve operation processes, minimizing resources and obtain high cost reduction maintaining quality requirements. Here we present the OAJ robotic control strategy to achieve maximum quality efficiency for the observatory surveys, processes and operations, giving practical examples of our approach.
Nguyen, D Duc; Ngo, H Hao; Guo, W; Nguyen, T Thanh; Chang, Soon W; Jang, A; Yoon, Yong S
2016-09-01
This paper evaluated a novel pilot scale electrocoagulation (EC) system for improving total phosphorus (TP) removal from municipal wastewater. This EC system was operated in continuous and batch operating mode under differing conditions (e.g. flow rate, initial concentration, electrolysis time, conductivity, voltage) to evaluate correlative phosphorus and electrical energy consumption. The results demonstrated that the EC system could effectively remove phosphorus to meet current stringent discharge standards of less than 0.2mg/L within 2 to 5min. This target was achieved in all ranges of initial TP concentrations studied. It was also found that an increase in conductivity of solution, voltages, or electrolysis time, correlated with improved TP removal efficiency and reduced specific energy consumption. Based on these results, some key economic considerations, such as operating costs, cost-effectiveness, product manufacturing feasibility, facility design and retrofitting, and program implementation are also discussed. This EC process can conclusively be highly efficient in a relatively simple, easily managed, and cost-effective for wastewater treatment system. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Huber, W. C.
1986-01-01
Voice synthesizer tells what key is about to be depressed. Verbal feedback useful for blind operators or where dim light prevents sighted operator from seeing keyboard. Also used where operator is busy observing other things while keying data into control system. Used as training aid for touch typing, and to train blind operators to use both standard and braille keyboards. Concept adapted to such equipment as typewriters, computers, calculators, telephones, cash registers, and on/off controls.
Military Review: Operation Desert Shield/Desert Storm
1991-09-01
areas of responsibility and ob- where he could best interact with key operation- jectives to influence the outcome of the battle. al wmmanders: the...Air component command- Key to successful operational command was er for air support, the Saudi Joint Forces Coin- the interaction of the two command...opterira~~ d-~ a~ ini ection as in available news people in the field. - Vtnmam.... The operational securiy/top safety prMAlem Dese ~amnind actio oereid
Development of an advanced uncooled 10-Gb DFB laser for volume manufacture
NASA Astrophysics Data System (ADS)
Burns, Gordon; Charles, Paul M.
2003-03-01
Optical communication systems operating at 10Gbit/s such as 10Gigabit Ethernet are becoming more and more important in Local Area Networks (LAN) and Metropolitan Area Networks (MAN). This market requires optical transceivers of low cost, size and power consumption. This drives a need for uncooled DFB lasers directly modulated at 10Gbit/s. This paper describes the development of a state of the art uncooled high speed DFB laser which is capable of being manufactured in high volume at the low cost demanded by the GbE market. A DFB laser was designed by developing technological building blocks within the 'conventional" InGaAsP materials system, using existing well proven manufacturing processes modules wherever possible, limiting the design risk to a few key areas where innovation was required. The temperature and speed performance of the InGaAsP SMQW active layer system was carefully optimized and then coupled with a low parasitic lateral confinement system. Using concurrent engineering, new processes were demonstrated to have acceptable process capability within a manufacturing fabrication environment, proving their ability to support high volume manufacturing requirements. The DFB laser fabricated was shown to operate at 100C chip temperature with an open eye at 10Gbit/s operation (with an extinction ratio >5dB). Up to 90C operation this DFB shows threshold current as low as 29mA, optical power as high as 13mW and it meets the 10Gb scaled Ethernet mask with extinction ratio >6dB. It was found that the high temperature dynamic behavior of these lasers could not be fully predicted from static test data. A production test strategy was therefore followed where equipment was designed to fully test devices/subassemblies at 100C and up to 20Gbit/s at key points in the product build. This facilitated the rapid optimisation of product yields upon manufacturing ramp up and minimization of product costs. This state of the art laser is now transferred into volume manufacture.
Evolving Tale of TCPs: New Paradigms and Old Lacunae
Dhaka, Namrata; Bhardwaj, Vasudha; Sharma, Manoj K.; Sharma, Rita
2017-01-01
Teosinte Branched1/Cycloidea/Proliferating cell factors (TCP) genes are key mediators of genetic innovations underlying morphological novelties, stress adaptation, and evolution of immune response in plants. They have a remarkable ability to integrate and translate diverse endogenous, and environmental signals with high fidelity. Compilation of studies, aimed at elucidating the mechanism of TCP functions, shows that it takes an amalgamation and interplay of several different factors, regulatory processes and pathways, instead of individual components, to achieve the incredible functional diversity and specificity, demonstrated by TCP proteins. Through this minireview, we provide a brief description of key structural features and molecular components, known so far, that operate this conglomerate, and highlight the important conceptual challenges and lacunae in TCP research. PMID:28421104
Evolving Tale of TCPs: New Paradigms and Old Lacunae.
Dhaka, Namrata; Bhardwaj, Vasudha; Sharma, Manoj K; Sharma, Rita
2017-01-01
Teosinte Branched1/Cycloidea/Proliferating cell factors (TCP) genes are key mediators of genetic innovations underlying morphological novelties, stress adaptation, and evolution of immune response in plants. They have a remarkable ability to integrate and translate diverse endogenous, and environmental signals with high fidelity. Compilation of studies, aimed at elucidating the mechanism of TCP functions, shows that it takes an amalgamation and interplay of several different factors, regulatory processes and pathways, instead of individual components, to achieve the incredible functional diversity and specificity, demonstrated by TCP proteins. Through this minireview, we provide a brief description of key structural features and molecular components, known so far, that operate this conglomerate, and highlight the important conceptual challenges and lacunae in TCP research.
Scarani, Valerio; Renner, Renato
2008-05-23
We derive a bound for the security of quantum key distribution with finite resources under one-way postprocessing, based on a definition of security that is composable and has an operational meaning. While our proof relies on the assumption of collective attacks, unconditional security follows immediately for standard protocols such as Bennett-Brassard 1984 and six-states protocol. For single-qubit implementations of such protocols, we find that the secret key rate becomes positive when at least N approximately 10(5) signals are exchanged and processed. For any other discrete-variable protocol, unconditional security can be obtained using the exponential de Finetti theorem, but the additional overhead leads to very pessimistic estimates.
Design and Applications of Rapid Image Tile Producing Software Based on Mosaic Dataset
NASA Astrophysics Data System (ADS)
Zha, Z.; Huang, W.; Wang, C.; Tang, D.; Zhu, L.
2018-04-01
Map tile technology is widely used in web geographic information services. How to efficiently produce map tiles is key technology for rapid service of images on web. In this paper, a rapid producing software for image tile data based on mosaic dataset is designed, meanwhile, the flow of tile producing is given. Key technologies such as cluster processing, map representation, tile checking, tile conversion and compression in memory are discussed. Accomplished by software development and tested by actual image data, the results show that this software has a high degree of automation, would be able to effectively reducing the number of IO and improve the tile producing efficiency. Moreover, the manual operations would be reduced significantly.
Sahmel, J; Devlin, K; Burns, A; Ferracini, T; Ground, M; Paustenbach, D
2013-01-01
Benzene, a known carcinogen, can be generated as a by-product during the use of petroleum-based raw materials in chemical manufacturing. The aim of this study was to analyze a large data set of benzene air concentration measurements collected over nearly 40 years during routine employee exposure monitoring at a petrochemical manufacturing facility. The facility used ethane, propane, and natural gas as raw materials in the production of common commercial materials such as polyethylene, polypropylene, waxes, adhesives, alcohols, and aldehydes. In total, 3607 benzene air samples were collected at the facility from 1962 to 1999. Of these, in total 2359 long-term (>1 h) personal exposure samples for benzene were collected during routine operations at the facility between 1974 and 1999. These samples were analyzed by division, department, and job title to establish employee benzene exposures in different areas of the facility over time. Sampling data were also analyzed by key events over time, including changes in the occupational exposure limits (OELs) for benzene and key equipment process changes at the facility. Although mean benzene concentrations varied according to operation, in nearly all cases measured benzene quantities were below the OEL in place at the time for benzene (10 ppm for 1974-1986 and 1 ppm for 1987-1999). Decreases in mean benzene air concentrations were also found when data were evaluated according to 7- to 10-yr periods following key equipment process changes. Further, an evaluation of mortality rates for a retrospective employee cohort (n = 3938) demonstrated that the average personal benzene exposures at this facility (0.89 ppm for the period 1974-1986 and 0.125 ppm for the period 1987-1999) did not result in increased standardized mortality ratio (SMRs) for diseases or malignancies of the lymphatic system. The robust nature of this data set provides comprehensive exposure information that may be useful for assessing human benzene exposures at similar facilities. The data also provide a basis for comparable measured exposure levels and the potential for adverse health effects. These data may also prove beneficial for comparing relative exposure potential for production versus nonproduction operations and the relationship between area and personal breathing zone samples.
NASA Astrophysics Data System (ADS)
Wan, Jiangping; Jones, James D.
2013-11-01
The Warfield version of systems science supports a wide variety of application areas, and is useful to practitioners who use the work program of complexity (WPOC) tool. In this article, WPOC is applied to information technology service management (ITSM) for managing the complexity of projects. In discussing the application of WPOC to ITSM, we discuss several steps of WPOC. The discovery step of WPOC consists of a description process and a diagnosis process. During the description process, 52 risk factors are identified, which are then narrowed to 20 key risk factors. All of this is done by interviews and surveys. Root risk factors (the most basic risk factors) consist of 11 kinds of common 'mindbugs' which are selected from an interpretive structural model. This is achieved by empirical analysis of 25 kinds of mindbugs. (A lesser aim of this research is to affirm that these mindbugs developed from a Western mindset have corresponding relevance in a completely different culture: the Peoples Republic of China.) During the diagnosis process, the relationships among the root risk factors in the implementation of the ITSM project are identified. The resolution step of WPOC consists of a design process and an implementation process. During the design process, issues related to the ITSM application are compared to both e-Government operation and maintenance, and software process improvement. The ITSM knowledge support structure is also designed at this time. During the implementation process, 10 keys to the successful implementation of ITSM projects are identified.
Resource quality of a symmetry-protected topologically ordered phase for quantum computation.
Miller, Jacob; Miyake, Akimasa
2015-03-27
We investigate entanglement naturally present in the 1D topologically ordered phase protected with the on-site symmetry group of an octahedron as a potential resource for teleportation-based quantum computation. We show that, as long as certain characteristic lengths are finite, all its ground states have the capability to implement any unit-fidelity one-qubit gate operation asymptotically as a key computational building block. This feature is intrinsic to the entire phase, in that perfect gate fidelity coincides with perfect string order parameters under a state-insensitive renormalization procedure. Our approach may pave the way toward a novel program to classify quantum many-body systems based on their operational use for quantum information processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anton, David
The proposed project built on the foundation of over several years years of intensive and ground-breaking R&D work at Cellana's Kona Demonstration Facility (KDF). Phycological and engineering solutions were provided to tackle key cultivation issues and technical barriers limiting algal biomass productivity identified through work conducted outdoors at industrial (1 acre) scale. The objectives of this project were to significantly improve algal biomass productivity and reduce operational cost in a seawater-based system, using results obtained from two top-performing algal strains as the baseline while technically advancing and more importantly, integrating the various unit operations involved in algal biomass production, processing,more » and refining.« less
Resource Quality of a Symmetry-Protected Topologically Ordered Phase for Quantum Computation
NASA Astrophysics Data System (ADS)
Miller, Jacob; Miyake, Akimasa
2015-03-01
We investigate entanglement naturally present in the 1D topologically ordered phase protected with the on-site symmetry group of an octahedron as a potential resource for teleportation-based quantum computation. We show that, as long as certain characteristic lengths are finite, all its ground states have the capability to implement any unit-fidelity one-qubit gate operation asymptotically as a key computational building block. This feature is intrinsic to the entire phase, in that perfect gate fidelity coincides with perfect string order parameters under a state-insensitive renormalization procedure. Our approach may pave the way toward a novel program to classify quantum many-body systems based on their operational use for quantum information processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vanvick, T.W.
The Logan Generating Plant (formerly Keystone Cogeneration Project) is a 230 MW (gross) pulverized coal cogeneration facility located on the Delaware River in Logan Township, New Jersey, off Route 130. Owned and operated by U.S. Generating Company, the plant was built by Bechtel Corporation, which provided engineering, procurement, construction, and startup services. Power from the plant is furnished to Atlantic Electric, and approximately 50,000 pounds of process steam per hour is provided to Monsanto`s adjacent facility. U.S. Generating Company is committed to operating plants with close attention to the environment and has developed a specific Environmental Mission Statement. This papermore » addresses some of the key environmental features at the Logan Generating Plant.« less
Development Of Simulation Model For Fluid Catalytic Cracking
NASA Astrophysics Data System (ADS)
Ghosh, Sobhan
2010-10-01
Fluid Catalytic Cracking (FCC) is the most widely used secondary conversion process in the refining industry, for producing gasoline, olefins, and middle distillate from heavier petroleum fractions. There are more than 500 units in the world with a total processing capacity of about 17 to 20% of the crude capacity. FCC catalyst is the highest consumed catalyst in the process industry. On one hand, FCC is quite flexible with respect to it's ability to process wide variety of crudes with a flexible product yield pattern, and on the other hand, the interdependence of the major operating parameters makes the process extremely complex. An operating unit is self balancing and some fluctuations in the independent parameters are automatically adjusted by changing the temperatures and flow rates at different sections. However, a good simulation model is very useful to the refiner to get the best out of the process, in terms of selection of the best catalyst, to cope up with the day to day changing of the feed quality and the demands of different products from FCC unit. In addition, a good model is of great help in designing the process units and peripherals. A simple empirical model is often adequate to monitor the day to day operations, but they are not of any use in handling the other problems such as, catalyst selection or, design / modification of the plant. For this, a kinetic based rigorous model is required. Considering the complexity of the process, large number of chemical species undergoing "n" number of parallel and consecutive reactions, it is virtually impossible to develop a simulation model based on the kinetic parameters. The most common approach is to settle for a semi empirical model. We shall take up the key issues for developing a FCC model and the contribution of such models in the optimum operation of the plant.
Medical imaging and registration in computer assisted surgery.
Simon, D A; Lavallée, S
1998-09-01
Imaging, sensing, and computing technologies that are being introduced to aid in the planning and execution of surgical procedures are providing orthopaedic surgeons with a powerful new set of tools for improving clinical accuracy, reliability, and patient outcomes while reducing costs and operating times. Current computer assisted surgery systems typically include a measurement process for collecting patient specific medical data, a decision making process for generating a surgical plan, a registration process for aligning the surgical plan to the patient, and an action process for accurately achieving the goals specified in the plan. Some of the key concepts in computer assisted surgery applied to orthopaedics with a focus on the basic framework and underlying technologies is outlined. In addition, technical challenges and future trends in the field are discussed.
JWST Operations and the Phase I and II Process
NASA Astrophysics Data System (ADS)
Beck, Tracy L.
2010-07-01
The JWST operations and Phase I and Phase II process will build upon our knowledge on the current system in use for HST. The primary observing overheads associated with JWST observations, both direct and indirect, are summarized. While some key operations constraints for JWST may cause deviations from the HST model for proposal planning, the overall interface to JWST planning will use the APT and will appear similar to the HST interface. The requirement is to have a proposal planning model simlar to HST, where proposals submitted to the TAC must have at least the minimum amount of information necessary for assessment of the strength of the science. However, a goal of the JWST planning process is to have the submitted Phase I proposal in executable form, and as complete as possible for many programs. JWST will have significant constraints on the spacecraft pointing and orient, so it is beneficial for the planning process to have these scheduling constraints on programs defined as early as possible. The guide field of JWST is also much smaller than the HST guide field, so searches for available guide stars for JWST science programs must be done at the Phase I deadline. The long range observing plan for each JWST cycle will be generated intially from the TAC accepted programs at the Phase I deadline, and the LRP will be refined after the Phase II deadline when all scheduling constraints are defined.
Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver
2017-08-01
Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.
Probabilistic Risk Assessment for Decision Making During Spacecraft Operations
NASA Technical Reports Server (NTRS)
Meshkat, Leila
2009-01-01
Decisions made during the operational phase of a space mission often have significant and immediate consequences. Without the explicit consideration of the risks involved and their representation in a solid model, it is very likely that these risks are not considered systematically in trade studies. Wrong decisions during the operational phase of a space mission can lead to immediate system failure whereas correct decisions can help recover the system even from faulty conditions. A problem of special interest is the determination of the system fault protection strategies upon the occurrence of faults within the system. Decisions regarding the fault protection strategy also heavily rely on a correct understanding of the state of the system and an integrated risk model that represents the various possible scenarios and their respective likelihoods. Probabilistic Risk Assessment (PRA) modeling is applicable to the full lifecycle of a space mission project, from concept development to preliminary design, detailed design, development and operations. The benefits and utilities of the model, however, depend on the phase of the mission for which it is used. This is because of the difference in the key strategic decisions that support each mission phase. The focus of this paper is on describing the particular methods used for PRA modeling during the operational phase of a spacecraft by gleaning insight from recently conducted case studies on two operational Mars orbiters. During operations, the key decisions relate to the commands sent to the spacecraft for any kind of diagnostics, anomaly resolution, trajectory changes, or planning. Often, faults and failures occur in the parts of the spacecraft but are contained or mitigated before they can cause serious damage. The failure behavior of the system during operations provides valuable data for updating and adjusting the related PRA models that are built primarily based on historical failure data. The PRA models, in turn, provide insight into the effect of various faults or failures on the risk and failure drivers of the system and the likelihood of possible end case scenarios, thereby facilitating the decision making process during operations. This paper describes the process of adjusting PRA models based on observed spacecraft data, on one hand, and utilizing the models for insight into the future system behavior on the other hand. While PRA models are typically used as a decision aid during the design phase of a space mission, we advocate adjusting them based on the observed behavior of the spacecraft and utilizing them for decision support during the operations phase.
Optical penetration-based silkworm pupa gender sensor structure.
Sumriddetchkajorn, Sarun; Kamtongdee, Chakkrit
2012-02-01
This paper proposes and experimentally demonstrates for what is believed to be the first time a highly sought-after optical structure for highly-accurate identification of the silkworm pupa gender. The key idea is to exploit a long wavelength optical beam in the red or near infrared spectrum that can effectively and safely penetrate the body of a silkworm pupa. Later on, simple image processing operations via image thresholding, blob filtering, and image inversion processes are applied in order to eliminate the unwanted image noises and at the same time highlight the gender gland. Experimental proof of concept using three 636 nm wavelength light emitting diodes, a two-dimensional web camera, an 8 bit microcontroller board, and a notebook computer shows a very high 95.6% total accuracy in identifying the gender of 45 silkworm pupae with a measured fast identification time of 96.6 ms. Other key features include low cost, low component counts, and ease of implementation and control.
Manufacture of poly(methyl methacrylate) microspheres using membrane emulsification
Bux, Jaiyana; Manga, Mohamed S.; Hunter, Timothy N.
2016-01-01
Accurate control of particle size at relatively narrow polydispersity remains a key challenge in the production of synthetic polymer particles at scale. A cross-flow membrane emulsification (XME) technique was used here in the preparation of poly(methyl methacrylate) microspheres at a 1–10 l h−1 scale, to demonstrate its application for such a manufacturing challenge. XME technology has previously been shown to provide good control over emulsion droplet sizes with careful choice of the operating conditions. We demonstrate here that, for an appropriate formulation, equivalent control can be gained for a precursor emulsion in a batch suspension polymerization process. We report here the influence of key parameters on the emulsification process; we also demonstrate the close correlation in size between the precursor emulsion and the final polymer particles. Two types of polymer particle were produced in this work: a solid microsphere and an oil-filled matrix microcapsule. This article is part of the themed issue ‘Soft interfacial materials: from fundamentals to formulation’. PMID:27298430
Key variables analysis of a novel continuous biodrying process for drying mixed sludge.
Navaee-Ardeh, Shahram; Bertrand, François; Stuart, Paul R
2010-05-01
A novel continuous biodrying process has been developed whose goal is to increase the dry solids content of the sludge to economic levels rendering it suitable for a safe and economic combustion operation in a biomass boiler. The sludge drying rates are enhanced by the metabolic bioheat produced in the matrix of mixed sludge. The goal of this study was to systematically analyze the continuous biodrying reactor. By performing a variable analysis, it was found that the outlet relative humidity profile was the key variable in the biodrying reactor. The influence of different outlet relative humidity profiles was then evaluated using biodrying efficiency index. It was found that by maintaining the air outlet relative humidity profile at 85/85/96/96% in the four compartments of the reactor, the highest biodrying efficiency index can be achieved, while economic dry solids level (>45%w/w) are guaranteed. Crown Copyright 2009. Published by Elsevier Ltd. All rights reserved.
Auditing radiation sterilization facilities
NASA Astrophysics Data System (ADS)
Beck, Jeffrey A.
The diversity of radiation sterilization systems available today places renewed emphasis on the need for thorough Quality Assurance audits of these facilities. Evaluating compliance with Good Manufacturing Practices is an obvious requirement, but an effective audit must also evaluate installation and performance qualification programs (validation_, and process control and monitoring procedures in detail. The present paper describes general standards that radiation sterilization operations should meet in each of these key areas, and provides basic guidance for conducting QA audits of these facilities.
Homeland security challenges in nursing practice.
Boatright, Connie; McGlown, K Joanne
2005-09-01
Nurses need a comprehensive knowledge of doctrine, laws, regulations,programs, and processes that build the operational framework for health care preparedness. Key components of this knowledge base reside in the areas of: evolution of homeland security: laws and mandates affecting health care and compliance and regulatory issues for health care organizations. This article addresses primary components in both of these areas, after first assessing the status of nursing's involvement (in homeland security), as portrayed in the professional literature.
Review of Findings for Human Performance Contribution to Risk in Operating Events
2002-03-01
and loss of DC power. Key to this event was failure to control setpoints on safety-related equipment and failure to maintain the load tap changer...34 Therefore, "to optimize task execution at the job site, it is important to align organizational processes and values." Effective team skills are an...reactor was blocked and the water level rapidly dropped to the automatic low-level scram setpoint . Human Performance Issues Control rods were fully
Visual Analysis of Air Traffic Data
NASA Technical Reports Server (NTRS)
Albrecht, George Hans; Pang, Alex
2012-01-01
In this paper, we present visual analysis tools to help study the impact of policy changes on air traffic congestion. The tools support visualization of time-varying air traffic density over an area of interest using different time granularity. We use this visual analysis platform to investigate how changing the aircraft separation volume can reduce congestion while maintaining key safety requirements. The same platform can also be used as a decision aid for processing requests for unmanned aerial vehicle operations.
Cos, Oriol; Ramón, Ramón; Montesinos, José Luis; Valero, Francisco
2006-01-01
The methylotrophic yeast Pichia pastoris has been widely reported as a suitable expression system for heterologous protein production. The use of different phenotypes under PAOX promoter, other alternative promoters, culture medium, and operational strategies with the objective to maximize either yield or productivity of the heterologous protein, but also to obtain a repetitive product batch to batch to get a robust process for the final industrial application have been reported. Medium composition, kinetics growth, fermentation operational strategies from fed-batch to continuous cultures using different phenotypes with the most common PAOX promoter and other novel promoters (GAP, FLD, ICL), the use of mixed substrates, on-line monitoring of the key fermentation parameters (methanol) and control algorithms applied to the bioprocess are reviewed and discussed in detail. PMID:16600031
NASA Astrophysics Data System (ADS)
Geiger, B.; Carrer, D.; Meurey, C.; Roujean, J.-L.
2006-08-01
The Satellite Application Facility for Land Surface Anal- ysis hosted by the Portuguese Meteorological Institute in Lisbon generates and distributes value added satellite products for numerical weather prediction and environ- mental applications in near-real time. Within the project consortium M´et´eo-France is responsible for the land sur- face albedo and down-welling short-wave radiation flux products. Since the beginning of the year 2005 Meteosat Second Generation data are routinely processed by the Land-SAF operational system. In general the validation studies carried out so far show a good consistency with in-situ observations or equivalent products derived from other satellites. After one year of operations a summary of the product characteristics and performances is given. Key words: Surface Albedo; Down-welling Radiation; Land-SAF.
Graph State-Based Quantum Group Authentication Scheme
NASA Astrophysics Data System (ADS)
Liao, Longxia; Peng, Xiaoqi; Shi, Jinjing; Guo, Ying
2017-02-01
Motivated by the elegant structure of the graph state, we design an ingenious quantum group authentication scheme, which is implemented by operating appropriate operations on the graph state and can solve the problem of multi-user authentication. Three entities, the group authentication server (GAS) as a verifier, multiple users as provers and the trusted third party Trent are included. GAS and Trent assist the multiple users in completing the authentication process, i.e., GAS is responsible for registering all the users while Trent prepares graph states. All the users, who request for authentication, encode their authentication keys on to the graph state by performing Pauli operators. It demonstrates that a novel authentication scheme can be achieved with the flexible use of graph state, which can synchronously authenticate a large number of users, meanwhile the provable security can be guaranteed definitely.
Ittenbach, Richard F; Baker, Cynthia L; Corsmo, Jeremy J
2014-05-01
Standard operating procedures (SOPs) were once considered the province of the pharmaceutical industry but are now viewed as a key component of quality assurance programs. To address variability and increase the rigor of clinical data management (CDM) operations, the Cincinnati Children's Hospital Medical Center (CCHMC) decided to create CDM SOPs. In response to this challenge, and as part of a broader institutional initiative, the CCHMC leadership established an executive steering committee to oversee the development and implementation of CDM SOPs. This resulted in the creation of a quality assurance review process with three review panels: an SOP development team (16 clinical data managers and technical staff members), a faculty review panel (8 senior faculty and administrators), and an expert advisory panel (3 national CDM experts). This innovative, tiered review process helped ensure that the new SOPs would be created and implemented in accord with good CDM practices and standards. Twelve fully vetted, institutionally endorsed SOPs and one CDM template resulted from the intensive, iterative 10-month process (December 2011 to early October 2012). Phased implementation, which incoporated the CDM SOPs into the existing audit process for certain types of clinical research studies, was on schedule at the time of this writing. Once CCHMC researchers have had the opportunity to use the SOPs over time and across a broad range of research settings and conditions, the SOPs will be revisited and revalidated.
Study on the application of mobile internet cloud computing platform
NASA Astrophysics Data System (ADS)
Gong, Songchun; Fu, Songyin; Chen, Zheng
2012-04-01
The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.
Gordon, G T; McCann, B P
2015-01-01
This paper describes the basis of a stakeholder-based sustainable optimisation indicator (SOI) system to be developed for small-to-medium sized activated sludge (AS) wastewater treatment plants (WwTPs) in the Republic of Ireland (ROI). Key technical publications relating to best practice plant operation, performance audits and optimisation, and indicator and benchmarking systems for wastewater services are identified. Optimisation studies were developed at a number of Irish AS WwTPs and key findings are presented. A national AS WwTP manager/operator survey was carried out to verify the applied operational findings and identify the key operator stakeholder requirements for this proposed SOI system. It was found that most plants require more consistent operational data-based decision-making, monitoring and communication structures to facilitate optimised, sustainable and continuous performance improvement. The applied optimisation and stakeholder consultation phases form the basis of the proposed stakeholder-based SOI system. This system will allow for continuous monitoring and rating of plant performance, facilitate optimised operation and encourage the prioritisation of performance improvement through tracking key operational metrics. Plant optimisation has become a major focus due to the transfer of all ROI water services to a national water utility from individual local authorities and the implementation of the EU Water Framework Directive.
Zeroing in on Number and Operations, Grades 7-8: Key Ideas and Common Misconceptions
ERIC Educational Resources Information Center
Collins, Anne; Dacey, Linda
2010-01-01
"The Zeroing in on Number and Operations" series, which aligns with the Common Core State Standards and the NCTM Standards and Focal Points, features easy-to-use tools for teaching key concepts in number and operations and for addressing common misconceptions. Sharing the insights they've gained in decades of mathematics teaching and research,…
Zeroing in on Number and Operations, Grades 3-4: Key Ideas and Common Misconceptions
ERIC Educational Resources Information Center
Dacey, Linda; Collins, Anne
2010-01-01
"The Zeroing in on Number and Operations" series, which aligns with the Common Core State Standards and the NCTM Standards and Focal Points, features easy-to-use tools for teaching key concepts in number and operations and for addressing common misconceptions. Sharing the insights they've gained in decades of mathematics teaching and research,…
Zeroing in on Number and Operations, Grades 5-6: Key Ideas and Common Misconceptions
ERIC Educational Resources Information Center
Collins, Anne; Dacey, Linda
2010-01-01
"The Zeroing in on Number and Operations" series, which aligns with the Common Core State Standards and the NCTM Sandards and Focal Points, features easy-to-use tools for teaching key concepts in number and operations and for addressing common misconceptions. Sharing the insights they've gained through decades of mathematics teaching and research,…
A Novel Image Encryption Algorithm Based on DNA Subsequence Operation
Zhang, Qiang; Xue, Xianglian; Wei, Xiaopeng
2012-01-01
We present a novel image encryption algorithm based on DNA subsequence operation. Different from the traditional DNA encryption methods, our algorithm does not use complex biological operation but just uses the idea of DNA subsequence operations (such as elongation operation, truncation operation, deletion operation, etc.) combining with the logistic chaotic map to scramble the location and the value of pixel points from the image. The experimental results and security analysis show that the proposed algorithm is easy to be implemented, can get good encryption effect, has a wide secret key's space, strong sensitivity to secret key, and has the abilities of resisting exhaustive attack and statistic attack. PMID:23093912
Operations management tools to be applied for textile
NASA Astrophysics Data System (ADS)
Maralcan, A.; Ilhan, I.
2017-10-01
In this paper, basic concepts of process analysis such as flow time, inventory, bottleneck, labour cost and utilization are illustrated first. The effect of bottleneck on the results of a business are especially emphasized. In the next section, tools on productivity measurement; KPI (Key Performance Indicators) Tree, OEE (Overall Equipment Effectiveness) and Takt Time are introduced and exemplified. KPI tree is a diagram on which we can visualize all the variables of an operation which are driving financial results through cost and profit. OEE is a tool to measure a potential extra capacity of an equipment or an employee. Takt time is a tool to determine the process flow rate according to the customer demand. KPI tree is studied through the whole process while OEE is exemplified for a stenter frame machine which is the most important machine (and usually the bottleneck) and the most expensive investment in a finishing plant. Takt time is exemplified for the quality control department. Finally quality tools, six sigma, control charts and jidoka are introduced. Six sigma is a tool to measure process capability and by the way probability of a defect. Control chart is a powerful tool to monitor the process. The idea of jidoka (detect, stop and alert) is about alerting the people that there is a problem in the process.
Why advanced computing? The key to space-based operations
NASA Astrophysics Data System (ADS)
Phister, Paul W., Jr.; Plonisch, Igor; Mineo, Jack
2000-11-01
The 'what is the requirement?' aspect of advanced computing and how it relates to and supports Air Force space-based operations is a key issue. In support of the Air Force Space Command's five major mission areas (space control, force enhancement, force applications, space support and mission support), two-fifths of the requirements have associated stringent computing/size implications. The Air Force Research Laboratory's 'migration to space' concept will eventually shift Science and Technology (S&T) dollars from predominantly airborne systems to airborne-and-space related S&T areas. One challenging 'space' area is in the development of sophisticated on-board computing processes for the next generation smaller, cheaper satellite systems. These new space systems (called microsats or nanosats) could be as small as a softball, yet perform functions that are currently being done by large, vulnerable ground-based assets. The Joint Battlespace Infosphere (JBI) concept will be used to manage the overall process of space applications coupled with advancements in computing. The JBI can be defined as a globally interoperable information 'space' which aggregates, integrates, fuses, and intelligently disseminates all relevant battlespace knowledge to support effective decision-making at all echelons of a Joint Task Force (JTF). This paper explores a single theme -- on-board processing is the best avenue to take advantage of advancements in high-performance computing, high-density memories, communications, and re-programmable architecture technologies. The goal is to break away from 'no changes after launch' design to a more flexible design environment that can take advantage of changing space requirements and needs while the space vehicle is 'on orbit.'
A chronic generalized bi-directional brain-machine interface.
Rouse, A G; Stanslaski, S R; Cong, P; Jensen, R M; Afshar, P; Ullestad, D; Gupta, R; Molnar, G F; Moran, D W; Denison, T J
2011-06-01
A bi-directional neural interface (NI) system was designed and prototyped by incorporating a novel neural recording and processing subsystem into a commercial neural stimulator architecture. The NI system prototype leverages the system infrastructure from an existing neurostimulator to ensure reliable operation in a chronic implantation environment. In addition to providing predicate therapy capabilities, the device adds key elements to facilitate chronic research, such as four channels of electrocortigram/local field potential amplification and spectral analysis, a three-axis accelerometer, algorithm processing, event-based data logging, and wireless telemetry for data uploads and algorithm/configuration updates. The custom-integrated micropower sensor and interface circuits facilitate extended operation in a power-limited device. The prototype underwent significant verification testing to ensure reliability, and meets the requirements for a class CF instrument per IEC-60601 protocols. The ability of the device system to process and aid in classifying brain states was preclinically validated using an in vivo non-human primate model for brain control of a computer cursor (i.e. brain-machine interface or BMI). The primate BMI model was chosen for its ability to quantitatively measure signal decoding performance from brain activity that is similar in both amplitude and spectral content to other biomarkers used to detect disease states (e.g. Parkinson's disease). A key goal of this research prototype is to help broaden the clinical scope and acceptance of NI techniques, particularly real-time brain state detection. These techniques have the potential to be generalized beyond motor prosthesis, and are being explored for unmet needs in other neurological conditions such as movement disorders, stroke and epilepsy.
Trust Threshold Based Public Key Management in Mobile Ad Hoc Networks
2016-03-05
should operate in a self-organized way. Capkun t al. [15] proposed a certificate-based self-organized pub- c key management for MANETs by removing...period allo node started with ignorance interact with other nodes, th not reach T th Table 2 Attack behavior for operations . Operation Attack...section, we discuss the core operations o CTPKM as illustrated by Fig. 1 . Each mobile entity is able t communicate with other entities using public
Srai, Jagjit Singh; Badman, Clive; Krumme, Markus; Futran, Mauricio; Johnston, Craig
2015-03-01
This paper examines the opportunities and challenges facing the pharmaceutical industry in moving to a primarily "continuous processing"-based supply chain. The current predominantly "large batch" and centralized manufacturing system designed for the "blockbuster" drug has driven a slow-paced, inventory heavy operating model that is increasingly regarded as inflexible and unsustainable. Indeed, new markets and the rapidly evolving technology landscape will drive more product variety, shorter product life-cycles, and smaller drug volumes, which will exacerbate an already unsustainable economic model. Future supply chains will be required to enhance affordability and availability for patients and healthcare providers alike despite the increased product complexity. In this more challenging supply scenario, we examine the potential for a more pull driven, near real-time demand-based supply chain, utilizing continuous processing where appropriate as a key element of a more "flow-through" operating model. In this discussion paper on future supply chain models underpinned by developments in the continuous manufacture of pharmaceuticals, we have set out; The paper recognizes that although current batch operational performance in pharma is far from optimal and not necessarily an appropriate end-state benchmark for batch technology, the adoption of continuous supply chain operating models underpinned by continuous production processing, as full or hybrid solutions in selected product supply chains, can support industry transformations to deliver right-first-time quality at substantially lower inventory profiles. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
A summary and integration of research concerning single pilot IFR operational problems
NASA Technical Reports Server (NTRS)
Chapman, G. C.
1983-01-01
A review of seven research studies pertaining to Single Pilot IFR (SPIFR) operations was performed. Two studies were based on questionnaire surveys; two based on National Transportation Safety Board (NTSB) reports; two were based on Aviation Safety Reporting System (ASRS) incident reports, and one report used event analysis and statistics to forecast problems. The results obtained in each study were extracted and integrated. Results were synthesized and key issues pertaining to SPIFR operations problems were identified. The research that was recommended by the studies and that addressed the key issues is catalogued for each key issue.
Continuous operation of four-state continuous-variable quantum key distribution system
NASA Astrophysics Data System (ADS)
Matsubara, Takuto; Ono, Motoharu; Oguri, Yusuke; Ichikawa, Tsubasa; Hirano, Takuya; Kasai, Kenta; Matsumoto, Ryutaroh; Tsurumaru, Toyohiro
2016-10-01
We report on the development of continuous-variable quantum key distribution (CV-QKD) system that are based on discrete quadrature amplitude modulation (QAM) and homodyne detection of coherent states of light. We use a pulsed light source whose wavelength is 1550 nm and repetition rate is 10 MHz. The CV-QKD system can continuously generate secret key which is secure against entangling cloner attack. Key generation rate is 50 kbps when the quantum channel is a 10 km optical fiber. The CV-QKD system we have developed utilizes the four-state and post-selection protocol [T. Hirano, et al., Phys. Rev. A 68, 042331 (2003).]; Alice randomly sends one of four states {|+/-α⟩,|+/-𝑖α⟩}, and Bob randomly performs x- or p- measurement by homodyne detection. A commercially available balanced receiver is used to realize shot-noise-limited pulsed homodyne detection. GPU cards are used to accelerate the software-based post-processing. We use a non-binary LDPC code for error correction (reverse reconciliation) and the Toeplitz matrix multiplication for privacy amplification.
Human Mars EDL Pathfinder Study: Assessment of Technology Development Gaps and Mitigations
NASA Technical Reports Server (NTRS)
Lillard, Randolph; Olejniczak, Joe; Polsgrove, Tara; Cianciolo, Alice Dwyer; Munk, Michelle; Whetsel, Charles; Drake, Bret
2017-01-01
This paper presents the results of a NASA initiated Agency-wide assessment to better characterize the risks and potential mitigation approaches associated with landing human class Entry, Descent, and Landing (EDL) systems on Mars. Due to the criticality and long-lead nature of advancing EDL techniques, it is necessary to determine an appropriate strategy to improve the capability to land large payloads. A key focus of this study was to understand the key EDL risks and with a focus on determining what "must" be tested at Mars. This process identified the various risks and potential risk mitigation strategies along with the key near term technology development efforts required and in what environment those technology demonstrations were best suited. The study identified key risks along with advantages to each entry technology. In addition, it was identified that provided the EDL concept of operations (con ops) minimized large scale transition events, there was no technology requirement for a Mars pre-cursor demonstration. Instead, NASA should take a direct path to a human-scale lander.
Virtual Mission Operations Center -Explicit Access to Small Satellites by a Net Enabled User Base
NASA Astrophysics Data System (ADS)
Miller, E.; Medina, O.; Paulsen, P.; Hopkins, J.; Long, C.; Holloman, K.
2008-08-01
The Office of Naval Research (ON R), The Office of the Secr etary of Defense (OSD) , Th e Operationally Responsive Space Off ice (ORS) , and th e National Aeronautics and Space Administration (NASA) are funding the development and integration of key technologies and new processes that w ill allow users across th e bread th of operations the ab ility to access, task , retr ieve, and collaborate w ith data from various sensors including small satellites v ia the Intern et and the SIPRnet. The V irtual Mission Oper ations Center (VMO C) facilitates the dynamic apportionmen t of space assets, allows scalable mission man agement of mu ltiple types of sensors, and provid es access for non-space savvy users through an intu itive collaborative w eb site. These key technologies are b eing used as experimentation pathfinders fo r th e Do D's Operationally Responsiv e Sp ace (O RS) initiative and NASA's Sensor W eb. The O RS initiative seeks to provide space assets that can b e rapid ly tailored to meet a commander's in telligen ce or commun ication needs. For the DoD and NASA the V MO C provid es ready and scalab le access to space b ased assets. To the commercial space sector the V MO C may provide an analog to the innovativ e fractional ownersh ip approach represen ted by FlexJet. This pap er delves in to the technology, in tegration, and applicability of th e V MO C to th e DoD , NASA , and co mmer cial sectors.
Advanced Stirling Convertor Testing at GRC
NASA Technical Reports Server (NTRS)
Schifer, Nick; Oriti, Salvatore M.
2013-01-01
NASA Glenn Research Center (GRC) has been supporting development of the Advanced Stirling Radioisotope Generator (ASRG) since 2006. A key element of the ASRG project is providing life, reliability, and performance testing of the Advanced Stirling Convertor (ASC). The latest version of the ASC, deemed ASC-E3, is of a design identical to the forthcoming flight convertors. The first pair of ASC-E3 units was delivered in December 2012. GRC has begun the process of adding these units to the catalog of ongoing Stirling convertor operation. This process includes performance verification, which examines the data from various tests to validate the convertors performance to the product specification.
Karim, Abdool Z
2009-01-01
The regional processing centre at Sunnybrook Health Sciences Centre recently faced the substantial challenge of increasing cleaning capacity to meet the current workload and anticipated future demand without increasing its operating budget. The solution, upgrading its cleaning and decontamination system to a highly automated system, met both objectives. An analysis of the impact of the change found that the new system provided additional benefits, including improved productivity and cleaning quality; decreased costs; reduced water, electricity and chemical use; improved worker safety and morale; and decreased overtime. Investing in innovative technology improved key departmental outcomes while meeting institutional environmental and cost savings objectives.
Causse, Mickaël; Sénard, Jean-Michel; Démonet, Jean François; Pastor, Josette
2010-06-01
The paper deals with the links between physiological measurements and cognitive and emotional functioning. As long as the operator is a key agent in charge of complex systems, the definition of metrics able to predict his performance is a great challenge. The measurement of the physiological state is a very promising way but a very acute comprehension is required; in particular few studies compare autonomous nervous system reactivity according to specific cognitive processes during task performance and task related psychological stress is often ignored. We compared physiological parameters recorded on 24 healthy subjects facing two neuropsychological tasks: a dynamic task that require problem solving in a world that continually evolves over time and a logical task representative of cognitive processes performed by operators facing everyday problem solving. Results showed that the mean pupil diameter change was higher during the dynamic task; conversely, the heart rate was more elevated during the logical task. Finally, the systolic blood pressure seemed to be strongly sensitive to psychological stress. A better taking into account of the precise influence of a given cognitive activity and both workload and related task-induced psychological stress during task performance is a promising way to better monitor operators in complex working situations to detect mental overload or pejorative stress factor of error.
Risk-based analysis and decision making in multi-disciplinary environments
NASA Technical Reports Server (NTRS)
Feather, Martin S.; Cornford, Steven L.; Moran, Kelly
2003-01-01
A risk-based decision-making process conceived of and developed at JPL and NASA, has been used to help plan and guide novel technology applications for use on spacecraft. These applications exemplify key challenges inherent in multi-disciplinary design of novel technologies deployed in mission-critical settings. 1) Cross-disciplinary concerns are numerous (e.g., spacecraft involve navigation, propulsion, telecommunications). These concems are cross-coupled and interact in multiple ways (e.g., electromagnetic interference, heat transfer). 2) Time and budget pressures constrain development, operational resources constrain the resulting system (e.g., mass, volume, power). 3) Spacecraft are critical systems that must operate correctly the first time in only partially understood environments, with no chance for repair. 4) Past experience provides only a partial guide: New mission concepts are enhanced and enabled by new technologies, for which past experience is lacking. The decision-making process rests on quantitative assessments of the relationships between three classes of information - objectives (the things the system is to accomplish and constraints on its operation and development), risks (whose occurrence detracts from objectives), and mitigations (options for reducing the likelihood and or severity of risks). The process successfully guides experts to pool their knowledge, using custom-built software to support information gathering and decision-making.
Formal Analysis of Key Integrity in PKCS#11
NASA Astrophysics Data System (ADS)
Falcone, Andrea; Focardi, Riccardo
PKCS#11 is a standard API to cryptographic devices such as smarcards, hardware security modules and usb crypto-tokens. Though widely adopted, this API has been shown to be prone to attacks in which a malicious user gains access to the sensitive keys stored in the devices. In 2008, Delaune, Kremer and Steel proposed a model to formally reason on this kind of attacks. We extend this model to also describe flaws that are based on integrity violations of the stored keys. In particular, we consider scenarios in which a malicious overwriting of keys might fool honest users into using attacker's own keys, while performing sensitive operations. We further enrich the model with a trusted key mechanism ensuring that only controlled, non-tampered keys are used in cryptographic operations, and we show how this modified API prevents the above mentioned key-replacement attacks.
Surface acoustic wave devices for sensor applications
NASA Astrophysics Data System (ADS)
Bo, Liu; Xiao, Chen; Hualin, Cai; Mohammad, Mohammad Ali; Xiangguang, Tian; Luqi, Tao; Yi, Yang; Tianling, Ren
2016-02-01
Surface acoustic wave (SAW) devices have been widely used in different fields and will continue to be of great importance in the foreseeable future. These devices are compact, cost efficient, easy to fabricate, and have a high performance, among other advantages. SAW devices can work as filters, signal processing units, sensors and actuators. They can even work without batteries and operate under harsh environments. In this review, the operating principles of SAW sensors, including temperature sensors, pressure sensors, humidity sensors and biosensors, will be discussed. Several examples and related issues will be presented. Technological trends and future developments will also be discussed. Project supported by the National Natural Science Foundation of China (Nos. 60936002, 61025021, 61434001, 61574083), the State Key Development Program for Basic Research of China (No. 2015CB352100), the National Key Project of Science and Technology (No. 2011ZX02403-002) and the Special Fund for Agroscientific Research in the Public Interest of China (No. 201303107). M.A.M is additionally supported by the Postdoctoral Fellowship (PDF) program of the Natural Sciences and Engineering Research Council (NSERC) of Canada and the China Postdoctoral Science Foundation (CPSF).
Packet communications in satellites with multiple-beam antennas and signal processing
NASA Technical Reports Server (NTRS)
Davies, R.; Chethik, F.; Penick, M.
1980-01-01
A communication satellite with a multiple-beam antenna and onboard signal processing is considered for use in a 'message-switched' data relay system. The signal processor may incorporate demodulation, routing, storage, and remodulation of the data. A system user model is established and key functional elements for the signal processing are identified. With the throughput and delay requirements as the controlled variables, the hardware complexity, operational discipline, occupied bandwidth, and overall user end-to-end cost are estimated for (1) random-access packet switching; and (2) reservation-access packet switching. Other aspects of this network (eg, the adaptability to channel switched traffic requirements) are examined. For the given requirements and constraints, the reservation system appears to be the most attractive protocol.
NASA Technical Reports Server (NTRS)
Chiaramonte, Francis P.; Joshi, Jitendra A.
2004-01-01
This workshop was designed to bring the experts from the Advanced Human Support Technologies communities together to identify the most pressing and fruitful areas of research where success hinges on collaborative research between the two communities. Thus an effort was made to bring together experts in both advanced human support technologies and microgravity fluids, transport and reaction processes. Expertise was drawn from academia, national laboratories, and the federal government. The intent was to bring about a thorough exchange of ideas and develop recommendations to address the significant open design and operation issues for human support systems that are affected by fluid physics, transport and reaction processes. This report provides a summary of key discussions, findings, and recommendations.
Mokhtari Azar, Akbar; Ghadirpour Jelogir, Ali; Nabi Bidhendi, Gholam Reza; Zaredar, Narges
2011-04-01
No doubt, operator is one of the main fundaments in wastewater treatment plants. By identifying the inadequacies, the operator could be considered as an important key in treatment plant. Several methods are used for wastewater treatment that requires spending a lot of cost. However, all investments of treatment facilities are usable when the expected efficiency of the treatment plant was obtained. Using experienced operator, this goal is more easily accessible. In this research, the wastewater of an urban community contaminated with moderated, diluted and highly concentrated pollution has been treated using surface and deep aeration treatment method. Sampling of these pilots was performed during winter 2008 to summer 2009. The results indicate that all analyzed parameters were eliminated using activated sludge and surface aeration methods. However, in activated sludge and deep aeration methods in combination with suitable function of operator, more pollutants could be eliminated. Hence, existence of operator in wastewater treatment plants is the basic principle to achieve considered efficiency. Wastewater treatment system is not intelligent itself and that is the operator who can organize even an inefficient system by its continuous presence. The converse of this fact is also real. Despite the various units and appropriate design of wastewater treatment plant, without an operator, the studied process cannot be expected highly efficient. In places frequently affected by the shock of organic and hydraulic loads, the compensator tank is important to offset the wastewater treatment process. Finally, in regard to microbial parameters, existence of disinfection unit is very useful.
Implicit false-belief processing in the human brain.
Schneider, Dana; Slaughter, Virginia P; Becker, Stefanie I; Dux, Paul E
2014-11-01
Eye-movement patterns in 'Sally-Anne' tasks reflect humans' ability to implicitly process the mental states of others, particularly false-beliefs - a key theory of mind (ToM) operation. It has recently been proposed that an efficient ToM system, which operates in the absence of awareness (implicit ToM, iToM), subserves the analysis of belief-like states. This contrasts to consciously available belief processing, performed by the explicit ToM system (eToM). The frontal, temporal and parietal cortices are engaged when humans explicitly 'mentalize' about others' beliefs. However, the neural underpinnings of implicit false-belief processing and the extent to which they draw on networks involved in explicit general-belief processing are unknown. Here, participants watched 'Sally-Anne' movies while fMRI and eye-tracking measures were acquired simultaneously. Participants displayed eye-movements consistent with implicit false-belief processing. After independently localizing the brain areas involved in explicit general-belief processing, only the left anterior superior temporal sulcus and precuneus revealed greater blood-oxygen-level-dependent activity for false- relative to true-belief trials in our iToM paradigm. No such difference was found for the right temporal-parietal junction despite significant activity in this area. These findings fractionate brain regions that are associated with explicit general ToM reasoning and false-belief processing in the absence of awareness. Copyright © 2014 Elsevier Inc. All rights reserved.
Secret Key Crypto Implementations
NASA Astrophysics Data System (ADS)
Bertoni, Guido Marco; Melzani, Filippo
This chapter presents the algorithm selected in 2001 as the Advanced Encryption Standard. This algorithm is the base for implementing security and privacy based on symmetric key solutions in almost all new applications. Secret key algorithms are used in combination with modes of operation to provide different security properties. The most used modes of operation are presented in this chapter. Finally an overview of the different techniques of software and hardware implementations is given.
USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation
NASA Astrophysics Data System (ADS)
Hudnut, K. W.; Murray, J. R.; Minson, S. E.
2015-12-01
Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of standardization and adaptation to the existing framework of the ShakeAlert earthquake early warning system have been met, such that real-time GNSS processing and input to ShakeAlert is now routine and in use. Ongoing adaptation and testing of algorithms remain the last step towards fully operational incorporation of GNSS into ShakeAlert by USGS and its partners.
Acadia National Park ITS field operational test : key informant interviews
DOT National Transportation Integrated Search
2003-03-01
This document reflects the ideas and opinions of a group of key informants and stakeholders involved in the Field Operational Test of ITS components in and around Acadia National Park from 1999 through 2002. The stakeholders were involved in the plan...
Integrated vehicle-based safety systems : heavy-truck field operational test key findings report.
DOT National Transportation Integrated Search
2010-08-01
This document presents key findings from the heavy-truck field operational test conducted as : part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of : analyses performed by the University of Michigan Transporta...
Integrated vehicle-based safety systems light-vehicle field operational test key findings report.
DOT National Transportation Integrated Search
2011-01-01
This document presents key findings from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michigan Transportati...
Integrated vehicle-based safety systems light-vehicle field operational test key findings report.
DOT National Transportation Integrated Search
2011-01-01
"This document presents key findings from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michigan Transportat...
Joint Space Operations Center (JSpOC) Mission System (JMS)
NASA Astrophysics Data System (ADS)
Morton, M.; Roberts, T.
2011-09-01
US space capabilities benefit the economy, national security, international relationships, scientific discovery, and our quality of life. Realizing these space responsibilities is challenging not only because the space domain is increasingly congested, contested, and competitive but is further complicated by the legacy space situational awareness (SSA) systems approaching end of life and inability to provide the breadth of SSA and command and control (C2) of space forces in this challenging domain. JMS will provide the capabilities to effectively employ space forces in this challenging domain. Requirements for JMS were developed based on regular, on-going engagement with the warfighter. The use of DoD Architecture Framework (DoDAF) products facilitated requirements scoping and understanding and transferred directly to defining and documenting the requirements in the approved Capability Development Document (CDD). As part of the risk reduction efforts, the Electronic System Center (ESC) JMS System Program Office (SPO) fielded JMS Capability Package (CP) 0 which includes an initial service oriented architecture (SOA) and user defined operational picture (UDOP) along with force status, sensor management, and analysis tools. Development efforts are planned to leverage and integrate prototypes and other research projects from Defense Advanced Research Projects Agency, Air Force Research Laboratories, Space Innovation and Development Center, and Massachusetts Institute of Technology/Lincoln Laboratories. JMS provides a number of benefits to the space community: a reduction in operational “transaction time” to accomplish key activities and processes; ability to process the increased volume of metric observations from new sensors (e.g., SBSS, SST, Space Fence), as well as owner/operator ephemerides thus enhancing the high accuracy near-real-time catalog, and greater automation of SSA data sharing supporting collaboration with government, civil, commercial, and foreign entities. Continued success in JMS depends on continued support from across the space community. Key activities where community participation is essential include the C2 SSA Community of Interest (COI) development and refinement, creative strategies for faster, better, cheaper development, and defining the next set of capabilities.
A definition of high-level decisions in the engineering of systems
NASA Astrophysics Data System (ADS)
Powell, Robert Anthony
The role of the systems engineer defines that he or she be proactive and guide the program manager and their customers through their decisions to enhance the effectiveness of system development---producing faster, better, and cheaper systems. The present lack of coverage in literature on what these decisions are and how they relate to each other may be a contributing factor to the high rate of failure among system projects. At the onset of the system development process, decisions have an integral role in the design of a system that meets stakeholders' needs. This is apparent during the design and qualification of both the Development System and the Operational System. The performance, cost and schedule of the Development System affect the performance of the Operational System and are affected by decisions that influence physical elements of the Development System. The performance, cost, and schedule of the Operational System is affected by decisions that influence physical elements of the Operational System. Traditionally, product and process have been designed using know-how and trial and error. However, the empiricism of engineers and program managers is limited which can, and has led to costly mistakes. To date, very little research has explored decisions made in the engineering of a system. In government, literature exists on procurement processes for major system development; but in general literature on decisions, how they relate to each other, and the key information requirements within one of two systems and across the two systems is not readily available. This research hopes to improve the processes inherent in the engineering of systems. The primary focus of this research is on department of defense (DoD) military systems, specifically aerospace systems and may generalize more broadly. The result of this research is a process tool, a Decision System Model, which can be used by systems engineers to guide the program manager and their customers through the decisions about concurrently designing and qualifying both the Development and Operational systems.
NASA Astrophysics Data System (ADS)
Denaro, Simona; Dinh, Quang; Bizzi, Simone; Bernardi, Dario; Pavan, Sara; Castelletti, Andrea; Schippa, Leonardo; Soncini-Sessa, Rodolfo
2013-04-01
Water management through dams and reservoirs is worldwide necessary to support key human-related activities ranging from hydropower production to water allocation, and flood risk mitigation. Reservoir operations are commonly planned in order to maximize these objectives. However reservoirs strongly influence river geomorphic processes causing sediment deficit downstream, altering the flow regime, leading, often, to process of river bed incision: for instance the variations of river cross sections over few years can notably affect hydropower production, flood mitigation, water supply strategies and eco-hydrological processes of the freshwater ecosystem. The river Po (a major Italian river) has experienced severe bed incision in the last decades. For this reason infrastructure stability has been negatively affected, and capacity to derive water decreased, navigation, fishing and tourism are suffering economic damages, not to mention the impact on the environment. Our case study analyzes the management of Isola Serafini hydropower plant located on the main Po river course. The plant has a major impact to the geomorphic river processes downstream, affecting sediment supply, connectivity (stopping sediment upstream the dam) and transport capacity (altering the flow regime). Current operation policy aims at maximizing hydropower production neglecting the effects in term of geomorphic processes. A new improved policy should also consider controlling downstream river bed incision. The aim of this research is to find suitable modeling framework to identify an operating policy for Isola Serafini reservoir able to provide an optimal trade-off between these two conflicting objectives: hydropower production and river bed incision downstream. A multi-objective simulation-based optimization framework is adopted. The operating policy is parameterized as a piecewise linear function and the parameters optimized using an interactive response surface approach. Global and local response surface are comparatively assessed. Preliminary results show that a range of potentially interesting trade-off policies exist able to better control river bed incision downstream without significantly decreasing hydropower production.
Multi-party quantum key agreement with five-qubit brown states
NASA Astrophysics Data System (ADS)
Cai, Tao; Jiang, Min; Cao, Gang
2018-05-01
In this paper, we propose a multi-party quantum key agreement protocol with five-qubit brown states and single-qubit measurements. Our multi-party protocol ensures each participant to contribute equally to the agreement key. Each party performs three single-qubit unitary operations on three qubits of each brown state. Finally, by measuring brown states and decoding the measurement results, all participants can negotiate a shared secret key without classical bits exchange between them. With the analysis of security, our protocol demonstrates that it can resist against both outsider and participant attacks. Compared with other schemes, it also possesses a higher information efficiency. In terms of physical operation, it requires single-qubit measurements only which weakens the hardware requirements of participant and has a better operating flexibility.
Hassanain, Mazen; Zamakhshary, Mohammed; Farhat, Ghada; Al-Badr, Ahmed
2017-04-01
The objective of this study was to assess whether an intervention on process efficiency using the Lean methodology leads to improved utilization of the operating room (OR), as measured by key performance metrics of OR efficiency. A quasi-experimental design was used to test the impact of the intervention by comparing pre-intervention and post-intervention data on five key performance indicators. The ORs of 12 hospitals were selected across regions of the Kingdom of Saudi Arabia (KSA). The participants were patients treated at these hospitals during the study period. The intervention comprised the following: (i) creation of visual dashboards that enable starting the first case on time; (ii) use of computerized surgical list management; (iii) optimization of time allocation; (iv) development of an operating model with policies and procedures for the pre-anesthesia clinic; and (iv) creation of a governance structure with policies and procedures for day surgeries. The following were the main outcome measures: on-time start for the first case, room turnover times, percent of overrun cases, average weekly procedure volume and OR utilization. The hospital exhibited statistically significant improvements in the following performance metrics: on-time start for the first case, room turnover times and percent of overrun cases. A statistically significant difference in OR utilization or average weekly procedure volumes was not detected. The implementation of a Lean-based intervention targeting process efficiency applied in ORs across various KSA hospitals resulted in encouraging results on some metrics at some sites, suggesting that the approach has the potential to produce significant benefit in the future. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Europa Explorer Operational Scenarios Development
NASA Technical Reports Server (NTRS)
Lock, Robert E.; Pappalardo, Robert T.; Clark, Karla B.
2008-01-01
In 2007, NASA conducted four advanced mission concept studies for outer planets targets: Europa, Ganymede, Titan and Enceladus. The studies were conducted in close cooperation with the planetary science community. Of the four, the Europa Explorer Concept Study focused on refining mission options, science trades and implementation details for a potential flagship mission to Europa in the 2015 timeframe. A science definition team (SDT) was appointed by NASA to guide the study. A JPL-led engineering team worked closely with the science team to address 3 major focus areas: 1) credible cost estimates, 2) rationale and logical discussion of radiation risk and mitigation approaches, and 3) better definition and exploration of science operational scenario trade space. This paper will address the methods and results of the collaborative process used to develop Europa Explorer operations scenarios. Working in concert with the SDT, and in parallel with the SDT's development of a science value matrix, key mission capabilities and constraints were challenged by the science and engineering members of the team. Science goals were advanced and options were considered for observation scenarios. Data collection and return strategies were tested via simulation, and mission performance was estimated and balanced with flight and ground system resources and science priorities. The key to this successful collaboration was a concurrent development environment in which all stakeholders could rapidly assess the feasibility of strategies for their success in the full system context. Issues of science and instrument compatibility, system constraints, and mission opportunities were treated analytically and objectively leading to complementary strategies for observation and data return. Current plans are that this approach, as part of the system engineering process, will continue as the Europa Explorer Concept Study moves toward becoming a development project.
NASA Technical Reports Server (NTRS)
Kyle, R. G.
1972-01-01
Information transfer between the operator and computer-generated display systems is an area where the human factors engineer discovers little useful design data relating human performance to system effectiveness. This study utilized a computer-driven, cathode-ray-tube graphic display to quantify human response speed in a sequential information processing task. The performance criteria was response time to sixteen cell elements of a square matrix display. A stimulus signal instruction specified selected cell locations by both row and column identification. An equal probable number code, from one to four, was assigned at random to the sixteen cells of the matrix and correspondingly required one of four, matched keyed-response alternatives. The display format corresponded to a sequence of diagnostic system maintenance events, that enable the operator to verify prime system status, engage backup redundancy for failed subsystem components, and exercise alternate decision-making judgements. The experimental task bypassed the skilled decision-making element and computer processing time, in order to determine a lower bound on the basic response speed for given stimulus/response hardware arrangement.