Miniaturized Power Processing Unit Study: A Cubesat Electric Propulsion Technology Enabler Project
NASA Technical Reports Server (NTRS)
Ghassemieh, Shakib M.
2014-01-01
This study evaluates High Voltage Power Processing Unit (PPU) technology and driving requirements necessary to enable the Microfluidic Electric Propulsion technology research and development by NASA and university partners. This study provides an overview of the state of the art PPU technology with recommendations for technology demonstration projects and missions for NASA to pursue.
USDA-ARS?s Scientific Manuscript database
The Cotton Chemistry and Utilization Research Unit is part of the Agricultural Research Service, the U.S. Department of Agriculture's chief scientific in-house research agency. The Research Unit develops new processes, applications and product enabling technologies which facilitate the expanded use ...
A Case Study of Enabling Factors in the Technology Integration Change Process
ERIC Educational Resources Information Center
Hsu, Pi-Sui; Sharma, Priya
2008-01-01
The purpose of this qualitative case study was to analyze enabling factors in the technology integration change process in a multi-section science methods course, SCIED 408 (pseudonym), from 1997 to 2003 at a large northeastern university in the United States. We used two major data collection methods, in-depth interviewing and document reviews.…
High Power Silicon Carbide (SiC) Power Processing Unit Development
NASA Technical Reports Server (NTRS)
Scheidegger, Robert J.; Santiago, Walter; Bozak, Karin E.; Pinero, Luis R.; Birchenough, Arthur G.
2015-01-01
NASA GRC successfully designed, built and tested a technology-push power processing unit for electric propulsion applications that utilizes high voltage silicon carbide (SiC) technology. The development specifically addresses the need for high power electronics to enable electric propulsion systems in the 100s of kilowatts. This unit demonstrated how high voltage combined with superior semiconductor components resulted in exceptional converter performance.
Jeong, Kyeong-Min; Kim, Hee-Seung; Hong, Sung-In; Lee, Sung-Keun; Jo, Na-Young; Kim, Yong-Soo; Lim, Hong-Gi; Park, Jae-Hyeung
2012-10-08
Speed enhancement of integral imaging based incoherent Fourier hologram capture using a graphic processing unit is reported. Integral imaging based method enables exact hologram capture of real-existing three-dimensional objects under regular incoherent illumination. In our implementation, we apply parallel computation scheme using the graphic processing unit, accelerating the processing speed. Using enhanced speed of hologram capture, we also implement a pseudo real-time hologram capture and optical reconstruction system. The overall operation speed is measured to be 1 frame per second.
Technical options for processing additional light tight oil volumes within the United States
2015-01-01
This report examines technical options for processing additional LTO volumes within the United States. Domestic processing of additional LTO would enable an increase in petroleum product exports from the United States, already the world’s largest net exporter of petroleum products. Unlike crude oil, products are not subject to export limitations or licensing requirements. While this is one possible approach to absorbing higher domestic LTO production in the absence of a relaxation of current limitations on crude exports, domestic LTO would have to be priced at a level required to encourage additional LTO runs at existing refinery units, debottlenecking, or possible additions of processing capacity.
FINAL REPORT FOR VERIFICATION OF THE METAL FINISHING FACILITY POLLUTION PREVENTION TOOL (MFFPPT)
The United States Environmental Protection Agency (USEPA) has prepared a computer process simulation package for the metal finishing industry that enables users to predict process outputs based upon process inputs and other operating conditions. This report documents the developm...
Challenges in Educational Modelling: Expressiveness of IMS Learning Design
ERIC Educational Resources Information Center
Caeiro-Rodriguez, Manuel; Anido-Rifon, Luis; Llamas-Nistal, Martin
2010-01-01
Educational Modelling Languages (EMLs) have been proposed to enable the authoring of models of "learning units" (e.g., courses, lessons, lab practices, seminars) covering the broad variety of pedagogical approaches. In addition, some EMLs have been proposed as computational languages that support the processing of learning unit models by…
A GPU-Based Wide-Band Radio Spectrometer
NASA Astrophysics Data System (ADS)
Chennamangalam, Jayanth; Scott, Simon; Jones, Glenn; Chen, Hong; Ford, John; Kepley, Amanda; Lorimer, D. R.; Nie, Jun; Prestage, Richard; Roshi, D. Anish; Wagner, Mark; Werthimer, Dan
2014-12-01
The graphics processing unit has become an integral part of astronomical instrumentation, enabling high-performance online data reduction and accelerated online signal processing. In this paper, we describe a wide-band reconfigurable spectrometer built using an off-the-shelf graphics processing unit card. This spectrometer, when configured as a polyphase filter bank, supports a dual-polarisation bandwidth of up to 1.1 GHz (or a single-polarisation bandwidth of up to 2.2 GHz) on the latest generation of graphics processing units. On the other hand, when configured as a direct fast Fourier transform, the spectrometer supports a dual-polarisation bandwidth of up to 1.4 GHz (or a single-polarisation bandwidth of up to 2.8 GHz).
Before Studying in the Humanities, What Do Students Need?
ERIC Educational Resources Information Center
Zemits, Birut; Hodson, Linda
2016-01-01
What enables success for students studying in the humanities can be a contested space; dependent not only on the view taken on the content and purpose of specific subjects, but also on the nature of teaching and learning. This paper examines the process of redeveloping an elective unit in a Tertiary Enabling Programme to prepare students for study…
Microchannel Distillation of JP-8 Jet Fuel for Sulfur Content Reduction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Feng; Stenkamp, Victoria S.; TeGrotenhuis, Ward E.
2006-09-16
In microchannel based distillation processes, thin vapor and liquid films are contacted in small channels where mass transfer is diffusion-limited. The microchannel architecture enables improvements in distillation processes. A shorter height equivalent of a theoretical plate (HETP) and therefore a more compact distillation unit can be achieved. A microchannel distillation unit was used to produce a light fraction of JP-8 fuel with reduced sulfur content for use as feed to produce fuel-cell grade hydrogen. The HETP of the microchannel unit is discussed, as well as the effects of process conditions such as feed temperature, flow rate, and reflux ratio.
High performance hybrid functional Petri net simulations of biological pathway models on CUDA.
Chalkidis, Georgios; Nagasaki, Masao; Miyano, Satoru
2011-01-01
Hybrid functional Petri nets are a wide-spread tool for representing and simulating biological models. Due to their potential of providing virtual drug testing environments, biological simulations have a growing impact on pharmaceutical research. Continuous research advancements in biology and medicine lead to exponentially increasing simulation times, thus raising the demand for performance accelerations by efficient and inexpensive parallel computation solutions. Recent developments in the field of general-purpose computation on graphics processing units (GPGPU) enabled the scientific community to port a variety of compute intensive algorithms onto the graphics processing unit (GPU). This work presents the first scheme for mapping biological hybrid functional Petri net models, which can handle both discrete and continuous entities, onto compute unified device architecture (CUDA) enabled GPUs. GPU accelerated simulations are observed to run up to 18 times faster than sequential implementations. Simulating the cell boundary formation by Delta-Notch signaling on a CUDA enabled GPU results in a speedup of approximately 7x for a model containing 1,600 cells.
Shi, Yulin; Veidenbaum, Alexander V; Nicolau, Alex; Xu, Xiangmin
2015-01-15
Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post hoc processing and analysis. Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22× speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. Copyright © 2014 Elsevier B.V. All rights reserved.
Shi, Yulin; Veidenbaum, Alexander V.; Nicolau, Alex; Xu, Xiangmin
2014-01-01
Background Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post-hoc processing and analysis. New Method Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. Results We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22x speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. Comparison with Existing Method(s) To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Conclusions Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. PMID:25277633
Matrix decomposition graphics processing unit solver for Poisson image editing
NASA Astrophysics Data System (ADS)
Lei, Zhao; Wei, Li
2012-10-01
In recent years, gradient-domain methods have been widely discussed in the image processing field, including seamless cloning and image stitching. These algorithms are commonly carried out by solving a large sparse linear system: the Poisson equation. However, solving the Poisson equation is a computational and memory intensive task which makes it not suitable for real-time image editing. A new matrix decomposition graphics processing unit (GPU) solver (MDGS) is proposed to settle the problem. A matrix decomposition method is used to distribute the work among GPU threads, so that MDGS will take full advantage of the computing power of current GPUs. Additionally, MDGS is a hybrid solver (combines both the direct and iterative techniques) and has two-level architecture. These enable MDGS to generate identical solutions with those of the common Poisson methods and achieve high convergence rate in most cases. This approach is advantageous in terms of parallelizability, enabling real-time image processing, low memory-taken and extensive applications.
Advanced I&C for Fault-Tolerant Supervisory Control of Small Modular Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Daniel G.
In this research, we have developed a supervisory control approach to enable automated control of SMRs. By design the supervisory control system has an hierarchical, interconnected, adaptive control architecture. A considerable advantage to this architecture is that it allows subsystems to communicate at different/finer granularity, facilitates monitoring of process at the modular and plant levels, and enables supervisory control. We have investigated the deployment of automation, monitoring, and data collection technologies to enable operation of multiple SMRs. Each unit's controller collects and transfers information from local loops and optimize that unit’s parameters. Information is passed from the each SMR unitmore » controller to the supervisory controller, which supervises the actions of SMR units and manage plant processes. The information processed at the supervisory level will provide operators the necessary information needed for reactor, unit, and plant operation. In conjunction with the supervisory effort, we have investigated techniques for fault-tolerant networks, over which information is transmitted between local loops and the supervisory controller to maintain a safe level of operational normalcy in the presence of anomalies. The fault-tolerance of the supervisory control architecture, the network that supports it, and the impact of fault-tolerance on multi-unit SMR plant control has been a second focus of this research. To this end, we have investigated the deployment of advanced automation, monitoring, and data collection and communications technologies to enable operation of multiple SMRs. We have created a fault-tolerant multi-unit SMR supervisory controller that collects and transfers information from local loops, supervise their actions, and adaptively optimize the controller parameters. The goal of this research has been to develop the methodologies and procedures for fault-tolerant supervisory control of small modular reactors. To achieve this goal, we have identified the following objectives. These objective are an ordered approach to the research: I) Development of a supervisory digital I&C system II) Fault-tolerance of the supervisory control architecture III) Automated decision making and online monitoring.« less
Integrated Process Modeling-A Process Validation Life Cycle Companion.
Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph
2017-10-17
During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.
78 FR 41025 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-09
... that import seed for cleaning or processing, to enter into compliance agreements with APHIS. This... other information activities to enable the importation of seeds for cleaning and processing so that they...: Imported Seed and Screening. OMB Control Number: 0579-0124. Summary of Collection: The United States...
High Temperature Boost (HTB) Power Processing Unit (PPU) Formulation Study
NASA Technical Reports Server (NTRS)
Chen, Yuan; Bradley, Arthur T.; Iannello, Christopher J.; Carr, Gregory A.; Mohammad, Mojarradi M.; Hunter, Don J.; DelCastillo, Linda; Stell, Christopher B.
2013-01-01
This technical memorandum is to summarize the Formulation Study conducted during fiscal year 2012 on the High Temperature Boost (HTB) Power Processing Unit (PPU). The effort is authorized and supported by the Game Changing Technology Division, NASA Office of the Chief Technologist. NASA center participation during the formulation includes LaRC, KSC and JPL. The Formulation Study continues into fiscal year 2013. The formulation study has focused on the power processing unit. The team has proposed a modular, power scalable, and new technology enabled High Temperature Boost (HTB) PPU, which has 5-10X improvement in PPU specific power/mass and over 30% in-space solar electric system mass saving.
Prompting Children to Reason Proportionally: Processing Discrete Units as Continuous Amounts
ERIC Educational Resources Information Center
Boyer, Ty W.; Levine, Susan C.
2015-01-01
Recent studies reveal that children can solve proportional reasoning problems presented with continuous amounts that enable intuitive strategies by around 6 years of age but have difficulties with problems presented with discrete units that tend to elicit explicit count-and-match strategies until at least 10 years of age. The current study tests…
Using Sensor Web Processes and Protocols to Assimilate Satellite Data into a Forecast Model
NASA Technical Reports Server (NTRS)
Goodman, H. Michael; Conover, Helen; Zavodsky, Bradley; Maskey, Manil; Jedlovec, Gary; Regner, Kathryn; Li, Xiang; Lu, Jessica; Botts, Mike; Berthiau, Gregoire
2008-01-01
The goal of the Sensor Management Applied Research Technologies (SMART) On-Demand Modeling project is to develop and demonstrate the readiness of the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities to integrate both space-based Earth observations and forecast model output into new data acquisition and assimilation strategies. The project is developing sensor web-enabled processing plans to assimilate Atmospheric Infrared Sounding (AIRS) satellite temperature and moisture retrievals into a regional Weather Research and Forecast (WRF) model over the southeastern United States.
The histone shuffle: histone chaperones in an energetic dance
Das, Chandrima; Tyler, Jessica K.; Churchill, Mair E.A.
2014-01-01
Our genetic information is tightly packaged into a rather ingenious nucleoprotein complex called chromatin in a manner that enables it to be rapidly accessed during genomic processes. Formation of the nucleosome, which is the fundamental unit of chromatin, occurs via a stepwise process that is reversed to enable the disassembly of nucleosomes. Histone chaperone proteins have prominent roles in facilitating these processes as well as in replacing old histones with new canonical histones or histone variants during the process of histone exchange. Recent structural, biophysical and biochemical studies have begun to shed light on the molecular mechanisms whereby histone chaperones promote chromatin assembly, disassembly and histone exchange to facilitate DNA replication, repair and transcription. PMID:20444609
A Hydrogen Containment Process for Nuclear Thermal Engine Ground testing
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Stewart, Eric; Canabal, Francisco
2016-01-01
The objective of this study is to propose a new total hydrogen containment process to enable the testing required for NTP engine development. This H2 removal process comprises of two unit operations: an oxygen-rich burner and a shell-and-tube type of heat exchanger. This new process is demonstrated by simulation of the steady state operation of the engine firing at nominal conditions.
Wireless structural monitoring for homeland security applications
NASA Astrophysics Data System (ADS)
Kiremidjian, Garo K.; Kiremidjian, Anne S.; Lynch, Jerome P.
2004-07-01
This paper addresses the development of a robust, low-cost, low power, and high performance autonomous wireless monitoring system for civil assets such as large facilities, new construction, bridges, dams, commercial buildings, etc. The role of the system is to identify the onset, development, location and severity of structural vulnerability and damage. The proposed system represents an enabling infrastructure for addressing structural vulnerabilities specifically associated with homeland security. The system concept is based on dense networks of "intelligent" wireless sensing units. The fundamental properties of a wireless sensing unit include: (a) interfaces to multiple sensors for measuring structural and environmental data (such as acceleration, displacements, pressure, strain, material degradation, temperature, gas agents, biological agents, humidity, corrosion, etc.); (b) processing of sensor data with embedded algorithms for assessing damage and environmental conditions; (c) peer-to-peer wireless communications for information exchange among units(thus enabling joint "intelligent" processing coordination) and storage of data and processed information in servers for information fusion; (d) ultra low power operation; (e) cost-effectiveness and compact size through the use of low-cost small-size off-the-shelf components. An integral component of the overall system concept is a decision support environment for interpretation and dissemination of information to various decision makers.
Genewein, U; Jakob, M; Bingisser, R; Burla, S; Heberer, M
2009-02-01
Mission and organization of emergency units were analysed to understand the underlying principles and concepts. The recent literature (2000-2007) on organizational structures and functional concepts of clinical emergency units was reviewed. An organizational portfolio based on the criteria specialization (presence of medical specialists on the emergency unit) and integration (integration of the emergency unit into the hospital structure) was established. The resulting organizational archetypes were comparatively assessed based on established efficiency criteria (efficiency of resource utilization, process efficiency, market efficiency). Clinical emergency units differ with regard to autonomy (within the hospital structure), range of services and service depth (horizontal and vertical integration). The "specialization"-"integration"-portfolio enabled the definition of typical organizational patterns (so-called archetypes): profit centres primarily driven by economic objectives, service centres operating on the basis of agreements with the hospital board, functional clinical units integrated into medical specialty units (e.g., surgery, gynaecology) and modular organizations characterized by small emergency teams that would call specialists immediately after triage and initial diagnostic. There is no "one fits all" concept for the organization of clinical emergency units. Instead, a number of well characterized organizational concepts are available enabling a rational choice based on a hospital's mission and demand.
Tankam, Patrice; Santhanam, Anand P.; Lee, Kye-Sung; Won, Jungeun; Canavesi, Cristina; Rolland, Jannick P.
2014-01-01
Abstract. Gabor-domain optical coherence microscopy (GD-OCM) is a volumetric high-resolution technique capable of acquiring three-dimensional (3-D) skin images with histological resolution. Real-time image processing is needed to enable GD-OCM imaging in a clinical setting. We present a parallelized and scalable multi-graphics processing unit (GPU) computing framework for real-time GD-OCM image processing. A parallelized control mechanism was developed to individually assign computation tasks to each of the GPUs. For each GPU, the optimal number of amplitude-scans (A-scans) to be processed in parallel was selected to maximize GPU memory usage and core throughput. We investigated five computing architectures for computational speed-up in processing 1000×1000 A-scans. The proposed parallelized multi-GPU computing framework enables processing at a computational speed faster than the GD-OCM image acquisition, thereby facilitating high-speed GD-OCM imaging in a clinical setting. Using two parallelized GPUs, the image processing of a 1×1×0.6 mm3 skin sample was performed in about 13 s, and the performance was benchmarked at 6.5 s with four GPUs. This work thus demonstrates that 3-D GD-OCM data may be displayed in real-time to the examiner using parallelized GPU processing. PMID:24695868
Tankam, Patrice; Santhanam, Anand P; Lee, Kye-Sung; Won, Jungeun; Canavesi, Cristina; Rolland, Jannick P
2014-07-01
Gabor-domain optical coherence microscopy (GD-OCM) is a volumetric high-resolution technique capable of acquiring three-dimensional (3-D) skin images with histological resolution. Real-time image processing is needed to enable GD-OCM imaging in a clinical setting. We present a parallelized and scalable multi-graphics processing unit (GPU) computing framework for real-time GD-OCM image processing. A parallelized control mechanism was developed to individually assign computation tasks to each of the GPUs. For each GPU, the optimal number of amplitude-scans (A-scans) to be processed in parallel was selected to maximize GPU memory usage and core throughput. We investigated five computing architectures for computational speed-up in processing 1000×1000 A-scans. The proposed parallelized multi-GPU computing framework enables processing at a computational speed faster than the GD-OCM image acquisition, thereby facilitating high-speed GD-OCM imaging in a clinical setting. Using two parallelized GPUs, the image processing of a 1×1×0.6 mm3 skin sample was performed in about 13 s, and the performance was benchmarked at 6.5 s with four GPUs. This work thus demonstrates that 3-D GD-OCM data may be displayed in real-time to the examiner using parallelized GPU processing.
H-theorem and Maxwell demon in quantum physics
NASA Astrophysics Data System (ADS)
Kirsanov, N. S.; Lebedev, A. V.; Sadovskyy, I. A.; Suslov, M. V.; Vinokur, V. M.; Blatter, G.; Lesovik, G. B.
2018-02-01
The Second Law of Thermodynamics states that temporal evolution of an isolated system occurs with non-diminishing entropy. In quantum realm, this holds for energy-isolated systems the evolution of which is described by the so-called unital quantum channel. The entropy of a system evolving in a non-unital quantum channel can, in principle, decrease. We formulate a general criterion of unitality for the evolution of a quantum system, enabling a simple and rigorous approach for finding and identifying the processes accompanied by decreasing entropy in energy-isolated systems. We discuss two examples illustrating our findings, the quantum Maxwell demon and heating-cooling process within a two-qubit system.
Sequential microfluidic droplet processing for rapid DNA extraction.
Pan, Xiaoyan; Zeng, Shaojiang; Zhang, Qingquan; Lin, Bingcheng; Qin, Jianhua
2011-11-01
This work describes a novel droplet-based microfluidic device, which enables sequential droplet processing for rapid DNA extraction. The microdevice consists of a droplet generation unit, two reagent addition units and three droplet splitting units. The loading/washing/elution steps required for DNA extraction were carried out by sequential microfluidic droplet processing. The movement of superparamagnetic beads, which were used as extraction supports, was controlled with magnetic field. The microdevice could generate about 100 droplets per min, and it took about 1 min for each droplet to perform the whole extraction process. The extraction efficiency was measured to be 46% for λ-DNA, and the extracted DNA could be used in subsequent genetic analysis such as PCR, demonstrating the potential of the device for fast DNA extraction. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Kang, Sungil; Roh, Annah; Nam, Bodam; Hong, Hyunki
2011-12-01
This paper presents a novel vision system for people detection using an omnidirectional camera mounted on a mobile robot. In order to determine regions of interest (ROI), we compute a dense optical flow map using graphics processing units, which enable us to examine compliance with the ego-motion of the robot in a dynamic environment. Shape-based classification algorithms are employed to sort ROIs into human beings and nonhumans. The experimental results show that the proposed system detects people more precisely than previous methods.
Centrifugal microfluidic platforms: advanced unit operations and applications.
Strohmeier, O; Keller, M; Schwemmer, F; Zehnle, S; Mark, D; von Stetten, F; Zengerle, R; Paust, N
2015-10-07
Centrifugal microfluidics has evolved into a mature technology. Several major diagnostic companies either have products on the market or are currently evaluating centrifugal microfluidics for product development. The fields of application are widespread and include clinical chemistry, immunodiagnostics and protein analysis, cell handling, molecular diagnostics, as well as food, water, and soil analysis. Nevertheless, new fluidic functions and applications that expand the possibilities of centrifugal microfluidics are being introduced at a high pace. In this review, we first present an up-to-date comprehensive overview of centrifugal microfluidic unit operations. Then, we introduce the term "process chain" to review how these unit operations can be combined for the automation of laboratory workflows. Such aggregation of basic functionalities enables efficient fluidic design at a higher level of integration. Furthermore, we analyze how novel, ground-breaking unit operations may foster the integration of more complex applications. Among these are the storage of pneumatic energy to realize complex switching sequences or to pump liquids radially inward, as well as the complete pre-storage and release of reagents. In this context, centrifugal microfluidics provides major advantages over other microfluidic actuation principles: the pulse-free inertial liquid propulsion provided by centrifugal microfluidics allows for closed fluidic systems that are free of any interfaces to external pumps. Processed volumes are easily scalable from nanoliters to milliliters. Volume forces can be adjusted by rotation and thus, even for very small volumes, surface forces may easily be overcome in the centrifugal gravity field which enables the efficient separation of nanoliter volumes from channels, chambers or sensor matrixes as well as the removal of any disturbing bubbles. In summary, centrifugal microfluidics takes advantage of a comprehensive set of fluidic unit operations such as liquid transport, metering, mixing and valving. The available unit operations cover the entire range of automated liquid handling requirements and enable efficient miniaturization, parallelization, and integration of assays.
W. T. Zakrzewski; M. Penner; D. W. MacFarlane
2007-01-01
As part of the Canada-United States Great Lakes Stem Profile Modelling Project, established to support the local timber production process and to enable cross-border comparisons of timber volumes, here we present results of fitting Zakrzewski's (1999) stem profile model for red pine (Pinus resinosa Ait.) growing in Michigan, United States, and...
Continuous Manufacturing of Recombinant Therapeutic Proteins: Upstream and Downstream Technologies.
Patil, Rohan; Walther, Jason
2017-03-07
Continuous biomanufacturing of recombinant therapeutic proteins offers several potential advantages over conventional batch processing, including reduced cost of goods, more flexible and responsive manufacturing facilities, and improved and consistent product quality. Although continuous approaches to various upstream and downstream unit operations have been considered and studied for decades, in recent years interest and application have accelerated. Researchers have achieved increasingly higher levels of process intensification, and have also begun to integrate different continuous unit operations into larger, holistically continuous processes. This review first discusses approaches for continuous cell culture, with a focus on perfusion-enabling cell separation technologies including gravitational, centrifugal, and acoustic settling, as well as filtration-based techniques. We follow with a review of various continuous downstream unit operations, covering categories such as clarification, chromatography, formulation, and viral inactivation and filtration. The review ends by summarizing case studies of integrated and continuous processing as reported in the literature.
Kesavachandran, C; Rastogi, S K; Das, Mohan; Khan, Asif M
2006-07-01
Workers in information technology (IT)-enabled services like business process outsourcing and call centers working with visual display units are reported to have various health and psycho-social disorders. Evidence from previously published studies in peer- reviewed journals and internet sources were examined to explore health disorders and psycho-social problems among personnel employed in IT-based services, for a systematic review on the topic. In addition, authors executed a questionnaire- based pilot study. The available literature and the pilot study, both suggest health disorders and psychosocial problems among workers of business process outsourcing. The details are discussed in the review.
Processing-in-Memory Enabled Graphics Processors for 3D Rendering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Chenhao; Song, Shuaiwen; Wang, Jing
2017-02-06
The performance of 3D rendering of Graphics Processing Unit that convents 3D vector stream into 2D frame with 3D image effects significantly impact users’ gaming experience on modern computer systems. Due to the high texture throughput in 3D rendering, main memory bandwidth becomes a critical obstacle for improving the overall rendering performance. 3D stacked memory systems such as Hybrid Memory Cube (HMC) provide opportunities to significantly overcome the memory wall by directly connecting logic controllers to DRAM dies. Based on the observation that texel fetches significantly impact off-chip memory traffic, we propose two architectural designs to enable Processing-In-Memory based GPUmore » for efficient 3D rendering.« less
Malba, V.
1998-11-10
A manufacturable process for fabricating electrical interconnects which extend from a top surface of an integrated circuit chip to a sidewall of the chip using laser pantography to pattern three dimensional interconnects. The electrical interconnects may be of an L-connect or L-shaped type. The process implements three dimensional (3D) stacking by moving the conventional bond or interface pads on a chip to the sidewall of the chip. Implementation of the process includes: (1) holding individual chips for batch processing, (2) depositing a dielectric passivation layer on the top and sidewalls of the chips, (3) opening vias in the dielectric, (4) forming the interconnects by laser pantography, and (5) removing the chips from the holding means. The process enables low cost manufacturing of chips with bond pads on the sidewalls, which enables stacking for increased performance, reduced space, and higher functional per unit volume. 3 figs.
Malba, Vincent
1998-01-01
A manufacturable process for fabricating electrical interconnects which extend from a top surface of an integrated circuit chip to a sidewall of the chip using laser pantography to pattern three dimensional interconnects. The electrical interconnects may be of an L-connect or L-shaped type. The process implements three dimensional (3D) stacking by moving the conventional bond or interface pads on a chip to the sidewall of the chip. Implementation of the process includes: 1) holding individual chips for batch processing, 2) depositing a dielectric passivation layer on the top and sidewalls of the chips, 3) opening vias in the dielectric, 4) forming the interconnects by laser pantography, and 5) removing the chips from the holding means. The process enables low cost manufacturing of chips with bond pads on the sidewalls, which enables stacking for increased performance, reduced space, and higher functional per unit volume.
SMIF capability at Intel Mask Operation improves yield
NASA Astrophysics Data System (ADS)
Dam, Thuc H.; Pekny, Matt; Millino, Jim; Luu, Gibson; Melwani, Nitesh; Venkatramani, Aparna; Tavassoli, Malahat
2003-08-01
At Intel Mask Operations (IMO), Standard Mechanical Interface (SMIF) processing has been employed to reduce environmental particle contamination from manual handling-related activities. SMIF handling entailed the utilization of automated robotic transfers of photoblanks/reticles between SMIF pods, whereas conventional handling utilized manual pick transfers of masks between SMIF pods with intermediate storage in Toppan compacts. The SMIF-enabling units in IMO's process line included: (1) coater, (2) exposure, (3) developer, (4) dry etcher, and (5) inspection. Each unit is equipped with automated I/O port, environmentally enclosed processing chamber, and SMIF pods. Yield metrics were utilized to demonstrate the effectiveness and advantages of SMIF processing compared to manual processing. The areas focused in this paper were blank resist coating, binary front-end reticle processing and 2nd level PSM reticle processing. Results obtained from the investigation showed yield improvements in these areas.
An efficient start-up circuitry for de-energized ultra-low power energy harvesting systems
NASA Astrophysics Data System (ADS)
Hörmann, Leander B.; Berger, Achim; Salzburger, Lukas; Priller, Peter; Springer, Andreas
2015-05-01
Cyber-physical systems often include small wireless devices to measure physical quantities or control a technical process. These devices need a self-sufficient power supply because no wired infrastructure is available. Their operational time can be enhanced by energy harvesting systems. However, the convertible power is often limited and discontinuous which requires the need of an energy storage unit. If this unit (and thus the whole system) is de-energized, the start-up process may take a significant amount of time because of an inefficient energy harvesting process. Therefore, this paper presents a system which enables a safe and fast start-up from the de-energized state.
Model of environmental life cycle assessment for coal mining operations.
Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian
2016-08-15
This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. Copyright © 2016 Elsevier B.V. All rights reserved.
Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz
2012-01-01
Background The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Materials and methods Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. Results The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. Discussion These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement. PMID:22044958
Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz
2012-01-01
The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement.
An Exchange-Only Qubit in Isotopically Enriched 28Si
NASA Astrophysics Data System (ADS)
Gyure, Mark
2015-03-01
We demonstrate coherent manipulation and universal control of a qubit composed of a triple quantum dot implemented in an isotopically enhanced Si/SiGe heterostructure, which requires no local AC or DC magnetic fields for operation. Strong control over tunnel rates is enabled by a dopantless, accumulation-only device design, and an integrated measurement dot enables single-shot measurement. Reduction of magnetic noise is achieved via isotopic purification of the silicon quantum well. We demonstrate universal control using composite pulses and employ these pulses for spin-echo-type sequences to measure both magnetic noise and charge noise. The noise measured is sufficiently low to enable the long pulse sequences required for exchange-only quantum information processing. Sponsored by United States Department of Defense. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressly or implied, of the United States Department of Defense or the U.S. Government. Approved for public release, distribution unlimited.
Hansen, M.C.; Egorov, Alexey; Roy, David P.; Potapov, P.; Ju, J.; Turubanova, S.; Kommareddy, I.; Loveland, Thomas R.
2011-01-01
Vegetation Continuous Field (VCF) layers of 30 m percent tree cover, bare ground, other vegetation and probability of water were derived for the conterminous United States (CONUS) using Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data sets from the Web-Enabled Landsat Data (WELD) project. Turnkey approaches to land cover characterization were enabled due to the systematic WELD Landsat processing, including conversion of digital numbers to calibrated top of atmosphere reflectance and brightness temperature, cloud masking, reprojection into a continental map projection and temporal compositing. Annual, seasonal and monthly WELD composites for 2008 were used as spectral inputs to a bagged regression and classification tree procedure using a large training data set derived from very high spatial resolution imagery and available ancillary data. The results illustrate the ability to perform Landsat land cover characterizations at continental scales that are internally consistent while retaining local spatial and thematic detail.
Single String Integration Test of the High Voltage Hall Accelerator System
NASA Technical Reports Server (NTRS)
Kamhawi, Hani; Haag, Thomas W.; Huang, Wensheng; Pinero, Luis; Peterson, Todd; Shastry, Rohit
2013-01-01
HiVHAc Task Objectives:-Develop and demonstrate low-power, long-life Hall thruster technology to enable cost effective EP for Discovery-class missions-Advance the TRL level of potential power processing units and xenon feed systems to integrate with the HiVHAc thruster.
Leadership and Logistics Meeting the Army’s Expeditionary Requirements of Today and 2025
2016-02-16
pushback . Many will blame the new system if they see a drop in readiness numbers. Leader involvement in this process will be critical. They will...unit would pushback and leaders would need to reinforce the change process . A top down vision with innovators from below enabled the change. 46 By...include leading Army organizations in supply chain management, system design and development, business process improvement, and Lean Six Sigma. Mr
Simulation Based Exploration of Critical Zone Dynamics in Intensively Managed Landscapes
NASA Astrophysics Data System (ADS)
Kumar, P.
2017-12-01
The advent of high-resolution measurements of topographic and (vertical) vegetation features using areal LiDAR are enabling us to resolve micro-scale ( 1m) landscape structural characteristics over large areas. Availability of hyperspectral measurements is further augmenting these LiDAR data by enabling the biogeochemical characterization of vegetation and soils at unprecedented spatial resolutions ( 1-10m). Such data have opened up novel opportunities for modeling Critical Zone processes and exploring questions that were not possible before. We show how an integrated 3-D model at 1m grid resolution can enable us to resolve micro-topographic and ecological dynamics and their control on hydrologic and biogeochemical processes over large areas. We address the computational challenge of such detailed modeling by exploiting hybrid CPU and GPU computing technologies. We show results of moisture, biogeochemical, and vegetation dynamics from studies in the Critical Zone Observatory for Intensively managed Landscapes (IMLCZO) in the Midwestern United States.
NASA Astrophysics Data System (ADS)
Repcheck, Randall J.
2010-09-01
The United States Federal Aviation Administration’s Office of Commercial Space Transportation(AST) authorizes the launch and reentry of expendable and reusable launch vehicles and the operation of launch and reentry sites by United States citizens or within the United States. It authorizes these activities consistent with public health and safety, the safety of property, and the national security and foreign policy interests of the United States. In addition to its safety role, AST has the role to encourage, facilitate, and promote commercial space launches and reentries by the private sector. AST’s promotional role includes, among other things, the development of information of interest to industry, the sharing of information of interest through a variety of methods, and serving as an advocate for Commercial Space Transportation within the United States government. This dual safety and promotion role is viewed by some as conflicting. AST views these two roles as complementary, and important for the current state of commercial space transportation. This paper discusses how maintaining a sound safety decision-making process, maintaining a strong safety culture, and taking steps to avoid complacency can together enable safe and successful commercial space transportation.
Developing Modular and Adaptable Courseware Using TeachML.
ERIC Educational Resources Information Center
Wehner, Frank; Lorz, Alexander
This paper presents the use of an XML grammar for two complementary projects--CHAMELEON (Cooperative Hypermedia Adaptive MultimEdia Learning Objects) and EIT (Enabling Informal Teamwork). Areas of applications are modular courseware documents and the collaborative authoring process of didactical units. A number of requirements for a suitable…
Peer Observation of Teaching: Reflections of an Early Career Academic
ERIC Educational Resources Information Center
Eri, Rajaraman
2014-01-01
Peer observation of teaching (POT) is a reciprocal process where a peer observes another's teaching (classroom, virtual, on-line or even teaching resource such as unit outlines, assignments). Peers then provide constructive feedbacks that would enable teaching professional development through the mirror of critical reflection by both the observer…
The rise of biosimilars: How they got here and where they are going.
Patel, Dhiren; Gillis, Colin; Naggar, Joseph; Mistry, Amee; Mantzoros, Christos S
2017-10-01
Biosimilars have become a subject of great interest in the past few years. The European Union and the United States are seeing an increasing number of biosimilar applications and approvals. The development of a biosimilar is significantly more complex and costly than a small molecule generic product. In the European Union, there has been a wider use of these medications compared to the United States. More biosimilars are gaining approval in the United States, and these products will likely alter the healthcare system in highly impactful ways. Understanding the regulatory process, the risks, and benefits will enable clinicians to be prepared and maximize the utility of these medications when they enter the market. This article introduces the concept of a biosimilar, discusses the regulatory process in the United States, and reviews the risks and benefits of these products. Copyright © 2017 Elsevier Inc. All rights reserved.
High Performance GPU-Based Fourier Volume Rendering.
Abdellah, Marwan; Eldeib, Ayman; Sharawi, Amr
2015-01-01
Fourier volume rendering (FVR) is a significant visualization technique that has been used widely in digital radiography. As a result of its (N (2)logN) time complexity, it provides a faster alternative to spatial domain volume rendering algorithms that are (N (3)) computationally complex. Relying on the Fourier projection-slice theorem, this technique operates on the spectral representation of a 3D volume instead of processing its spatial representation to generate attenuation-only projections that look like X-ray radiographs. Due to the rapid evolution of its underlying architecture, the graphics processing unit (GPU) became an attractive competent platform that can deliver giant computational raw power compared to the central processing unit (CPU) on a per-dollar-basis. The introduction of the compute unified device architecture (CUDA) technology enables embarrassingly-parallel algorithms to run efficiently on CUDA-capable GPU architectures. In this work, a high performance GPU-accelerated implementation of the FVR pipeline on CUDA-enabled GPUs is presented. This proposed implementation can achieve a speed-up of 117x compared to a single-threaded hybrid implementation that uses the CPU and GPU together by taking advantage of executing the rendering pipeline entirely on recent GPU architectures.
Low-Energy, Low-Cost Production of Ethylene by Low- Temperature Oxidative Coupling of Methane
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radaelli, Guido; Chachra, Gaurav; Jonnavittula, Divya
In this project, we develop a catalytic process technology for distributed small-scale production of ethylene by oxidative coupling of methane at low temperatures using an advanced catalyst. The Low Temperature Oxidative Coupling of Methane (LT-OCM) catalyst system is enabled by a novel chemical catalyst and process pioneered by Siluria, at private expense, over the last six years. Herein, we develop the LT-OCM catalyst system for distributed small-scale production of ethylene by identifying and addressing necessary process schemes, unit operations and process parameters that limit the economic viability and mass penetration of this technology to manufacture ethylene at small-scales. The outputmore » of this program is process concepts for small-scale LT-OCM catalyst based ethylene production, lab-scale verification of the novel unit operations adopted in the proposed concept, and an analysis to validate the feasibility of the proposed concepts.« less
An Intercultural Community - Input Process for Curriculum Development.
ERIC Educational Resources Information Center
Leonard, Deni
A program to bring about community involvement in the development of curriculum for public schools was implemented in Seattle in 1974-75 by the United Indians of All Tribes Foundation. The program follows a 12-step procedure that begins with selecting community representatives who will learn curriculum planning skills enabling them to make…
Improving Organizational Learning: Defining Units of Learning from Social Tools
ERIC Educational Resources Information Center
Menolli, André Luís Andrade; Reinehr, Sheila; Malucelli, Andreia
2013-01-01
New technologies, such as social networks, wikis, blogs and other social tools, enable collaborative work and are important facilitators of the social learning process. Many companies are using these types of tools as substitutes for their intranets, especially software development companies. However, the content generated by these tools in many…
Miao, Yipu; Merz, Kenneth M
2015-04-14
We present an efficient implementation of ab initio self-consistent field (SCF) energy and gradient calculations that run on Compute Unified Device Architecture (CUDA) enabled graphical processing units (GPUs) using recurrence relations. We first discuss the machine-generated code that calculates the electron-repulsion integrals (ERIs) for different ERI types. Next we describe the porting of the SCF gradient calculation to GPUs, which results in an acceleration of the computation of the first-order derivative of the ERIs. However, only s, p, and d ERIs and s and p derivatives could be executed simultaneously on GPUs using the current version of CUDA and generation of NVidia GPUs using a previously described algorithm [Miao and Merz J. Chem. Theory Comput. 2013, 9, 965-976.]. Hence, we developed an algorithm to compute f type ERIs and d type ERI derivatives on GPUs. Our benchmarks shows the performance GPU enable ERI and ERI derivative computation yielded speedups of 10-18 times relative to traditional CPU execution. An accuracy analysis using double-precision calculations demonstrates that the overall accuracy is satisfactory for most applications.
Moore, Jenny; Crozier, Kenda; Kite, Katharine
2012-01-01
The National Health Service in the United Kingdom is committed to a process of reform centred on quality care and innovative practice. Central to this process is the need for research capacity building within the workforce. The aim of this study was to develop an infrastructure for research capacity building within one National Health Service Foundation Trust. Using an Action Research methodology, sixteen individuals were purposefully selected from a population of nurses and midwives to participate in the study. This nonprobability sampling method enabled the researchers to select participants on the basis of who would be most informative about existing research capacity building structures and processes within the Trust. Data were collected in the form of semi-structured individual interviews with each participant. The main findings were that research activity was not embedded in the culture of the organisation, and initiating and undertaking change was a complex process. As a result, a range of structures and processes which were considered necessary to enable the Trust move forward in developing capacity and capability for research were developed and implemented. This paper reports the first two stages of this process, namely: the findings from the pre-step and an outline of how these findings were used to create an infrastructure to support research capacity building within one NHS Foundation Trust Hospital in the United Kingdom. Copyright © 2011 Elsevier Ltd. All rights reserved.
Software Framework for Advanced Power Plant Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
John Widmann; Sorin Munteanu; Aseem Jain
2010-08-01
This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less
Larcombe, Wendy
2012-04-01
Jurisdictions in the United States, United Kingdom, and Australia now have laws that enable preventive detention of post-sentence sex offenders based on an assessment of the offender's likely recidivism. Measures of recidivism, or risk assessments, rely on the criminal justice process to produce the "pool" of sex offenders studied. This article argues that recidivism research needs to be placed in the context of attrition studies that document the disproportionate and patterned attrition of sexual offenses and sexual offenders from the criminal justice process. Understanding the common biases that affect criminal prosecution of sex offenses would improve sexual violence prevention policies.
NASA Technical Reports Server (NTRS)
Gorospe, George E., Jr.; Daigle, Matthew J.; Sankararaman, Shankar; Kulkarni, Chetan S.; Ng, Eley
2017-01-01
Prognostic methods enable operators and maintainers to predict the future performance for critical systems. However, these methods can be computationally expensive and may need to be performed each time new information about the system becomes available. In light of these computational requirements, we have investigated the application of graphics processing units (GPUs) as a computational platform for real-time prognostics. Recent advances in GPU technology have reduced cost and increased the computational capability of these highly parallel processing units, making them more attractive for the deployment of prognostic software. We present a survey of model-based prognostic algorithms with considerations for leveraging the parallel architecture of the GPU and a case study of GPU-accelerated battery prognostics with computational performance results.
Advancing perinatal patient safety through application of safety science principles using health IT.
Webb, Jennifer; Sorensen, Asta; Sommerness, Samantha; Lasater, Beth; Mistry, Kamila; Kahwati, Leila
2017-12-19
The use of health information technology (IT) has been shown to promote patient safety in Labor and Delivery (L&D) units. The use of health IT to apply safety science principles (e.g., standardization) to L&D unit processes may further advance perinatal safety. Semi-structured interviews were conducted with L&D units participating in the Agency for Healthcare Research and Quality's (AHRQ's) Safety Program for Perinatal Care (SPPC) to assess units' experience with program implementation. Analysis of interview transcripts was used to characterize the process and experience of using health IT for applying safety science principles to L&D unit processes. Forty-six L&D units from 10 states completed participation in SPPC program implementation; thirty-two (70%) reported the use of health IT as an enabling strategy for their local implementation. Health IT was used to improve standardization of processes, use of independent checks, and to facilitate learning from defects. L&D units standardized care processes through use of electronic health record (EHR)-based order sets and use of smart pumps and other technology to improve medication safety. Units also standardized EHR documentation, particularly related to electronic fetal monitoring (EFM) and shoulder dystocia. Cognitive aids and tools were integrated into EHR and care workflows to create independent checks such as checklists, risk assessments, and communication handoff tools. Units also used data from EHRs to monitor processes of care to learn from defects. Units experienced several challenges incorporating health IT, including obtaining organization approval, working with their busy IT departments, and retrieving standardized data from health IT systems. Use of health IT played an integral part in the planning and implementation of SPPC for participating L&D units. Use of health IT is an encouraging approach for incorporating safety science principles into care to improve perinatal safety and should be incorporated into materials to facilitate the implementation of perinatal safety initiatives.
Fodi, Tamas; Didaskalou, Christos; Kupai, Jozsef; Balogh, Gyorgy T; Huszthy, Peter; Szekely, Gyorgy
2017-09-11
Solvent usage in the pharmaceutical sector accounts for as much as 90 % of the overall mass during manufacturing processes. Consequently, solvent consumption poses significant costs and environmental burdens. Continuous processing, in particular continuous-flow reactors, have great potential for the sustainable production of pharmaceuticals but subsequent downstream processing remains challenging. Separation processes for concentrating and purifying chemicals can account for as much as 80 % of the total manufacturing costs. In this work, a nanofiltration unit was coupled to a continuous-flow rector for in situ solvent and reagent recycling. The nanofiltration unit is straightforward to implement and simple to control during continuous operation. The hybrid process operated continuously over six weeks, recycling about 90 % of the solvent and reagent. Consequently, the E-factor and the carbon footprint were reduced by 91 % and 19 %, respectively. Moreover, the nanofiltration unit led to a solution of the product eleven times more concentrated than the reaction mixture and increased the purity from 52.4 % to 91.5 %. The boundaries for process conditions were investigated to facilitate implementation of the methodology by the pharmaceutical sector. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Zhu, Xiang; Zhang, Dianwen
2013-01-01
We present a fast, accurate and robust parallel Levenberg-Marquardt minimization optimizer, GPU-LMFit, which is implemented on graphics processing unit for high performance scalable parallel model fitting processing. GPU-LMFit can provide a dramatic speed-up in massive model fitting analyses to enable real-time automated pixel-wise parametric imaging microscopy. We demonstrate the performance of GPU-LMFit for the applications in superresolution localization microscopy and fluorescence lifetime imaging microscopy. PMID:24130785
NASA Astrophysics Data System (ADS)
Pizette, Patrick; Govender, Nicolin; Wilke, Daniel N.; Abriak, Nor-Edine
2017-06-01
The use of the Discrete Element Method (DEM) for industrial civil engineering industrial applications is currently limited due to the computational demands when large numbers of particles are considered. The graphics processing unit (GPU) with its highly parallelized hardware architecture shows potential to enable solution of civil engineering problems using discrete granular approaches. We demonstrate in this study the pratical utility of a validated GPU-enabled DEM modeling environment to simulate industrial scale granular problems. As illustration, the flow discharge of storage silos using 8 and 17 million particles is considered. DEM simulations have been performed to investigate the influence of particle size (equivalent size for the 20/40-mesh gravel) and induced shear stress for two hopper shapes. The preliminary results indicate that the shape of the hopper significantly influences the discharge rates for the same material. Specifically, this work shows that GPU-enabled DEM modeling environments can model industrial scale problems on a single portable computer within a day for 30 seconds of process time.
Accelerated design of bioconversion processes using automated microscale processing techniques.
Lye, Gary J; Ayazi-Shamlou, Parviz; Baganz, Frank; Dalby, Paul A; Woodley, John M
2003-01-01
Microscale processing techniques are rapidly emerging as a means to increase the speed of bioprocess design and reduce material requirements. Automation of these techniques can reduce labour intensity and enable a wider range of process variables to be examined. This article examines recent research on various individual microscale unit operations including microbial fermentation, bioconversion and product recovery techniques. It also explores the potential of automated whole process sequences operated in microwell formats. The power of the whole process approach is illustrated by reference to a particular bioconversion, namely the Baeyer-Villiger oxidation of bicyclo[3.2.0]hept-2-en-6-one for the production of optically pure lactones.
Application of agent-based system for bioprocess description and process improvement.
Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J
2010-01-01
Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers
NASA Technical Reports Server (NTRS)
Hall, William A.
1990-01-01
Slave microprocessors in multimicroprocessor computing system contains modified circuit cards programmed via bus connecting master processor with slave microprocessors. Enables interactive, microprocessor-based, single-loop control. Confers ability to load and run program from master/slave bus, without need for microprocessor development station. Tristate buffers latch all data and information on status. Slave central processing unit never connected directly to bus.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxberry, Geoffrey
Google Test MPI Listener is a plugin for the Google Test c++ unit testing library that organizes test output of software that uses both the MPI parallel programming model and Google Test. Typically, such output is ordered arbitrarily and disorganized, making difficult the process of interpreting test output. This plug organizes output in MPI rank order, enabling easy interpretation of test results.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-30
... products designed to meet new customer needs for access to postage. In addition, changes within the United... opportunities for PES providers to propose new concepts, methods, and processes to enable customers to print pre... support the USPS PES Test and Evaluation Program (the ``Program''). The intent is for the volumes to fully...
Thinking Science: A Way to Change Teacher Practice in Order to Raise Students' Ability to Think
ERIC Educational Resources Information Center
Hueppauff, Sonia
2016-01-01
This article describes key facets of the Cognitive Acceleration through Science Education (CASE), a curriculum that emerged in the United Kingdom, enabling teachers to accelerate the process of cognitive development so that more students could attain the higher-order thinking skills (formal operational thinking) required (Lecky, 2012). CASE, also…
Preparing Science Specific Mentors: A Look at One Successful Georgia Program.
ERIC Educational Resources Information Center
Upson, Leslie; Koballa, Thomas; Gerber, Brian
The state of Georgia has developed the Teacher Support Specialist Program to assist prospective mentors as they begin the process of preparing to provide support and guidance to those new to the profession. Successful completion of this program for either staff development units or college credit enables Georgia teachers to add the teacher support…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, S. L.
1998-08-25
Fluid Catalytic Cracking (FCC) technology is the most important process used by the refinery industry to convert crude oil to valuable lighter products such as gasoline. Process development is generally very time consuming especially when a small pilot unit is being scaled-up to a large commercial unit because of the lack of information to aide in the design of scaled-up units. Such information can now be obtained by analysis based on the pilot scale measurements and computer simulation that includes controlling physics of the FCC system. A Computational fluid dynamic (CFD) code, ICRKFLO, has been developed at Argonne National Laboratorymore » (ANL) and has been successfully applied to the simulation of catalytic petroleum cracking risers. It employs hybrid hydrodynamic-chemical kinetic coupling techniques, enabling the analysis of an FCC unit with complex chemical reaction sets containing tens or hundreds of subspecies. The code has been continuously validated based on pilot-scale experimental data. It is now being used to investigate the effects of scaled-up FCC units. Among FCC operating conditions, the feed injection conditions are found to have a strong impact on the product yields of scaled-up FCC units. The feed injection conditions appear to affect flow and heat transfer patterns and the interaction of hydrodynamics and cracking kinetics causes the product yields to change accordingly.« less
Body area network--a key infrastructure element for patient-centered telemedicine.
Norgall, Thomas; Schmidt, Robert; von der Grün, Thomas
2004-01-01
The Body Area Network (BAN) extends the range of existing wireless network technologies by an ultra-low range, ultra-low power network solution optimised for long-term or continuous healthcare applications. It enables wireless radio communication between several miniaturised, intelligent Body Sensor (or actor) Units (BSU) and a single Body Central Unit (BCU) worn at the human body. A separate wireless transmission link from the BCU to a network access point--using different technology--provides for online access to BAN components via usual network infrastructure. The BAN network protocol maintains dynamic ad-hoc network configuration scenarios and co-existence of multiple networks.BAN is expected to become a basic infrastructure element for electronic health services: By integrating patient-attached sensors and mobile actor units, distributed information and data processing systems, the range of medical workflow can be extended to include applications like wireless multi-parameter patient monitoring and therapy support. Beyond clinical use and professional disease management environments, private personal health assistance scenarios (without financial reimbursement by health agencies / insurance companies) enable a wide range of applications and services in future pervasive computing and networking environments.
NASA Astrophysics Data System (ADS)
Leung, Nelson; Abdelhafez, Mohamed; Koch, Jens; Schuster, David
2017-04-01
We implement a quantum optimal control algorithm based on automatic differentiation and harness the acceleration afforded by graphics processing units (GPUs). Automatic differentiation allows us to specify advanced optimization criteria and incorporate them in the optimization process with ease. We show that the use of GPUs can speedup calculations by more than an order of magnitude. Our strategy facilitates efficient numerical simulations on affordable desktop computers and exploration of a host of optimization constraints and system parameters relevant to real-life experiments. We demonstrate optimization of quantum evolution based on fine-grained evaluation of performance at each intermediate time step, thus enabling more intricate control on the evolution path, suppression of departures from the truncated model subspace, as well as minimization of the physical time needed to perform high-fidelity state preparation and unitary gates.
GR712RC- Dual-Core Processor- Product Status
NASA Astrophysics Data System (ADS)
Sturesson, Fredrik; Habinc, Sandi; Gaisler, Jiri
2012-08-01
The GR712RC System-on-Chip (SoC) is a dual core LEON3FT system suitable for advanced high reliability space avionics. Fault tolerance features from Aeroflex Gaisler’s GRLIB IP library and an implementation using Ramon Chips RadSafe cell library enables superior radiation hardness.The GR712RC device has been designed to provide high processing power by including two LEON3FT 32- bit SPARC V8 processors, each with its own high- performance IEEE754 compliant floating-point-unit and SPARC reference memory management unit.This high processing power is combined with a large number of serial interfaces, ranging from high-speed links for data transfers to low-speed control buses for commanding and status acquisition.
Added value in health care with six sigma.
Lenaz, Maria P
2004-06-01
Six sigma is the structured application of the tools and techniques of quality management applied on a project basis that can enable organizations to achieve superior performance and strategic business results. The Greek character sigma has been used as a statistical term that measures how much a process varies from perfection, based on the number of defects per million units. Health care organizations using this model proceed from the lower levels of quality performance to the highest level, in which the process is nearly error free.
Fischer-Baum, Simon; Englebretson, Robert
2016-08-01
Reading relies on the recognition of units larger than single letters and smaller than whole words. Previous research has linked sublexical structures in reading to properties of the visual system, specifically on the parallel processing of letters that the visual system enables. But whether the visual system is essential for this to happen, or whether the recognition of sublexical structures may emerge by other means, is an open question. To address this question, we investigate braille, a writing system that relies exclusively on the tactile rather than the visual modality. We provide experimental evidence demonstrating that adult readers of (English) braille are sensitive to sublexical units. Contrary to prior assumptions in the braille research literature, we find strong evidence that braille readers do indeed access sublexical structure, namely the processing of multi-cell contractions as single orthographic units and the recognition of morphemes within morphologically-complex words. Therefore, we conclude that the recognition of sublexical structure is not exclusively tied to the visual system. However, our findings also suggest that there are aspects of morphological processing on which braille and print readers differ, and that these differences may, crucially, be related to reading using the tactile rather than the visual sensory modality. Copyright © 2016 Elsevier B.V. All rights reserved.
2006-02-18
KENNEDY SPACE CENTER, FLA. - In NASA Kennedy Space Center's Orbiter Processing Facility bay 3, United Space Alliance shuttle technicians remove the hard cover from a window on Space Shuttle Discovery to enable STS-121 crew members to inspect the window from the cockpit. Launch of Space Shuttle Discovery on mission STS-121, the second return-to-flight mission, is scheduled no earlier than May.
NASA Astrophysics Data System (ADS)
Parra, Pablo; da Silva, Antonio; Polo, Óscar R.; Sánchez, Sebastián
2018-02-01
In this day and age, successful embedded critical software needs agile and continuous development and testing procedures. This paper presents the overall testing and code coverage metrics obtained during the unit testing procedure carried out to verify the correctness of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. The ICU boot software is a critical part of the project so its verification should be addressed at an early development stage, so any test case missed in this process may affect the quality of the overall on-board software. According to the European Cooperation for Space Standardization ESA standards, testing this kind of critical software must cover 100% of the source code statement and decision paths. This leads to the complete testing of fault tolerance and recovery mechanisms that have to resolve every possible memory corruption or communication error brought about by the space environment. The introduced procedure enables fault injection from the beginning of the development process and enables to fulfill the exigent code coverage demands on the boot software.
Massively parallel processor computer
NASA Technical Reports Server (NTRS)
Fung, L. W. (Inventor)
1983-01-01
An apparatus for processing multidimensional data with strong spatial characteristics, such as raw image data, characterized by a large number of parallel data streams in an ordered array is described. It comprises a large number (e.g., 16,384 in a 128 x 128 array) of parallel processing elements operating simultaneously and independently on single bit slices of a corresponding array of incoming data streams under control of a single set of instructions. Each of the processing elements comprises a bidirectional data bus in communication with a register for storing single bit slices together with a random access memory unit and associated circuitry, including a binary counter/shift register device, for performing logical and arithmetical computations on the bit slices, and an I/O unit for interfacing the bidirectional data bus with the data stream source. The massively parallel processor architecture enables very high speed processing of large amounts of ordered parallel data, including spatial translation by shifting or sliding of bits vertically or horizontally to neighboring processing elements.
NASA Astrophysics Data System (ADS)
Min, Jae-Hong; Gelo, Nikolas J.; Jo, Hongki
2016-04-01
The newly developed smartphone application, named RINO, in this study allows measuring absolute dynamic displacements and processing them in real time using state-of-the-art smartphone technologies, such as high-performance graphics processing unit (GPU), in addition to already powerful CPU and memories, embedded high-speed/ resolution camera, and open-source computer vision libraries. A carefully designed color-patterned target and user-adjustable crop filter enable accurate and fast image processing, allowing up to 240fps for complete displacement calculation and real-time display. The performances of the developed smartphone application are experimentally validated, showing comparable accuracy with those of conventional laser displacement sensor.
GPU MrBayes V3.1: MrBayes on Graphics Processing Units for Protein Sequence Data.
Pang, Shuai; Stones, Rebecca J; Ren, Ming-Ming; Liu, Xiao-Guang; Wang, Gang; Xia, Hong-ju; Wu, Hao-Yang; Liu, Yang; Xie, Qiang
2015-09-01
We present a modified GPU (graphics processing unit) version of MrBayes, called ta(MC)(3) (GPU MrBayes V3.1), for Bayesian phylogenetic inference on protein data sets. Our main contributions are 1) utilizing 64-bit variables, thereby enabling ta(MC)(3) to process larger data sets than MrBayes; and 2) to use Kahan summation to improve accuracy, convergence rates, and consequently runtime. Versus the current fastest software, we achieve a speedup of up to around 2.5 (and up to around 90 vs. serial MrBayes), and more on multi-GPU hardware. GPU MrBayes V3.1 is available from http://sourceforge.net/projects/mrbayes-gpu/. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Dale, Simeon; Levi, Christopher; Ward, Jeanette; Grimshaw, Jeremy M; Jammali-Blasi, Asmara; D'Este, Catherine; Griffiths, Rhonda; Quinn, Clare; Evans, Malcolm; Cadilhac, Dominique; Cheung, N Wah; Middleton, Sandy
2015-02-01
The Quality in Acute Stroke Care (QASC) trial evaluated systematic implementation of clinical treatment protocols to manage fever, sugar, and swallow (FeSS protocols) in acute stroke care. This cluster-randomised controlled trial was conducted in 19 stroke units in Australia. To describe perceived barriers and enablers preimplementation to the introduction of the FeSS protocols and, postimplementation, to determine which of these barriers eventuated as actual barriers. Preimplementation: Workshops were held at the intervention stroke units (n = 10). The first workshop involved senior clinicians who identified perceived barriers and enablers to implementation of the protocols, the second workshop involved bedside clinicians. Postimplementation, an online survey with stroke champions from intervention sites was conducted. A total of 111 clinicians attended the preimplementation workshops, identifying 22 barriers covering four main themes: (a) need for new policies, (b) limited workforce (capacity), (c) lack of equipment, and (d) education and logistics of training staff. Preimplementation enablers identified were: support by clinical champions, medical staff, nursing management and allied health staff; easy adaptation of current protocols, care-plans, and local policies; and presence of specialist stroke unit staff. Postimplementation, only five of the 22 barriers identified preimplementation were reported as actual barriers to adoption of the FeSS protocols, namely, no previous use of insulin infusions; hyperglycaemic protocols could not be commenced without written orders; medical staff reluctance to use the ASSIST swallowing screening tool; poor level of engagement of medical staff; and doctors' unawareness of the trial. The process of identifying barriers and enablers preimplementation allowed staff to take ownership and to address barriers and plan for change. As only five of the 22 barriers identified preimplementation were reported to be actual barriers at completion of the trial, this suggests that barriers are often overcome whilst some are only ever perceived rather than actual barriers. © 2015 Sigma Theta Tau International.
Graphics Processing Unit (GPU) Acceleration of the Goddard Earth Observing System Atmospheric Model
NASA Technical Reports Server (NTRS)
Putnam, Williama
2011-01-01
The Goddard Earth Observing System 5 (GEOS-5) is the atmospheric model used by the Global Modeling and Assimilation Office (GMAO) for a variety of applications, from long-term climate prediction at relatively coarse resolution, to data assimilation and numerical weather prediction, to very high-resolution cloud-resolving simulations. GEOS-5 is being ported to a graphics processing unit (GPU) cluster at the NASA Center for Climate Simulation (NCCS). By utilizing GPU co-processor technology, we expect to increase the throughput of GEOS-5 by at least an order of magnitude, and accelerate the process of scientific exploration across all scales of global modeling, including: The large-scale, high-end application of non-hydrostatic, global, cloud-resolving modeling at 10- to I-kilometer (km) global resolutions Intermediate-resolution seasonal climate and weather prediction at 50- to 25-km on small clusters of GPUs Long-range, coarse-resolution climate modeling, enabled on a small box of GPUs for the individual researcher After being ported to the GPU cluster, the primary physics components and the dynamical core of GEOS-5 have demonstrated a potential speedup of 15-40 times over conventional processor cores. Performance improvements of this magnitude reduce the required scalability of 1-km, global, cloud-resolving models from an unfathomable 6 million cores to an attainable 200,000 GPU-enabled cores.
NASA Astrophysics Data System (ADS)
Kukkonen, S.; Kostama, V.-P.
2018-05-01
The availability of very high-resolution images has made it possible to extend crater size-frequency distribution studies to small, deca/hectometer-scale craters. This has enabled the dating of small and young surface units, as well as recent, short-time and small-scale geologic processes that have occurred on the units. Usually, however, the higher the spatial resolution of space images is, the smaller area is covered by the images. Thus the use of single, very high-resolution images in crater count age determination may be debatable if the images do not cover the studied region entirely. Here we compare the crater count results for the floor of the Harmakhis Vallis outflow channel obtained from the images of the ConTeXt camera (CTX) and High Resolution Imaging Science Experiment (HiRISE) aboard the Mars Reconnaissance Orbiter (MRO). The CTX images enable crater counts for entire units on the Harmakhis Vallis main valley, whereas the coverage of the higher-resolution HiRISE images is limited and thus the images can only be used to date small parts of the units. Our case study shows that the crater count data based on small impact craters and small surface areas mainly correspond with the crater count data based on larger craters and more extensive counting areas on the same unit. If differences between the results were founded, they could usually be explained by the regional geology. Usually, these differences appeared when at least one cratering model age is missing from either of the crater datasets. On the other hand, we found only a few cases in which the cratering model ages were completely different. We conclude that the crater counts using small impact craters on small counting areas provide useful information about the geological processes which have modified the surface. However, it is important to remember that all the crater counts results obtained from a specific counting area always primarily represent the results from the counting area-not the whole unit. On the other hand, together with crater count results from extensive counting areas and lower-resolution images, crater counts on small counting areas but by using very high-resolution images is a very valuable tool for obtaining unique additional information about the local processes on the surface units.
Army Logistician. Volume 38, Issue 4, July-August 2006
2006-08-01
Relationships Effective joint logistics depends on clear roles, accountabilities , and relationships among the global players within the joint logistics...well-understood roles and accountabilities of the players involved in those processes, and shared JFC metrics shape this enabler. Domain-wide...DD [Department of Defense] Form 10) (when applicable) (for sensitive cargo accountability ) X X X UIC and shipment unit number (stenciled) X (4
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCaskey, Alexander J.
There is a lack of state-of-the-art HPC simulation tools for simulating general quantum computing. Furthermore, there are no real software tools that integrate current quantum computers into existing classical HPC workflows. This product, the Quantum Virtual Machine (QVM), solves this problem by providing an extensible framework for pluggable virtual, or physical, quantum processing units (QPUs). It enables the execution of low level quantum assembly codes and returns the results of such executions.
Plowden, K O; Wenger, A F
2001-01-01
African Americans are facing a serious health crisis. They are disproportionately affected by most chronic illnesses. The disparity among ethic groups as it relates to health and illness is related to psychosocial and biological factors within the African American culture. Many African Americans are sometimes reluctant to participate in studies. This article discusses the process of creating a caring community when conducting research within an African American community based on the experience of the authors with two faith communities in a southern metropolitan area in the United States. The process is identified as unknowing, reflection, presence, and knowing. The process is based on Leininger's theory of culture care diversity and universality and her stranger to friend enabler. When the theory and method are used, the investigator moves from a stranger within the community to a trusted friend and begins to collect rich and valuable data for analysis from the informants' point of view.
Willmott, Jon R.; Mims, Forrest M.; Parisi, Alfio V.
2018-01-01
Smartphones are playing an increasing role in the sciences, owing to the ubiquitous proliferation of these devices, their relatively low cost, increasing processing power and their suitability for integrated data acquisition and processing in a ‘lab in a phone’ capacity. There is furthermore the potential to deploy these units as nodes within Internet of Things architectures, enabling massive networked data capture. Hitherto, considerable attention has been focused on imaging applications of these devices. However, within just the last few years, another possibility has emerged: to use smartphones as a means of capturing spectra, mostly by coupling various classes of fore-optics to these units with data capture achieved using the smartphone camera. These highly novel approaches have the potential to become widely adopted across a broad range of scientific e.g., biomedical, chemical and agricultural application areas. In this review, we detail the exciting recent development of smartphone spectrometer hardware, in addition to covering applications to which these units have been deployed, hitherto. The paper also points forward to the potentially highly influential impacts that such units could have on the sciences in the coming decades. PMID:29342899
Sustaining a culture of practice development in an acute adolescent inpatient mental health unit.
Vella, Natalie; Page, Laura; Edwards, Clair; Wand, Timothy
2014-08-01
It is recognized that facilitating change in workplace culture is a significant challenge in healthcare service delivery. Practice development strategies and principles provide a framework for initiating and sustaining programs focused on enhancing patient-centered care by concentrating on the therapeutic attributes of nursing. However, little literature exists on explicating "what worked" in practice development programs. This paper details the processes, people, resources, and relationships that enabled the successful implementation, and led to the sustainability, of a practice development program employed in an acute adolescent mental health unit in Sydney, Australia. Following an external review of the unit, a meeting of key stakeholders was convened and subsequently an advisory panel formed to address specific issues facing nursing staff. This process resulted in the development of an educational package and adoption of the tidal model as the framework for mental health nursing practice in the unit. Clinical reasoning sessions and journal article presentations were incorporated to consolidate and maintain the change in nursing care. A planned, structured, and inclusive practice development program has transformed the nursing culture and vastly improved the care provided to adolescents presenting in acute states of distress to this mental health unit. © 2014 Wiley Periodicals, Inc.
Two schemes for rapid generation of digital video holograms using PC cluster
NASA Astrophysics Data System (ADS)
Park, Hanhoon; Song, Joongseok; Kim, Changseob; Park, Jong-Il
2017-12-01
Computer-generated holography (CGH), which is a process of generating digital holograms, is computationally expensive. Recently, several methods/systems of parallelizing the process using graphic processing units (GPUs) have been proposed. Indeed, use of multiple GPUs or a personal computer (PC) cluster (each PC with GPUs) enabled great improvements in the process speed. However, extant literature has less often explored systems involving rapid generation of multiple digital holograms and specialized systems for rapid generation of a digital video hologram. This study proposes a system that uses a PC cluster and is able to more efficiently generate a video hologram. The proposed system is designed to simultaneously generate multiple frames and accelerate the generation by parallelizing the CGH computations across a number of frames, as opposed to separately generating each individual frame while parallelizing the CGH computations within each frame. The proposed system also enables the subprocesses for generating each frame to execute in parallel through multithreading. With these two schemes, the proposed system significantly reduced the data communication time for generating a digital hologram when compared with that of the state-of-the-art system.
Influence of White and Gray Matter Connections on Endogenous Human Cortical Oscillations
Hawasli, Ammar H.; Kim, DoHyun; Ledbetter, Noah M.; Dahiya, Sonika; Barbour, Dennis L.; Leuthardt, Eric C.
2016-01-01
Brain oscillations reflect changes in electrical potentials summated across neuronal populations. Low- and high-frequency rhythms have different modulation patterns. Slower rhythms are spatially broad, while faster rhythms are more local. From this observation, we hypothesized that low- and high-frequency oscillations reflect white- and gray-matter communications, respectively, and synchronization between low-frequency phase with high-frequency amplitude represents a mechanism enabling distributed brain-networks to coordinate local processing. Testing this common understanding, we selectively disrupted white or gray matter connections to human cortex while recording surface field potentials. Counter to our original hypotheses, we found that cortex consists of independent oscillatory-units (IOUs) that maintain their own complex endogenous rhythm structure. IOUs are differentially modulated by white and gray matter connections. White-matter connections maintain topographical anatomic heterogeneity (i.e., separable processing in cortical space) and gray-matter connections segregate cortical synchronization patterns (i.e., separable temporal processing through phase-power coupling). Modulation of distinct oscillatory modules enables the functional diversity necessary for complex processing in the human brain. PMID:27445767
NASA Astrophysics Data System (ADS)
Sewell, Stephen
This thesis introduces a software framework that effectively utilizes low-cost commercially available Graphic Processing Units (GPUs) to simulate complex scientific plasma phenomena that are modeled using the Particle-In-Cell (PIC) paradigm. The software framework that was developed conforms to the Compute Unified Device Architecture (CUDA), a standard for general purpose graphic processing that was introduced by NVIDIA Corporation. This framework has been verified for correctness and applied to advance the state of understanding of the electromagnetic aspects of the development of the Aurora Borealis and Aurora Australis. For each phase of the PIC methodology, this research has identified one or more methods to exploit the problem's natural parallelism and effectively map it for execution on the graphic processing unit and its host processor. The sources of overhead that can reduce the effectiveness of parallelization for each of these methods have also been identified. One of the novel aspects of this research was the utilization of particle sorting during the grid interpolation phase. The final representation resulted in simulations that executed about 38 times faster than simulations that were run on a single-core general-purpose processing system. The scalability of this framework to larger problem sizes and future generation systems has also been investigated.
Coordinated traffic incident management using the I-Net embedded sensor architecture
NASA Astrophysics Data System (ADS)
Dudziak, Martin J.
1999-01-01
The I-Net intelligent embedded sensor architecture enables the reconfigurable construction of wide-area remote sensing and data collection networks employing diverse processing and data acquisition modules communicating over thin- server/thin-client protocols. Adaptive initially for operation using mobile remotely-piloted vehicle platforms such as small helicopter robots such as the Hornet and Ascend-I, the I-Net architecture lends itself to a critical problem in the management of both spontaneous and planned traffic congestion and rerouting over major interstate thoroughfares such as the I-95 Corridor. Pre-programmed flight plans and ad hoc operator-assisted navigation of the lightweight helicopter, using an auto-pilot and gyroscopic stabilization augmentation units, allows daytime or nighttime over-the-horizon flights of the unit to collect and transmit real-time video imagery that may be stored or transmitted to other locations. With on-board GPS and ground-based pattern recognition capabilities to augment the standard video collection process, this approach enables traffic management and emergency response teams to plan and assist real-time in the adjustment of traffic flows in high- density or congested areas or during dangerous road conditions such as during ice, snow, and hurricane storms. The I-Net architecture allows for integration of land-based and roadside sensors within a comprehensive automated traffic management system with communications to and form an airborne or other platform to devices in the network other than human-operated desktop computers, thereby allowing more rapid assimilation and response for critical data. Experiments have been conducted using several modified platforms and standard video and still photographic equipment. Current research and development is focused upon modification of the modular instrumentation units in order to accommodate faster loading and reloading of equipment onto the RPV, extension of the I-Net architecture to enable RPV-to-RPV signaling and control, and refinement of safety and emergency mechanisms to handle RPV mechanical failure during flight.
An end-to-end communications architecture for condition-based maintenance applications
NASA Astrophysics Data System (ADS)
Kroculick, Joseph
2014-06-01
This paper explores challenges in implementing an end-to-end communications architecture for Condition-Based Maintenance Plus (CBM+) data transmission which aligns with the Army's Network Modernization Strategy. The Army's Network Modernization strategy is based on rolling out network capabilities which connect the smallest unit and Soldier level to enterprise systems. CBM+ is a continuous improvement initiative over the life cycle of a weapon system or equipment to improve the reliability and maintenance effectiveness of Department of Defense (DoD) systems. CBM+ depends on the collection, processing and transport of large volumes of data. An important capability that enables CBM+ is an end-to-end network architecture that enables data to be uploaded from the platform at the tactical level to enterprise data analysis tools. To connect end-to-end maintenance processes in the Army's supply chain, a CBM+ network capability can be developed from available network capabilities.
Enabling technologies built on a sonochemical platform: challenges and opportunities.
Cintas, Pedro; Tagliapietra, Silvia; Caporaso, Marina; Tabasso, Silvia; Cravotto, Giancarlo
2015-07-01
Scientific and technological progress now occurs at the interface between two or more scientific and technical disciplines while chemistry is intertwined with almost all scientific domains. Complementary and synergistic effects have been found in the overlay between sonochemistry and other enabling technologies such as mechanochemistry, microwave chemistry and flow-chemistry. Although their nature and effects are intrinsically different, these techniques share the ability to significantly activate most chemical processes and peculiar phenomena. These studies offer a comprehensive overview of sonochemistry, provide a better understanding of correlated phenomena (mechanochemical effects, hot spots, etc.), and pave the way for emerging applications which unite hybrid reactors. Copyright © 2014 Elsevier B.V. All rights reserved.
Mathematical Modeling Of Life-Support Systems
NASA Technical Reports Server (NTRS)
Seshan, Panchalam K.; Ganapathi, Balasubramanian; Jan, Darrell L.; Ferrall, Joseph F.; Rohatgi, Naresh K.
1994-01-01
Generic hierarchical model of life-support system developed to facilitate comparisons of options in design of system. Model represents combinations of interdependent subsystems supporting microbes, plants, fish, and land animals (including humans). Generic model enables rapid configuration of variety of specific life support component models for tradeoff studies culminating in single system design. Enables rapid evaluation of effects of substituting alternate technologies and even entire groups of technologies and subsystems. Used to synthesize and analyze life-support systems ranging from relatively simple, nonregenerative units like aquariums to complex closed-loop systems aboard submarines or spacecraft. Model, called Generic Modular Flow Schematic (GMFS), coded in such chemical-process-simulation languages as Aspen Plus and expressed as three-dimensional spreadsheet.
Brazhnik, Olga; Jones, John F.
2007-01-01
Producing reliable information is the ultimate goal of data processing. The ocean of data created with the advances of science and technologies calls for integration of data coming from heterogeneous sources that are diverse in their purposes, business rules, underlying models and enabling technologies. Reference models, Semantic Web, standards, ontology, and other technologies enable fast and efficient merging of heterogeneous data, while the reliability of produced information is largely defined by how well the data represent the reality. In this paper we initiate a framework for assessing the informational value of data that includes data dimensions; aligning data quality with business practices; identifying authoritative sources and integration keys; merging models; uniting updates of varying frequency and overlapping or gapped data sets. PMID:17071142
ERIC Educational Resources Information Center
Jansson, Anders B.
2011-01-01
This article focuses on the learning that is enabled while a primary school child makes a story using multimodal software. This child is diagnosed with autism. The aim is to use a cultural-historical framework to carry out an in-depth analysis of a process of learning with action as a unit of analysis. The article is based on a collaborative…
NASA Astrophysics Data System (ADS)
Rhodes, Russel E.; Byrd, Raymond J.
1998-01-01
This paper presents a ``back of the envelope'' technique for fast, timely, on-the-spot, assessment of affordability (profitability) of commercial space transportation architectural concepts. The tool presented here is not intended to replace conventional, detailed costing methodology. The process described enables ``quick look'' estimations and assumptions to effectively determine whether an initial concept (with its attendant cost estimating line items) provides focus for major leapfrog improvement. The Cost Charts Users Guide provides a generic sample tutorial, building an approximate understanding of the basic launch system cost factors and their representative magnitudes. This process will enable the user to develop a net ``cost (and price) per payload-mass unit to orbit'' incorporating a variety of significant cost drivers, supplemental to basic vehicle cost estimates. If acquisition cost and recurring cost factors (as a function of cost per payload-mass unit to orbit) do not meet the predetermined system-profitability goal, the concept in question will be clearly seen as non-competitive. Multiple analytical approaches, and applications of a variety of interrelated assumptions, can be examined in a quick, (on-the-spot) cost approximation analysis as this tool has inherent flexibility. The technique will allow determination of concept conformance to system objectives.
A workload model and measures for computer performance evaluation
NASA Technical Reports Server (NTRS)
Kerner, H.; Kuemmerle, K.
1972-01-01
A generalized workload definition is presented which constructs measurable workloads of unit size from workload elements, called elementary processes. An elementary process makes almost exclusive use of one of the processors, CPU, I/O processor, etc., and is measured by the cost of its execution. Various kinds of user programs can be simulated by quantitative composition of elementary processes into a type. The character of the type is defined by the weights of its elementary processes and its structure by the amount and sequence of transitions between its elementary processes. A set of types is batched to a mix. Mixes of identical cost are considered as equivalent amounts of workload. These formalized descriptions of workloads allow investigators to compare the results of different studies quantitatively. Since workloads of different composition are assigned a unit of cost, these descriptions enable determination of cost effectiveness of different workloads on a machine. Subsequently performance parameters such as throughput rate, gain factor, internal and external delay factors are defined and used to demonstrate the effects of various workload attributes on the performance of a selected large scale computer system.
CAD Services: an Industry Standard Interface for Mechanical CAD Interoperability
NASA Technical Reports Server (NTRS)
Claus, Russell; Weitzer, Ilan
2002-01-01
Most organizations seek to design and develop new products in increasingly shorter time periods. At the same time, increased performance demands require a team-based multidisciplinary design process that may span several organizations. One approach to meet these demands is to use 'Geometry Centric' design. In this approach, design engineers team their efforts through one united representation of the design that is usually captured in a CAD system. Standards-based interfaces are critical to provide uniform, simple, distributed services that enable the 'Geometry Centric' design approach. This paper describes an industry-wide effort, under the Object Management Group's (OMG) Manufacturing Domain Task Force, to define interfaces that enable the interoperability of CAD, Computer Aided Manufacturing (CAM), and Computer Aided Engineering (CAE) tools. This critical link to enable 'Geometry Centric' design is called: Cad Services V1.0. This paper discusses the features of this standard and proposed application.
Industrial biomanufacturing: The future of chemical production.
Clomburg, James M; Crumbley, Anna M; Gonzalez, Ramon
2017-01-06
The current model for industrial chemical manufacturing employs large-scale megafacilities that benefit from economies of unit scale. However, this strategy faces environmental, geographical, political, and economic challenges associated with energy and manufacturing demands. We review how exploiting biological processes for manufacturing (i.e., industrial biomanufacturing) addresses these concerns while also supporting and benefiting from economies of unit number. Key to this approach is the inherent small scale and capital efficiency of bioprocesses and the ability of engineered biocatalysts to produce designer products at high carbon and energy efficiency with adjustable output, at high selectivity, and under mild process conditions. The biological conversion of single-carbon compounds represents a test bed to establish this paradigm, enabling rapid, mobile, and widespread deployment, access to remote and distributed resources, and adaptation to new and changing markets. Copyright © 2017, American Association for the Advancement of Science.
Brückner, Hans-Peter; Spindeldreier, Christian; Blume, Holger
2013-01-01
A common approach for high accuracy sensor fusion based on 9D inertial measurement unit data is Kalman filtering. State of the art floating-point filter algorithms differ in their computational complexity nevertheless, real-time operation on a low-power microcontroller at high sampling rates is not possible. This work presents algorithmic modifications to reduce the computational demands of a two-step minimum order Kalman filter. Furthermore, the required bit-width of a fixed-point filter version is explored. For evaluation real-world data captured using an Xsens MTx inertial sensor is used. Changes in computational latency and orientation estimation accuracy due to the proposed algorithmic modifications and fixed-point number representation are evaluated in detail on a variety of processing platforms enabling on-board processing on wearable sensor platforms.
Architectural design of heterogeneous metallic nanocrystals--principles and processes.
Yu, Yue; Zhang, Qingbo; Yao, Qiaofeng; Xie, Jianping; Lee, Jim Yang
2014-12-16
CONSPECTUS: Heterogeneous metal nanocrystals (HMNCs) are a natural extension of simple metal nanocrystals (NCs), but as a research topic, they have been much less explored until recently. HMNCs are formed by integrating metal NCs of different compositions into a common entity, similar to the way atoms are bonded to form molecules. HMNCs can be built to exhibit an unprecedented architectural diversity and complexity by programming the arrangement of the NC building blocks ("unit NCs"). The architectural engineering of HMNCs involves the design and fabrication of the architecture-determining elements (ADEs), i.e., unit NCs with precise control of shape and size, and their relative positions in the design. Similar to molecular engineering, where structural diversity is used to create more property variations for application explorations, the architectural engineering of HMNCs can similarly increase the utility of metal NCs by offering a suite of properties to support multifunctionality in applications. The architectural engineering of HMNCs calls for processes and operations that can execute the design. Some enabling technologies already exist in the form of classical micro- and macroscale fabrication techniques, such as masking and etching. These processes, when used singly or in combination, are fully capable of fabricating nanoscopic objects. What is needed is a detailed understanding of the engineering control of ADEs and the translation of these principles into actual processes. For simplicity of execution, these processes should be integrated into a common reaction system and yet retain independence of control. The key to architectural diversity is therefore the independent controllability of each ADE in the design blueprint. The right chemical tools must be applied under the right circumstances in order to achieve the desired outcome. In this Account, after a short illustration of the infinite possibility of combining different ADEs to create HMNC design variations, we introduce the fabrication processes for each ADE, which enable shape, size, and location control of the unit NCs in a particular HMNC design. The principles of these processes are discussed and illustrated with examples. We then discuss how these processes may be integrated into a common reaction system while retaining the independence of individual processes. The principles for the independent control of each ADE are discussed in detail to lay the foundation for the selection of the chemical reaction system and its operating space.
Adaptive MCMC in Bayesian phylogenetics: an application to analyzing partitioned data in BEAST.
Baele, Guy; Lemey, Philippe; Rambaut, Andrew; Suchard, Marc A
2017-06-15
Advances in sequencing technology continue to deliver increasingly large molecular sequence datasets that are often heavily partitioned in order to accurately model the underlying evolutionary processes. In phylogenetic analyses, partitioning strategies involve estimating conditionally independent models of molecular evolution for different genes and different positions within those genes, requiring a large number of evolutionary parameters that have to be estimated, leading to an increased computational burden for such analyses. The past two decades have also seen the rise of multi-core processors, both in the central processing unit (CPU) and Graphics processing unit processor markets, enabling massively parallel computations that are not yet fully exploited by many software packages for multipartite analyses. We here propose a Markov chain Monte Carlo (MCMC) approach using an adaptive multivariate transition kernel to estimate in parallel a large number of parameters, split across partitioned data, by exploiting multi-core processing. Across several real-world examples, we demonstrate that our approach enables the estimation of these multipartite parameters more efficiently than standard approaches that typically use a mixture of univariate transition kernels. In one case, when estimating the relative rate parameter of the non-coding partition in a heterochronous dataset, MCMC integration efficiency improves by > 14-fold. Our implementation is part of the BEAST code base, a widely used open source software package to perform Bayesian phylogenetic inference. guy.baele@kuleuven.be. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Interactive brain shift compensation using GPU based programming
NASA Astrophysics Data System (ADS)
van der Steen, Sander; Noordmans, Herke Jan; Verdaasdonk, Rudolf
2009-02-01
Processing large images files or real-time video streams requires intense computational power. Driven by the gaming industry, the processing power of graphic process units (GPUs) has increased significantly. With the pixel shader model 4.0 the GPU can be used for image processing 10x faster than the CPU. Dedicated software was developed to deform 3D MR and CT image sets for real-time brain shift correction during navigated neurosurgery using landmarks or cortical surface traces defined by the navigation pointer. Feedback was given using orthogonal slices and an interactively raytraced 3D brain image. GPU based programming enables real-time processing of high definition image datasets and various applications can be developed in medicine, optics and image sciences.
Boedeker, Berthold; Goldstein, Adam; Mahajan, Ekta
2017-11-04
The availability and use of pre-sterilized disposables has greatly changed the methods used in biopharmaceuticals development and production, particularly from mammalian cell culture. Nowadays, almost all process steps from cell expansion, fermentation, cell removal, and purification to formulation and storage of drug substances can be carried out in disposables, although there are still limitations with single-use technologies, particularly in the areas of pretesting and quality control of disposables, bag and connections standardization and qualification, extractables and leachables (E/L) validation, and dependency on individual vendors. The current status of single-use technologies is summarized for all process unit operations using a standard mAb process as an example. In addition, current pros and cons of using disposables are addressed in a comparative way, including quality control and E/L validation.The continuing progress in developing single-use technologies has an important impact on manufacturing facilities, resulting in much faster, less expensive and simpler plant design, start-up, and operation, because cell culture process steps are no longer performed in hard-piped unit operations. This leads to simpler operations in a lab-like environment. Overall it enriches the current landscape of available facilities from standard hard-piped to hard-piped/disposables hybrid to completely single-use-based production plants using the current segregation and containment concept. At the top, disposables in combination with completely and functionally closed systems facilitate a new, revolutionary design of ballroom facilities without or with much less segregation, which enables us to perform good manufacturing practice manufacturing of different products simultaneously in unclassified but controlled areas.Finally, single-use processing in lab-like shell facilities is a big enabler of transferring and establishing production in emergent countries, and this is described in more detail in 7. Graphical Abstract.
Methane and Hydrogen Production from Anaerobic Fermentation of Municipal Solid Wastes
NASA Astrophysics Data System (ADS)
Kobayashi, Takuro; Lee, Dong-Yeol; Xu, Kaiqin; Li, Yu-You; Inamori, Yuhei
Methane and hydrogen production was investigated in batch experiments of thermophilic methane and hydrogen fermentation, using domestic garbage and food processing waste classified by fat/carbohydrate balance as a base material. Methane production per unit of VS added was significantly positively correlated with fat content and negatively correlated with carbohydrate content in the substrate, and the average value of the methane production per unit of VS added from fat-rich materials was twice as large as that from carbohydrate-rich materials. By contrast, hydrogen production per unit of VS added was significantly positively correlated with carbohydrate content and negatively correlated with fat content. Principal component analysis using the results obtained in this study enable an evaluation of substrates for methane and hydrogen fermentation based on nutrient composition.
Hirabayashi, Satoshi; Nowak, David J
2016-08-01
Trees remove air pollutants through dry deposition processes depending upon forest structure, meteorology, and air quality that vary across space and time. Employing nationally available forest, weather, air pollution and human population data for 2010, computer simulations were performed for deciduous and evergreen trees with varying leaf area index for rural and urban areas in every county in the conterminous United States. The results populated a national database of annual air pollutant removal, concentration changes, and reductions in adverse health incidences and costs for NO2, O3, PM2.5 and SO2. The developed database enabled a first order approximation of air quality and associated human health benefits provided by trees with any forest configurations anywhere in the conterminous United States over time. Comprehensive national database of tree effects on air quality and human health in the United States was developed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Towards surgeon-authored VR training: the scene-development cycle.
Dindar, Saleh; Nguyen, Thien; Peters, Jörg
2016-01-01
Enabling surgeon-educators to themselves create virtual reality (VR) training units promises greater variety, specialization, and relevance of the units. This paper describes a software bridge that semi-automates the scene-generation cycle, a key bottleneck in authoring, modeling, and developing VR units. Augmenting an open source modeling environment with physical behavior attachment and collision specifications yields single-click testing of the full force-feedback enabled anatomical scene.
Distributive Distillation Enabled by Microchannel Process Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arora, Ravi
The application of microchannel technology for distributive distillation was studied to achieve the Grand Challenge goals of 25% energy savings and 10% return on investment. In Task 1, a detailed study was conducted and two distillation systems were identified that would meet the Grand Challenge goals if the microchannel distillation technology was used. Material and heat balance calculations were performed to develop process flow sheet designs for the two distillation systems in Task 2. The process designs were focused on two methods of integrating the microchannel technology 1) Integrating microchannel distillation to an existing conventional column, 2) Microchannel distillation formore » new plants. A design concept for a modular microchannel distillation unit was developed in Task 3. In Task 4, Ultrasonic Additive Machining (UAM) was evaluated as a manufacturing method for microchannel distillation units. However, it was found that a significant development work would be required to develop process parameters to use UAM for commercial distillation manufacturing. Two alternate manufacturing methods were explored. Both manufacturing approaches were experimentally tested to confirm their validity. The conceptual design of the microchannel distillation unit (Task 3) was combined with the manufacturing methods developed in Task 4 and flowsheet designs in Task 2 to estimate the cost of the microchannel distillation unit and this was compared to a conventional distillation column. The best results were for a methanol-water separation unit for the use in a biodiesel facility. For this application microchannel distillation was found to be more cost effective than conventional system and capable of meeting the DOE Grand Challenge performance requirements.« less
A Research Planning Assessment for Applications of Artificial Intelligence in Manufacturing.
1986-01-01
to apply,based on their needs.___ *PROJECT OESCRIPTION/ APPROACH : -The project will apply A_ in r’ejpesent ni ri O siri ct...of this project are essential for the practical implementation of Al-based approaches to improving unit processes. This work will enable advances in ...July 1985 to 1 August 1985. The authors wish to thank all workshop participants for their contributions to this effort. In particular, we wish to
Defense Acquisition Research Journal. Volume 21, Number 3, Issue 70
2014-07-01
the science of administration. New York, NY: Columbia University Institute of Public Administration. Hasik, J. (2004). Dream teams and brilliant eyes...advantage, the United States needs to develop a process that enables the lucid and rapid production of mission-tailored platforms that do not rely solely on...weapons does not require relying on the springboard of new technology, it just demands lucid and incisive thinking. However, this is not a strong point
Improving Intelligence Integration Amongst the Intelligence Community
2014-06-13
numerous attacks because of the work of the men and women in the IC. Success in the IC goes unrecognized while failure becomes public knowledge...ELEMENT NUMBER 6. AUTHOR(S) Michael D. Norton, MAJ, USA 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING...the process. His lifelong dedication to research and teaching enabled me the freedom to work the project knowing he was a simple call away. Dr. House
Study on photochemical analysis system (VLES) for EUV lithography
NASA Astrophysics Data System (ADS)
Sekiguchi, A.; Kono, Y.; Kadoi, M.; Minami, Y.; Kozawa, T.; Tagawa, S.; Gustafson, D.; Blackborow, P.
2007-03-01
A system for photo-chemical analysis of EUV lithography processes has been developed. This system has consists of 3 units: (1) an exposure that uses the Z-Pinch (Energetiq Tech.) EUV Light source (DPP) to carry out a flood exposure, (2) a measurement system RDA (Litho Tech Japan) for the development rate of photo-resists, and (3) a simulation unit that utilizes PROLITH (KLA-Tencor) to calculate the resist profiles and process latitude using the measured development rate data. With this system, preliminary evaluation of the performance of EUV lithography can be performed without any lithography tool (Stepper and Scanner system) that is capable of imaging and alignment. Profiles for 32 nm line and space pattern are simulated for the EUV resist (Posi-2 resist by TOK) by using VLES that hat has sensitivity at the 13.5nm wavelength. The simulation successfully predicts the resist behavior. Thus it is confirmed that the system enables efficient evaluation of the performance of EUV lithography processes.
Design of video interface conversion system based on FPGA
NASA Astrophysics Data System (ADS)
Zhao, Heng; Wang, Xiang-jun
2014-11-01
This paper presents a FPGA based video interface conversion system that enables the inter-conversion between digital and analog video. Cyclone IV series EP4CE22F17C chip from Altera Corporation is used as the main video processing chip, and single-chip is used as the information interaction control unit between FPGA and PC. The system is able to encode/decode messages from the PC. Technologies including video decoding/encoding circuits, bus communication protocol, data stream de-interleaving and de-interlacing, color space conversion and the Camera Link timing generator module of FPGA are introduced. The system converts Composite Video Broadcast Signal (CVBS) from the CCD camera into Low Voltage Differential Signaling (LVDS), which will be collected by the video processing unit with Camera Link interface. The processed video signals will then be inputted to system output board and displayed on the monitor.The current experiment shows that it can achieve high-quality video conversion with minimum board size.
Accelerating image recognition on mobile devices using GPGPU
NASA Astrophysics Data System (ADS)
Bordallo López, Miguel; Nykänen, Henri; Hannuksela, Jari; Silvén, Olli; Vehviläinen, Markku
2011-01-01
The future multi-modal user interfaces of battery-powered mobile devices are expected to require computationally costly image analysis techniques. The use of Graphic Processing Units for computing is very well suited for parallel processing and the addition of programmable stages and high precision arithmetic provide for opportunities to implement energy-efficient complete algorithms. At the moment the first mobile graphics accelerators with programmable pipelines are available, enabling the GPGPU implementation of several image processing algorithms. In this context, we consider a face tracking approach that uses efficient gray-scale invariant texture features and boosting. The solution is based on the Local Binary Pattern (LBP) features and makes use of the GPU on the pre-processing and feature extraction phase. We have implemented a series of image processing techniques in the shader language of OpenGL ES 2.0, compiled them for a mobile graphics processing unit and performed tests on a mobile application processor platform (OMAP3530). In our contribution, we describe the challenges of designing on a mobile platform, present the performance achieved and provide measurement results for the actual power consumption in comparison to using the CPU (ARM) on the same platform.
Workflow for Criticality Assessment Applied in Biopharmaceutical Process Validation Stage 1.
Zahel, Thomas; Marschall, Lukas; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Mueller, Eric M; Murphy, Patrick; Natschläger, Thomas; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph
2017-10-12
Identification of critical process parameters that impact product quality is a central task during regulatory requested process validation. Commonly, this is done via design of experiments and identification of parameters significantly impacting product quality (rejection of the null hypothesis that the effect equals 0). However, parameters which show a large uncertainty and might result in an undesirable product quality limit critical to the product, may be missed. This might occur during the evaluation of experiments since residual/un-modelled variance in the experiments is larger than expected a priori. Estimation of such a risk is the task of the presented novel retrospective power analysis permutation test. This is evaluated using a data set for two unit operations established during characterization of a biopharmaceutical process in industry. The results show that, for one unit operation, the observed variance in the experiments is much larger than expected a priori, resulting in low power levels for all non-significant parameters. Moreover, we present a workflow of how to mitigate the risk associated with overlooked parameter effects. This enables a statistically sound identification of critical process parameters. The developed workflow will substantially support industry in delivering constant product quality, reduce process variance and increase patient safety.
Quantitation of Staphylococcus aureus in Seawater Using CHROMagar™ SA
Pombo, David; Hui, Jennifer; Kurano, Michelle; Bankowski, Matthew J; Seifried, Steven E
2010-01-01
A microbiological algorithm has been developed to analyze beach water samples for the determination of viable colony forming units (CFU) of Staphylococcus aureus (S. aureus). Membrane filtration enumeration of S. aureus from recreational beach waters using the chromogenic media CHROMagar™SA alone yields a positive predictive value (PPV) of 70%. Presumptive CHROMagar™SA colonies were confirmed as S. aureus by 24-hour tube coagulase test. Combined, these two tests yield a PPV of 100%. This algorithm enables accurate quantitation of S. aureus in seawater in 72 hours and could support risk-prediction processes for recreational waters. A more rapid protocol, utilizing a 4-hour tube coagulase confirmatory test, enables a 48-hour turnaround time with a modest false negative rate of less than 10%. PMID:20222490
Embedded Volttron specification - benchmarking small footprint compute device for Volttron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanyal, Jibonananda; Fugate, David L.; Woodworth, Ken
An embedded system is a small footprint computing unit that typically serves a specific purpose closely associated with measurements and control of hardware devices. These units are designed for reasonable durability and operations in a wide range of operating conditions. Some embedded systems support real-time operations and can demonstrate high levels of reliability. Many have failsafe mechanisms built to handle graceful shutdown of the device in exception conditions. The available memory, processing power, and network connectivity of these devices are limited due to the nature of their specific-purpose design and intended application. Industry practice is to carefully design the softwaremore » for the available hardware capability to suit desired deployment needs. Volttron is an open source agent development and deployment platform designed to enable researchers to interact with devices and appliances without having to write drivers themselves. Hosting Volttron on small footprint embeddable devices enables its demonstration for embedded use. This report details the steps required and the experience in setting up and running Volttron applications on three small footprint devices: the Intel Next Unit of Computing (NUC), the Raspberry Pi 2, and the BeagleBone Black. In addition, the report also details preliminary investigation of the execution performance of Volttron on these devices.« less
Biodesign process and culture to enable pediatric medical technology innovation.
Wall, James; Wynne, Elizabeth; Krummel, Thomas
2015-06-01
Innovation is the process through which new scientific discoveries are developed and promoted from bench to bedside. In an effort to encourage young entrepreneurs in this area, Stanford Biodesign developed a medical device innovation training program focused on need-based innovation. The program focuses on teaching systematic evaluation of healthcare needs, invention, and concept development. This process can be applied to any field of medicine, including Pediatric Surgery. Similar training programs have gained traction throughout the United States and beyond. Equally important to process in the success of these programs is an institutional culture that supports transformative thinking. Key components of this culture include risk tolerance, patience, encouragement of creativity, management of conflict, and networking effects. Copyright © 2015 Elsevier Inc. All rights reserved.
Image processing applications: From particle physics to society
NASA Astrophysics Data System (ADS)
Sotiropoulou, C.-L.; Luciano, P.; Gkaitatzis, S.; Citraro, S.; Giannetti, P.; Dell'Orso, M.
2017-01-01
We present an embedded system for extremely efficient real-time pattern recognition execution, enabling technological advancements with both scientific and social impact. It is a compact, fast, low consumption processing unit (PU) based on a combination of Field Programmable Gate Arrays (FPGAs) and the full custom associative memory chip. The PU has been developed for real time tracking in particle physics experiments, but delivers flexible features for potential application in a wide range of fields. It has been proposed to be used in accelerated pattern matching execution for Magnetic Resonance Fingerprinting (biomedical applications), in real time detection of space debris trails in astronomical images (space applications) and in brain emulation for image processing (cognitive image processing). We illustrate the potentiality of the PU for the new applications.
Persistent Thalamic Sound Processing Despite Profound Cochlear Denervation.
Chambers, Anna R; Salazar, Juan J; Polley, Daniel B
2016-01-01
Neurons at higher stages of sensory processing can partially compensate for a sudden drop in peripheral input through a homeostatic plasticity process that increases the gain on weak afferent inputs. Even after a profound unilateral auditory neuropathy where >95% of afferent synapses between auditory nerve fibers and inner hair cells have been eliminated with ouabain, central gain can restore cortical processing and perceptual detection of basic sounds delivered to the denervated ear. In this model of profound auditory neuropathy, auditory cortex (ACtx) processing and perception recover despite the absence of an auditory brainstem response (ABR) or brainstem acoustic reflexes, and only a partial recovery of sound processing at the level of the inferior colliculus (IC), an auditory midbrain nucleus. In this study, we induced a profound cochlear neuropathy with ouabain and asked whether central gain enabled a compensatory plasticity in the auditory thalamus comparable to the full recovery of function previously observed in the ACtx, the partial recovery observed in the IC, or something different entirely. Unilateral ouabain treatment in adult mice effectively eliminated the ABR, yet robust sound-evoked activity persisted in a minority of units recorded from the contralateral medial geniculate body (MGB) of awake mice. Sound driven MGB units could decode moderate and high-intensity sounds with accuracies comparable to sham-treated control mice, but low-intensity classification was near chance. Pure tone receptive fields and synchronization to broadband pulse trains also persisted, albeit with significantly reduced quality and precision, respectively. MGB decoding of temporally modulated pulse trains and speech tokens were both greatly impaired in ouabain-treated mice. Taken together, the absence of an ABR belied a persistent auditory processing at the level of the MGB that was likely enabled through increased central gain. Compensatory plasticity at the level of the auditory thalamus was less robust overall than previous observations in cortex or midbrain. Hierarchical differences in compensatory plasticity following sensorineural hearing loss may reflect differences in GABA circuit organization within the MGB, as compared to the ACtx or IC.
Asymmetric neighborhood functions accelerate ordering process of self-organizing maps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ota, Kaiichiro; Aoki, Takaaki; Kurata, Koji
2011-02-15
A self-organizing map (SOM) algorithm can generate a topographic map from a high-dimensional stimulus space to a low-dimensional array of units. Because a topographic map preserves neighborhood relationships between the stimuli, the SOM can be applied to certain types of information processing such as data visualization. During the learning process, however, topological defects frequently emerge in the map. The presence of defects tends to drastically slow down the formation of a globally ordered topographic map. To remove such topological defects, it has been reported that an asymmetric neighborhood function is effective, but only in the simple case of mapping one-dimensionalmore » stimuli to a chain of units. In this paper, we demonstrate that even when high-dimensional stimuli are used, the asymmetric neighborhood function is effective for both artificial and real-world data. Our results suggest that applying the asymmetric neighborhood function to the SOM algorithm improves the reliability of the algorithm. In addition, it enables processing of complicated, high-dimensional data by using this algorithm.« less
A real-time GNSS-R system based on software-defined radio and graphics processing units
NASA Astrophysics Data System (ADS)
Hobiger, Thomas; Amagai, Jun; Aida, Masanori; Narita, Hideki
2012-04-01
Reflected signals of the Global Navigation Satellite System (GNSS) from the sea or land surface can be utilized to deduce and monitor physical and geophysical parameters of the reflecting area. Unlike most other remote sensing techniques, GNSS-Reflectometry (GNSS-R) operates as a passive radar that takes advantage from the increasing number of navigation satellites that broadcast their L-band signals. Thereby, most of the GNSS-R receiver architectures are based on dedicated hardware solutions. Software-defined radio (SDR) technology has advanced in the recent years and enabled signal processing in real-time, which makes it an ideal candidate for the realization of a flexible GNSS-R system. Additionally, modern commodity graphic cards, which offer massive parallel computing performances, allow to handle the whole signal processing chain without interfering with the PC's CPU. Thus, this paper describes a GNSS-R system which has been developed on the principles of software-defined radio supported by General Purpose Graphics Processing Units (GPGPUs), and presents results from initial field tests which confirm the anticipated capability of the system.
NASA Astrophysics Data System (ADS)
Kim, Jeongnim; Baczewski, Andrew D.; Beaudet, Todd D.; Benali, Anouar; Chandler Bennett, M.; Berrill, Mark A.; Blunt, Nick S.; Josué Landinez Borda, Edgar; Casula, Michele; Ceperley, David M.; Chiesa, Simone; Clark, Bryan K.; Clay, Raymond C., III; Delaney, Kris T.; Dewing, Mark; Esler, Kenneth P.; Hao, Hongxia; Heinonen, Olle; Kent, Paul R. C.; Krogel, Jaron T.; Kylänpää, Ilkka; Li, Ying Wai; Lopez, M. Graham; Luo, Ye; Malone, Fionn D.; Martin, Richard M.; Mathuriya, Amrita; McMinis, Jeremy; Melton, Cody A.; Mitas, Lubos; Morales, Miguel A.; Neuscamman, Eric; Parker, William D.; Pineda Flores, Sergio D.; Romero, Nichols A.; Rubenstein, Brenda M.; Shea, Jacqueline A. R.; Shin, Hyeondeok; Shulenburger, Luke; Tillack, Andreas F.; Townsend, Joshua P.; Tubman, Norm M.; Van Der Goetz, Brett; Vincent, Jordan E.; ChangMo Yang, D.; Yang, Yubo; Zhang, Shuai; Zhao, Luning
2018-05-01
QMCPACK is an open source quantum Monte Carlo package for ab initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater–Jastrow type trial wavefunctions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary-field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performance computing architectures, including multicore central processing unit and graphical processing unit systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://qmcpack.org.
Kim, Jeongnim; Baczewski, Andrew T; Beaudet, Todd D; Benali, Anouar; Bennett, M Chandler; Berrill, Mark A; Blunt, Nick S; Borda, Edgar Josué Landinez; Casula, Michele; Ceperley, David M; Chiesa, Simone; Clark, Bryan K; Clay, Raymond C; Delaney, Kris T; Dewing, Mark; Esler, Kenneth P; Hao, Hongxia; Heinonen, Olle; Kent, Paul R C; Krogel, Jaron T; Kylänpää, Ilkka; Li, Ying Wai; Lopez, M Graham; Luo, Ye; Malone, Fionn D; Martin, Richard M; Mathuriya, Amrita; McMinis, Jeremy; Melton, Cody A; Mitas, Lubos; Morales, Miguel A; Neuscamman, Eric; Parker, William D; Pineda Flores, Sergio D; Romero, Nichols A; Rubenstein, Brenda M; Shea, Jacqueline A R; Shin, Hyeondeok; Shulenburger, Luke; Tillack, Andreas F; Townsend, Joshua P; Tubman, Norm M; Van Der Goetz, Brett; Vincent, Jordan E; Yang, D ChangMo; Yang, Yubo; Zhang, Shuai; Zhao, Luning
2018-05-16
QMCPACK is an open source quantum Monte Carlo package for ab initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wavefunctions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary-field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performance computing architectures, including multicore central processing unit and graphical processing unit systems. We detail the program's capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://qmcpack.org.
Daugherty, Elizabeth L; Rubinson, Lewis
2011-11-01
In recent years, healthcare disaster planning has grown from its early place as an occasional consideration within the manuals of emergency medical services and emergency department managers to a rapidly growing field, which considers continuity of function, surge capability, and process changes across the spectrum of healthcare delivery. A detailed examination of critical care disaster planning was undertaken in 2007 by the Task Force for Mass Critical Care of the American College of Chest Physicians Critical Care Collaborative Initiative. We summarize the Task Force recommendations and available updated information to answer a fundamental question for critical care disaster planners: What is a prepared intensive care unit and how do I ensure my unit's readiness? Database searches and review of relevant published literature. Preparedness is essential for successful response, but because intensive care units face many competing priorities, without defining "preparedness for what," the task can seem overwhelming. Intensive care unit disaster planners should, therefore, along with the entire hospital, participate in a hospital or regionwide planning process to 1) identify critical care response vulnerabilities; and 2) clarify the hazards for which their community is most at risk. The process should inform a comprehensive written preparedness plan targeting the most worrisome scenarios and including specific guidance on 1) optimal use of space, equipment, and staffing for delivery of critical care to significantly increased patient volumes; 2) allocation of resources for provision of essential critical care services under conditions of absolute scarcity; 3) intensive care unit evacuation; and 4) redundant internal communication systems and means for timely data collection. Critical care disaster planners have a complex, challenging task. Experienced planners will agree that no disaster response is perfect, but careful planning will enable the prepared intensive care unit to respond effectively in times of crisis.
Use telecommunications for real-time process control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zilberman, I.; Bigman, J.; Sela, I.
1996-05-01
Process operators design real-time accurate information to monitor and control product streams and to optimize unit operations. The challenge is how to cost-effectively install sophisticated analytical equipment in harsh environments such as process areas and maintain system reliability. Incorporating telecommunications technology with near infrared (NIR) spectroscopy may be the bridge to help operations achieve their online control goals. Coupling communications fiber optics with NIR analyzers enables the probe and sampling system to remain in the field and crucial analytical equipment to be remotely located in a general purpose area without specialized protection provisions. The case histories show how two refineriesmore » used NIR spectroscopy online to track octane levels for reformate streams.« less
Room-temperature current blockade in atomically defined single-cluster junctions
NASA Astrophysics Data System (ADS)
Lovat, Giacomo; Choi, Bonnie; Paley, Daniel W.; Steigerwald, Michael L.; Venkataraman, Latha; Roy, Xavier
2017-11-01
Fabricating nanoscopic devices capable of manipulating and processing single units of charge is an essential step towards creating functional devices where quantum effects dominate transport characteristics. The archetypal single-electron transistor comprises a small conducting or semiconducting island separated from two metallic reservoirs by insulating barriers. By enabling the transfer of a well-defined number of charge carriers between the island and the reservoirs, such a device may enable discrete single-electron operations. Here, we describe a single-molecule junction comprising a redox-active, atomically precise cobalt chalcogenide cluster wired between two nanoscopic electrodes. We observe current blockade at room temperature in thousands of single-cluster junctions. Below a threshold voltage, charge transfer across the junction is suppressed. The device is turned on when the temporary occupation of the core states by a transiting carrier is energetically enabled, resulting in a sequential tunnelling process and an increase in current by a factor of ∼600. We perform in situ and ex situ cyclic voltammetry as well as density functional theory calculations to unveil a two-step process mediated by an orbital localized on the core of the cluster in which charge carriers reside before tunnelling to the collector reservoir. As the bias window of the junction is opened wide enough to include one of the cluster frontier orbitals, the current blockade is lifted and charge carriers can tunnel sequentially across the junction.
Computer-supported weight-based drug infusion concentrations in the neonatal intensive care unit.
Giannone, Gay
2005-01-01
This article addresses the development of a computerized provider order entry (CPOE)-embedded solution for weight-based neonatal drug infusion developed during the transition from a legacy CPOE system to a customized application of a neonatal CPOE product during a hospital-wide information system transition. The importance of accurate fluid management in the neonate is reviewed. The process of tailoring the system that eventually resulted in the successful development of a computer application enabling weight-based medication infusion calculation for neonates within the CPOE information system is explored. In addition, the article provides guidelines on how to customize a vendor solution for hospitals with neonatal intensive care unit.
Korasa, Klemen; Vrečer, Franc
2018-01-01
Over the last two decades, regulatory agencies have demanded better understanding of pharmaceutical products and processes by implementing new technological approaches, such as process analytical technology (PAT). Process analysers present a key PAT tool, which enables effective process monitoring, and thus improved process control of medicinal product manufacturing. Process analysers applicable in pharmaceutical coating unit operations are comprehensibly described in the present article. The review is focused on monitoring of solid oral dosage forms during film coating in two most commonly used coating systems, i.e. pan and fluid bed coaters. Brief theoretical background and critical overview of process analysers used for real-time or near real-time (in-, on-, at- line) monitoring of critical quality attributes of film coated dosage forms are presented. Besides well recognized spectroscopic methods (NIR and Raman spectroscopy), other techniques, which have made a significant breakthrough in recent years, are discussed (terahertz pulsed imaging (TPI), chord length distribution (CLD) analysis, and image analysis). Last part of the review is dedicated to novel techniques with high potential to become valuable PAT tools in the future (optical coherence tomography (OCT), acoustic emission (AE), microwave resonance (MR), and laser induced breakdown spectroscopy (LIBS)). Copyright © 2017 Elsevier B.V. All rights reserved.
Advancing microwave technology for dehydration processing of biologics.
Cellemme, Stephanie L; Van Vorst, Matthew; Paramore, Elisha; Elliott, Gloria D
2013-10-01
Our prior work has shown that microwave processing can be effective as a method for dehydrating cell-based suspensions in preparation for anhydrous storage, yielding homogenous samples with predictable and reproducible drying times. In the current work an optimized microwave-based drying process was developed that expands upon this previous proof-of-concept. Utilization of a commercial microwave (CEM SAM 255, Matthews, NC) enabled continuous drying at variable low power settings. A new turntable was manufactured from Ultra High Molecular Weight Polyethylene (UHMW-PE; Grainger, Lake Forest, IL) to provide for drying of up to 12 samples at a time. The new process enabled rapid and simultaneous drying of multiple samples in containment devices suitable for long-term storage and aseptic rehydration of the sample. To determine sample repeatability and consistency of drying within the microwave cavity, a concentration series of aqueous trehalose solutions were dried for specific intervals and water content assessed using Karl Fischer Titration at the end of each processing period. Samples were dried on Whatman S-14 conjugate release filters (Whatman, Maidestone, UK), a glass fiber membrane used currently in clinical laboratories. The filters were cut to size for use in a 13 mm Swinnex(®) syringe filter holder (Millipore(™), Billerica, MA). Samples of 40 μL volume could be dehydrated to the equilibrium moisture content by continuous processing at 20% with excellent sample-to-sample repeatability. The microwave-assisted procedure enabled high throughput, repeatable drying of multiple samples, in a manner easily adaptable for drying a wide array of biological samples. Depending on the tolerance for sample heating, the drying time can be altered by changing the power level of the microwave unit.
Tuning and synthesis of metallic nanostructures by mechanical compression
Fan, Hongyou; Li, Binsong
2015-11-17
The present invention provides a pressure-induced phase transformation process to engineer metal nanoparticle architectures and to fabricate new nanostructured materials. The reversible changes of the nanoparticle unit cell dimension under pressure allow precise control over interparticle separation in 2D or 3D nanoparticle assemblies, offering unique robustness for interrogation of both quantum and classic coupling interactions. Irreversible changes above a threshold pressure of about 8 GPa enables new nanostructures, such as nanorods, nanowires, or nanosheets.
2010-03-24
integrity violations, sexual harassment issues, hazing, theft, substance abuse, and those ’involving both physical and moral courage that may need to be...overwhelming adversity. The goal is to build upon the moral muscle memory of the Marine, to enable him/her to mal <:e the right decision for the right...such as those pertaining to Marine Corps policy and organizational values. Regarding policy, recruits learn the specific regulations about sexual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daniel Curtis; Charles Forsberg; Humberto Garcia
2015-05-01
We propose the development of Nuclear Renewable Oil Shale Systems (NROSS) in northern Europe, China, and the western United States to provide large supplies of flexible, dispatchable, very-low-carbon electricity and fossil fuel production with reduced CO2 emissions. NROSS are a class of large hybrid energy systems in which base-load nuclear reactors provide the primary energy used to produce shale oil from kerogen deposits and simultaneously provide flexible, dispatchable, very-low-carbon electricity to the grid. Kerogen is solid organic matter trapped in sedimentary shale, and large reserves of this resource, called oil shale, are found in northern Europe, China, and the westernmore » United States. NROSS couples electricity generation and transportation fuel production in a single operation, reduces lifecycle carbon emissions from the fuel produced, improves revenue for the nuclear plant, and enables a major shift toward a very-low-carbon electricity grid. NROSS will require a significant development effort in the United States, where kerogen resources have never been developed on a large scale. In Europe, however, nuclear plants have been used for process heat delivery (district heating), and kerogen use is familiar in certain countries. Europe, China, and the United States all have the opportunity to use large scale NROSS development to enable major growth in renewable generation and either substantially reduce or eliminate their dependence on foreign fossil fuel supplies, accelerating their transitions to cleaner, more efficient, and more reliable energy systems.« less
A social-level macro-governance mode for collaborative manufacturing processes
NASA Astrophysics Data System (ADS)
Gao, Ji; Lv, Hexin; Jin, Zhiyong; Xu, Ping
2017-08-01
This paper proposes the social-level macro-governance mode for innovating the popular centralized control for CoM (Collaborative Manufacturing) processes, and makes this mode depend on the support from three aspects of technologies standalone and complementary: social-level CoM process norms, CoM process supervision system, and rational agents as the brokers of enterprises. It is the close coupling of those technologies that redounds to removing effectively the uncontrollability obstacle confronted with by cross-management-domain CoM processes. As a result, this mode enables CoM applications to be implemented by uniting the centralized control of CoM partners for respective CoM activities, and therefore provides a new distributed CoM process control mode to push forward the convenient development and large-scale deployment of SME-oriented CoM applications.
NASA Astrophysics Data System (ADS)
Johnson, M.-V. V.; Norfleet, M. L.; Atwood, J. D.; Behrman, K. D.; Kiniry, J. R.; Arnold, J. G.; White, M. J.; Williams, J.
2015-07-01
The Conservation Effects Assessment Project (CEAP) was initiated to quantify the impacts of agricultural conservation practices at the watershed, regional, and national scales across the United States. Representative cropland acres in all major U.S. watersheds were surveyed in 2003-2006 as part of the seminal CEAP Cropland National Assessment. Two process-based models, the Agricultural Policy Environmental eXtender(APEX) and the Soil Water Assessment Tool (SWAT), were applied to the survey data to provide a quantitative assessment of current conservation practice impacts, establish a benchmark against which future conservation trends and efforts could be measured, and identify outstanding conservation concerns. The flexibility of these models and the unprecedented amount of data on current conservation practices across the country enabled Cropland CEAP to meet its Congressional mandate of quantifying the value of current conservation practices. It also enabled scientifically grounded exploration of a variety of conservation scenarios, empowering CEAP to not only inform on past successes and additional needs, but to also provide a decision support tool to help guide future policy development and conservation practice decision making. The CEAP effort will repeat the national survey in 2015-2016, enabling CEAP to provide analyses of emergent conservation trends, outstanding needs, and potential costs and benefits of pursuing various treatment scenarios for all agricultural watersheds across the United States.
Morrison, Cecily; Jones, Matthew; Jones, Rachel; Vuylsteke, Alain
2013-04-10
Current policies encourage healthcare institutions to acquire clinical information systems (CIS) so that captured data can be used for secondary purposes, including clinical process improvement. Such policies do not account for the extra work required to repurpose data for uses other than direct clinical care, making their implementation problematic. This paper aims to analyze the strategies employed by clinical units to use data effectively for both direct clinical care and clinical process improvement. Ethnographic methods were employed. A total of 54 contextual interviews with health professionals spanning various disciplines and 18 hours of observation were carried out in 5 intensive care units in England using an advanced CIS. Case studies of how the extra work was achieved in each unit were derived from the data and then compared. We found that extra work is required to repurpose CIS data for clinical process improvement. Health professionals must enter data not required for clinical care and manipulation of this data into a machine-readable form is often necessary. Ambiguity over who should be responsible for this extra work hindered CIS data usage for clinical process improvement. We describe 11 strategies employed by units to accommodate this extra work, distributing it across roles. Seven of these motivated data entry by health professionals and four addressed the machine readability of data. Many of the strategies relied heavily on the skill and leadership of local clinical customizers. To realize the expected clinical process improvements by the use of CIS data, clinical leaders and policy makers need to recognize and support the redistribution of the extra work that is involved in data repurposing. Adequate time, funding, and appropriate motivation are needed to enable units to acquire and deliver the necessary skills in CIS customization.
Scully, John R
2015-01-01
Recent advances in characterization tools, computational capabilities, and theories have created opportunities for advancement in understanding of solid-fluid interfaces at the nanoscale in corroding metallic systems. The Faraday Discussion on Corrosion Chemistry in 2015 highlighted some of the current needs, gaps and opportunities in corrosion science. Themes were organized into several hierarchical categories that provide an organizational framework for corrosion. Opportunities to develop fundamental physical and chemical data which will enable further progress in thermodynamic and kinetic modelling of corrosion were discussed. These will enable new and better understanding of unit processes that govern corrosion at the nanoscale. Additional topics discussed included scales, films and oxides, fluid-surface and molecular-surface interactions, selected topics in corrosion science and engineering as well as corrosion control. Corrosion science and engineering topics included complex alloy dissolution, local corrosion, and modelling of specific corrosion processes that are made up of collections of temporally and spatially varying unit processes such as oxidation, ion transport, and competitive adsorption. Corrosion control and mitigation topics covered some new insights on coatings and inhibitors. Further advances in operando or in situ experimental characterization strategies at the nanoscale combined with computational modelling will enhance progress in the field, especially if coupling across length and time scales can be achieved incorporating the various phenomena encountered in corrosion. Readers are encouraged to not only to use this ad hoc organizational scheme to guide their immersion into the current opportunities in corrosion chemistry, but also to find value in the information presented in their own ways.
Campbell, Denise J; Brown, Fiona G; Craig, Jonathan C; Gallagher, Martin P; Johnson, David W; Kirkland, Geoffrey S; Kumar, Subramanian K; Lim, Wai H; Ranganathan, Dwarakanathan; Saweirs, Walaa; Sud, Kamal; Toussaint, Nigel D; Walker, Rowan G; Williams, Lesley A; Yehia, Maha; Mudge, David W
2016-04-01
Existing Australasian and international guidelines outline antibiotic and antifungal measures to prevent the development of treatment-related infection in peritoneal dialysis (PD) patients. Practice patterns and rates of PD-related infection vary widely across renal units in Australia and New Zealand and are known to vary significantly from guideline recommendations, resulting in PD technique survival rates that are lower than those achieved in many other countries. The aim of this study was to determine if there is an association between current practice and PD-related infection outcomes and to identify the barriers and enablers to good clinical practice. This is a multicentre network study involving eight PD units in Australia and New Zealand, with a focus on adherence to guideline recommendations on antimicrobial prophylaxis in PD patients. Current practice was established by asking the PD unit heads to respond to a short survey about practice/protocols/policies and a 'process map' was constructed following a face-to-face interview with the primary PD nurse at each unit. The perceived barriers/enablers to adherence to the relevant guideline recommendations were obtained from the completion of 'cause and effect' diagrams by the nephrologist and PD nurse at each unit. Data on PD-related infections were obtained for the period 1 January 2011 to 31 December 2011. Perceived barriers that may result in reduced adherence to guideline recommendations included lack of knowledge, procedural lapses, lack of a centralized patient database, patients with non-English speaking background, professional concern about antibiotic resistance, medication cost and the inability of nephrologists and infectious diseases staff to reach consensus on unit protocols. The definitions of PD-related infections used by some units varied from those recommended by the International Society for Peritoneal Dialysis, particularly with exit-site infection (ESI). Wide variations were observed in the rates of ESI (0.06-0.53 episodes per patient-year) and peritonitis (0.31-0.86 episodes per patient-year). Despite the existence of strongly evidence-based guideline recommendations, there was wide variation in adherence to these recommendations between PD units which might contribute to PD-related infection rates, which varied widely between units. Although individual patient characteristics may account for some of this variability, inconsistencies in the processes of care to prevent infection in PD patients also play a role. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Designed cell consortia as fragrance-programmable analog-to-digital converters.
Müller, Marius; Ausländer, Simon; Spinnler, Andrea; Ausländer, David; Sikorski, Julian; Folcher, Marc; Fussenegger, Martin
2017-03-01
Synthetic biology advances the rational engineering of mammalian cells to achieve cell-based therapy goals. Synthetic gene networks have nearly reached the complexity of digital electronic circuits and enable single cells to perform programmable arithmetic calculations or to provide dynamic remote control of transgenes through electromagnetic waves. We designed a synthetic multilayered gaseous-fragrance-programmable analog-to-digital converter (ADC) allowing for remote control of digital gene expression with 2-bit AND-, OR- and NOR-gate logic in synchronized cell consortia. The ADC consists of multiple sampling-and-quantization modules sensing analog gaseous fragrance inputs; a gas-to-liquid transducer converting fragrance intensity into diffusible cell-to-cell signaling compounds; a digitization unit with a genetic amplifier circuit to improve the signal-to-noise ratio; and recombinase-based digital expression switches enabling 2-bit processing of logic gates. Synthetic ADCs that can remotely control cellular activities with digital precision may enable the development of novel biosensors and may provide bioelectronic interfaces synchronizing analog metabolic pathways with digital electronics.
Menting, Roel; Ng, Dennis K P; Röder, Beate; Ermilov, Eugeny A
2012-11-14
Porphyrins, phthalocyanines and subphthalocyanines are three attractive classes of chromophores with intriguing properties making them suitable for the design of artificial photosynthetic systems. The assembly of these components by a supramolecular approach is of particular interest as it provides a facile means to build multi-chromophoric arrays with various architectures and tuneable photophysical properties. In this paper, we show the formation of mixed host-guest supramolecular complexes that consist of a β-cyclodextrin-conjugated subphthalocyanine, a tetrasulfonated porphyrin and a series of silicon(IV) phthalocyanines substituted axially with two β-cyclodextrins via different spacers. We found that the three components form supramolecular complexes held by host-guest interactions in aqueous solution. Upon excitation of the subphthalocyanine part of the complex, the excitation energy is delivered to the phthalocyanine unit via excitation energy transfer and the porphyrin chromophore acts as an energy transfer bridge enabling this process. It was shown that photo-induced charge transfer also takes place. A sequential electron transfer process from the porphyrin unit to the phthalocyanine moiety and subsequently from the subphthalocyanine moiety to the porphyrin unit takes place, and the probability of this process is controlled by the linker between β-cyclodextrin and phthalocyanine. The lifetime of the charge-separated state was found to be 1.7 ns by transient absorption spectroscopy.
Geological Sequestration of CO2 by Hydrous Carbonate Formation with Reclaimed Slag
DOE Office of Scientific and Technical Information (OSTI.GOV)
Von L. Richards; Kent Peaslee; Jeffrey Smith
The concept of this project is to develop a process that improves the kinetics of the hydrous carbonate formation reaction enabling steelmakers to directly remove CO2 from their furnace exhaust gas. It is proposed to bring the furnace exhaust stream containing CO2 in contact with reclaimed steelmaking slag in a reactor that has an environment near the unit activity of water resulting in the production of carbonates. The CO2 emissions from the plant would be reduced by the amount sequestered in the formation of carbonates. The main raw materials for the process are furnace exhaust gases and specially prepared slag.
SNMG: a social-level norm-based methodology for macro-governing service collaboration processes
NASA Astrophysics Data System (ADS)
Gao, Ji; Lv, Hexin; Jin, Zhiyong; Xu, Ping
2017-08-01
In order to adapt to the accelerative open tendency of collaborations between enterprises, this paper proposes a Social-level Norm-based methodology for Macro-Governing service collaboration processes, called SNMG, to regulate and control the social-level visible macro-behaviors of the social individuals participating in collaborations. SNMG not only can remove effectively the uncontrollability hindrance confronted with by open social activities, but also enables across-management-domain collaborations to be implemented by uniting the centralized controls of social individuals for respective social activities. Therefore, this paper provides a brand-new system construction mode to promote the development and large-scale deployment of service collaborations.
Robot Electronics Architecture
NASA Technical Reports Server (NTRS)
Garrett, Michael; Magnone, Lee; Aghazarian, Hrand; Baumgartner, Eric; Kennedy, Brett
2008-01-01
An electronics architecture has been developed to enable the rapid construction and testing of prototypes of robotic systems. This architecture is designed to be a research vehicle of great stability, reliability, and versatility. A system according to this architecture can easily be reconfigured (including expanded or contracted) to satisfy a variety of needs with respect to input, output, processing of data, sensing, actuation, and power. The architecture affords a variety of expandable input/output options that enable ready integration of instruments, actuators, sensors, and other devices as independent modular units. The separation of different electrical functions onto independent circuit boards facilitates the development of corresponding simple and modular software interfaces. As a result, both hardware and software can be made to expand or contract in modular fashion while expending a minimum of time and effort.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lail, Marty
The project aimed to advance RTI’s non-aqueous amine solvent technology by improving the solvent to reduce volatility, demonstrating long-term continuous operation at lab- (0.5 liters solvent) and bench-scale (~120 liters solvent), showing low reboiler heat duty measured during bench-scale testing, evaluating degradation products, building a rate-based process model, and evaluating the techno-economic performance of the process. The project team (RTI, SINTEF, Linde Engineering) and the technology performed well in each area of advancement. The modifications incorporated throughout the project enabled the attainment of target absorber and regenerator conditions for the process. Reboiler duties below 2,000 kJt/kg CO2 were observed inmore » a bench-scale test unit operated at RTI.« less
Heintz, Søren; Börner, Tim; Ringborg, Rolf H; Rehn, Gustav; Grey, Carl; Nordblad, Mathias; Krühne, Ulrich; Gernaey, Krist V; Adlercreutz, Patrick; Woodley, John M
2017-03-01
An experimental platform based on scaled-down unit operations combined in a plug-and-play manner enables easy and highly flexible testing of advanced biocatalytic process options such as in situ product removal (ISPR) process strategies. In such a platform, it is possible to compartmentalize different process steps while operating it as a combined system, giving the possibility to test and characterize the performance of novel process concepts and biocatalysts with minimal influence of inhibitory products. Here the capabilities of performing process development by applying scaled-down unit operations are highlighted through a case study investigating the asymmetric synthesis of 1-methyl-3-phenylpropylamine (MPPA) using ω-transaminase, an enzyme in the sub-family of amino transferases (ATAs). An on-line HPLC system was applied to avoid manual sample handling and to semi-automatically characterize ω-transaminases in a scaled-down packed-bed reactor (PBR) module, showing MPPA as a strong inhibitor. To overcome the inhibition, a two-step liquid-liquid extraction (LLE) ISPR concept was tested using scaled-down unit operations combined in a plug-and-play manner. Through the tested ISPR concept, it was possible to continuously feed the main substrate benzylacetone (BA) and extract the main product MPPA throughout the reaction, thereby overcoming the challenges of low substrate solubility and product inhibition. The tested ISPR concept achieved a product concentration of 26.5 g MPPA · L -1 , a purity up to 70% g MPPA · g tot -1 and a recovery in the range of 80% mol · mol -1 of MPPA in 20 h, with the possibility to increase the concentration, purity, and recovery further. Biotechnol. Bioeng. 2017;114: 600-609. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Fracturing And Liquid CONvection
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-02-29
FALCON has been developed to enable simulation of the tightly coupled fluid-rock behavior in hydrothermal and engineered geothermal system (EGS) reservoirs, targeting the dynamics of fracture stimulation, fluid flow, rock deformation, and heat transport in a single integrated code, with the ultimate goal of providing a tool that can be used to test the viability of EGS in the United States and worldwide. Reliable reservoir performance predictions of EGS systems require accurate and robust modeling for the coupled thermalhydrologicalmechanical processes.
The Role of the Mexican Urban Household in Decisions about Migration to the United States.
1980-01-01
Table 3 shows. And if it is true, the preoccupation of U. S. American authorities with the numbers of migrants who plan to and do take effec- tive...study is to enable INDECO to rationalize the process of urban planning in the secondary cities of Mexico, where the government hopes to concentrate a...Survey-Evaluation of Urban Dwellinq Environments (Baldwin 1974), developed by the School of Architecture and Planning of the Massachu- setts
Panthere V2: Multipurpose Simulation Software for 3D Dose Rate Calculations
NASA Astrophysics Data System (ADS)
Penessot, Gaël; Bavoil, Éléonore; Wertz, Laurent; Malouch, Fadhel; Visonneau, Thierry; Dubost, Julien
2017-09-01
PANTHERE is a multipurpose radiation protection software developed by EDF to calculate gamma dose rates in complex 3D environments. PANTHERE takes a key role in the EDF ALARA process, enabling to predict dose rates and to organize and optimize operations in high radiation environments. PANTHERE is also used for nuclear waste characterization, transport of nuclear materials, etc. It is used in most of the EDF engineering units and their design service providers and industrial partners.
Barriers and Enablers to Enacting Child and Youth Related Injury Prevention Legislation in Canada
Rothman, Linda; Pike, Ian; Belton, Kathy; Olsen, Lise; Fuselli, Pam; Macpherson, Alison
2016-01-01
Injury prevention policy is crucial for the safety of Canada’s children; however legislation is not adopted uniformly across the country. This study aimed to identify key barriers and enablers to enacting injury prevention legislation. Purposive snowball sampling identified individuals involved in injury prevention throughout Canada. An online survey asked respondents to identify policies that were relevant to them, and whether legislation existed in their province. Respondents rated the importance of barriers or enablers using a 5-point Likert type scale and included open-ended comments. Fifty-seven respondents identified the most common injury topics: bicycle helmets (44, 77%), cell phone-distracted driving (36, 63%), booster seats (28, 49%), ski helmets (24, 42%), and graduated driver’s licensing (21, 37%). The top enablers were research/surveillance, managerial/political support and professional group consultation, with much variability between injury topics. Open-ended comments emphasized the importance of a united opinion as an enabler and barriers included costs of protective equipment and inadequate enforcement of legislation. The results highlighted the importance of strategies that include research, management and community collaboration and that injury prevention topics should be addressed individually as information may be lost if topics are considered together. Findings can inform the process of turning injury prevention evidence into action. PMID:27399745
Microelectromechanical Systems
NASA Technical Reports Server (NTRS)
Gabriel, Kaigham J.
1995-01-01
Micro-electromechanical systems (MEMS) is an enabling technology that merges computation and communication with sensing and actuation to change the way people and machines interact with the physical world. MEMS is a manufacturing technology that will impact widespread applications including: miniature inertial measurement measurement units for competent munitions and personal navigation; distributed unattended sensors; mass data storage devices; miniature analytical instruments; embedded pressure sensors; non-invasive biomedical sensors; fiber-optics components and networks; distributed aerodynamic control; and on-demand structural strength. The long term goal of ARPA's MEMS program is to merge information processing with sensing and actuation to realize new systems and strategies for both perceiving and controlling systems, processes, and the environment. The MEMS program has three major thrusts: advanced devices and processes, system design, and infrastructure.
The United Nations programme on space applications: priority thematic areas
NASA Astrophysics Data System (ADS)
Haubold, H.
The Third United Nations Conference on the Exploration and Peaceful Uses of Outer Space (UNISPACE III) was held in 1999 with efforts to identify world wide benefits of developing space science and technology, particularly in the developing nations. One of the main vehicles to implement recommendations of UNISPACE III is the United Nations Programme on Space Applications of the Office for Outer Space Affairs at UN Headquarters in Vienna. Following a process of prioritization by Member States, the Programme focus its activities on (i) knowledge-based themes as space law and basic space science, (ii) application-based themes as disaster management, natural resources management, environmental monitoring, tele-health, and (iii) enabling technologies such as remote sensing satellites, communications satellites, global navigation satellite systems, and small satellites. Current activities of the Programme will be reviewed. Further information available at http://www.oosa.unvienna.org/sapidx.html
Next Generation X-Ray Optics: High-Resolution, Light-Weight, and Low-Cost
NASA Technical Reports Server (NTRS)
Zhang, William W.
2012-01-01
X-ray telescopes are essential to the future of x-ray astronomy. In this talk I will describe a comprehensive program to advance the technology for x-ray telescopes well beyond the state of the art represented by the three currently operating missions: Chandra, XMM-Newton, and Suzaku. This program will address the three key issues in making an x-ray telescope: (1) angular resolution, (2) effective area per unit mass, and (3) cost per unit effective area. The objectives of this technology program are (1) in the near term, to enable Explorer-class x-ray missions and an IXO-type mission, and (2) in the long term, to enable a flagship x-ray mission with sub-arcsecond angular resolution and multi-square-meter effective area, at an affordable cost. We pursue two approaches concurrently, emphasizing the first approach in the near term (2-5 years) and the second in the long term (4-10 years). The first approach is precision slumping of borosilicate glass sheets. By design and choice at the outset, this technique makes lightweight and low-cost mirrors. The development program will continue to improve angular resolution, to enable the production of 5-arcsecond x-ray telescopes, to support Explorer-class missions and one or more missions to supersede the original IXO mission. The second approach is precision polishing and light-weighting of single-crystal silicon mirrors. This approach benefits from two recent commercial developments: (1) the inexpensive and abundant availability of large blocks of monocrystalline silicon, and (2) revolutionary advances in deterministic, precision polishing of mirrors. By design and choice at the outset, this technique is capable of producing lightweight mirrors with sub-arcsecond angular resolution. The development program will increase the efficiency and reduce the cost of the polishing and the light-weighting processes, to enable the production of lightweight sub-arcsecond x-ray telescopes. Concurrent with the fabrication of lightweight mirror segments is the continued development and perfection of alignment and integration techniques, for incorporating individual mirror segments into a precision mirror assembly. Recently, we have been developing a technique called edge-bonding, which has achieved an accuracy to enable 10-arcsecond x-ray telescopes. Currently, we are investigating and improving the long-term alignment stability of so-bonded mirrors. Next, we shall refine this process to enable 5-arsecond x-ray telescopes. This technology development program includes all elements to demonstrate progress toward TRL-6: metrology; x-ray performance tests; coupled structural, thermal, and optical performance analysis, and environmental testing.
Next Generation X-Ray Optics: High-Resolution, Light-Weight, and Low-Cost
NASA Technical Reports Server (NTRS)
Zhang, William W.
2011-01-01
X-ray telescopes are essential to the future of x-ray astronomy. This paper describes a comprehensive program to advance the technology for x-ray telescopes well beyond the state of the art represented by the three currently operating missions: Chandra, XMM-Newton , and Suzaku . This program will address the three key issues in making an x-ray telescope: (I) angular resolution, (2) effective area per unit mass, and (3) cost per unit effective area. The objectives of this technology program are (1) in the near term, to enable Explorer-class x-ray missions and an IXO type mission, and (2) in the long term, to enable a flagship x-ray mission with sub-arcsecond angular resolution and multi-square-meter effective area, at an affordable cost. We pursue two approaches concurrently, emphasizing the first approach in the near term (2-5 years) and the second in the long term (4-10 years). The first approach is precision slumping of borosilicate glass sheets. By design and choice at the outset, this technique makes lightweight and low-cost mirrors. The development program will continue to improve angular resolution, to enable the production of 5-arcsecond x-ray telescopes, to support Explorer-class missions and one or more missions to supersede the original IXO mission. The second approach is precision polishing and light-weighting of single-crystal silicon mirrors. This approach benefits from two recent commercial developments: (1) the inexpensive and abundant availability of large blocks of mono crystalline silicon, and (2) revolutionary advances in deterministic, precision polishing of mirrors. By design and choice at the outset, this technique is capable of producing lightweight mirrors with sub-arcsecond angular resolution. The development program will increase the efficiency and reduce the cost of the polishing and the lightweighting processes, to enable the production of lightweight sub-arcsecond x-ray telescopes. Concurrent with the fabrication of lightweight mirror segments is the continued development and perfection of alignment and integration techniques, for incorporating individual mirror segments into a precision mirror assembly. Recently, we have been developing a technique called edge-bonding, which has achieved an accuracy to enable 10- arcsecond x-ray telescopes. Currently, we are investigating and improving the long-term alignment stability of so-bonded mirrors. Next, we shall refine this process to enable 5-arsecond x-ray telescopes. This technology development program includes all elements to demonstrate progress toward TRL-6: metrology; x-ray performance tests; coupled structural, thermal, and optical performance analysis, and environmental testing.
Configuration of electro-optic fire source detection system
NASA Astrophysics Data System (ADS)
Fabian, Ram Z.; Steiner, Zeev; Hofman, Nir
2007-04-01
The recent fighting activities in various parts of the world have highlighted the need for accurate fire source detection on one hand and fast "sensor to shooter cycle" capabilities on the other. Both needs can be met by the SPOTLITE system which dramatically enhances the capability to rapidly engage hostile fire source with a minimum of casualties to friendly force and to innocent bystanders. Modular system design enable to meet each customer specific requirements and enable excellent future growth and upgrade potential. The design and built of a fire source detection system is governed by sets of requirements issued by the operators. This can be translated into the following design criteria: I) Long range, fast and accurate fire source detection capability. II) Different threat detection and classification capability. III) Threat investigation capability. IV) Fire source data distribution capability (Location, direction, video image, voice). V) Men portability. ) In order to meet these design criteria, an optimized concept was presented and exercised for the SPOTLITE system. Three major modular components were defined: I) Electro Optical Unit -Including FLIR camera, CCD camera, Laser Range Finder and Marker II) Electronic Unit -including system computer and electronic. III) Controller Station Unit - Including the HMI of the system. This article discusses the system's components definition and optimization processes, and also show how SPOTLITE designers successfully managed to introduce excellent solutions for other system parameters.
Lopez-Iturri, Peio; Aguirre, Erik; Trigo, Jesús Daniel; Astrain, José Javier; Azpilicueta, Leyre; Serrano, Luis; Villadangos, Jesús; Falcone, Francisco
2018-01-29
In the context of hospital management and operation, Intensive Care Units (ICU) are one of the most challenging in terms of time responsiveness and criticality, in which adequate resource management and signal processing play a key role in overall system performance. In this work, a context aware Intensive Care Unit is implemented and analyzed to provide scalable signal acquisition capabilities, as well as to provide tracking and access control. Wireless channel analysis is performed by means of hybrid optimized 3D Ray Launching deterministic simulation to assess potential interference impact as well as to provide required coverage/capacity thresholds for employed transceivers. Wireless system operation within the ICU scenario, considering conventional transceiver operation, is feasible in terms of quality of service for the complete scenario. Extensive measurements of overall interference levels have also been carried out, enabling subsequent adequate coverage/capacity estimations, for a set of Zigbee based nodes. Real system operation has been tested, with ad-hoc designed Zigbee wireless motes, employing lightweight communication protocols to minimize energy and bandwidth usage. An ICU information gathering application and software architecture for Visitor Access Control has been implemented, providing monitoring of the Boxes external doors and the identification of visitors via a RFID system. The results enable a solution to provide ICU access control and tracking capabilities previously not exploited, providing a step forward in the implementation of a Smart Health framework.
NASA Astrophysics Data System (ADS)
Merwade, V.; Ruddell, B. L.; Fox, S.; Iverson, E. A. R.
2014-12-01
With the access to emerging datasets and computational tools, there is a need to bring these capabilities into hydrology classrooms. However, developing curriculum modules using data and models to augment classroom teaching is hindered by a steep technology learning curve, rapid technology turnover, and lack of an organized community cyberinfrastructure (CI) for the dissemination, publication, and sharing of the latest tools and curriculum material for hydrology and geoscience education. The objective of this project is to overcome some of these limitations by developing a cyber enabled collaborative environment for publishing, sharing and adoption of data and modeling driven curriculum modules in hydrology and geosciences classroom. The CI is based on Carleton College's Science Education Resource Center (SERC) Content Management System. Building on its existing community authoring capabilities the system is being extended to allow assembly of new teaching activities by drawing on a collection of interchangeable building blocks; each of which represents a step in the modeling process. Currently the system hosts more than 30 modules or steps, which can be combined to create multiple learning units. Two specific units: Unit Hydrograph and Rational Method, have been used in undergraduate hydrology class-rooms at Purdue University and Arizona State University. The structure of the CI and the lessons learned from its implementation, including preliminary results from student assessments of learning will be presented.
A Decentralized Framework for Multi-Agent Robotic Systems
2018-01-01
Over the past few years, decentralization of multi-agent robotic systems has become an important research area. These systems do not depend on a central control unit, which enables the control and assignment of distributed, asynchronous and robust tasks. However, in some cases, the network communication process between robotic agents is overlooked, and this creates a dependency for each agent to maintain a permanent link with nearby units to be able to fulfill its goals. This article describes a communication framework, where each agent in the system can leave the network or accept new connections, sending its information based on the transfer history of all nodes in the network. To this end, each agent needs to comply with four processes to participate in the system, plus a fifth process for data transfer to the nearest nodes that is based on Received Signal Strength Indicator (RSSI) and data history. To validate this framework, we use differential robotic agents and a monitoring agent to generate a topological map of an environment with the presence of obstacles. PMID:29389849
Katouda, Michio; Naruse, Akira; Hirano, Yukihiko; Nakajima, Takahito
2016-11-15
A new parallel algorithm and its implementation for the RI-MP2 energy calculation utilizing peta-flop-class many-core supercomputers are presented. Some improvements from the previous algorithm (J. Chem. Theory Comput. 2013, 9, 5373) have been performed: (1) a dual-level hierarchical parallelization scheme that enables the use of more than 10,000 Message Passing Interface (MPI) processes and (2) a new data communication scheme that reduces network communication overhead. A multi-node and multi-GPU implementation of the present algorithm is presented for calculations on a central processing unit (CPU)/graphics processing unit (GPU) hybrid supercomputer. Benchmark results of the new algorithm and its implementation using the K computer (CPU clustering system) and TSUBAME 2.5 (CPU/GPU hybrid system) demonstrate high efficiency. The peak performance of 3.1 PFLOPS is attained using 80,199 nodes of the K computer. The peak performance of the multi-node and multi-GPU implementation is 514 TFLOPS using 1349 nodes and 4047 GPUs of TSUBAME 2.5. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Vitásek, Stanislav; Matějka, Petr
2017-09-01
The article deals with problematic parts of automated processing of quantity takeoff (QTO) from data generated in BIM model. It focuses on models of road constructions, and uses volumes and dimensions of excavation work to create an estimate of construction costs. The article uses a case study and explorative methods to discuss possibilities and problems of data transfer from a model to a price system of construction production when such transfer is used for price estimates of construction works. Current QTOs and price tenders are made with 2D documents. This process is becoming obsolete because more modern tools can be used. The BIM phenomenon enables partial automation in processing volumes and dimensions of construction units and matching the data to units in a given price scheme. Therefore price of construction can be estimated and structured without lengthy and often imprecise manual calculations. The use of BIM for QTO is highly dependent on local market budgeting systems, therefore proper push/pull strategy is required. It also requires proper requirements specification, compatible pricing database and software.
Kilogram-scale prexasertib monolactate monohydrate synthesis under continuous-flow CGMP conditions.
Cole, Kevin P; Groh, Jennifer McClary; Johnson, Martin D; Burcham, Christopher L; Campbell, Bradley M; Diseroad, William D; Heller, Michael R; Howell, John R; Kallman, Neil J; Koenig, Thomas M; May, Scott A; Miller, Richard D; Mitchell, David; Myers, David P; Myers, Steven S; Phillips, Joseph L; Polster, Christopher S; White, Timothy D; Cashman, Jim; Hurley, Declan; Moylan, Robert; Sheehan, Paul; Spencer, Richard D; Desmond, Kenneth; Desmond, Paul; Gowran, Olivia
2017-06-16
Advances in drug potency and tailored therapeutics are promoting pharmaceutical manufacturing to transition from a traditional batch paradigm to more flexible continuous processing. Here we report the development of a multistep continuous-flow CGMP (current good manufacturing practices) process that produced 24 kilograms of prexasertib monolactate monohydrate suitable for use in human clinical trials. Eight continuous unit operations were conducted to produce the target at roughly 3 kilograms per day using small continuous reactors, extractors, evaporators, crystallizers, and filters in laboratory fume hoods. Success was enabled by advances in chemistry, engineering, analytical science, process modeling, and equipment design. Substantial technical and business drivers were identified, which merited the continuous process. The continuous process afforded improved performance and safety relative to batch processes and also improved containment of a highly potent compound. Copyright © 2017, American Association for the Advancement of Science.
A Polymer Visualization System with Accurate Heating and Cooling Control and High-Speed Imaging
Wong, Anson; Guo, Yanting; Park, Chul B.; Zhou, Nan Q.
2015-01-01
A visualization system to observe crystal and bubble formation in polymers under high temperature and pressure has been developed. Using this system, polymer can be subjected to a programmable thermal treatment to simulate the process in high pressure differential scanning calorimetry (HPDSC). With a high-temperature/high-pressure view-cell unit, this system enables in situ observation of crystal formation in semi-crystalline polymers to complement thermal analyses with HPDSC. The high-speed recording capability of the camera not only allows detailed recording of crystal formation, it also enables in situ capture of plastic foaming processes with a high temporal resolution. To demonstrate the system’s capability, crystal formation and foaming processes of polypropylene/carbon dioxide systems were examined. It was observed that crystals nucleated and grew into spherulites, and they grew at faster rates as temperature decreased. This observation agrees with the crystallinity measurement obtained with the HPDSC. Cell nucleation first occurred at crystals’ boundaries due to CO2 exclusion from crystal growth fronts. Subsequently, cells were nucleated around the existing ones due to tensile stresses generated in the constrained amorphous regions between networks of crystals. PMID:25915031
On-chip skin color detection using a triple-well CMOS process
NASA Astrophysics Data System (ADS)
Boussaid, Farid; Chai, Douglas; Bouzerdoum, Abdesselam
2004-03-01
In this paper, a current-mode VLSI architecture enabling on read-out skin detection without the need for any on-chip memory elements is proposed. An important feature of the proposed architecture is that it removes the need for demosaicing. Color separation is achieved using the strong wavelength dependence of the absorption coefficient in silicon. This wavelength dependence causes a very shallow absorption of blue light and enables red light to penetrate deeply in silicon. A triple-well process, allowing a P-well to be placed inside an N-well, is chosen to fabricate three vertically integrated photodiodes acting as the RGB color detector for each pixel. Pixels of an input RGB image are classified as skin or non-skin pixels using a statistical skin color model, chosen to offer an acceptable trade-off between skin detection performance and implementation complexity. A single processing unit is used to classify all pixels of the input RGB image. This results in reduced mismatch and also in an increased pixel fill-factor. Furthermore, the proposed current-mode architecture is programmable, allowing external control of all classifier parameters to compensate for mismatch and changing lighting conditions.
Electromagnetic microforging apparatus for low-cost fabrication of molds for microlens arrays
NASA Astrophysics Data System (ADS)
Pribošek, Jaka; Diaci, Janez
2015-06-01
This study addresses the problem of low-cost microlens fabrication and outlines the development of a novel microforging apparatus for microlens mold fabrication. The apparatus consists of an electromagnetic impact tool which strikes a piston with a hardened steel ball into a workpiece. The impact creates a spherical indentation which serves as a lens cavity. The microforging apparatus is controlled by a microprocessor control unit communicating with a personal computer and enables on-the-fly variation of electromagnetic excitation to control the microforging process. We studied the effects of process parameters on the diameter of the fabricated lens cavities inspected by a custom automatic image processing algorithm. Different microforging regimes are analyzed and discussed. The surface quality of fabricated cavities has been inspected by confocal microscopy and the influence of fill factor on sphericity error has been studied. The proposed microforging method enables the fabrication of molds with 100% fill factor, surface roughness as low as Ra 0.15 µm and sphericity error lower than 0.5 µm. The fabricated microlens arrays exhibit nearly diffraction-limited performance, offering a wide range of possible applications. We believe this study provides access to microoptical technology for smaller optical and computer vision laboratories.
Solar Sails: Sneaking up on Interstellar Travel
NASA Astrophysics Data System (ADS)
Johnson, L.
Throughout the world, government agencies, universities and private companies are developing solar sail propulsion systems to more efficiently explore the solar system and to enable science and exploration missions that are simply impossible to accomplish by any other means. Solar sail technology is rapidly advancing to support these demonstrations and missions, and in the process, is incrementally advancing one of the few approaches allowed by physics that may one day take humanity to the stars. Continuous solar pressure provides solar sails with propellantless thrust, potentially enabling them to propel a spacecraft to tremendous speeds theoretically much faster than any present-day propulsion system. The next generation of sails will enable us to take our first real steps beyond the edge of the solar system, sending spacecraft out to distances of 1000 Astronomical Units, or more. In the farther term, the descendants of these first and second generation sails will augment their thrust by using high power lasers and enable travel to nearby stellar systems with flight times less than 500 years a tremendous improvement over what is possible with conventional chemical rockets. By fielding these first solar sail systems, we are sneaking up on a capability to reach the stars.
ERIC Educational Resources Information Center
Sonu, Debbie; Benson, Jeremy
2016-01-01
This paper argues that normative conceptions of the child, as a natural quasi-human being in need of guidance, enable current school reforms in the United States to directly link the child to neoliberal aims and objectives. In using Foucault's concept of governmentality and disciplinary power, we first present how the child is constructed as a…
Prompting children to reason proportionally: Processing discrete units as continuous amounts.
Boyer, Ty W; Levine, Susan C
2015-05-01
Recent studies reveal that children can solve proportional reasoning problems presented with continuous amounts that enable intuitive strategies by around 6 years of age but have difficulties with problems presented with discrete units that tend to elicit explicit count-and-match strategies until at least 10 years of age. The current study tests whether performance on discrete unit problems might be improved by prompting intuitive reasoning with continuous-format problems. Participants were kindergarten, second-grade, and fourth-grade students (N = 194) assigned to either an experimental condition, where they were given continuous amount proportion problems before discrete unit proportion problems, or a control condition, where they were given all discrete unit problems. Results of a three-way mixed-model analysis of variance examining school grade, experimental condition, and block of trials indicated that fourth-grade students in the experimental condition outperformed those in the control condition on discrete unit problems in the second half of the experiment, but kindergarten and second-grade students did not differ by condition. This suggests that older children can be prompted to use intuitive strategies to reason proportionally. (c) 2015 APA, all rights reserved).
Kovarik, Ales; Dadejova, Martina; Lim, Yoong K.; Chase, Mark W.; Clarkson, James J.; Knapp, Sandra; Leitch, Andrew R.
2008-01-01
Background The evolution and biology of rDNA have interested biologists for many years, in part, because of two intriguing processes: (1) nucleolar dominance and (2) sequence homogenization. We review patterns of evolution in rDNA in the angiosperm genus Nicotiana to determine consequences of allopolyploidy on these processes. Scope Allopolyploid species of Nicotiana are ideal for studying rDNA evolution because phylogenetic reconstruction of DNA sequences has revealed patterns of species divergence and their parents. From these studies we also know that polyploids formed over widely different timeframes (thousands to millions of years), enabling comparative and temporal studies of rDNA structure, activity and chromosomal distribution. In addition studies on synthetic polyploids enable the consequences of de novo polyploidy on rDNA activity to be determined. Conclusions We propose that rDNA epigenetic expression patterns established even in F1 hybrids have a material influence on the likely patterns of divergence of rDNA. It is the active rDNA units that are vulnerable to homogenization, which probably acts to reduce mutational load across the active array. Those rDNA units that are epigenetically silenced may be less vulnerable to sequence homogenization. Selection cannot act on these silenced genes, and they are likely to accumulate mutations and eventually be eliminated from the genome. It is likely that whole silenced arrays will be deleted in polyploids of 1 million years of age and older. PMID:18310159
Electromagnetic reprogrammable coding-metasurface holograms.
Li, Lianlin; Jun Cui, Tie; Ji, Wei; Liu, Shuo; Ding, Jun; Wan, Xiang; Bo Li, Yun; Jiang, Menghua; Qiu, Cheng-Wei; Zhang, Shuang
2017-08-04
Metasurfaces have enabled a plethora of emerging functions within an ultrathin dimension, paving way towards flat and highly integrated photonic devices. Despite the rapid progress in this area, simultaneous realization of reconfigurability, high efficiency, and full control over the phase and amplitude of scattered light is posing a great challenge. Here, we try to tackle this challenge by introducing the concept of a reprogrammable hologram based on 1-bit coding metasurfaces. The state of each unit cell of the coding metasurface can be switched between '1' and '0' by electrically controlling the loaded diodes. Our proof-of-concept experiments show that multiple desired holographic images can be realized in real time with only a single coding metasurface. The proposed reprogrammable hologram may be a key in enabling future intelligent devices with reconfigurable and programmable functionalities that may lead to advances in a variety of applications such as microscopy, display, security, data storage, and information processing.Realizing metasurfaces with reconfigurability, high efficiency, and control over phase and amplitude is a challenge. Here, Li et al. introduce a reprogrammable hologram based on a 1-bit coding metasurface, where the state of each unit cell of the coding metasurface can be switched electrically.
Analysis of impact of general-purpose graphics processor units in supersonic flow modeling
NASA Astrophysics Data System (ADS)
Emelyanov, V. N.; Karpenko, A. G.; Kozelkov, A. S.; Teterina, I. V.; Volkov, K. N.; Yalozo, A. V.
2017-06-01
Computational methods are widely used in prediction of complex flowfields associated with off-normal situations in aerospace engineering. Modern graphics processing units (GPU) provide architectures and new programming models that enable to harness their large processing power and to design computational fluid dynamics (CFD) simulations at both high performance and low cost. Possibilities of the use of GPUs for the simulation of external and internal flows on unstructured meshes are discussed. The finite volume method is applied to solve three-dimensional unsteady compressible Euler and Navier-Stokes equations on unstructured meshes with high resolution numerical schemes. CUDA technology is used for programming implementation of parallel computational algorithms. Solutions of some benchmark test cases on GPUs are reported, and the results computed are compared with experimental and computational data. Approaches to optimization of the CFD code related to the use of different types of memory are considered. Speedup of solution on GPUs with respect to the solution on central processor unit (CPU) is compared. Performance measurements show that numerical schemes developed achieve 20-50 speedup on GPU hardware compared to CPU reference implementation. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.
Kirensky, L V; Terskov, I A; Gitelson, I I; Lisovsky, G M; Kovrov, B G; Okladnikov, Y N
1968-01-01
According to the opinion of many researchers, a culture of microalgae may serve as a regenerator of atmosphere in the cabin of a spaceship. To use microalgae for these objectives, it was necessary to have an automatic unit possessing a high productivity of the cultivation process. This unit, containing a minimum of equipment, enables carrying on for an unlimited time the cultivation of algae without a drop in their productivity. The unit meeting these requirements (the cultivator) was developed by the authors and will be described in the presentation. The stability of the microalga photosynthetic system is characterized by the fact that after 70% biosynthesis repression by the ultraviolet radiation, a full regeneration of the productivity level takes place within 24 hours. In our experiments the system was functioning with the stable estimated productivity for many days (up to two months without interruption). During the process, no biological inhibitions to permanent performance and further prolongation of its life were found. As to the productivity, stability and control, the described biotechnological method may appear to be useful as a link of the closed ecosystem.
Coordinating an Autonomous Earth-Observing Sensorweb
NASA Technical Reports Server (NTRS)
Sherwood, Robert; Cichy, Benjamin; Tran, Daniel; Chien, Steve; Rabideau, Gregg; Davies, Ashley; Castano, Rebecca; frye, Stuart; Mandl, Dan; Shulman, Seth;
2006-01-01
A system of software has been developed to coordinate the operation of an autonomous Earth-observing sensorweb. Sensorwebs are collections of sensor units scattered over large regions to gather data on spatial and temporal patterns of physical, chemical, or biological phenomena in those regions. Each sensor unit is a node in a data-gathering/ data-communication network that spans a region of interest. In this case, the region is the entire Earth, and the sensorweb includes multiple terrestrial and spaceborne sensor units. In addition to acquiring data for scientific study, the sensorweb is required to give timely notice of volcanic eruptions, floods, and other hazardous natural events. In keeping with the inherently modular nature of the sensory, communication, and data-processing hardware, the software features a flexible, modular architecture that facilitates expansion of the network, customization of conditions that trigger alarms of hazardous natural events, and customization of responses to alarms. The soft8 NASA Tech Briefs, July 2006 ware facilitates access to multiple sources of data on an event of scientific interest, enables coordinated use of multiple sensors in rapid reaction to detection of an event, and facilitates the tracking of spacecraft operations, including tracking of the acquisition, processing, and downlinking of requested data.
Kim, Jeongnim; Baczewski, Andrew T.; Beaudet, Todd D.; ...
2018-04-19
QMCPACK is an open source quantum Monte Carlo package for ab-initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wave functions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performancemore » computing architectures, including multicore central processing unit (CPU) and graphical processing unit (GPU) systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://www.qmcpack.org.« less
Maurer, S A; Kussmann, J; Ochsenfeld, C
2014-08-07
We present a low-prefactor, cubically scaling scaled-opposite-spin second-order Møller-Plesset perturbation theory (SOS-MP2) method which is highly suitable for massively parallel architectures like graphics processing units (GPU). The scaling is reduced from O(N⁵) to O(N³) by a reformulation of the MP2-expression in the atomic orbital basis via Laplace transformation and the resolution-of-the-identity (RI) approximation of the integrals in combination with efficient sparse algebra for the 3-center integral transformation. In contrast to previous works that employ GPUs for post Hartree-Fock calculations, we do not simply employ GPU-based linear algebra libraries to accelerate the conventional algorithm. Instead, our reformulation allows to replace the rate-determining contraction step with a modified J-engine algorithm, that has been proven to be highly efficient on GPUs. Thus, our SOS-MP2 scheme enables us to treat large molecular systems in an accurate and efficient manner on a single GPU-server.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jeongnim; Baczewski, Andrew T.; Beaudet, Todd D.
QMCPACK is an open source quantum Monte Carlo package for ab-initio electronic structure calculations. It supports calculations of metallic and insulating solids, molecules, atoms, and some model Hamiltonians. Implemented real space quantum Monte Carlo algorithms include variational, diffusion, and reptation Monte Carlo. QMCPACK uses Slater-Jastrow type trial wave functions in conjunction with a sophisticated optimizer capable of optimizing tens of thousands of parameters. The orbital space auxiliary field quantum Monte Carlo method is also implemented, enabling cross validation between different highly accurate methods. The code is specifically optimized for calculations with large numbers of electrons on the latest high performancemore » computing architectures, including multicore central processing unit (CPU) and graphical processing unit (GPU) systems. We detail the program’s capabilities, outline its structure, and give examples of its use in current research calculations. The package is available at http://www.qmcpack.org.« less
Nelson, Kurtis; Steinwand, Daniel R.
2015-01-01
Annual disturbance maps are produced by the LANDFIRE program across the conterminous United States (CONUS). Existing LANDFIRE disturbance data from 1999 to 2010 are available and current efforts will produce disturbance data through 2012. A tiling and compositing approach was developed to produce bi-annual images optimized for change detection. A tiled grid of 10,000 × 10,000 30 m pixels was defined for CONUS and adjusted to consolidate smaller tiles along national borders, resulting in 98 non-overlapping tiles. Data from Landsat-5,-7, and -8 were re-projected to the tile extents, masked to remove clouds, shadows, water, and snow/ice, then composited using a cosine similarity approach. The resultant images were used in a change detection algorithm to determine areas of vegetation change. This approach enabled more efficient processing compared to using single Landsat scenes, by taking advantage of overlap between adjacent paths, and allowed an automated system to be developed for the entire process.
A Brief Introduction to the Theory of Friction Stir Welding
NASA Technical Reports Server (NTRS)
Nunes, Arthur C., Jr.
2008-01-01
Friction stir welding (FSW) is a solid state welding process invented in 1991 at The Welding Institute in the United Kingdom. A weld is made in the FSW process by translating a rotating pin along a weld seam so as to stir the sides of the seam together. FSW avoids deleterious effects inherent in melting and is already an important welding process for the aerospace industry, where welds of optimal quality are demanded. The structure of welds determines weld properties. The structure of friction stir welds is determined by the flow field in the weld metal in the vicinity of the weld tool. A simple kinematic model of the FSW flow field developed at Marshall Space Flight Center, which enables the basic features of FSW microstructure to be understood and related to weld process parameters and tool design, is explained.
Membrane separation systems---A research and development needs assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, R.W.; Cussler, E.L.; Eykamp, W.
1990-04-01
Industrial separation processes consume a significant portion of the energy used in the United States. A 1986 survey by the Office of Industrial Programs estimated that about 4.2 quads of energy are expended annually on distillation, drying and evaporation operations. This survey also concluded that over 0.8 quads of energy could be saved in the chemical, petroleum and food industries alone if these industries adopted membrane separation systems more widely. Membrane separation systems offer significant advantages over existing separation processes. In addition to consuming less energy than conventional processes, membrane systems are compact and modular, enabling easy retrofit to existingmore » industrial processes. The present study was commissioned by the Department of Energy, Office of Program Analysis, to identify and prioritize membrane research needs in light of DOE's mission. Each report will be individually cataloged.« less
DeRose, Yoko S.; Gligorich, Keith M.; Wang, Guoying; Georgelas, Ann; Bowman, Paulette; Courdy, Samir J.; Welm, Alana L.; Welm, Bryan E.
2013-01-01
Research models that replicate the diverse genetic and molecular landscape of breast cancer are critical for developing the next generation therapeutic entities that can target specific cancer subtypes. Patient-derived tumorgrafts, generated by transplanting primary human tumor samples into immune-compromised mice, are a valuable method to model the clinical diversity of breast cancer in mice, and are a potential resource in personalized medicine. Primary tumorgrafts also enable in vivo testing of therapeutics and make possible the use of patient cancer tissue for in vitro screens. Described in this unit are a variety of protocols including tissue collection, biospecimen tracking, tissue processing, transplantation, and 3-dimensional culturing of xenografted tissue, that enable use of bona fide uncultured human tissue in designing and validating cancer therapies. PMID:23456611
Thermal activation of dislocations in large scale obstacle bypass
NASA Astrophysics Data System (ADS)
Sobie, Cameron; Capolungo, Laurent; McDowell, David L.; Martinez, Enrique
2017-08-01
Dislocation dynamics simulations have been used extensively to predict hardening caused by dislocation-obstacle interactions, including irradiation defect hardening in the athermal case. Incorporating the role of thermal energy on these interactions is possible with a framework provided by harmonic transition state theory (HTST) enabling direct access to thermally activated reaction rates using the Arrhenius equation, including rates of dislocation-obstacle bypass processes. Moving beyond unit dislocation-defect reactions to a representative environment containing a large number of defects requires coarse-graining the activation energy barriers of a population of obstacles into an effective energy barrier that accurately represents the large scale collective process. The work presented here investigates the relationship between unit dislocation-defect bypass processes and the distribution of activation energy barriers calculated for ensemble bypass processes. A significant difference between these cases is observed, which is attributed to the inherent cooperative nature of dislocation bypass processes. In addition to the dislocation-defect interaction, the morphology of the dislocation segments pinned to the defects play an important role on the activation energies for bypass. A phenomenological model for activation energy stress dependence is shown to describe well the effect of a distribution of activation energies, and a probabilistic activation energy model incorporating the stress distribution in a material is presented.
Woods, J
2001-01-01
The third generation cardiac institute will build on the successes of the past in structuring the service line, re-organizing to assimilate specialist interests, and re-positioning to expand cardiac services into cardiovascular services. To meet the challenges of an increasingly competitive marketplace and complex delivery system, the focus for this new model will shift away from improved structures, and toward improved processes. This shift will require a sound methodology for statistically measuring and sustaining process changes related to the optimization of cardiovascular care. In recent years, GE Medical Systems has successfully applied Six Sigma methodologies to enable cardiac centers to control key clinical and market development processes through its DMADV, DMAIC and Change Acceleration processes. Data indicates Six Sigma is having a positive impact within organizations across the United States, and when appropriately implemented, this approach can serve as a solid foundation for building the next generation cardiac institute.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallarno, George; Rogers, James H; Maxwell, Don E
The high computational capability of graphics processing units (GPUs) is enabling and driving the scientific discovery process at large-scale. The world s second fastest supercomputer for open science, Titan, has more than 18,000 GPUs that computational scientists use to perform scientific simu- lations and data analysis. Understanding of GPU reliability characteristics, however, is still in its nascent stage since GPUs have only recently been deployed at large-scale. This paper presents a detailed study of GPU errors and their impact on system operations and applications, describing experiences with the 18,688 GPUs on the Titan supercom- puter as well as lessons learnedmore » in the process of efficient operation of GPUs at scale. These experiences are helpful to HPC sites which already have large-scale GPU clusters or plan to deploy GPUs in the future.« less
NASA Technical Reports Server (NTRS)
Mejzak, R. S.
1980-01-01
The distributed processing concept is defined in terms of control primitives, variables, and structures and their use in performing a decomposed discrete Fourier transform (DET) application function. The design assumes interprocessor communications to be anonymous. In this scheme, all processors can access an entire common database by employing control primitives. Access to selected areas within the common database is random, enforced by a hardware lock, and determined by task and subtask pointers. This enables the number of processors to be varied in the configuration without any modifications to the control structure. Decompositional elements of the DFT application function in terms of tasks and subtasks are also described. The experimental hardware configuration consists of IMSAI 8080 chassis which are independent, 8 bit microcomputer units. These chassis are linked together to form a multiple processing system by means of a shared memory facility. This facility consists of hardware which provides a bus structure to enable up to six microcomputers to be interconnected. It provides polling and arbitration logic so that only one processor has access to shared memory at any one time.
THOR Field and Wave Processor - FWP
NASA Astrophysics Data System (ADS)
Soucek, Jan; Rothkaehl, Hanna; Balikhin, Michael; Zaslavsky, Arnaud; Nakamura, Rumi; Khotyaintsev, Yuri; Uhlir, Ludek; Lan, Radek; Yearby, Keith; Morawski, Marek; Winkler, Marek
2016-04-01
If selected, Turbulence Heating ObserveR (THOR) will become the first mission ever flown in space dedicated to plasma turbulence. The Fields and Waves Processor (FWP) is an integrated electronics unit for all electromagnetic field measurements performed by THOR. FWP will interface with all fields sensors: electric field antennas of the EFI instrument, the MAG fluxgate magnetometer and search-coil magnetometer (SCM) and perform data digitization and on-board processing. FWP box will house multiple data acquisition sub-units and signal analyzers all sharing a common power supply and data processing unit and thus a single data and power interface to the spacecraft. Integrating all the electromagnetic field measurements in a single unit will improve the consistency of field measurement and accuracy of time synchronization. The feasibility of making highly sensitive electric and magnetic field measurements in space has been demonstrated by Cluster (among other spacecraft) and THOR instrumentation complemented by a thorough electromagnetic cleanliness program will further improve on this heritage. Taking advantage of the capabilities of modern electronics, FWP will provide simultaneous synchronized waveform and spectral data products at high time resolution from the numerous THOR sensors, taking advantage of the large telemetry bandwidth of THOR. FWP will also implement a plasma a resonance sounder and a digital plasma quasi-thermal noise analyzer designed to provide high cadence measurements of plasma density and temperature complementary to data from particle instruments. FWP will be interfaced with the particle instrument data processing unit (PPU) via a dedicated digital link which will enable performing on board correlation between waves and particles, quantifying the transfer of energy between waves and particles. The FWP instrument shall be designed and built by an international consortium of scientific institutes from Czech Republic, Poland, France, UK, Sweden and Austria.
NASP - Enabling new space launch options
NASA Astrophysics Data System (ADS)
Froning, David; Gaubatz, William; Mathews, George
1990-10-01
Successful NASP developments in the United States are bringing about the possibility of effective, fully reusable vehicles for transport of people and cargo between earth and space. These developments include: extension of airbreathing propulsion to a much higher speed; densification of propellants for greater energy per unit volume of mass; structures with much greater strength-to-weight at high temperatures; computational advancements that enable more optimal design and integration of airframes, engines and controls; and advances in avionics, robotics, artificial intelligence and automation that enable accomplishment of earth-to-orbit (ETO) operations with much less manpower support and cost. This paper describes the relative magnitude of improvement that these developments may provide.
NASP - Enabling new space launch options
NASA Technical Reports Server (NTRS)
Froning, David; Gaubatz, William; Mathews, George
1990-01-01
Successful NASP developments in the United States are bringing about the possibility of effective, fully reusable vehicles for transport of people and cargo between earth and space. These developments include: extension of airbreathing propulsion to a much higher speed; densification of propellants for greater energy per unit volume of mass; structures with much greater strength-to-weight at high temperatures; computational advancements that enable more optimal design and integration of airframes, engines and controls; and advances in avionics, robotics, artificial intelligence and automation that enable accomplishment of earth-to-orbit (ETO) operations with much less manpower support and cost. This paper describes the relative magnitude of improvement that these developments may provide.
How do strategic decisions and operative practices affect operating room productivity?
Peltokorpi, Antti
2011-12-01
Surgical operating rooms are cost-intensive parts of health service production. Managing operating units efficiently is essential when hospitals and healthcare systems aim to maximize health outcomes with limited resources. Previous research about operating room management has focused on studying the effect of management practices and decisions on efficiency by utilizing mainly modeling approach or before-after analysis in single hospital case. The purpose of this research is to analyze the synergic effect of strategic decisions and operative management practices on operating room productivity and to use a multiple case study method enabling statistical hypothesis testing with empirical data. 11 hypotheses that propose connections between the use of strategic and operative practices and productivity were tested in a multi-hospital study that included 26 units. The results indicate that operative practices, such as personnel management, case scheduling and performance measurement, affect productivity more remarkably than do strategic decisions that relate to, e.g., units' size, scope or academic status. Units with different strategic positions should apply different operative practices: Focused hospital units benefit most from sophisticated case scheduling and parallel processing whereas central and ambulatory units should apply flexible working hours, incentives and multi-skilled personnel. Operating units should be more active in applying management practices which are adequate for their strategic orientation.
Neumann, Cedric; Ramotowski, Robert; Genessay, Thibault
2011-05-13
Forensic examinations of ink have been performed since the beginning of the 20th century. Since the 1960s, the International Ink Library, maintained by the United States Secret Service, has supported those analyses. Until 2009, the search and identification of inks were essentially performed manually. This paper describes the results of a project designed to improve ink samples' analytical and search processes. The project focused on the development of improved standardization procedures to ensure the best possible reproducibility between analyses run on different HPTLC plates. The successful implementation of this new calibration method enabled the development of mathematical algorithms and of a software package to complement the existing ink library. Copyright © 2010 Elsevier B.V. All rights reserved.
Habitat Demonstration Unit (HDU) Pressurized Excursion Module (PEM) Systems Integration Strategy
NASA Technical Reports Server (NTRS)
Gill, Tracy; Merbitz, Jerad; Kennedy, Kriss; Tri, Terry; Toups, Larry; Howe, A. Scott
2011-01-01
The Habitat Demonstration Unit (HDU) project team constructed an analog prototype lunar surface laboratory called the Pressurized Excursion Module (PEM). The prototype unit subsystems were integrated in a short amount of time, utilizing a rapid prototyping approach that brought together over 20 habitation-related technologies from a variety of NASA centers. This paper describes the system integration strategies and lessons learned, that allowed the PEM to be brought from paper design to working field prototype using a multi-center team. The system integration process was based on a rapid prototyping approach. Tailored design review and test and integration processes facilitated that approach. The use of collaboration tools including electronic tools as well as documentation enabled a geographically distributed team take a paper concept to an operational prototype in approximately one year. One of the major tools used in the integration strategy was a coordinated effort to accurately model all the subsystems using computer aided design (CAD), so conflicts were identified before physical components came together. A deliberate effort was made following the deployment of the HDU PEM for field operations to collect lessons learned to facilitate process improvement and inform the design of future flight or analog versions of habitat systems. Significant items within those lessons learned were limitations with the CAD integration approach and the impact of shell design on flexibility of placing systems within the HDU shell.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCorkle, D.; Yang, C.; Jordan, T.
2007-06-01
Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less
NASA Technical Reports Server (NTRS)
Brophy, John R. (Inventor)
1993-01-01
Apparatus and methods for large-area, high-power ion engines comprise dividing a single engine into a combination of smaller discharge chambers (or segments) configured to operate as a single large-area engine. This segmented ion thruster (SIT) approach enables the development of 100-kW class argon ion engines for operation at a specific impulse of 10,000 s. A combination of six 30-cm diameter ion chambers operating as a single engine can process over 100 kW. Such a segmented ion engine can be operated from a single power processor unit.
Bioproducts to Enable Biofuels Workshop Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, Andrea; Leong, G. Jeremy; Fitzgerald, Nichole
2015-12-01
This report summarizes the results of a public workshop sponsored by DOE/EERE in Westminster, Colorado, on July 16, 2015. The views and opinions of the workshop attendees, as summarized in this document, do not necessarily reflect those of the United States government or any agency thereof, nor do their employees make any warranty, expressed or implied, or assume any liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represent that its use would not infringe upon privately owned rights.
Next generation Er:YAG fractional ablative laser
NASA Astrophysics Data System (ADS)
Heinrich, A.; Vizhanyo, A.; Krammer, P.; Summer, S.; Gross, S.; Bragagna, T.; Böhler, C.
2011-03-01
Pantec Biosolutions AG presents a portable fractional ablative laser system based on a miniaturized diode pumped Er:YAG laser. The system can operate at repetition rates up to 500 Hz and has an incorporated beam deflection unit. It is smaller, lighter and cost efficient compared to systems based on lamp pumped Er:YAG lasers and incorporates a skin layer detection to guarantee precise control of the microporation process. The pulse parameters enable a variety of applications in dermatology and in general medicine, as demonstrated by first results on transdermal drug delivery of FSH (follicle stimulating hormone).
Amplifying Electrochemical Indicators
NASA Technical Reports Server (NTRS)
Fan, Wenhong; Li, Jun; Han, Jie
2004-01-01
Dendrimeric reporter compounds have been invented for use in sensing and amplifying electrochemical signals from molecular recognition events that involve many chemical and biological entities. These reporter compounds can be formulated to target specific molecules or molecular recognition events. They can also be formulated to be, variously, hydrophilic or amphiphilic so that they are suitable for use at interfaces between (1) aqueous solutions and (2) electrodes connected to external signal-processing electronic circuits. The invention of these reporter compounds is expected to enable the development of highly miniaturized, low-power-consumption, relatively inexpensive, mass-producible sensor units for diverse applications.
Model-Based Verification and Validation of Spacecraft Avionics
NASA Technical Reports Server (NTRS)
Khan, Mohammed Omair
2012-01-01
Our simulation was able to mimic the results of 30 tests on the actual hardware. This shows that simulations have the potential to enable early design validation - well before actual hardware exists. Although simulations focused around data processing procedures at subsystem and device level, they can also be applied to system level analysis to simulate mission scenarios and consumable tracking (e.g. power, propellant, etc.). Simulation engine plug-in developments are continually improving the product, but handling time for time-sensitive operations (like those of the remote engineering unit and bus controller) can be cumbersome.
Using Multi-Core Systems for Rover Autonomy
NASA Technical Reports Server (NTRS)
Clement, Brad; Estlin, Tara; Bornstein, Benjamin; Springer, Paul; Anderson, Robert C.
2010-01-01
Task Objectives are: (1) Develop and demonstrate key capabilities for rover long-range science operations using multi-core computing, (a) Adapt three rover technologies to execute on SOA multi-core processor (b) Illustrate performance improvements achieved (c) Demonstrate adapted capabilities with rover hardware, (2) Targeting three high-level autonomy technologies (a) Two for onboard data analysis (b) One for onboard command sequencing/planning, (3) Technologies identified as enabling for future missions, (4)Benefits will be measured along several metrics: (a) Execution time / Power requirements (b) Number of data products processed per unit time (c) Solution quality
NASA Astrophysics Data System (ADS)
Loisel, Guillaume
2016-10-01
Emission from accretion powered objects accounts for a large fraction of all photons in the universe and is a powerful diagnostic for their behavior and structure. Quantitative interpretation of spectrum emission from these objects requires a spectral synthesis model for photoionized plasma, since the ionizing luminosity is so large that photon driven atomic processes dominate over collisions. This is a quandary because laboratory experiments capable of testing the spectral emission models are non-existent. The models must predict the photoionized charge state distribution, the photon emission processes, and the radiation transport influence on the observed emission. We have used a decade of research at the Z facility to achieve the first simultaneous measurements of emission and absorption from photoionized plasmas. The extraordinary spectra are reproducible to within +/-2% and the E/dE 500 spectral resolution has enabled unprecedented tests of atomic structure calculations. The absorption spectra enable determination of plasma density, temperature, and charge state distribution. The emission spectra then enable tests of spectral emission models. The emission has been measured from plasmas with varying size to elucidate the radiation transport effects. This combination of measurements will provide strong constraints on models used in astrophysics. Sandia is a multi-program laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract DE-AC04-94AL85000.
Zhang, Fengjiao; Hu, Yunbin; Schuettfort, Torben; Di, Chong-an; Gao, Xike; McNeill, Christopher R; Thomsen, Lars; Mannsfeld, Stefan C B; Yuan, Wei; Sirringhaus, Henning; Zhu, Daoben
2013-02-13
Substituted side chains are fundamental units in solution processable organic semiconductors in order to achieve a balance of close intermolecular stacking, high crystallinity, and good compatibility with different wet techniques. Based on four air-stable solution-processed naphthalene diimides fused with 2-(1,3-dithiol-2-ylidene)malononitrile groups (NDI-DTYM2) that bear branched alkyl chains with varied side-chain length and different branching position, we have carried out systematic studies on the relationship between film microstructure and charge transport in their organic thin-film transistors (OTFTs). In particular synchrotron measurements (grazing incidence X-ray diffraction and near-edge X-ray absorption fine structure) are combined with device optimization studies to probe the interplay between molecular structure, molecular packing, and OTFT mobility. It is found that the side-chain length has a moderate influence on thin-film microstructure but leads to only limited changes in OTFT performance. In contrast, the position of branching point results in subtle, yet critical changes in molecular packing and leads to dramatic differences in electron mobility ranging from ~0.001 to >3.0 cm(2) V(-1) s(-1). Incorporating a NDI-DTYM2 core with three-branched N-alkyl substituents of C(11,6) results in a dense in-plane molecular packing with an unit cell area of 127 Å(2), larger domain sizes of up to 1000 × 3000 nm(2), and an electron mobility of up to 3.50 cm(2) V(-1) s(-1), which is an unprecedented value for ambient stable n-channel solution-processed OTFTs reported to date. These results demonstrate that variation of the alkyl chain branching point is a powerful strategy for tuning of molecular packing to enable high charge transport mobilities.
NASA Astrophysics Data System (ADS)
García-Flores, Agustín.; Paz-Gallardo, Abel; Plaza, Antonio; Li, Jun
2016-10-01
This paper describes a new web platform dedicated to the classification of satellite images called Hypergim. The current implementation of this platform enables users to perform classification of satellite images from any part of the world thanks to the worldwide maps provided by Google Maps. To perform this classification, Hypergim uses unsupervised algorithms like Isodata and K-means. Here, we present an extension of the original platform in which we adapt Hypergim in order to use supervised algorithms to improve the classification results. This involves a significant modification of the user interface, providing the user with a way to obtain samples of classes present in the images to use in the training phase of the classification process. Another main goal of this development is to improve the runtime of the image classification process. To achieve this goal, we use a parallel implementation of the Random Forest classification algorithm. This implementation is a modification of the well-known CURFIL software package. The use of this type of algorithms to perform image classification is widespread today thanks to its precision and ease of training. The actual implementation of Random Forest was developed using CUDA platform, which enables us to exploit the potential of several models of NVIDIA graphics processing units using them to execute general purpose computing tasks as image classification algorithms. As well as CUDA, we use other parallel libraries as Intel Boost, taking advantage of the multithreading capabilities of modern CPUs. To ensure the best possible results, the platform is deployed in a cluster of commodity graphics processing units (GPUs), so that multiple users can use the tool in a concurrent way. The experimental results indicate that this new algorithm widely outperform the previous unsupervised algorithms implemented in Hypergim, both in runtime as well as precision of the actual classification of the images.
Aguirre, Erik
2018-01-01
In the context of hospital management and operation, Intensive Care Units (ICU) are one of the most challenging in terms of time responsiveness and criticality, in which adequate resource management and signal processing play a key role in overall system performance. In this work, a context aware Intensive Care Unit is implemented and analyzed to provide scalable signal acquisition capabilities, as well as to provide tracking and access control. Wireless channel analysis is performed by means of hybrid optimized 3D Ray Launching deterministic simulation to assess potential interference impact as well as to provide required coverage/capacity thresholds for employed transceivers. Wireless system operation within the ICU scenario, considering conventional transceiver operation, is feasible in terms of quality of service for the complete scenario. Extensive measurements of overall interference levels have also been carried out, enabling subsequent adequate coverage/capacity estimations, for a set of Zigbee based nodes. Real system operation has been tested, with ad-hoc designed Zigbee wireless motes, employing lightweight communication protocols to minimize energy and bandwidth usage. An ICU information gathering application and software architecture for Visitor Access Control has been implemented, providing monitoring of the Boxes external doors and the identification of visitors via a RFID system. The results enable a solution to provide ICU access control and tracking capabilities previously not exploited, providing a step forward in the implementation of a Smart Health framework. PMID:29382148
NASA Astrophysics Data System (ADS)
Weissbrod, T.; Perath, I.
A systematic study of the Precambrian and Paleozoic-Mesozoic clastic sequences (Nubian Sandstone) in Israel and Sinai, and a comparative analysis of its stratigraphy in neighbouring countries, has shown that besides the conventional criteria of subdivision (lithology, field appearance, photogeological features, fossil content), additional criteria can be applied, which singly or in mutual conjuction enable the recognition of widespread units and boundaries. These criteria show lateral constancy, and recurrence of a similar vertical sequence over great distances, and are therefore acceptable for the identification of synchronous, region-wide sedimentary units (and consequently, major unconformities). They also enable, once the units are established, to identify detached (not in situ) samples, samples from isolated or discontinous outcrops, borehole material or archive material. The following rock properties were tested and found to be usefuls in stratigraphic interpretation, throughout large distribution areas of the clastic sequence: Landscape, which is basically the response of a particular textural-chemic al aggregate to atmospheric weathering. Characteristic outcrop feature — styles of roundness or massivity, fissuring or fliatin, slope profile, bedding — express a basic uniformity of these platform-type clastics. Colors are often stratigraphically constant over hundreds of kilometers, through various climates and topographies, and express some intrinsic unity of the rock bodies. Grain size and sorting, when cross-plotted, enable to differentiate existing unit. The method requires the analysis of representative numbers of samples. Vertical trends of median grain size and sorting show reversals, typically across unconformities. Feldstar content diminishes from 15-50% in Precambrian-Paleozoic rocks to a mere 5% or less in Mesozoic sandstones — a distinctive regionwide time trend. Dominance of certain feldstar types characterizes Precambrian and Paleozoic units. Clay minerals, though subordinate, characterize certain units. Illite is usually the dominant clay mineral in the Precambrain-Paleozoic sediments, showing different degress of crystallization in different units. Kaolinite is the main, often the only clay mineral in Mesozoic units. Heavy minerals, whose species spectra reflect on parent rock and provenance terrain and whose differential response to degradation points to the sedimentary history of the deposit, show certain vertical regularities, such as the abrupt disappearance of species or whole assemblages at certain levels, indicating unconformities. Trace metals, which in places reach ore concentrations (e.g. copper), are often extensive, though of well-defined vertical distribution. They express adsorptive capacity of specific widespread lithologies, enabling the discrimination of units. Even though each of these criteria is not always by itself diagnostic, they may in conjuction with one or more other criteria amount to a petrographic fingerprint that enables fairly accurate identification of the age interval of the unit, and its relation both to the regional and the local stratigraphic sequence.
Medical image processing on the GPU - past, present and future.
Eklund, Anders; Dufort, Paul; Forsberg, Daniel; LaConte, Stephen M
2013-12-01
Graphics processing units (GPUs) are used today in a wide range of applications, mainly because they can dramatically accelerate parallel computing, are affordable and energy efficient. In the field of medical imaging, GPUs are in some cases crucial for enabling practical use of computationally demanding algorithms. This review presents the past and present work on GPU accelerated medical image processing, and is meant to serve as an overview and introduction to existing GPU implementations. The review covers GPU acceleration of basic image processing operations (filtering, interpolation, histogram estimation and distance transforms), the most commonly used algorithms in medical imaging (image registration, image segmentation and image denoising) and algorithms that are specific to individual modalities (CT, PET, SPECT, MRI, fMRI, DTI, ultrasound, optical imaging and microscopy). The review ends by highlighting some future possibilities and challenges. Copyright © 2013 Elsevier B.V. All rights reserved.
Qualitative and quantitative interpretation of SEM image using digital image processing.
Saladra, Dawid; Kopernik, Magdalena
2016-10-01
The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Cardinal, Bradley J; Lee, Jong-Young; Kim, Young-Ho; Lee, Hyo; Li, Kin-Kit; Si, Qi
2009-01-01
Examine behavioral, demographic, psychosocial, and sociocultural concomitants of the stages of change for physical activity behavior among college students in South Korea (n = 221) and the United States (n = 166). Measures obtained in this cross-sectional study included age; body mass index; nationality; gender; exercise behavior; processes of change; decisional balance; self-efficacy; stage of change; and predisposing, reinforcing, and enabling factors. The amount of variance explained for stage of change by the transtheoretical model constructs (i.e., decisional balance, processes of change, self-efficacy) ranged from 11% to 29% (all p < .001), whereas the predisposing (2%; p = .052), reinforcing (3%; p = .06), and enabling (5%; p < .001) factors were not as important. In multivariate ordinal logistic regression analysis, gender (odds ratio [OR] = 3.3; p < .001), gender by nationality interaction (OR = .27; p < .01), weekly exercise behavior (OR = 1.04; p < .001), and behavioral processes of change (OR = 1.12; p < .001) were each significant concomitants of the stages of change. In terms of physical activity behavior, South Korean women were more likely than South Korean men to be in the early stages, whereas American men were slightly more likely to be in the early stages than American women when all the concomitants were accounted for. Among the psychosocial stage of change concomitants, only the behavioral processes of change were found to be important.
On localization attacks against cloud infrastructure
NASA Astrophysics Data System (ADS)
Ge, Linqiang; Yu, Wei; Sistani, Mohammad Ali
2013-05-01
One of the key characteristics of cloud computing is the device and location independence that enables the user to access systems regardless of their location. Because cloud computing is heavily based on sharing resource, it is vulnerable to cyber attacks. In this paper, we investigate a localization attack that enables the adversary to leverage central processing unit (CPU) resources to localize the physical location of server used by victims. By increasing and reducing CPU usage through the malicious virtual machine (VM), the response time from the victim VM will increase and decrease correspondingly. In this way, by embedding the probing signal into the CPU usage and correlating the same pattern in the response time from the victim VM, the adversary can find the location of victim VM. To determine attack accuracy, we investigate features in both the time and frequency domains. We conduct both theoretical and experimental study to demonstrate the effectiveness of such an attack.
Light-operated machines based on threaded molecular structures.
Credi, Alberto; Silvi, Serena; Venturi, Margherita
2014-01-01
Rotaxanes and related species represent the most common implementation of the concept of artificial molecular machines, because the supramolecular nature of the interactions between the components and their interlocked architecture allow a precise control on the position and movement of the molecular units. The use of light to power artificial molecular machines is particularly valuable because it can play the dual role of "writing" and "reading" the system. Moreover, light-driven machines can operate without accumulation of waste products, and photons are the ideal inputs to enable autonomous operation mechanisms. In appropriately designed molecular machines, light can be used to control not only the stability of the system, which affects the relative position of the molecular components but also the kinetics of the mechanical processes, thereby enabling control on the direction of the movements. This step forward is necessary in order to make a leap from molecular machines to molecular motors.
NASA Astrophysics Data System (ADS)
Emmerman, Philip J.
2005-05-01
Teams of robots or mixed teams of warfighters and robots on reconnaissance and other missions can benefit greatly from a local fusion station. A local fusion station is defined here as a small mobile processor with interfaces to enable the ingestion of multiple heterogeneous sensor data and information streams, including blue force tracking data. These data streams are fused and integrated with contextual information (terrain features, weather, maps, dynamic background features, etc.), and displayed or processed to provide real time situational awareness to the robot controller or to the robots themselves. These blue and red force fusion applications remove redundancies, lessen ambiguities, correlate, aggregate, and integrate sensor information with context such as high resolution terrain. Applications such as safety, team behavior, asset control, training, pattern analysis, etc. can be generated or enhanced by these fusion stations. This local fusion station should also enable the interaction between these local units and a global information world.
Enabling Optical Network Test Bed for 5G Tests
NASA Astrophysics Data System (ADS)
Giuntini, Marco; Grazioso, Paolo; Matera, Francesco; Valenti, Alessandro; Attanasio, Vincenzo; Di Bartolo, Silvia; Nastri, Emanuele
2017-03-01
In this work, we show some experimental approaches concerning optical network design dedicated to 5G infrastructures. In particular, we show some implementations of network slicing based on Carrier Ethernet forwarding, which will be very suitable in the context of 5G heterogeneous networks, especially looking at services for vertical enterprises. We also show how to adopt a central unit (orchestrator) to automatically manage such logical paths according to quality-of-service requirements, which can be monitored at the user location. We also illustrate how novel all-optical processes, such as the ones based on all-optical wavelength conversion, can be used for multicasting, enabling development of TV broadcasting based on 4G-5G terminals. These managing and forwarding techniques, operating on optical links, are tested in a wireless environment on Wi-Fi cells and emulating LTE and WiMAX systems by means of the NS-3 code.
Ungulate management in national parks of the United States and Canada
Demarais, S.; Cornicelli, L.; Kahn, R.; Merrill, E.; Miller, C.; Peek, J.M.; Porter, W.F.; Sargeant, G.A.
2012-01-01
Enabling legislation—that which gives appropriate officials the authority to implement or enforce the law—impacts management of ungulates in national parks of Canada and the United States (U.S.). The initial focus of such legislation in both countries centered on preserving natural and culturally significant areas for posterity. Although this objective remains primary, philosophies and practices have changed. A Canadian vision for ungulate management emerged during the latter half of the 20th century to protect and maintain or restore the ecological integrity of representative samples of the country’s 39 distinct landscapes, and to include provisions for traditional hunting and fishing practices representative of past cultural impacts on the environment. The current ungulate management approach in the U.S. relies on natural (ecological) processes, as long as normal conditions are promoted and there is no impairment of natural resources. Emphasizing natural processes as the basis has been a challenge because ecosystem dynamics are complex and management is multi-jurisdictional. Additionally, natural regulation typically will not prevent ungulates from reaching and sustaining densities that are incompatible with preservation or restoration of native flora and fauna, natural processes, or historical landscapes.
Large three-dimensional photonic crystals based on monocrystalline liquid crystal blue phases.
Chen, Chun-Wei; Hou, Chien-Tsung; Li, Cheng-Chang; Jau, Hung-Chang; Wang, Chun-Ta; Hong, Ching-Lang; Guo, Duan-Yi; Wang, Cheng-Yu; Chiang, Sheng-Ping; Bunning, Timothy J; Khoo, Iam-Choon; Lin, Tsung-Hsien
2017-09-28
Although there have been intense efforts to fabricate large three-dimensional photonic crystals in order to realize their full potential, the technologies developed so far are still beset with various material processing and cost issues. Conventional top-down fabrications are costly and time-consuming, whereas natural self-assembly and bottom-up fabrications often result in high defect density and limited dimensions. Here we report the fabrication of extraordinarily large monocrystalline photonic crystals by controlling the self-assembly processes which occur in unique phases of liquid crystals that exhibit three-dimensional photonic-crystalline properties called liquid-crystal blue phases. In particular, we have developed a gradient-temperature technique that enables three-dimensional photonic crystals to grow to lateral dimensions of ~1 cm (~30,000 of unit cells) and thickness of ~100 μm (~ 300 unit cells). These giant single crystals exhibit extraordinarily sharp photonic bandgaps with high reflectivity, long-range periodicity in all dimensions and well-defined lattice orientation.Conventional fabrication approaches for large-size three-dimensional photonic crystals are problematic. By properly controlling the self-assembly processes, the authors report the fabrication of monocrystalline blue phase liquid crystals that exhibit three-dimensional photonic-crystalline properties.
Neurokernel: An Open Source Platform for Emulating the Fruit Fly Brain
2016-01-01
We have developed an open software platform called Neurokernel for collaborative development of comprehensive models of the brain of the fruit fly Drosophila melanogaster and their execution and testing on multiple Graphics Processing Units (GPUs). Neurokernel provides a programming model that capitalizes upon the structural organization of the fly brain into a fixed number of functional modules to distinguish between these modules’ local information processing capabilities and the connectivity patterns that link them. By defining mandatory communication interfaces that specify how data is transmitted between models of each of these modules regardless of their internal design, Neurokernel explicitly enables multiple researchers to collaboratively model the fruit fly’s entire brain by integration of their independently developed models of its constituent processing units. We demonstrate the power of Neurokernel’s model integration by combining independently developed models of the retina and lamina neuropils in the fly’s visual system and by demonstrating their neuroinformation processing capability. We also illustrate Neurokernel’s ability to take advantage of direct GPU-to-GPU data transfers with benchmarks that demonstrate scaling of Neurokernel’s communication performance both over the number of interface ports exposed by an emulation’s constituent modules and the total number of modules comprised by an emulation. PMID:26751378
Benefits and Challenges of Linking Green Infrastructure and Highway Planning in the United States
NASA Astrophysics Data System (ADS)
Marcucci, Daniel J.; Jordan, Lauren M.
2013-01-01
Landscape-level green infrastructure creates a network of natural and semi-natural areas that protects and enhances ecosystem services, regenerative capacities, and ecological dynamism over long timeframes. It can also enhance quality of life and certain economic activity. Highways create a network for moving goods and services efficiently, enabling commerce, and improving mobility. A fundamentally profound conflict exists between transportation planning and green infrastructure planning because they both seek to create connected, functioning networks across the same landscapes and regions, but transportation networks, especially in the form of highways, fragment and disconnect green infrastructure networks. A key opportunity has emerged in the United States during the last ten years with the promotion of measures to link transportation and environmental concerns. In this article we examined the potential benefits and challenges of linking landscape-level green infrastructure planning and implementation with integrated transportation planning and highway project development in the United States policy context. This was done by establishing a conceptual model that identified logical flow lines from planning to implementation as well as the potential interconnectors between green infrastructure and highway infrastructure. We analyzed the relationship of these activities through literature review, policy analysis, and a case study of a suburban Maryland, USA landscape. We found that regionally developed and adopted green infrastructure plans can be instrumental in creating more responsive regional transportation plans and streamlining the project environmental review process while enabling better outcomes by enabling more targeted mitigation. In order for benefits to occur, however, landscape-scale green infrastructure assessments and plans must be in place before integrated transportation planning and highway project development occurs. It is in the transportation community's interests to actively facilitate green infrastructure planning because it creates a more predictable environmental review context. On the other hand, for landscape-level green infrastructure, transportation planning and development is much more established and better funded and can provide a means of supporting green infrastructure planning and implementation, thereby enhancing conservation of ecological function.
Benefits and challenges of linking green infrastructure and highway planning in the United States.
Marcucci, Daniel J; Jordan, Lauren M
2013-01-01
Landscape-level green infrastructure creates a network of natural and semi-natural areas that protects and enhances ecosystem services, regenerative capacities, and ecological dynamism over long timeframes. It can also enhance quality of life and certain economic activity. Highways create a network for moving goods and services efficiently, enabling commerce, and improving mobility. A fundamentally profound conflict exists between transportation planning and green infrastructure planning because they both seek to create connected, functioning networks across the same landscapes and regions, but transportation networks, especially in the form of highways, fragment and disconnect green infrastructure networks. A key opportunity has emerged in the United States during the last ten years with the promotion of measures to link transportation and environmental concerns. In this article we examined the potential benefits and challenges of linking landscape-level green infrastructure planning and implementation with integrated transportation planning and highway project development in the United States policy context. This was done by establishing a conceptual model that identified logical flow lines from planning to implementation as well as the potential interconnectors between green infrastructure and highway infrastructure. We analyzed the relationship of these activities through literature review, policy analysis, and a case study of a suburban Maryland, USA landscape. We found that regionally developed and adopted green infrastructure plans can be instrumental in creating more responsive regional transportation plans and streamlining the project environmental review process while enabling better outcomes by enabling more targeted mitigation. In order for benefits to occur, however, landscape-scale green infrastructure assessments and plans must be in place before integrated transportation planning and highway project development occurs. It is in the transportation community's interests to actively facilitate green infrastructure planning because it creates a more predictable environmental review context. On the other hand, for landscape-level green infrastructure, transportation planning and development is much more established and better funded and can provide a means of supporting green infrastructure planning and implementation, thereby enhancing conservation of ecological function.
NASA Technical Reports Server (NTRS)
Scoggins, J. R.; Smith, O. E.
1973-01-01
A tablulation is given of rawinsonde data for NASA's first Atmospheric Variability Experiment (AVE 1) conducted during the period February 19-22, 1964. Methods of data handling and processing, and estimates of error magnitudes are also given. Data taken on the AVE 1 project in 1964 enabled an analysis of a large sector of the eastern United States on a fine resolution time scale. This experiment was run in February 1964, and data were collected as a wave developed in the East Gulf on a frontal system which extended through the eastern part of the United States. The primary objective of AVE 1 was to investigate the variability of parameters in space and over time intervals of three hours, and to integrate the results into NASA programs which require this type of information. The results presented are those from one approach, and represent only a portion of the total research effort that can be accomplished.
Kinetic Roughening Transition and Energetics of Tetragonal Lysozyme Crystal Growth
NASA Technical Reports Server (NTRS)
Gorti, Sridhar; Forsythe, Elizabeth L.; Pusey, Marc L.
2004-01-01
Interpretation of lysozyme crystal growth rates using well-established physical theories enabled the discovery of a phenomenon possibly indicative of kinetic roughening. For example, lysozyme crystals grown above a critical supersaturation sigma, (where supersaturation sigma = ln c/c(sub eq), c = the protein concentration and c(sub eq) = the solubility concentration) exhibit microscopically rough surfaces due to the continuous addition of growth units anywhere on the surface of a crystal. The rate of crystal growth, V(sub c), for the continuous growth process is determined by the continuous flux of macromolecules onto a unit area of the crystal surface, a, from a distance, xi, per unit time due to diffusion, and a probability of attachment onto the crystal surface, expressed. Based upon models applied, the energetics of lysozyme crystal growth was determined. The magnitudes of the energy barriers of crystal growth for both the (110) and (101) faces of tetragonal lysozyme crystals are compared. Finally, evidence supportive of the kinetic roughening hypothesis is presented.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-26
... rehabilitation (VR) unit personnel in program areas essential to the effective management of the unit's program of VR services and in skill areas that will enable personnel to improve their ability to provide VR services leading to employment outcomes for individuals with disabilities. The State VR Unit In- Service...
Unit Mastery Learning in an Introductory Geography Course
ERIC Educational Resources Information Center
Healy, John R.; Stephenson, Larry K.
1975-01-01
The unit mastery learning system is a method of individualized, self-paced learning which, through repeatable testing, enables students to attain a mastery of the content of one unit before proceeding to the next in the program. This article describes the unit mastery learning system and its application in an introductory geography course at Hilo…
A compact high-resolution 3-D imaging spectrometer for discovering Oases on Mars
Ge, J.; Ren, D.; Lunine, J.I.; Brown, R.H.; Yelle, R.V.; Soderblom, L.A.; ,
2002-01-01
A new design for a very lightweight, very high throughput reflectance sectrometer enabled by two new technologies being developed is presented. These new technologies include integral field unit optics to enable simultaneous imaging and spectroscopy at high spatial resolution with an infrared (IR) array, and silicon grisms to enable compact and high-resolution spectroscopy.
28 CFR 51.15 - Enabling legislation and contingent or nonuniform requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) PROCEDURES FOR THE ADMINISTRATION OF SECTION 5 OF THE VOTING RIGHTS ACT OF 1965, AS AMENDED General... legislation (1) that enables or permits the State or its political subunits to institute a voting change or (2) that requires or enables the State or its political sub-units to institute a voting change upon some...
28 CFR 51.15 - Enabling legislation and contingent or nonuniform requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) PROCEDURES FOR THE ADMINISTRATION OF SECTION 5 OF THE VOTING RIGHTS ACT OF 1965, AS AMENDED General... legislation (1) that enables or permits the State or its political subunits to institute a voting change or (2) that requires or enables the State or its political sub-units to institute a voting change upon some...
Recalculation of dose for each fraction of treatment on TomoTherapy.
Thomas, Simon J; Romanchikova, Marina; Harrison, Karl; Parker, Michael A; Bates, Amy M; Scaife, Jessica E; Sutcliffe, Michael P F; Burnet, Neil G
2016-01-01
The VoxTox study, linking delivered dose to toxicity requires recalculation of typically 20-37 fractions per patient, for nearly 2000 patients. This requires a non-interactive interface permitting batch calculation with multiple computers. Data are extracted from the TomoTherapy(®) archive and processed using the computational task-management system GANGA. Doses are calculated for each fraction of radiotherapy using the daily megavoltage (MV) CT images. The calculated dose cube is saved as a digital imaging and communications in medicine RTDOSE object, which can then be read by utilities that calculate dose-volume histograms or dose surface maps. The rectum is delineated on daily MV images using an implementation of the Chan-Vese algorithm. On a cluster of up to 117 central processing units, dose cubes for all fractions of 151 patients took 12 days to calculate. Outlining the rectum on all slices and fractions on 151 patients took 7 h. We also present results of the Hounsfield unit (HU) calibration of TomoTherapy MV images, measured over an 8-year period, showing that the HU calibration has become less variable over time, with no large changes observed after 2011. We have developed a system for automatic dose recalculation of TomoTherapy dose distributions. This does not tie up the clinically needed planning system but can be run on a cluster of independent machines, enabling recalculation of delivered dose without user intervention. The use of a task management system for automation of dose calculation and outlining enables work to be scaled up to the level required for large studies.
Qualification and Reliability for MEMS and IC Packages
NASA Technical Reports Server (NTRS)
Ghaffarian, Reza
2004-01-01
Advanced IC electronic packages are moving toward miniaturization from two key different approaches, front and back-end processes, each with their own challenges. Successful use of more of the back-end process front-end, e.g. microelectromechanical systems (MEMS) Wafer Level Package (WLP), enable reducing size and cost. Use of direct flip chip die is the most efficient approach if and when the issues of know good die and board/assembly are resolved. Wafer level package solve the issue of known good die by enabling package test, but it has its own limitation, e.g., the I/O limitation, additional cost, and reliability. From the back-end approach, system-in-a-package (SIAP/SIP) development is a response to an increasing demand for package and die integration of different functions into one unit to reduce size and cost and improve functionality. MEMS add another challenging dimension to electronic packaging since they include moving mechanical elements. Conventional qualification and reliability need to be modified and expanded in most cases in order to detect new unknown failures. This paper will review four standards that already released or being developed that specifically address the issues on qualification and reliability of assembled packages. Exposures to thermal cycles, monotonic bend test, mechanical shock and drop are covered in these specifications. Finally, mechanical and thermal cycle qualification data generated for MEMS accelerometer will be presented. The MEMS was an element of an inertial measurement unit (IMU) qualified for NASA Mars Exploration Rovers (MERs), Spirit and Opportunity that successfully is currently roaring the Martian surface
Enabling Housing Cooperatives: policy lessons from Sweden, India and the United States.
Ganapati, Sukumar
2010-01-01
Housing cooperatives became active in urban areas in Sweden, India and the United States during the interwar period. Yet, after the second world war, while housing cooperatives grew phenomenally nationwide in Sweden and India, they did not do so in the United States. This article makes a comparative institutional analysis of the evolution of housing cooperatives in these three countries. The analysis reveals that housing cooperatives' relationship with the state and the consequent support structures explain the divergent evolution. Although the relationships between cooperatives and the state evolved over time, they can be characterized as embedded autonomy, overembeddedness and disembeddedness in Sweden, India and the United States respectively. Whereas the consequent support structures for housing cooperatives became well developed in Sweden and India, such structures have been weak in the United States. The article highlights the need for embedded autonomy and the need for supportive structures to enable the growth of housing cooperatives.
Gas-Liquid Processing in Microchannels
DOE Office of Scientific and Technical Information (OSTI.GOV)
TeGrotenhuis, Ward E.; Stenkamp, Victoria S.; Twitchell, Alvin
Processing gases and liquids together in microchannels having at least one dimension <1 mm has unique advantages for rapid heat and mass transfer. One approach for managing the two phases is to use porous structures as wicks within microchannels to segregate the liquid phase from the gas phase. Gas-liquid processing is accomplished by providing a gas flow path and inducing flow of the liquid phase through or along the wick under an induced pressure gradient. A variety of unit operations are enabled, including phase separation, partial condensation, absorption, desorption, and distillation. Results are reported of an investigation of microchannel phasemore » separation in a transparent, single-channel device. Next, heat exchange is integrated with the microchannel wick approach to create a partial condenser that also separates the condensate. Finally, the scale-up to a multi-channel phase separator is described.« less
Suciu, George; Suciu, Victor; Martian, Alexandru; Craciunescu, Razvan; Vulpe, Alexandru; Marcu, Ioana; Halunga, Simona; Fratu, Octavian
2015-11-01
Big data storage and processing are considered as one of the main applications for cloud computing systems. Furthermore, the development of the Internet of Things (IoT) paradigm has advanced the research on Machine to Machine (M2M) communications and enabled novel tele-monitoring architectures for E-Health applications. However, there is a need for converging current decentralized cloud systems, general software for processing big data and IoT systems. The purpose of this paper is to analyze existing components and methods of securely integrating big data processing with cloud M2M systems based on Remote Telemetry Units (RTUs) and to propose a converged E-Health architecture built on Exalead CloudView, a search based application. Finally, we discuss the main findings of the proposed implementation and future directions.
NASA Technical Reports Server (NTRS)
Stoutemyer, D. R.
1977-01-01
The computer algebra language MACSYMA enables the programmer to include symbolic physical units in computer calculations, and features automatic detection of dimensionally-inhomogeneous formulas and conversion of inconsistent units in a dimensionally homogeneous formula. Some examples illustrate these features.
Wittmann, Marc
2011-01-01
It has been suggested that perception and action can be understood as evolving in temporal epochs or sequential processing units. Successive events are fused into units forming a unitary experience or “psychological present.” Studies have identified several temporal integration levels on different time scales which are fundamental for our understanding of behavior and subjective experience. In recent literature concerning the philosophy and neuroscience of consciousness these separate temporal processing levels are not always precisely distinguished. Therefore, empirical evidence from psychophysics and neuropsychology on these distinct temporal processing levels is presented and discussed within philosophical conceptualizations of time experience. On an elementary level, one can identify a functional moment, a basic temporal building block of perception in the range of milliseconds that defines simultaneity and succession. Below a certain threshold temporal order is not perceived, individual events are processed as co-temporal. On a second level, an experienced moment, which is based on temporal integration of up to a few seconds, has been reported in many qualitatively different experiments in perception and action. It has been suggested that this segmental processing mechanism creates temporal windows that provide a logistical basis for conscious representation and the experience of nowness. On a third level of integration, continuity of experience is enabled by working memory in the range of multiple seconds allowing the maintenance of cognitive operations and emotional feelings, leading to mental presence, a temporal window of an individual’s experienced presence. PMID:22022310
20170312 - In Silico Dynamics: computer simulation in a ...
Abstract: Utilizing cell biological information to predict higher order biological processes is a significant challenge in predictive toxicology. This is especially true for highly dynamical systems such as the embryo where morphogenesis, growth and differentiation require precisely orchestrated interactions between diverse cell populations. In patterning the embryo, genetic signals setup spatial information that cells then translate into a coordinated biological response. This can be modeled as ‘biowiring diagrams’ representing genetic signals and responses. Because the hallmark of multicellular organization resides in the ability of cells to interact with one another via well-conserved signaling pathways, multiscale computational (in silico) models that enable these interactions provide a platform to translate cellular-molecular lesions perturbations into higher order predictions. Just as ‘the Cell’ is the fundamental unit of biology so too should it be the computational unit (‘Agent’) for modeling embryogenesis. As such, we constructed multicellular agent-based models (ABM) with ‘CompuCell3D’ (www.compucell3d.org) to simulate kinematics of complex cell signaling networks and enable critical tissue events for use in predictive toxicology. Seeding the ABMs with HTS/HCS data from ToxCast demonstrated the potential to predict, quantitatively, the higher order impacts of chemical disruption at the cellular or bioche
In Silico Dynamics: computer simulation in a Virtual Embryo ...
Abstract: Utilizing cell biological information to predict higher order biological processes is a significant challenge in predictive toxicology. This is especially true for highly dynamical systems such as the embryo where morphogenesis, growth and differentiation require precisely orchestrated interactions between diverse cell populations. In patterning the embryo, genetic signals setup spatial information that cells then translate into a coordinated biological response. This can be modeled as ‘biowiring diagrams’ representing genetic signals and responses. Because the hallmark of multicellular organization resides in the ability of cells to interact with one another via well-conserved signaling pathways, multiscale computational (in silico) models that enable these interactions provide a platform to translate cellular-molecular lesions perturbations into higher order predictions. Just as ‘the Cell’ is the fundamental unit of biology so too should it be the computational unit (‘Agent’) for modeling embryogenesis. As such, we constructed multicellular agent-based models (ABM) with ‘CompuCell3D’ (www.compucell3d.org) to simulate kinematics of complex cell signaling networks and enable critical tissue events for use in predictive toxicology. Seeding the ABMs with HTS/HCS data from ToxCast demonstrated the potential to predict, quantitatively, the higher order impacts of chemical disruption at the cellular or biochemical level. This is demonstrate
Implementation of evidence-based stroke care: enablers, barriers, and the role of facilitators
Purvis, Tara; Moss, Karen; Denisenko, Sonia; Bladin, Chris; Cadilhac, Dominique A
2014-01-01
A stroke care strategy was developed in 2007 to improve stroke services in Victoria, Australia. Eight stroke network facilitators (SNFs) were appointed in selected hospitals to enable the establishment of stroke units, develop thrombolysis services, and implement protocols. We aimed to explain the main issues being faced by clinicians in providing evidence-based stroke care, and to determine if the appointment of an SNF was perceived as an acceptable strategy to improve stroke care. Face-to-face semistructured interviews were used in a qualitative research design. Interview transcripts were verified by respondents prior to coding. Two researchers conducted thematic analysis of major themes and subthemes. Overall, 84 hospital staff participated in 33 interviews during 2008. The common factors found to impact on stroke care included staff and equipment availability, location of care, inconsistent use of clinical pathways, and professional beliefs. Other barriers included limited access to specialist clinicians and workload demands. The establishment of dedicated stroke units was considered essential to improve the quality of care. The SNF role was valued for identifying gaps in care and providing capacity to change clinical processes. This is the first large, qualitative multicenter study to describe issues associated with delivering high-quality stroke care and the potential benefits of SNFs to facilitate these improvements. PMID:25246799
NASA Technical Reports Server (NTRS)
Young, Kelsey E.; Evans, C. A.; Hodges, K. V.
2012-01-01
While traditional geologic mapping includes the examination of structural relationships between rock units in the field, more advanced technology now enables us to simultaneously collect and combine analytical datasets with field observations. Information about tectonomagmatic processes can be gleaned from these combined data products. Historically, construction of multi-layered field maps that include sample data has been accomplished serially (first map and collect samples, analyze samples, combine data, and finally, readjust maps and conclusions about geologic history based on combined data sets). New instruments that can be used in the field, such as a handheld xray fluorescence (XRF) unit, are now available. Targeted use of such instruments enables geologists to collect preliminary geochemical data while in the field so that they can optimize scientific data return from each field traverse. Our study tests the application of this technology and projects the benefits gained by real-time geochemical data in the field. The integrated data set produces a richer geologic map and facilitates a stronger contextual picture for field geologists when collecting field observations and samples for future laboratory work. Real-time geochemical data on samples also provide valuable insight regarding sampling decisions by the field geologist
NBC detection in air and water
NASA Technical Reports Server (NTRS)
Hartley, Frank T.; Smith, Steven J.; McMurtry, Gary M.
2003-01-01
Participating in a Navy STTR project to develop a system capable of the 'real-time' detection and quanitification of nuclear, biological and chemical (NBC) warfare agents, and of related industrial chemicals including NBC agent synthesis by-products in water and in air immediately above the water's surface. This project uses JPL's Soft Ionization Membrane (SIM) technology which totally ionizes molecules without fragmentation (a process that can markedly improve the sensitivity and specificity of molecule compostition identification), and JPL's Rotating Field Mass Spectrometer (RFMS) technology which has large enough dynamic mass range to enable detection of nuclear materials as well as biological and chemical agents. This Navy project integrates these JPL Environmental Monitoring UnitS (REMUS) an autonomous underwater vehicle (AUV). It is anticipated that the REMUS AUV will be capable of 'real-time' detection and quantification of NBC warefare agents.
[Knowledge transfer to prevent falls in a cardiovascular setting].
Malouin-Benoit, Marie-Christine; Cossette, Sylvie
2012-01-01
The objective of the clinical project was to plan and deploy a knowledge translation approach to prevent falls among elderly patients hospitalized in a unit of cardiovascular medicine. A combination of education strategies built around interactive workshops enabled the implementation of a screening tool and of an up-to-date preventive intervention guide. Twenty-four workshops were conducted in all three work shifts and an implementation follow-up was made. The participation rate was 93% of the unit's active staff The increased use of prevention tools and of an intervention guide to prevent falls suggests an increased level of awareness as a result of the project. The staff expressed their satisfaction on having been consulted and involved early in the implementation process. Moreover, the flexible schedule and focus on a bilateral sharing of knowledge through brief interactive workshops were appreciated.
NASA Astrophysics Data System (ADS)
Pothof, F.; Bonini, L.; Lanzilotto, M.; Livi, A.; Fogassi, L.; Orban, G. A.; Paul, O.; Ruther, P.
2016-08-01
Objective. Drug resistant focal epilepsy can be treated by resecting the epileptic focus requiring a precise focus localisation using stereoelectroencephalography (SEEG) probes. As commercial SEEG probes offer only a limited spatial resolution, probes of higher channel count and design freedom enabling the incorporation of macro and microelectrodes would help increasing spatial resolution and thus open new perspectives for investigating mechanisms underlying focal epilepsy and its treatment. This work describes a new fabrication process for SEEG probes with materials and dimensions similar to clinical probes enabling recording single neuron activity at high spatial resolution. Approach. Polyimide is used as a biocompatible flexible substrate into which platinum electrodes and leads are integrated with a minimal feature size of 5 μm. The polyimide foils are rolled into the cylindrical probe shape at a diameter of 0.8 mm. The resulting probe features match those of clinically approved devices. Tests in saline solution confirmed the probe stability and functionality. Probes were implanted into the brain of one monkey (Macaca mulatta), trained to perform different motor tasks. Suitable configurations including up to 128 electrode sites allow the recording of task-related neuronal signals. Main results. Probes with 32 and 64 electrode sites were implanted in the posterior parietal cortex. Local field potentials and multi-unit activity were recorded as early as one hour after implantation. Stable single-unit activity was achieved for up to 26 days after implantation of a 64-channel probe. All recorded signals showed modulation during task execution. Significance. With the novel probes it is possible to record stable biologically relevant data over a time span exceeding the usual time needed for epileptic focus localisation in human patients. This is the first time that single units are recorded along cylindrical polyimide probes chronically implanted 22 mm deep into the brain of a monkey, which suggests the potential usefulness of this probe for human applications.
Pothof, F; Bonini, L; Lanzilotto, M; Livi, A; Fogassi, L; Orban, G A; Paul, O; Ruther, P
2016-08-01
Drug resistant focal epilepsy can be treated by resecting the epileptic focus requiring a precise focus localisation using stereoelectroencephalography (SEEG) probes. As commercial SEEG probes offer only a limited spatial resolution, probes of higher channel count and design freedom enabling the incorporation of macro and microelectrodes would help increasing spatial resolution and thus open new perspectives for investigating mechanisms underlying focal epilepsy and its treatment. This work describes a new fabrication process for SEEG probes with materials and dimensions similar to clinical probes enabling recording single neuron activity at high spatial resolution. Polyimide is used as a biocompatible flexible substrate into which platinum electrodes and leads are integrated with a minimal feature size of 5 μm. The polyimide foils are rolled into the cylindrical probe shape at a diameter of 0.8 mm. The resulting probe features match those of clinically approved devices. Tests in saline solution confirmed the probe stability and functionality. Probes were implanted into the brain of one monkey (Macaca mulatta), trained to perform different motor tasks. Suitable configurations including up to 128 electrode sites allow the recording of task-related neuronal signals. Probes with 32 and 64 electrode sites were implanted in the posterior parietal cortex. Local field potentials and multi-unit activity were recorded as early as one hour after implantation. Stable single-unit activity was achieved for up to 26 days after implantation of a 64-channel probe. All recorded signals showed modulation during task execution. With the novel probes it is possible to record stable biologically relevant data over a time span exceeding the usual time needed for epileptic focus localisation in human patients. This is the first time that single units are recorded along cylindrical polyimide probes chronically implanted 22 mm deep into the brain of a monkey, which suggests the potential usefulness of this probe for human applications.
[Assessing program sustainability in public health organizations: a tool-kit application in Haiti].
Ridde, V; Pluye, P; Queuille, L
2006-10-01
Public health stakeholders are concerned about program sustainability. However, they usually conceive sustainability in accordance with financial criteria for at least one reason. No simple frameworks are operationally and theoretically sound enough to globally evaluate program sustainability. The present paper aims to describe an application of one framework assessment tool used to evaluate the sustainability level and process of a Nutritional Care Unit managed by a Swiss humanitarian agency to fight against severe child malnutrition in a Haitian area. The managing agency is committed to put this Unit back into the structure of a local public hospital. The evaluation was performed within the sustainability framework proposed in a former article. Data were collected with a combination of tools, semi-structured interviews (n=33, medical and support staff from the agency and the hospital), participatory observation and document review. Data concerned the four characteristics of organizational routines (memory, adaptation, values and rules) enabling assess to the level of sustainability. In addition, data were related to three types of events distinguishing routinization processes from implementation processes: specific events of routinization, routinization-implementation joint events, and specific events of implementation. Data analysis was thematic and results were validated by actors through a feed-back session and written comments. The current level of sustainability of the Nutritional Care Unit within the Hospital is weak: weak memory, high adaptation, weak sharing of values and rules. This may be explained by the sustainability process, and the absence of specific routinization events. The relevance of such processes is reasonable, while it has been strongly challenged in the troublesome Haitian context. Riots have been widespread over the last years, creating difficulties for the Hospital. This experience suggests the proposed framework and sustainability assessment tools are useful when the context permits scrutinization of program sustainability.
[The role of a specialised risk analysis group in the Veterinary Services of a developing country].
Urbina-Amarís, M E
2003-08-01
Since the World Trade Organization (WTO) Agreement on the Application of Sanitary and Phytosanitary Measures was established, risk analysis in trade, and ultimately in Veterinary and Animal Health Services, has become strategically important. Irrespective of their concept (discipline, approach, method, process), all types of risk analysis in trade involve four periods or phases:--risk identification-- risk assessment--risk management--risk information or communication. All veterinarians involved in a risk analysis unit must have in-depth knowledge of statistics and the epidemiology of transmissible diseases, as well as a basic knowledge of veterinary science, economics, mathematics, data processing and social communication, to enable them to work with professionals in these disciplines. Many developing countries do not have enough well-qualified professionnals in these areas to support a risk analysis unit. This will need to be rectified by seeking strategic alliances with other public or private sectors that will provide the required support to run the unit properly. Due to the special nature of its risk analysis functions, its role in supporting decision-making, and the criteria of independence and transparency that are so crucial to its operations, the hierarchical position of the risk analysis unit should be close to the top management of the Veterinary Service. Due to the shortage of personnel in developing countries with the required training and scientific and technical qualifications, countries with organisations responsible for both animal and plant health protection would be advised to set up integrated plant and animal risk analysis units. In addition, these units could take charge of all activities relating to WTO agreements and regional agreements on animal and plant health management.
Pre- and post-drill comparison of the Mount Elbert gas hydrate prospect, Alaska North Slope
Lee, M.W.; Agena, W.F.; Collett, T.S.; Inks, T.L.
2011-01-01
In 2006, the United States Geological Survey (USGS) completed a detailed analysis and interpretation of available 2-D and 3-D seismic data, along with seismic modeling and correlation with specially processed downhole well log data for identifying potential gas hydrate accumulations on the North Slope of Alaska. A methodology was developed for identifying sub-permafrost gas hydrate prospects within the gas hydrate stability zone in the Milne Point area. The study revealed a total of 14 gas hydrate prospects in this area.In order to validate the gas hydrate prospecting protocol of the USGS and to acquire critical reservoir data needed to develop a longer-term production testing program, a stratigraphic test well was drilled at the Mount Elbert prospect in the Milne Point area in early 2007. The drilling confirmed the presence of two prominent gas-hydrate-bearing units in the Mount Elbert prospect, and high quality well logs and core data were acquired. The post-drill results indicate pre-drill predictions of the reservoir thickness and the gas-hydrate saturations based on seismic and existing well data were 90% accurate for the upper unit (hydrate unit D) and 70% accurate for the lower unit (hydrate unit C), confirming the validity of the USGS approach to gas hydrate prospecting. The Mount Elbert prospect is the first gas hydrate accumulation on the North Slope of Alaska identified primarily on the basis of seismic attribute analysis and specially processed downhole log data. Post-drill well log data enabled a better constraint of the elastic model and the development of an improved approach to the gas hydrate prospecting using seismic attributes. ?? 2009.
DOE Office of Scientific and Technical Information (OSTI.GOV)
REN, GANG; LIU, JINXIN; LI, HONGCHANG
A closed-loop proportional-integral (PI) control software is provided for fully mechanically controlled automated electron microscopic tomography. The software is developed based on Gatan DigitalMicrograph, and is compatible with Zeiss LIBRA 120 transmission electron microscope. However, it can be expanded to other TEM instrument with modification. The software consists of a graphical user interface, a digital PI controller, an image analyzing unit, and other drive units (i.e.: image acquire unit and goniometer drive unit). During a tomography data collection process, the image analyzing unit analyzes both the accumulated shift and defocus value of the latest acquired image, and provides the resultsmore » to the digital PI controller. The digital PI control compares the results with the preset values and determines the optimum adjustments of the goniometer. The goniometer drive unit adjusts the spatial position of the specimen according to the instructions given by the digital PI controller for the next tilt angle and image acquisition. The goniometer drive unit achieves high precision positioning by using a backlash elimination method. The major benefits of the software are: 1) the goniometer drive unit keeps pre-aligned/optimized beam conditions unchanged and achieves position tracking solely through mechanical control; 2) the image analyzing unit relies on only historical data and therefore does not require additional images/exposures; 3) the PI controller enables the system to dynamically track the imaging target with extremely low system error.« less
Leontjevas, Ruslan; Gerritsen, Debby L; Koopmans, Raymond T C M; Smalbrugge, Martin; Vernooij-Dassen, Myrra J F J
2012-06-01
A multidisciplinary, evidence-based care program to improve the management of depression in nursing home residents was implemented and tested using a stepped-wedge design in 23 nursing homes (NHs): "Act in case of Depression" (AiD). Before effect analyses, to evaluate AiD process data on sampling quality (recruitment and randomization, reach) and intervention quality (relevance and feasibility, extent to which AiD was performed), which can be used for understanding internal and external validity. In this article, a model is presented that divides process evaluation data into first- and second-order process data. Qualitative and quantitative data based on personal files of residents, interviews of nursing home professionals, and a research database were analyzed according to the following process evaluation components: sampling quality and intervention quality. Nursing home. The pattern of residents' informed consent rates differed for dementia special care units and somatic units during the study. The nursing home staff was satisfied with the AiD program and reported that the program was feasible and relevant. With the exception of the first screening step (nursing staff members using a short observer-based depression scale), AiD components were not performed fully by NH staff as prescribed in the AiD protocol. Although NH staff found the program relevant and feasible and was satisfied with the program content, individual AiD components may have different feasibility. The results on sampling quality implied that statistical analyses of AiD effectiveness should account for the type of unit, whereas the findings on intervention quality implied that, next to the type of unit, analyses should account for the extent to which individual AiD program components were performed. In general, our first-order process data evaluation confirmed internal and external validity of the AiD trial, and this evaluation enabled further statistical fine tuning. The importance of evaluating the first-order process data before executing statistical effect analyses is thus underlined. Copyright © 2012 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.
Dragas, Jelena; Viswam, Vijay; Shadmani, Amir; Chen, Yihui; Bounik, Raziyeh; Stettler, Alexander; Radivojevic, Milos; Geissler, Sydney; Obien, Marie; Müller, Jan; Hierlemann, Andreas
2017-06-01
Biological cells are characterized by highly complex phenomena and processes that are, to a great extent, interdependent. To gain detailed insights, devices designed to study cellular phenomena need to enable tracking and manipulation of multiple cell parameters in parallel; they have to provide high signal quality and high spatiotemporal resolution. To this end, we have developed a CMOS-based microelectrode array system that integrates six measurement and stimulation functions, the largest number to date. Moreover, the system features the largest active electrode array area to date (4.48×2.43 mm 2 ) to accommodate 59,760 electrodes, while its power consumption, noise characteristics, and spatial resolution (13.5 μm electrode pitch) are comparable to the best state-of-the-art devices. The system includes: 2,048 action-potential (AP, bandwidth: 300 Hz to 10 kHz) recording units, 32 local-field-potential (LFP, bandwidth: 1 Hz to 300 Hz) recording units, 32 current recording units, 32 impedance measurement units, and 28 neurotransmitter detection units, in addition to the 16 dual-mode voltage-only or current/voltage-controlled stimulation units. The electrode array architecture is based on a switch matrix, which allows for connecting any measurement/stimulation unit to any electrode in the array and for performing different measurement/stimulation functions in parallel.
2017-06-01
organizational structure , fixed vs. mobile forward operating base (FOB) synchronization, prior preparation, and unit capabilities. 5. Ideas to Improve...Technical Report 1356 Enabling Rapid Integration of Combined Arms Teams into a Brigade Combat Team Organizational Structure ...2012 - May 2014 4. TITLE AND SUBTITLE Enabling Rapid Integration of Combined Arms Teams into a Brigade Combat Team Organizational Structure
Fast simulation of Proton Induced X-Ray Emission Tomography using CUDA
NASA Astrophysics Data System (ADS)
Beasley, D. G.; Marques, A. C.; Alves, L. C.; da Silva, R. C.
2013-07-01
A new 3D Proton Induced X-Ray Emission Tomography (PIXE-T) and Scanning Transmission Ion Microscopy Tomography (STIM-T) simulation software has been developed in Java and uses NVIDIA™ Common Unified Device Architecture (CUDA) to calculate the X-ray attenuation for large detector areas. A challenge with PIXE-T is to get sufficient counts while retaining a small beam spot size. Therefore a high geometric efficiency is required. However, as the detector solid angle increases the calculations required for accurate reconstruction of the data increase substantially. To overcome this limitation, the CUDA parallel computing platform was used which enables general purpose programming of NVIDIA graphics processing units (GPUs) to perform computations traditionally handled by the central processing unit (CPU). For simulation performance evaluation, the results of a CPU- and a CUDA-based simulation of a phantom are presented. Furthermore, a comparison with the simulation code in the PIXE-Tomography reconstruction software DISRA (A. Sakellariou, D.N. Jamieson, G.J.F. Legge, 2001) is also shown. Compared to a CPU implementation, the CUDA based simulation is approximately 30× faster.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maurer, S. A.; Kussmann, J.; Ochsenfeld, C., E-mail: Christian.Ochsenfeld@cup.uni-muenchen.de
2014-08-07
We present a low-prefactor, cubically scaling scaled-opposite-spin second-order Møller-Plesset perturbation theory (SOS-MP2) method which is highly suitable for massively parallel architectures like graphics processing units (GPU). The scaling is reduced from O(N{sup 5}) to O(N{sup 3}) by a reformulation of the MP2-expression in the atomic orbital basis via Laplace transformation and the resolution-of-the-identity (RI) approximation of the integrals in combination with efficient sparse algebra for the 3-center integral transformation. In contrast to previous works that employ GPUs for post Hartree-Fock calculations, we do not simply employ GPU-based linear algebra libraries to accelerate the conventional algorithm. Instead, our reformulation allows tomore » replace the rate-determining contraction step with a modified J-engine algorithm, that has been proven to be highly efficient on GPUs. Thus, our SOS-MP2 scheme enables us to treat large molecular systems in an accurate and efficient manner on a single GPU-server.« less
Patching. Restitching business portfolios in dynamic markets.
Eisenhardt, K M; Brown, S L
1999-01-01
In turbulent markets, businesses and opportunities are constantly falling out of alignment. New technologies and emerging markets create fresh opportunities. Converging markets produce more. And of course, some markets fade. In this landscape of continuous flux, it's more important to build corporate-level strategic processes that enable dynamic repositioning than it is to build any particular defensible position. That's why smart corporate strategists use patching, a process of mapping and remapping business units to create a shifting mix of highly focused, tightly aligned businesses that can respond to changing market opportunities. Patching is not just another name for reorganizing; patchers have a distinctive mindset. Traditional managers see structure as stable; patching managers believe structure is inherently temporary. Traditional managers set corporate strategy first, but patching managers keep the organization focused on the right set of business opportunities and let strategy emerge from individual businesses. Although the focus of patching is flexibility, the process itself follows a pattern. Patching changes are usually small in scale and made frequently. Patching should be done quickly; the emphasis is on getting the patch about right and fixing problems later. Patches should have a test drive before they're formalized but then be tightly scripted after they've been announced. And patching won't work without the right infrastructure: modular business units, fine-grained and complete unit-level metrics, and companywide compensation parity. The authors illustrate how patching works and point out some common stumbling blocks.
Recent progress in continuous and semi-continuous processing of solid oral dosage forms: a review.
Teżyk, Michał; Milanowski, Bartłomiej; Ernst, Andrzej; Lulek, Janina
2016-08-01
Continuous processing is an innovative production concept well known and successfully used in other industries for many years. The modern pharmaceutical industry is facing the challenge of transition from a traditional manufacturing approach based on batch-wise production to a continuous manufacturing model. The aim of this article is to present technological progress in manufacturing based on continuous and semi-continuous processing of the solid oral dosage forms. Single unit processes possessing an alternative processing pathway to batch-wise technology or, with some modification, an altered approach that may run continuously, and are thus able to seamlessly switch to continuous manufacturing are briefly presented. Furthermore, the concept of semi-continuous processing is discussed. Subsequently, more sophisticated production systems created by coupling single unit processes and comprising all the steps of production, from powder to final dosage form, were reviewed. Finally, attempts of end-to-end production approach, meaning the linking of continuous synthesis of API from intermediates with the production of final dosage form, are described. There are a growing number of scientific articles showing an increasing interest in changing the approach to the production of pharmaceuticals in recent years. Numerous scientific publications are a source of information on the progress of knowledge and achievements of continuous processing. These works often deal with issues of how to modify or replace the unit processes in order to enable seamlessly switching them into continuous processing. A growing number of research papers concentrate on integrated continuous manufacturing lines in which the production concept of "from powder to tablet" is realized. Four main domains are under investigation: influence of process parameters on intermediates or final dosage forms properties, implementation of process analytical tools, control-managing system responsible for keeping continuous materials flow through the whole manufacturing process and the development of new computational methods to assess or simulate these new manufacturing techniques. The attempt to connect the primary and secondary production steps proves that development of continuously operating lines is possible. A mind-set change is needed to be able to face, and fully assess, the advantages and disadvantages of switching from batch to continuous mode production.
SIRU development. Volume 3: Software description and program documentation
NASA Technical Reports Server (NTRS)
Oehrle, J.
1973-01-01
The development and initial evaluation of a strapdown inertial reference unit (SIRU) system are discussed. The SIRU configuration is a modular inertial subsystem with hardware and software features that achieve fault tolerant operational capabilities. The SIRU redundant hardware design is formulated about a six gyro and six accelerometer instrument module package. The six axes array provides redundant independent sensing and the symmetry enables the formulation of an optimal software redundant data processing structure with self-contained fault detection and isolation (FDI) capabilities. The basic SIRU software coding system used in the DDP-516 computer is documented.
2003-12-01
Heating and Cooling Device 42 5.2.3 Multiple Tip STM ~ 43 5.2.3.1 Novel Nanomanipulator MM3 43 5.2.3.2 Four Tip STM Assembly 44 5.2.3.3 Vibration ...Analysis of Eddy Current Damping System of Multiple TIP STM " 44 5.2.3.4 Active Vibration Damping System 46 5.3 First Results 47 5.3.1 UHV-SEM...side: Actual device side and top view. 44 46. Setup for the vibration analysis experiment. 45 47. Relaxation of the STM unit, (a) without the eddy
Cost effective modular unit for cleaning oil and gas field waste water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zinberg, M.B.; Nenasheva, M.N.; Gafarov, N.A.
1996-12-31
Problems of environmental control involving conservation of water resources are vital for the development of giant oil and gas condensate fields near Caspian Sea (Russia) characterized by water shortages. One of the urgent tasks of oil production industry is to use all field waste water consisting of underground, processing and rain water. It was necessary to construct a new highly effective equipment which could be used in local waste water treatment. Now we have at our disposal a technology and equipment to meet the requirements to the treated water quality. Thus we have installed a modular unit of 100 m{supmore » 3}/a day capacity to clean waste water from oil products, suspended matter and other organic pollutants at Orenburg oil and gas condensate field, Russia. The unit provides with a full treatment of produced water and comprises a settling tank with adhesive facility, the number of sorption filters, Trofactor bioreactors and a disinfecting facility. The equipment is fitted into three boxes measuring 9 x 3.2 x 2.7 in each. The equipment is simple in design that enables to save money, time and space. Sorption filters, bioreactors as well as the Trofactor process are a part of know-how. While working on the unit construction we applied well known methods of settling and sorption. The process of mechanic cleaning is undergoing in the following succession: (1) the gravitational separation in a settling tank where the floated film oil products are constantly gathered and the sediment is periodically taken away, (2) the settled water treatment in sorption Filters of a special kind.« less
28 CFR 0.111B - Witness Security Program.
Code of Federal Regulations, 2014 CFR
2014-07-01
... United States Marshals Service § 0.111B Witness Security Program. (a) In connection with the protection... potential witness, the Director of the United States Marshals Service and officers of the United States Marshals Service designated by the Director may: (1) Provide suitable documents to enable the person to...
28 CFR 0.111B - Witness Security Program.
Code of Federal Regulations, 2013 CFR
2013-07-01
... United States Marshals Service § 0.111B Witness Security Program. (a) In connection with the protection... potential witness, the Director of the United States Marshals Service and officers of the United States Marshals Service designated by the Director may: (1) Provide suitable documents to enable the person to...
NASA Technical Reports Server (NTRS)
Taylor, Leslie A.
1993-01-01
Technical innovations have converged with the exploding market demand for mobile telecommunications to create the impetus for low-earth orbit (LEO) communications satellite systems. The so-called 'Little LEO's' propose use of VHF and UHF spectrum to provide position - location and data messaging services. The so-called 'Big LEO's' propose to utilize the RDSS bands to provide voice and data services. In the United States, several applications were filed with the U.S. Federal Communications Commission (FCC) to construct and operate these mobile satellite systems. To enable the prompt introduction of such new technology services, the FCC is using innovative approaches to process the applications. Traditionally, when the FCC is faced with 'mutually exclusive' applications, e.g. a grant of one would preclude a grant of the others, it uses selection mechanisms such as comparative hearings or lotteries. In the case of the LEO systems, the FCC has sought to avoid these time-consuming approaches by using negotiated rulemakings. The FCC's objective is to enable the multiple applicants and other interested parties to agree on technical and service rules which will enable the grant of all qualified applications. With regard to the VHF/UHF systems, the Advisory Committee submitted a consensus report to the FCC. The process for the systems operating in the bands above 1 GHz involved more parties and more issues but still provided the FCC useful technical information to guide the adoption of rules for the new mobile satellite service.
NASA Astrophysics Data System (ADS)
Taylor, Leslie A.
Technical innovations have converged with the exploding market demand for mobile telecommunications to create the impetus for low-earth orbit (LEO) communications satellite systems. The so-called 'Little LEO's' propose use of VHF and UHF spectrum to provide position - location and data messaging services. The so-called 'Big LEO's' propose to utilize the RDSS bands to provide voice and data services. In the United States, several applications were filed with the U.S. Federal Communications Commission (FCC) to construct and operate these mobile satellite systems. To enable the prompt introduction of such new technology services, the FCC is using innovative approaches to process the applications. Traditionally, when the FCC is faced with 'mutually exclusive' applications, e.g. a grant of one would preclude a grant of the others, it uses selection mechanisms such as comparative hearings or lotteries. In the case of the LEO systems, the FCC has sought to avoid these time-consuming approaches by using negotiated rulemakings. The FCC's objective is to enable the multiple applicants and other interested parties to agree on technical and service rules which will enable the grant of all qualified applications. With regard to the VHF/UHF systems, the Advisory Committee submitted a consensus report to the FCC. The process for the systems operating in the bands above 1 GHz involved more parties and more issues but still provided the FCC useful technical information to guide the adoption of rules for the new mobile satellite service.
NASA Astrophysics Data System (ADS)
Cherala, Anshuman; Sreenivasan, S. V.
2018-12-01
Complex nanoshaped structures (nanoshape structures here are defined as shapes enabled by sharp corners with radius of curvature <5 nm) have been shown to enable emerging nanoscale applications in energy, electronics, optics, and medicine. This nanoshaped fabrication at high throughput is well beyond the capabilities of advanced optical lithography. While the highest-resolution e-beam processes (Gaussian beam tools with non-chemically amplified resists) can achieve <5 nm resolution, this is only available at very low throughputs. Large-area e-beam processes, needed for photomasks and imprint templates, are limited to 18 nm half-pitch lines and spaces and 20 nm half-pitch hole patterns. Using nanoimprint lithography, we have previously demonstrated the ability to fabricate precise diamond-like nanoshapes with 3 nm radius corners over large areas. An exemplary shaped silicon nanowire ultracapacitor device was fabricated with these nanoshaped structures, wherein the half-pitch was 100 nm. The device significantly exceeded standard nanowire capacitor performance (by 90%) due to relative increase in surface area per unit projected area, enabled by the nanoshape. Going beyond the previous work, in this paper we explore the scaling of these nanoshaped structures to 10 nm half-pitch and below. At these scales a new "shape retention" resolution limit is observed due to polymer relaxation in imprint resists, which cannot be predicted with a linear elastic continuum model. An all-atom molecular dynamics model of the nanoshape structure was developed here to study this shape retention phenomenon and accurately predict the polymer relaxation. The atomistic framework is an essential modeling and design tool to extend the capability of imprint lithography to sub-10 nm nanoshapes. This framework has been used here to propose process refinements that maximize shape retention, and design template assist features (design for nanoshape retention) to achieve targeted nanoshapes.
NASA Astrophysics Data System (ADS)
Brossier, J. F.; Stephan, K.; Jaumann, R.; Le Mouelic, S.; Brown, R. H.
2015-12-01
Since the equatorial regions of Titan have been fully observed by the Visible and Infrared Mapping Spectrometer (VIMS) [1], the analysis of false-color composite allows distinguishing three mains units: bright, bluish and brownish units [2-4]. This distinction can be enhanced by using ratios of VIMS channels that allow emphasizing subtle difference of spectral behavior of the units, especially at short wavelengths (below 2 µm). The VIMS - bluish unit is mostly enriched in water-ice particles, which consist of particles exposition derived from the high standing water-ice substrate and deposited on the lowlands after fluvial/pluvial processes [5] and impact [6]. This spectral unit is mainly located at the frontier of the large bright plateaus, and hence considered as a transition zone to the VIMS - brownish unit corresponding to the Radar dune-fields [7]. Whereas these brownish dunes consist on atmospheric aerosols, named tholins [4] contaminated with particles of water ice. High resolution observations of VIMS (less than 1 km per pixel), show local transition zones between the bright material and the brownish dunes, suggesting weathering and erosional processes (e.g. Bohai Sinus and the Huygens Landing site). The reason of these spectral variations in this bluish unit might be due to physical properties variations related to erosional processes occurring on the bright plateaus [5,8], such as particles sizes and the degree of mixture with tholins. Our approach enables a better understanding of the distribution of the water-ice grains in terms of particles-size and mixtures with tholins at local and global scale. Reference: [1] Brown, R. H. et al. (2005) SSR. [2] Barnes, J. W. et al. (2007) Icarus, 186 (1). [3] Soderblom, L. A. et al. (2007) PSS, 55 (13). [4] Langhans, M. H. et al. (2011) PSS, 60. [5] Jaumann, R. et al. (2008) Icarus, 197. [6] Le Mouelic, S. et al. (2008) JGR, 113 (E04003). [7] Rodriguez, S. et al. (2013) Icarus. [8] Jaumann, R. et al. (2009) LPSC.
Exploring a Community's Heritage through a Collaborative Unit of Study
ERIC Educational Resources Information Center
Bobetsky, Victor V.
2005-01-01
This article presents a model of an effective unit of study in which music played a vital role. The unit of study was created and implemented in a New York City middle school, and students examined an African American community in the borough of Brooklyn. The unit enabled students to explore the history, heritage, and culture of a local community…
[Criteria of quality of structure in rehabilitation units with inpatient treatment].
Klein, K; Farin, E; Jäckel, W H; Blatt, O; Schliehe, F
2004-04-01
The structure of a rehabilitation unit is an important feature of the quality of care. Adequate and qualitatively good structures provide the basis for appropriate therapy offers and treatment and eventually, a better health for rehabilitants. The quality of structures is generally recorded without any evaluation of the aspects in particular. The definition of standards is the basis for such an evaluation. The project presented is aimed at the definition of relevant structural standards for rehab units with inpatient treatment for musculoskeletal, cardiac, neurological, gastroenterological, oncological, pneumological and dermatological diseases. Here, the distinction between basal criteria which have to be fulfilled by every rehab unit with inpatient treatment and criteria important for a well-aimed assignment of patients with specific needs ("assignment criteria") should be made. Apart from the documentation of structural attributes, the structural quality of a rehab unit can be described individually as well as in comparison with other units. Relevant structural criteria were defined in expert meetings by means of a modified Delphi-technique with five inquiries. Overall, 199 "basal criteria" and "assignment criteria" were defined. All criteria can be assigned to the two domains general structural characteristics (general characteristics and equipment of rooms; medical/technical equipment; therapy, education, care; staff) and process-related structures (conceptual frames; internal quality management; internal communication and personnel development). The structural standards are applicable to units for musculoskeletal, cardiac, neurological, oncological, gastroenterological, dermatological and pneumological rehabilitation financed by the two main providers of rehabilitation, the statutory pension insurance scheme and the statutory health insurance scheme for all other five indications. The definition of structural standards agreed by experts in a formal consensus process, provides comprehensive and concrete requirements for German rehab units with inpatient medical rehabilitation. If the two main providers of rehabilitation both use the standards this can be regarded as a hallmark on the path to a unitary programme for quality management. The results enable units to analyse their weak points not just on an individual basis but allow also for a comparison between units, along with contributing to optimizing the structural quality of rehab units.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Howard
2010-11-30
This project met the objective to further the development of an integrated multi-contaminant removal process in which H2S, NH3, HCl and heavy metals including Hg, As, Se and Cd present in the coal-derived syngas can be removed to specified levels in a single/integrated process step. The process supports the mission and goals of the Department of Energy's Gasification Technologies Program, namely to enhance the performance of gasification systems, thus enabling U.S. industry to improve the competitiveness of gasification-based processes. The gasification program will reduce equipment costs, improve process environmental performance, and increase process reliability and flexibility. Two sulfur conversion conceptsmore » were tested in the laboratory under this project, i.e., the solventbased, high-pressure University of California Sulfur Recovery Process High Pressure (UCSRP-HP) and the catalytic-based, direct oxidation (DO) section of the CrystaSulf-DO process. Each process required a polishing unit to meet the ultra-clean sulfur content goals of <50 ppbv (parts per billion by volume) as may be necessary for fuel cells or chemical production applications. UCSRP-HP was also tested for the removal of trace, non-sulfur contaminants, including ammonia, hydrogen chloride, and heavy metals. A bench-scale unit was commissioned and limited testing was performed with simulated syngas. Aspen-Plus®-based computer simulation models were prepared and the economics of the UCSRP-HP and CrystaSulf-DO processes were evaluated for a nominal 500 MWe, coal-based, IGCC power plant with carbon capture. This report covers the progress on the UCSRP-HP technology development and the CrystaSulf-DO technology.« less
Grigg, Celia P; Tracy, Sally K; Schmied, Virginia; Daellenbach, Rea; Kensington, Mary
2015-06-01
to explore women׳s birthplace decision-making and identify the factors which enable women to plan to give birth in a freestanding midwifery-led primary level maternity unit rather than in an obstetric-led tertiary level maternity hospital in New Zealand. a mixed methods prospective cohort design. data from eight focus groups (37 women) and a six week postpartum survey (571 women, 82%) were analysed using thematic analysis and descriptive statistics. The qualitative data from the focus groups and survey were the primary data sources and were integrated at the analysis stage; and the secondary qualitative and quantitative data were integrated at the interpretation stage. Christchurch, New Zealand, with one tertiary maternity hospital and four primary level maternity units (2010-2012). well (at 'low risk' of developing complications), pregnant women booked to give birth in one of the primary units or the tertiary hospital. All women received midwifery continuity of care, regardless of their intended or actual birthplace. five core themes were identified: the birth process, women׳s self-belief in their ability to give birth, midwives, the health system and birth place. 'Confidence' was identified as the overarching concept influencing the themes. Women who chose to give birth in a primary maternity unit appeared to differ markedly in their beliefs regarding their optimal birthplace compared to women who chose to give birth in a tertiary maternity hospital. The women who planned a primary maternity unit birth expressed confidence in the birth process, their ability to give birth, their midwife, the maternity system and/or the primary unit itself. The women planning to give birth in a tertiary hospital did not express confidence in the birth process, their ability to give birth, the system for transfers and/or the primary unit as a birthplace, although they did express confidence in their midwife. birthplace is a profoundly important aspect of women׳s experience of childbirth. Birthplace decision-making is complex, in common with many other aspects of childbirth. A multiplicity of factors needs converge in order for all those involved to gain the confidence required to plan what, in this context, might be considered a 'countercultural' decision to give birth at a midwife-led primary maternity unit. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Xiaohui; Couwenhoven, Mary E.; Foos, David H.; Doran, James; Yankelevitz, David F.; Henschke, Claudia I.
2008-03-01
An image-processing method has been developed to improve the visibility of tube and catheter features in portable chest x-ray (CXR) images captured in the intensive care unit (ICU). The image-processing method is based on a multi-frequency approach, wherein the input image is decomposed into different spatial frequency bands, and those bands that contain the tube and catheter signals are individually enhanced by nonlinear boosting functions. Using a random sampling strategy, 50 cases were retrospectively selected for the study from a large database of portable CXR images that had been collected from multiple institutions over a two-year period. All images used in the study were captured using photo-stimulable, storage phosphor computed radiography (CR) systems. Each image was processed two ways. The images were processed with default image processing parameters such as those used in clinical settings (control). The 50 images were then separately processed using the new tube and catheter enhancement algorithm (test). Three board-certified radiologists participated in a reader study to assess differences in both detection-confidence performance and diagnostic efficiency between the control and test images. Images were evaluated on a diagnostic-quality, 3-megapixel monochrome monitor. Two scenarios were studied: the baseline scenario, representative of today's workflow (a single-control image presented with the window/level adjustments enabled) vs. the test scenario (a control/test image pair presented with a toggle enabled and the window/level settings disabled). The radiologists were asked to read the images in each scenario as they normally would for clinical diagnosis. Trend analysis indicates that the test scenario offers improved reading efficiency while providing as good or better detection capability compared to the baseline scenario.
Raveis, Victoria H; Conway, Laurie J; Uchida, Mayuko; Pogorzelska-Maziarz, Monika; Larson, Elaine L; Stone, Patricia W
2014-04-01
Health-care-associated infections (HAIs) remain a major patient safety problem even as policy and programmatic efforts designed to reduce HAIs have increased. Although information on implementing effective infection control (IC) efforts has steadily grown, knowledge gaps remain regarding the organizational elements that improve bedside practice and accommodate variations in clinical care settings. We conducted in-depth, semistructured interviews in 11 hospitals across the United States with a range of hospital personnel involved in IC (n = 116). We examined the collective nature of IC and the organizational elements that can enable disparate groups to work together to prevent HAIs. Our content analysis of participants' narratives yielded a rich description of the organizational process of implementing adherence to IC. Findings document the dynamic, fluid, interactional, and reactive nature of this process. Three themes emerged: implementing adherence efforts institution-wide, promoting an institutional culture to sustain adherence, and contending with opposition to the IC mandate.
NASA Astrophysics Data System (ADS)
Varzi, Alberto; Passerini, Stefano
2015-12-01
Potatoes starch (PS), a natural polymer obtainable from non-edible sources, is for the first time evaluated as alternative water-processable binder for Electrochemical Double-Layer Capacitor (EDLC) electrodes. Morphological and electrochemical properties of activated carbon (AC)-based electrodes are investigated and compared to those achieved with the state-of-the-art aqueous binder (CMC, i.e. Na-carboxymethyl cellulose). The obtained results suggest substantial benefits of PS, in particular regarding the electrode fabrication process. As a matter of fact, owing to its amylopectin content (moderately branched polysaccharide), PS displays only minimal shrinkage upon drying, resulting on rather homogeneous electrodes not presenting the dramatic surface cracking observed with CMC. Furthermore, owing to the smaller volume of water required for the processing, much higher active material loading per area unit can be achieved. This is reflected on improvements of up to 60% in terms of areal capacitance.
Frequency domain zero padding for accurate autofocusing based on digital holography
NASA Astrophysics Data System (ADS)
Shin, Jun Geun; Kim, Ju Wan; Eom, Tae Joong; Lee, Byeong Ha
2018-01-01
The numerical refocusing feature of digital holography enables the reconstruction of a well-focused image from a digital hologram captured at an arbitrary out-of-focus plane without the supervision of end users. However, in general, the autofocusing process for getting a highly focused image requires a considerable computational cost. In this study, to reconstruct a better-focused image, we propose the zero padding technique implemented in the frequency domain. Zero padding in the frequency domain enhances the visibility or numerical resolution of the image, which allows one to measure the degree of focus with more accuracy. A coarse-to-fine search algorithm is used to reduce the computing load, and a graphics processing unit (GPU) is employed to accelerate the process. The performance of the proposed scheme is evaluated with simulation and experiment, and the possibility of obtaining a well-refocused image with an enhanced accuracy and speed are presented.
Electrically Conductive and Protective Coating for Planar SOFC Stacks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, Jung-Pyung; Stevenson, Jeffry W.
Ferritic stainless steels are preferred interconnect materials for intermediate temperature SOFCs because of their resistance to oxidation, high formability and low cost. However, their protective oxide layer produces Cr-containing volatile species at SOFC operating temperatures and conditions, which can cause cathode poisoning. Electrically conducting spinel coatings have been developed to prevent cathode poisoning and to maintain an electrically conductive pathway through SOFC stacks. However, this coating is not compatible with the formation of stable, hermetic seals between the interconnect frame component and the ceramic cell. Thus, a new aluminizing process has been developed by PNNL to enable durable sealing, preventmore » Cr evaporation, and maintain electrical insulation between stack repeat units. Hence, two different types of coating need to have stable operation of SOFC stacks. This paper will focus on the electrically conductive coating process. Moreover, an advanced coating process, compatible with a non-electrically conductive coating will be« less
Advanced Stirling Radioisotope Generator Engineering Unit 2 (ASRG EU2) Final Assembly
NASA Technical Reports Server (NTRS)
Oriti, Salvatore M.
2015-01-01
NASA Glenn Research Center (GRC) has recently completed the assembly of a unique Stirling generator test article for laboratory experimentation. Under the Advanced Stirling Radioisotope Generator (ASRG) flight development contract, NASA GRC initiated a task to design and fabricate a flight-like generator for in-house testing. This test article was given the name ASRG Engineering Unit 2 (EU2) as it was effectively the second engineering unit to be built within the ASRG project. The intent of the test article was to duplicate Lockheed Martin's qualification unit ASRG design as much as possible to enable system-level tests not previously possible at GRC. After the cancellation of the ASRG flight development project, the decision was made to continue the EU2 build, and make use of a portion of the hardware from the flight development project. GRC and Lockheed Martin engineers collaborated to develop assembly procedures, leveraging the valuable knowledge gathered by Lockheed Martin during the ASRG development contract. The ASRG EU2 was then assembled per these procedures at GRC with Lockheed Martin engineers on site. The assembly was completed in August 2014. This paper details the components that were used for the assembly, and the assembly process itself.
Munir, Samina K; Kay, Stephen
2005-08-01
A multi-site study, conducted in two English and two Danish intensive care units, investigates the complexity of work processes in intensive care, and the implications of this complexity for information management with regards to clinical information systems. Data were collected via observations, shadowing of clinical staff, interviews and questionnaires. The construction of role activity diagrams enabled the capture of critical care work processes. Upon analysing these diagrams, it was found that intensive care work processes consist of 'simplified-complexity', these processes are changed with the introduction of information systems for the everyday use and management of all clinical information. The prevailing notion of complexity surrounding critical care clinical work processes was refuted and found to be misleading; in reality, it is not the work processes that cause the complexity, the complexity is rooted in the way in which clinical information is used and managed. This study emphasises that the potential for clinical information systems that consider integrating all clinical information requirements is not only immense but also very plausible.
Direct Scaling of Leaf-Resolving Biophysical Models from Leaves to Canopies
NASA Astrophysics Data System (ADS)
Bailey, B.; Mahaffee, W.; Hernandez Ochoa, M.
2017-12-01
Recent advances in the development of biophysical models and high-performance computing have enabled rapid increases in the level of detail that can be represented by simulations of plant systems. However, increasingly detailed models typically require increasingly detailed inputs, which can be a challenge to accurately specify. In this work, we explore the use of terrestrial LiDAR scanning data to accurately specify geometric inputs for high-resolution biophysical models that enables direct up-scaling of leaf-level biophysical processes. Terrestrial LiDAR scans generate "clouds" of millions of points that map out the geometric structure of the area of interest. However, points alone are often not particularly useful in generating geometric model inputs, as additional data processing techniques are required to provide necessary information regarding vegetation structure. A new method was developed that directly reconstructs as many leaves as possible that are in view of the LiDAR instrument, and uses a statistical backfilling technique to ensure that the overall leaf area and orientation distribution matches that of the actual vegetation being measured. This detailed structural data is used to provide inputs for leaf-resolving models of radiation, microclimate, evapotranspiration, and photosynthesis. Model complexity is afforded by utilizing graphics processing units (GPUs), which allows for simulations that resolve scales ranging from leaves to canopies. The model system was used to explore how heterogeneity in canopy architecture at various scales affects scaling of biophysical processes from leaves to canopies.
The design and development of transonic multistage compressors
NASA Technical Reports Server (NTRS)
Ball, C. L.; Steinke, R. J.; Newman, F. A.
1988-01-01
The development of the transonic multistage compressor is reviewed. Changing trends in design and performance parameters are noted. These changes are related to advances in compressor aerodynamics, computational fluid mechanics and other enabling technologies. The parameters normally given to the designer and those that need to be established during the design process are identified. Criteria and procedures used in the selection of these parameters are presented. The selection of tip speed, aerodynamic loading, flowpath geometry, incidence and deviation angles, blade/vane geometry, blade/vane solidity, stage reaction, aerodynamic blockage, inlet flow per unit annulus area, stage/overall velocity ratio, and aerodynamic losses are considered. Trends in these parameters both spanwise and axially through the machine are highlighted. The effects of flow mixing and methods for accounting for the mixing in the design process are discussed.
Neuronal and oscillatory activity during reward processing in the human ventral striatum.
Lega, Bradley C; Kahana, Michael J; Jaggi, Jurg; Baltuch, Gordon H; Zaghloul, Kareem
2011-11-16
Accumulated evidence from animal studies implicates the ventral striatum in the processing of reward information. Recently, deep brain stimulation (DBS) surgery has enabled researchers to analyze neurophysiological recordings from humans engaged in reward tasks. We present data recorded from the human ventral striatum during deep brain stimulation surgery as a participant played a video game coupled to the receipt of visual reward images. To our knowledge, we identify the first instances of reward-sensitive single unit activity in the human ventral striatum. Local field potential data suggest that alpha oscillations are sensitive to positive feedback, whereas beta oscillations exhibit significantly higher power during unrewarded trials. We report evidence of alpha-gamma cross-frequency coupling that differentiates between positive and negative feedback. © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins.
NASA Astrophysics Data System (ADS)
Koike, Hiroki; Ohsawa, Takashi; Miura, Sadahiko; Honjo, Hiroaki; Ikeda, Shoji; Hanyu, Takahiro; Ohno, Hideo; Endoh, Tetsuo
2015-04-01
A spintronic-based power-gated micro-processing unit (MPU) is proposed. It includes a power control circuit activated by the newly supported power-off instruction for the deep-sleep mode. These means enable the power-off procedure for the MPU to be executed appropriately. A test chip was designed and fabricated using 90 nm CMOS and an additional 100 nm MTJ process; it was successfully operated. The guideline of the energy reduction effects for this MPU was presented, using the estimation based on the measurement results of the test chip. The result shows that a large operation energy reduction of 1/28 can be achieved when the operation duty is 10%, under the condition of a sufficient number of idle clock cycles.
The productivity limit of manufacturing blood cell therapy in scalable stirred bioreactors
Bayley, Rachel; Ahmed, Forhad; Glen, Katie; McCall, Mark; Stacey, Adrian
2017-01-01
Abstract Manufacture of red blood cells (RBCs) from progenitors has been proposed as a method to reduce reliance on donors. Such a process would need to be extremely efficient for economic viability given a relatively low value product and high (2 × 1012) cell dose. Therefore, the aim of these studies was to define the productivity of an industry standard stirred‐tank bioreactor and determine engineering limitations of commercial red blood cells production. Cord blood derived CD34+ cells were cultured under erythroid differentiation conditions in a stirred micro‐bioreactor (Ambr™). Enucleated cells of 80% purity could be created under optimal physical conditions: pH 7.5, 50% oxygen, without gas‐sparging (which damaged cells) and with mechanical agitation (which directly increased enucleation). O2 consumption was low (~5 × 10–8 μg/cell.h) theoretically enabling erythroblast densities in excess of 5 × 108/ml in commercial bioreactors and sub‐10 l/unit production volumes. The bioreactor process achieved a 24% and 42% reduction in media volume and culture time, respectively, relative to unoptimized flask processing. However, media exchange limited productivity to 1 unit of erythroblasts per 500 l of media. Systematic replacement of media constituents, as well as screening for inhibitory levels of ammonia, lactate and key cytokines did not identify a reason for this limitation. We conclude that the properties of erythroblasts are such that the conventional constraints on cell manufacturing efficiency, such as mass transfer and metabolic demand, should not prevent high intensity production; furthermore, this could be achieved in industry standard equipment. However, identification and removal of an inhibitory mediator is required to enable these economies to be realized. Copyright © 2016 The Authors Journal of Tissue Engineering and Regenerative Medicine Published by John Wiley & Sons Ltd. PMID:27696710
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W.; Keeling, Josh; Bruchs, Doug
Refrigerator recycling programs are designed to save energy by removing operable, albeit less efficient, refrigerators from service. By offering free pickup, providing incentives, and disseminating information about the operating cost of less efficient refrigerators, these programs are designed to encourage consumers to: - Limit the use of secondary refrigerators -Relinquish refrigerators previously used as primary units when they are replaced (rather than keeping the existing refrigerator as a secondary unit) -Prevent the continued use of less efficient refrigerators in another household through a direct transfer (giving it away or selling it) or indirect transfer (resale on the used appliance market).more » Commonly implemented by third-party contractors (who collect and decommission participating appliances), these programs generate energy savings through the retirement of inefficient appliances. The decommissioning process captures environmentally harmful refrigerants and foam, and enables recycling of the plastic, metal, and wiring components.« less
Emerging Definition of Next-Generation of Aeronautical Communications
NASA Technical Reports Server (NTRS)
Kerczewski, Robert J.
2006-01-01
Aviation continues to experience rapid growth. In regions such as the United States and Europe air traffic congestion is constraining operations, leading to major new efforts to develop methodologies and infrastructures to enable continued aviation growth through transformational air traffic management systems. Such a transformation requires better communications linking airborne and ground-based elements. Technologies for next-generation communications, the required capacities, frequency spectrum of operation, network interconnectivity, and global interoperability are now receiving increased attention. A number of major planning and development efforts have taken place or are in process now to define the transformed airspace of the future. These activities include government and industry led efforts in the United States and Europe, and by international organizations. This paper will review the features, approaches, and activities of several representative planning and development efforts, and identify the emerging global consensus on requirements of next generation aeronautical communications systems for air traffic control.
A wearable biofeedback control system based body area network for freestyle swimming.
Rui Li; Zibo Cai; WeeSit Lee; Lai, Daniel T H
2016-08-01
Wearable posture measurement units are capable of enabling real-time performance evaluation and providing feedback to end users. This paper presents a wearable feedback prototype designed for freestyle swimming with focus on trunk rotation measurement. The system consists of a nine-degree-of-freedom inertial sensor, which is built in a central data collection and processing unit, and two vibration motors for delivering real-time feedback. Theses devices form a fundamental body area network (BAN). In the experiment setup, four recreational swimmers were asked to do two sets of 4 x 25m freestyle swimming without and with feedback provided respectively. Results showed that real-time biofeedback mechanism improves swimmers kinematic performance by an average of 4.5% reduction in session time. Swimmers can gradually adapt to feedback signals, and the biofeedback control system can be employed in swimmers daily training for fitness maintenance.
A novel ultrasonic phased array inspection system to NDT for offshore platform structures
NASA Astrophysics Data System (ADS)
Wang, Hua; Shan, Baohua; Wang, Xin; Ou, Jinping
2007-01-01
A novel ultrasonic phased array detection system is developed for nondestructive testing (NDT). The purpose of the system is to make acquisition of data in real-time from 64-element ultrasonic phased array transducer, and to enable real- time processing of the acquired data. The system is composed of five main parts: master unit, main board, eight transmit/receive units, a 64-element transducer and an external PC. The system can be used with 64 element transducers, excite 32 elements, receive and sample echo signals form 32 elements simultaneously at 62.5MHz with 8 bit precision. The external PC is used as the user interface showing the real time images and controls overall operation of the system through USB serial link. The use of Universal Serial Bus (USB) improves the transform speed and reduces hardware interface complexity. The program of the system is written in Visual C++.NET and is platform independent.
Cardiac Care Assistance using Self Configured Sensor Network—a Remote Patient Monitoring System
NASA Astrophysics Data System (ADS)
Sarma Dhulipala, V. R.; Kanagachidambaresan, G. R.
2014-04-01
Pervasive health care systems are used to monitor patients remotely without disturbing the normal day-to-day activities in real-time. Wearable physiological sensors required to monitor various significant ecological parameters of the patients are connected to Body Central Unit (BCU). Body Sensor Network (BSN) updates data in real-time and are designed to transmit alerts against abnormalities which enables quick response by medical units in case of an emergency. BSN helps monitoring patient without any need for attention to the subject. BSN helps in reducing the stress and strain caused by hospital environment. In this paper, mathematical models for heartbeat signal, electro cardio graph (ECG) signal and pulse rate are introduced. These signals are compared and their RMS difference-fast Fourier transforms (PRD-FFT) are processed. In the context of cardiac arrest, alert messages of these parameters and first aid for post-surgical operations has been suggested.
[Care practices for neonates while setting up a neonatal unit in a university hospital].
Pedron, Cecília Drebes; Bonilha, Ana Lúcia de Lourenzi
2008-12-01
The hospitalization process of neonates makes them vulnerable to several care practices. The aim of this study was to get to know the care practices adopted by health professionals while setting up a neonatal unit at the Hospital de Clínicas of Porto Alegre, Rio Grande do Sul, Brazil. This is a qualitative study based on the New History Theory. The study collected data from October 2006 to January 2007. Fifteen health professionals responsible for the project and/or its implementation from 1972 to 1984 provided information. The thematic data analysis highlighted the concern among health professionals of making good use of technological advances, as well as unifying scientifically-based conducts. Besides, they tried to establish routines enabling neonate's parents to stay at the bedside during the whole hospitalization period. Finally, it was inferred that the main objective of these practices was to increase the survival of neonates.
Designing perturbative metamaterials from discrete models.
Matlack, Kathryn H; Serra-Garcia, Marc; Palermo, Antonio; Huber, Sebastian D; Daraio, Chiara
2018-04-01
Identifying material geometries that lead to metamaterials with desired functionalities presents a challenge for the field. Discrete, or reduced-order, models provide a concise description of complex phenomena, such as negative refraction, or topological surface states; therefore, the combination of geometric building blocks to replicate discrete models presenting the desired features represents a promising approach. However, there is no reliable way to solve such an inverse problem. Here, we introduce 'perturbative metamaterials', a class of metamaterials consisting of weakly interacting unit cells. The weak interaction allows us to associate each element of the discrete model with individual geometric features of the metamaterial, thereby enabling a systematic design process. We demonstrate our approach by designing two-dimensional elastic metamaterials that realize Veselago lenses, zero-dispersion bands and topological surface phonons. While our selected examples are within the mechanical domain, the same design principle can be applied to acoustic, thermal and photonic metamaterials composed of weakly interacting unit cells.
Permafrost and Subsurface Ice in the Solar System
NASA Technical Reports Server (NTRS)
Anderson, D. M.
1985-01-01
The properties and behavior of planetary permafrost are discussed with reference to the ability of such surfaces to sustain loads characteristics of spacecraft landing and planetary bases. In most occurrences, water ice is in close proximity to, or in contact with, finely divided silicate mineral matter. When ice contacts silicate mineral surfaces, a liquid-like, transition zone is created. Its thickness ranges from several hundred Angstron units at temperatures near 0 degrees C to about three Angstrom units at -150 degrees C. When soluble substances are present, the resulting brine enlarges the interfacial zone. When clays are involved, although the interfacial zone may be small, its extent is large. The unfrozen, interfacial water may amount to 100% or more weight at a temperature of -5 degrees C. The presence of this interfacial unfrozen water acts to confer plasticity to permafrost, enabling it to exhibit creep at all imposed levels of stress. Nucleation processes and load-bearing capacity are examined.
Dynamic and Contextual Information in HMM Modeling for Handwritten Word Recognition.
Bianne-Bernard, Anne-Laure; Menasri, Farès; Al-Hajj Mohamad, Rami; Mokbel, Chafic; Kermorvant, Christopher; Likforman-Sulem, Laurence
2011-10-01
This study aims at building an efficient word recognition system resulting from the combination of three handwriting recognizers. The main component of this combined system is an HMM-based recognizer which considers dynamic and contextual information for a better modeling of writing units. For modeling the contextual units, a state-tying process based on decision tree clustering is introduced. Decision trees are built according to a set of expert-based questions on how characters are written. Questions are divided into global questions, yielding larger clusters, and precise questions, yielding smaller ones. Such clustering enables us to reduce the total number of models and Gaussians densities by 10. We then apply this modeling to the recognition of handwritten words. Experiments are conducted on three publicly available databases based on Latin or Arabic languages: Rimes, IAM, and OpenHart. The results obtained show that contextual information embedded with dynamic modeling significantly improves recognition.
Fundamental Dimensions and Essential Elements of Exemplary Local Extension Units
ERIC Educational Resources Information Center
Terry, Bryan D.; Osborne, Edward
2015-01-01
Collaborative efforts between federal, state, and local government agencies enable local Extension units to deliver a high level of educational opportunities to local citizens. These units represent land-grant institutions by delivering non-formal education that aim to address local, regional, and state concerns. The purpose of this study was to…
Careers (A Course of Study). Unit III: Do It Right!
ERIC Educational Resources Information Center
Turley, Kay
Designed to enable the special needs student to comprehend and complete job application forms, this set of activities on job application vocabulary, neatness, and following directions is the third unit in a nine-unit secondary level careers course intended to provide handicapped students with the knowledge and tools necessary to succeed in the…
The Modular Modeling System (MMS): A toolbox for water- and environmental-resources management
Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.; Hay, L.E.; ,
2005-01-01
The increasing complexity of water- and environmental-resource problems require modeling approaches that incorporate knowledge from a broad range of scientific and software disciplines. To address this need, the U.S. Geological Survey (USGS) has developed the Modular Modeling System (MMS). MMS is an integrated system of computer software for model development, integration, and application. Its modular design allows a high level of flexibility and adaptability to enable modelers to incorporate their own software into a rich array of built-in models and modeling tools. These include individual process models, tightly coupled models, loosely coupled models, and fully- integrated decision support systems. A geographic information system (GIS) interface, the USGS GIS Weasel, has been integrated with MMS to enable spatial delineation and characterization of basin and ecosystem features, and to provide objective parameter-estimation methods for models using available digital data. MMS provides optimization and sensitivity-analysis tools to analyze model parameters and evaluate the extent to which uncertainty in model parameters affects uncertainty in simulation results. MMS has been coupled with the Bureau of Reclamation object-oriented reservoir and river-system modeling framework, RiverWare, to develop models to evaluate and apply optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. This decision support system approach has been developed, tested, and implemented in the Gunnison, Yakima, San Joaquin, Rio Grande, and Truckee River basins of the western United States. MMS is currently being coupled with the U.S. Forest Service model SIMulating Patterns and Processes at Landscape Scales (SIMPPLLE) to assess the effects of alternative vegetation-management strategies on a variety of hydrological and ecological responses. Initial development and testing of the MMS-SIMPPLLE integration is being conducted on the Colorado Plateau region of the western United Sates.
FAST: framework for heterogeneous medical image computing and visualization.
Smistad, Erik; Bozorgi, Mohammadmehdi; Lindseth, Frank
2015-11-01
Computer systems are becoming increasingly heterogeneous in the sense that they consist of different processors, such as multi-core CPUs and graphic processing units. As the amount of medical image data increases, it is crucial to exploit the computational power of these processors. However, this is currently difficult due to several factors, such as driver errors, processor differences, and the need for low-level memory handling. This paper presents a novel FrAmework for heterogeneouS medical image compuTing and visualization (FAST). The framework aims to make it easier to simultaneously process and visualize medical images efficiently on heterogeneous systems. FAST uses common image processing programming paradigms and hides the details of memory handling from the user, while enabling the use of all processors and cores on a system. The framework is open-source, cross-platform and available online. Code examples and performance measurements are presented to show the simplicity and efficiency of FAST. The results are compared to the insight toolkit (ITK) and the visualization toolkit (VTK) and show that the presented framework is faster with up to 20 times speedup on several common medical imaging algorithms. FAST enables efficient medical image computing and visualization on heterogeneous systems. Code examples and performance evaluations have demonstrated that the toolkit is both easy to use and performs better than existing frameworks, such as ITK and VTK.
NASA Astrophysics Data System (ADS)
Göll, S.; Samsun, R. C.; Peters, R.
Fuel-cell-based auxiliary power units can help to reduce fuel consumption and emissions in transportation. For this application, the combination of solid oxide fuel cells (SOFCs) with upstream fuel processing by autothermal reforming (ATR) is seen as a highly favorable configuration. Notwithstanding the necessity to improve each single component, an optimized architecture of the fuel cell system as a whole must be achieved. To enable model-based analyses, a system-level approach is proposed in which the fuel cell system is modeled as a multi-stage thermo-chemical process using the "flowsheeting" environment PRO/II™. Therein, the SOFC stack and the ATR are characterized entirely by corresponding thermodynamic processes together with global performance parameters. The developed model is then used to achieve an optimal system layout by comparing different system architectures. A system with anode and cathode off-gas recycling was identified to have the highest electric system efficiency. Taking this system as a basis, the potential for further performance enhancement was evaluated by varying four parameters characterizing different system components. Using methods from the design and analysis of experiments, the effects of these parameters and of their interactions were quantified, leading to an overall optimized system with encouraging performance data.
Magnesium Front End Research and Development: A Canada-China-USA Collaboration
NASA Astrophysics Data System (ADS)
Luo, Alan A.; Nyberg, Eric A.; Sadayappan, Kumar; Shi, Wenfang
The Magnesium Front End Research & Development (MFERD) project is an effort jointly sponsored by the United States Department of Energy, the United States Automotive Materials Partnership (USAMP), the Chinese Ministry of Science and Technology and Natural Resources Canada (NRCan) to demonstrate the technical and economic feasibility of a magnesium-intensive automotive front end body structure which offers improved fuel economy and performance benefits in a multi-material automotive structure. The project examines novel magnesium automotive body applications and processes, beyond conventional die castings, including wrought components (sheet or extrusions) and high-integrity body castings. This paper outlines the scope of work and organization for the collaborative (tri-country) task teams. The project has the goals of developing key enabling technologies and knowledge base for increased magnesium automotive body applications. The MFERD project began in early 2007 by initiating R&D in the following areas: crashworthiness, NVH, fatigue and durability, corrosion and surface finishing, extrusion and forming, sheet and forming, high-integrity body casting, as well as joining and assembly. Additionally, the MFERD project is also linked to the Integrated Computational Materials Engineering (ICME) project that will investigate the processing/structure/properties relations for various magnesium alloys and manufacturing processes utilizing advanced computer-aided engineering and modeling tools.
Heterogeneous scalable framework for multiphase flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, Karla Vanessa
2013-09-01
Two categories of challenges confront the developer of computational spray models: those related to the computation and those related to the physics. Regarding the computation, the trend towards heterogeneous, multi- and many-core platforms will require considerable re-engineering of codes written for the current supercomputing platforms. Regarding the physics, accurate methods for transferring mass, momentum and energy from the dispersed phase onto the carrier fluid grid have so far eluded modelers. Significant challenges also lie at the intersection between these two categories. To be competitive, any physics model must be expressible in a parallel algorithm that performs well on evolving computermore » platforms. This work created an application based on a software architecture where the physics and software concerns are separated in a way that adds flexibility to both. The develop spray-tracking package includes an application programming interface (API) that abstracts away the platform-dependent parallelization concerns, enabling the scientific programmer to write serial code that the API resolves into parallel processes and threads of execution. The project also developed the infrastructure required to provide similar APIs to other application. The API allow object-oriented Fortran applications direct interaction with Trilinos to support memory management of distributed objects in central processing units (CPU) and graphic processing units (GPU) nodes for applications using C++.« less
NASA Astrophysics Data System (ADS)
Silva Fernandez, Marta A.
The purpose of this cross-national study was to gain a more comprehensive understanding about doctoral students in the United States and Chile and how their decisions to pursue a career in the life sciences field occurred throughout their lives. . I interviewed 15 doctoral students from the Seven Lakes University (Chile) and 15 students from the West Coast University (US), using a life history approach. Analyses revealed that the degree of flexibility in the schooling system and the degree of individualism and collectivism of the social groups in which the students were learning science seemed to influence the informants' vocational decisions in three interrelated processes: (1) Deciding the informants' degree of interest and ability in science by the opportunity of choosing science classes and activities. The highly tracked Chilean system socializes students to science at an early age. The more flexible school system in the US enabled the interviewees to gradually decide about pursuing their interest in science; (2) Experiencing science as a collective learning process for the Chilean informants and an individualistic learning process for the US students; (3) Perceiving science differently at each life stage for both groups of interviewees including: Playing science, Studying science, Doing science, Working in science, Practicing Science in their doctoral programs.
Use of Dynamic Models and Operational Architecture to Solve Complex Navy Challenges
NASA Technical Reports Server (NTRS)
Grande, Darby; Black, J. Todd; Freeman, Jared; Sorber, TIm; Serfaty, Daniel
2010-01-01
The United States Navy established 8 Maritime Operations Centers (MOC) to enhance the command and control of forces at the operational level of warfare. Each MOC is a headquarters manned by qualified joint operational-level staffs, and enabled by globally interoperable C41 systems. To assess and refine MOC staffing, equipment, and schedules, a dynamic software model was developed. The model leverages pre-existing operational process architecture, joint military task lists that define activities and their precedence relations, as well as Navy documents that specify manning and roles per activity. The software model serves as a "computational wind-tunnel" in which to test a MOC on a mission, and to refine its structure, staffing, processes, and schedules. More generally, the model supports resource allocation decisions concerning Doctrine, Organization, Training, Material, Leadership, Personnel and Facilities (DOTMLPF) at MOCs around the world. A rapid prototype effort efficiently produced this software in less than five months, using an integrated process team consisting of MOC military and civilian staff, modeling experts, and software developers. The work reported here was conducted for Commander, United States Fleet Forces Command in Norfolk, Virginia, code N5-0LW (Operational Level of War) that facilitates the identification, consolidation, and prioritization of MOC capabilities requirements, and implementation and delivery of MOC solutions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sauter, Nicholas K., E-mail: nksauter@lbl.gov; Hattne, Johan; Grosse-Kunstleve, Ralf W.
The Computational Crystallography Toolbox (cctbx) is a flexible software platform that has been used to develop high-throughput crystal-screening tools for both synchrotron sources and X-ray free-electron lasers. Plans for data-processing and visualization applications are discussed, and the benefits and limitations of using graphics-processing units are evaluated. Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h{sup −1}) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in realmore » time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femtosecond crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units.« less
Patel, Darshan C; Lyu, Yaqi Fara; Gandarilla, Jorge; Doherty, Steve
2018-04-03
In-process sampling and analysis is an important aspect of monitoring kinetic profiles and impurity formation or rejection, both in development and during commercial manufacturing. In pharmaceutical process development, the technology of choice for a substantial portion of this analysis is high-performance liquid chromatography (HPLC). Traditionally, the sample extraction and preparation for reaction characterization have been performed manually. This can be time consuming, laborious, and impractical for long processes. Depending on the complexity of the sample preparation, there can be variability introduced by different analysts, and in some cases, the integrity of the sample can be compromised during handling. While there are commercial instruments available for on-line monitoring with HPLC, they lack capabilities in many key areas. Some do not provide integration of the sampling and analysis, while others afford limited flexibility in sample preparation. The current offerings provide a limited number of unit operations available for sample processing and no option for workflow customizability. This work describes development of a microfluidic automated program (MAP) which fully automates the sample extraction, manipulation, and on-line LC analysis. The flexible system is controlled using an intuitive Microsoft Excel based user interface. The autonomous system is capable of unattended reaction monitoring that allows flexible unit operations and workflow customization to enable complex operations and on-line sample preparation. The automated system is shown to offer advantages over manual approaches in key areas while providing consistent and reproducible in-process data. Copyright © 2017 Elsevier B.V. All rights reserved.
A Binaural Grouping Model for Predicting Speech Intelligibility in Multitalker Environments
Colburn, H. Steven
2016-01-01
Spatially separating speech maskers from target speech often leads to a large intelligibility improvement. Modeling this phenomenon has long been of interest to binaural-hearing researchers for uncovering brain mechanisms and for improving signal-processing algorithms in hearing-assistive devices. Much of the previous binaural modeling work focused on the unmasking enabled by binaural cues at the periphery, and little quantitative modeling has been directed toward the grouping or source-separation benefits of binaural processing. In this article, we propose a binaural model that focuses on grouping, specifically on the selection of time-frequency units that are dominated by signals from the direction of the target. The proposed model uses Equalization-Cancellation (EC) processing with a binary decision rule to estimate a time-frequency binary mask. EC processing is carried out to cancel the target signal and the energy change between the EC input and output is used as a feature that reflects target dominance in each time-frequency unit. The processing in the proposed model requires little computational resources and is straightforward to implement. In combination with the Coherence-based Speech Intelligibility Index, the model is applied to predict the speech intelligibility data measured by Marrone et al. The predicted speech reception threshold matches the pattern of the measured data well, even though the predicted intelligibility improvements relative to the colocated condition are larger than some of the measured data, which may reflect the lack of internal noise in this initial version of the model. PMID:27698261
A Binaural Grouping Model for Predicting Speech Intelligibility in Multitalker Environments.
Mi, Jing; Colburn, H Steven
2016-10-03
Spatially separating speech maskers from target speech often leads to a large intelligibility improvement. Modeling this phenomenon has long been of interest to binaural-hearing researchers for uncovering brain mechanisms and for improving signal-processing algorithms in hearing-assistive devices. Much of the previous binaural modeling work focused on the unmasking enabled by binaural cues at the periphery, and little quantitative modeling has been directed toward the grouping or source-separation benefits of binaural processing. In this article, we propose a binaural model that focuses on grouping, specifically on the selection of time-frequency units that are dominated by signals from the direction of the target. The proposed model uses Equalization-Cancellation (EC) processing with a binary decision rule to estimate a time-frequency binary mask. EC processing is carried out to cancel the target signal and the energy change between the EC input and output is used as a feature that reflects target dominance in each time-frequency unit. The processing in the proposed model requires little computational resources and is straightforward to implement. In combination with the Coherence-based Speech Intelligibility Index, the model is applied to predict the speech intelligibility data measured by Marrone et al. The predicted speech reception threshold matches the pattern of the measured data well, even though the predicted intelligibility improvements relative to the colocated condition are larger than some of the measured data, which may reflect the lack of internal noise in this initial version of the model. © The Author(s) 2016.
Evaluation of the quality of the teaching-learning process in undergraduate courses in Nursing.
González-Chordá, Víctor Manuel; Maciá-Soler, María Loreto
2015-01-01
to identify aspects of improvement of the quality of the teaching-learning process through the analysis of tools that evaluated the acquisition of skills by undergraduate students of Nursing. prospective longitudinal study conducted in a population of 60 secondyear Nursing students based on registration data, from which quality indicators that evaluate the acquisition of skills were obtained, with descriptive and inferential analysis. nine items were identified and nine learning activities included in the assessment tools that did not reach the established quality indicators (p<0.05). There are statistically significant differences depending on the hospital and clinical practices unit (p<0.05). the analysis of the evaluation tools used in the article "Nursing Care in Welfare Processes" of the analyzed university undergraduate course enabled the detection of the areas for improvement in the teachinglearning process. The challenge of education in nursing is to reach the best clinical research and educational results, in order to provide improvements to the quality of education and health care.
Software for Preprocessing Data from Rocket-Engine Tests
NASA Technical Reports Server (NTRS)
Cheng, Chiu-Fu
2004-01-01
Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris). EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PV-WAVE based plotting software.
Han, Wuxiao; He, Haoxuan; Zhang, Linlin; Dong, Chuanyi; Zeng, Hui; Dai, Yitong; Xing, Lili; Zhang, Yan; Xue, Xinyu
2017-09-06
The emerging multifunctional flexible electronic-skin for establishing body-electric interaction can enable real-time monitoring of personal health status as a new personalized medicine technique. A key difficulty in the device design is the flexible power supply. Here a self-powered wearable noninvasive electronic-skin for perspiration analysis has been realized on the basis of a piezo-biosensing unit matrix of enzyme/ZnO nanoarrays. The electronic-skin can detect lactate, glucose, uric acid, and urea in the perspiration, and no outside electrical power supply or battery is used in the biosensing process. The piezoelectric impulse of the piezo-biosensing units serves as the power supply and the data biosensor. The working mechanism can be ascribed to the piezoelectric-enzymatic-reaction coupling effect of enzyme/ZnO nanowires. The electronic-skin can real-time/continuously monitor the physiological state of a runner through analyzing the perspiration on his skin. This approach can promote the development of a new-type of body electric and self-powered biosensing electronic-skin.
Software for Preprocessing Data From Rocket-Engine Tests
NASA Technical Reports Server (NTRS)
Cheng, Chiu-Fu
2003-01-01
Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: (1) Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. (2) QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot. (3) EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PVWAVE based plotting software.
Pharmaceutical quality by design: product and process development, understanding, and control.
Yu, Lawrence X
2008-04-01
The purpose of this paper is to discuss the pharmaceutical Quality by Design (QbD) and describe how it can be used to ensure pharmaceutical quality. The QbD was described and some of its elements identified. Process parameters and quality attributes were identified for each unit operation during manufacture of solid oral dosage forms. The use of QbD was contrasted with the evaluation of product quality by testing alone. The QbD is a systemic approach to pharmaceutical development. It means designing and developing formulations and manufacturing processes to ensure predefined product quality. Some of the QbD elements include: Defining target product quality profile; Designing product and manufacturing processes; Identifying critical quality attributes, process parameters, and sources of variability; Controlling manufacturing processes to produce consistent quality over time. Using QbD, pharmaceutical quality is assured by understanding and controlling formulation and manufacturing variables. Product testing confirms the product quality. Implementation of QbD will enable transformation of the chemistry, manufacturing, and controls (CMC) review of abbreviated new drug applications (ANDAs) into a science-based pharmaceutical quality assessment.
Distribution and interplay of geologic processes on Titan from Cassini radar data
Lopes, R.M.C.; Stofan, E.R.; Peckyno, R.; Radebaugh, J.; Mitchell, K.L.; Mitri, Giuseppe; Wood, C.A.; Kirk, R.L.; Wall, S.D.; Lunine, J.I.; Hayes, A.; Lorenz, R.; Farr, Tom; Wye, L.; Craig, J.; Ollerenshaw, R.J.; Janssen, M.; LeGall, A.; Paganelli, F.; West, R.; Stiles, B.; Callahan, P.; Anderson, Y.; Valora, P.; Soderblom, L.
2010-01-01
The Cassini Titan Radar Mapper is providing an unprecedented view of Titan's surface geology. Here we use Synthetic Aperture Radar (SAR) image swaths (Ta-T30) obtained from October 2004 to December 2007 to infer the geologic processes that have shaped Titan's surface. These SAR swaths cover about 20% of the surface, at a spatial resolution ranging from ???350 m to ???2 km. The SAR data are distributed over a wide latitudinal and longitudinal range, enabling some conclusions to be drawn about the global distribution of processes. They reveal a geologically complex surface that has been modified by all the major geologic processes seen on Earth - volcanism, tectonism, impact cratering, and erosion and deposition by fluvial and aeolian activity. In this paper, we map geomorphological units from SAR data and analyze their areal distribution and relative ages of modification in order to infer the geologic evolution of Titan's surface. We find that dunes and hummocky and mountainous terrains are more widespread than lakes, putative cryovolcanic features, mottled plains, and craters and crateriform structures that may be due to impact. Undifferentiated plains are the largest areal unit; their origin is uncertain. In terms of latitudinal distribution, dunes and hummocky and mountainous terrains are located mostly at low latitudes (less than 30??), with no dunes being present above 60??. Channels formed by fluvial activity are present at all latitudes, but lakes are at high latitudes only. Crateriform structures that may have been formed by impact appear to be uniformly distributed with latitude, but the well-preserved impact craters are all located at low latitudes, possibly indicating that more resurfacing has occurred at higher latitudes. Cryovolcanic features are not ubiquitous, and are mostly located between 30?? and 60?? north. We examine temporal relationships between units wherever possible, and conclude that aeolian and fluvial/pluvial/lacustrine processes are the most recent, while tectonic processes that led to the formation of mountains and Xanadu are likely the most ancient. ?? 2009 Elsevier Inc.
Disribution and interplay of geologic processes on Titan from Cassini radar data
Lopes, R.M.C.; Stofan, E.R.; Peckyno, R.; Radebaugh, J.; Mitchell, K.L.; Mitri, Giuseppe; Wood, C.A.; Kirk, R.L.; Wall, S.D.; Lunine, J.I.; Hayes, A.; Lorenz, R.; Farr, Tom; Wye, L.; Craig, J.; Ollerenshaw, R.J.; Janssen, M.; LeGall, A.; Paganelli, F.; West, R.; Stiles, B.; Callahan, P.; Anderson, Y.; Valora, P.; Soderblom, L.
2010-01-01
The Cassini Titan Radar Mapper is providing an unprecedented view of Titan's surface geology. Here we use Synthetic Aperture Radar (SAR) image swaths (Ta-T30) obtained from October 2004 to December 2007 to infer the geologic processes that have shaped Titan's surface. These SAR swaths cover about 20% of the surface, at a spatial resolution ranging from ~350 m to ~2 km. The SAR data are distributed over a wide latitudinal and longitudinal range, enabling some conclusions to be drawn about the global distribution of processes. They reveal a geologically complex surface that has been modified by all the major geologic processes seen on Earth - volcanism, tectonism, impact cratering, and erosion and deposition by fluvial and aeolian activity. In this paper, we map geomorphological units from SAR data and analyze their areal distribution and relative ages of modification in order to infer the geologic evolution of Titan's surface. We find that dunes and hummocky and mountainous terrains are more widespread than lakes, putative cryovolcanic features, mottled plains, and craters and crateriform structures that may be due to impact. Undifferentiated plains are the largest areal unit; their origin is uncertain. In terms of latitudinal distribution, dunes and hummocky and mountainous terrains are located mostly at low latitudes (less than 30 degrees), with no dunes being present above 60 degrees. Channels formed by fluvial activity are present at all latitudes, but lakes are at high latitudes only. Crateriform structures that may have been formed by impact appear to be uniformly distributed with latitude, but the well-preserved impact craters are all located at low latitudes, possibly indicating that more resurfacing has occurred at higher latitudes. Cryovolcanic features are not ubiquitous, and are mostly located between 30 degrees and 60 degrees north. We examine temporal relationships between units wherever possible, and conclude that aeolian and fluvial/pluvial/lacustrine processes are the most recent, while tectonic processes that led to the formation of mountains and Xanadu are likely the most ancient.
Network-Capable Application Process and Wireless Intelligent Sensors for ISHM
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Morris, Jon; Turowski, Mark; Wang, Ray
2011-01-01
Intelligent sensor technology and systems are increasingly becoming attractive means to serve as frameworks for intelligent rocket test facilities with embedded intelligent sensor elements, distributed data acquisition elements, and onboard data acquisition elements. Networked intelligent processors enable users and systems integrators to automatically configure their measurement automation systems for analog sensors. NASA and leading sensor vendors are working together to apply the IEEE 1451 standard for adding plug-and-play capabilities for wireless analog transducers through the use of a Transducer Electronic Data Sheet (TEDS) in order to simplify sensor setup, use, and maintenance, to automatically obtain calibration data, and to eliminate manual data entry and error. A TEDS contains the critical information needed by an instrument or measurement system to identify, characterize, interface, and properly use the signal from an analog sensor. A TEDS is deployed for a sensor in one of two ways. First, the TEDS can reside in embedded, nonvolatile memory (typically flash memory) within the intelligent processor. Second, a virtual TEDS can exist as a separate file, downloadable from the Internet. This concept of virtual TEDS extends the benefits of the standardized TEDS to legacy sensors and applications where the embedded memory is not available. An HTML-based user interface provides a visual tool to interface with those distributed sensors that a TEDS is associated with, to automate the sensor management process. Implementing and deploying the IEEE 1451.1-based Network-Capable Application Process (NCAP) can achieve support for intelligent process in Integrated Systems Health Management (ISHM) for the purpose of monitoring, detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, mitigation to maintain operability, and integrated awareness of system health by the operator. It can also support local data collection and storage. This invention enables wide-area sensing and employs numerous globally distributed sensing devices that observe the physical world through the existing sensor network. This innovation enables distributed storage, distributed processing, distributed intelligence, and the availability of DiaK (Data, Information, and Knowledge) to any element as needed. It also enables the simultaneous execution of multiple processes, and represents models that contribute to the determination of the condition and health of each element in the system. The NCAP (intelligent process) can configure data-collection and filtering processes in reaction to sensed data, allowing it to decide when and how to adapt collection and processing with regard to sophisticated analysis of data derived from multiple sensors. The user will be able to view the sensing device network as a single unit that supports a high-level query language. Each query would be able to operate over data collected from across the global sensor network just as a search query encompasses millions of Web pages. The sensor web can preserve ubiquitous information access between the querier and the queried data. Pervasive monitoring of the physical world raises significant data and privacy concerns. This innovation enables different authorities to control portions of the sensing infrastructure, and sensor service authors may wish to compose services across authority boundaries.
Horn, Jacqueline; Friess, Wolfgang
2018-01-01
The collapse temperature (Tc) and the glass transition temperature of freeze-concentrated solutions (Tg') as well as the crystallization behavior of excipients are important physicochemical characteristics which guide the cycle development in freeze-drying. The most frequently used methods to determine these values are differential scanning calorimetry (DSC) and freeze-drying microscopy (FDM). The objective of this study was to evaluate the optical fiber system (OFS) unit as alternative tool for the analysis of Tc, Tg' and crystallization events. The OFS unit was also tested as a potential online monitoring tool during freeze-drying. Freeze/thawing and freeze-drying experiments of sucrose, trehalose, stachyose, mannitol, and highly concentrated IgG1 and lysozyme solutions were carried out and monitored by the OFS. Comparative analyses were performed by DSC and FDM. OFS and FDM results correlated well. The crystallization behavior of mannitol could be monitored by the OFS during freeze/thawing as it can be done by DSC. Online monitoring of freeze-drying runs detected collapse of amorphous saccharide matrices. The OFS unit enabled the analysis of both Tc and crystallization processes, which is usually carried out by FDM and DSC. The OFS can hence be used as novel measuring device. Additionally, detection of these events during lyophilization facilitates online-monitoring. Thus the OFS is a new beneficial tool for the development and monitoring of freeze-drying processes. PMID:29435445
NASA Astrophysics Data System (ADS)
Horn, Jacqueline; Friess, Wolfgang
2018-01-01
The collapse temperature (Tc) and the glass transition temperature of freeze-concentrated solutions (Tg’) as well as the crystallization behavior of excipients are important physicochemical characteristics which guide the cycle development in freeze-drying. The most frequently used methods to determine these values are differential scanning calorimetry (DSC) and freeze-drying microscopy (FDM). The objective of this study was to evaluate the optical fiber system (OFS) unit as alternative tool for the analysis of Tc, Tg’ and crystallization events. The OFS unit was also tested as a potential online monitoring tool during freeze-drying. Freeze/thawing and freeze-drying experiments of sucrose, trehalose, stachyose, mannitol and highly concentrated IgG1 and lysozyme solutions were carried out and monitored by the OFS. Comparative analyses were performed by DSC and FDM. OFS and FDM results correlated well. The crystallization behavior of mannitol could be monitored by the OFS during freeze/thawing as it can be done by DSC. Online monitoring of freeze-drying runs detected collapse of amorphous saccharide matrices. The OFS unit enabled the analysis of both Tc and crystallization processes, which is usually carried out by FDM and DSC. The OFS can hence be used as novel measuring device. Additionally, detection of these events during lyophilization facilitate online-monitoring. Thus the OFS is a new beneficial tool for the development and monitoring of freeze-drying processes.
Carey, William A; Colby, Christopher E
2013-02-01
In 1999, the Accreditation Council for Graduate Medical Education identified 6 general competencies in which all residents must receive training. In the decade since these requirements went into effect, practice-based learning and improvement (PBLI) and systems-based practice (SBP) have proven to be the most challenging competencies to teach and assess. Because PBLI and SBP both are related to quality improvement (QI) principles and processes, we developed a QI-based curriculum to teach these competencies to our fellows. This experiential curriculum engaged our fellows in our neonatal intensive care unit's (NICU's) structured QI process. After identifying specific patient outcomes in need of improvement, our fellows applied validated QI methods to develop evidence-based treatment protocols for our neonatal intensive care unit. These projects led to immediate and meaningful improvements in patient care and also afforded our fellows various means by which to demonstrate their competence in PBLI and SBP. Our use of portfolios enabled us to document our fellows' performance in these competencies quite easily and comprehensively. Given the clinical and educational structures common to most intensive care unit-based training programs, we believe that a QI-based curriculum such as ours could be adapted by others to teach and assess PBLI and SBP. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Powell, James; Maise, George; Paniagua, John; Borowski, Stanley
2003-01-01
Nuclear thermal propulsion (NTP) enables unique new robotic planetary science missions that are impossible with chemical or nuclear electric propulsion systems. A compact and ultra lightweight bi-modal nuclear engine, termed MITEE-B (MInature ReacTor EnginE - Bi-Modal) can deliver 1000's of kilograms of propulsive thrust when it operates in the NTP mode, and many kilowatts of continuous electric power when it operates in the electric generation mode. The high propulsive thrust NTP mode enables spacecraft to land and takeoff from the surface of a planet or moon, to hop to multiple widely separated sites on the surface, and virtually unlimited flight in planetary atmospheres. The continuous electric generation mode enables a spacecraft to replenish its propellant by processing in-situ resources, provide power for controls, instruments, and communications while in space and on the surface, and operate electric propulsion units. Six examples of unique and important missions enabled by the MITEE-B engine are described, including: (1) Pluto lander and sample return; (2) Europa lander and ocean explorer; (3) Mars Hopper; (4) Jupiter atmospheric flyer; (5) SunBurn hypervelocity spacecraft; and (6) He3 mining from Uranus. Many additional important missions are enabled by MITEE-B. A strong technology base for MITEE-B already exists. With a vigorous development program, it could be ready for initial robotic science and exploration missions by 2010 AD. Potential mission benefits include much shorter in-space times, reduced IMLEO requirements, and replenishment of supplies from in-situ resources.
Scientific innovation's two Valleys of Death: how blood and tissue banks can help to bridge the gap.
Thompson, Sean D A
2014-12-01
Most biomedical basic research in the United States takes place at universities and research institutes and is funded by federal grants. Basic research is awarded billions of federal dollars every year, enabling new discoveries and greater understanding of the fundamental science that makes new innovations and therapies possible. However, when basic research yields an invention of practical use and the research evolves from basic to applied, the playing field changes. Pre-technology licensing federal dollars all but disappear, and innovations rely predominantly on private funding to support the full path from bench to bedside. It is along this path that the scientific advance faces two Valleys of Death. These sometimes insurmountable development stages are the product of the innovation's inherent financial, business and investment risks. Well-planned and executed in vivo studies using quality biological materials demonstrating proof-of-concept is often the key to bridging these gaps, and blood and tissue banks offer unique services and resources to enable this process.
Liang, Li; Oline, Stefan N; Kirk, Justin C; Schmitt, Lukas Ian; Komorowski, Robert W; Remondes, Miguel; Halassa, Michael M
2017-01-01
Independently adjustable multielectrode arrays are routinely used to interrogate neuronal circuit function, enabling chronic in vivo monitoring of neuronal ensembles in freely behaving animals at a single-cell, single spike resolution. Despite the importance of this approach, its widespread use is limited by highly specialized design and fabrication methods. To address this, we have developed a Scalable, Lightweight, Integrated and Quick-to-assemble multielectrode array platform. This platform additionally integrates optical fibers with independently adjustable electrodes to allow simultaneous single unit recordings and circuit-specific optogenetic targeting and/or manipulation. In current designs, the fully assembled platforms are scalable from 2 to 32 microdrives, and yet range 1-3 g, light enough for small animals. Here, we describe the design process starting from intent in computer-aided design, parameter testing through finite element analysis and experimental means, and implementation of various applications across mice and rats. Combined, our methods may expand the utility of multielectrode recordings and their continued integration with other tools enabling functional dissection of intact neural circuits.
Supporting those who work and learn: A phenomenological research study.
Thurgate, Claire
2018-02-01
With a shift in the United Kingdom's National Health Service to organisational learning and the local introduction of the Assistant Practitioner role to support the nursing workforce there was a broad need to understand the lived experiences of those who work and learn. Hermeneutic phenomenology was the chosen methodology. A purposive sample of eight trainee assistant practitioners, four matrons, seven mentors and the practice development nurse participated in conversational interviews at intermittent points in the journey. A stepped process of analysis produced three over-arching super-ordinate themes which indicated that the transition to assistant practitioner is non-linear and complex necessitating a change in knowledge and behaviour and the workplace culture must enable learning and role development. This paper focuses on supporting the journey which encompassed learning at university and learning in the workplace. Participants' stories demonstrated the presence of knowledgeable mentors and a learning culture enabled new roles to be supported. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
Platform Architecture for Decentralized Positioning Systems.
Kasmi, Zakaria; Norrdine, Abdelmoumen; Blankenbach, Jörg
2017-04-26
A platform architecture for positioning systems is essential for the realization of a flexible localization system, which interacts with other systems and supports various positioning technologies and algorithms. The decentralized processing of a position enables pushing the application-level knowledge into a mobile station and avoids the communication with a central unit such as a server or a base station. In addition, the calculation of the position on low-cost and resource-constrained devices presents a challenge due to the limited computing, storage capacity, as well as power supply. Therefore, we propose a platform architecture that enables the design of a system with the reusability of the components, extensibility (e.g., with other positioning technologies) and interoperability. Furthermore, the position is computed on a low-cost device such as a microcontroller, which simultaneously performs additional tasks such as data collecting or preprocessing based on an operating system. The platform architecture is designed, implemented and evaluated on the basis of two positioning systems: a field strength system and a time of arrival-based positioning system.
Exhaust Nozzle Materials Development for the High Speed Civil Transport
NASA Technical Reports Server (NTRS)
Grady, J. E.
1999-01-01
The United States has embarked on a national effort to develop the technology necessary to produce a Mach 2.4 High Speed Civil Transport (HSCT) for entry into service by the year 2005. The viability of this aircraft is contingent upon its meeting both economic and environmental requirements. Two engine components have been identified as critical to the environmental acceptability of the HSCT. These include a combustor with significantly lower emissions than are feasible with current technology, and a lightweight exhaust nozzle that meets community noise standards. The Enabling Propulsion Materials (EPM) program will develop the advanced structural materials, materials fabrication processes, structural analysis and life prediction tools for the HSCT combustor and low noise exhaust nozzle. This is being accomplished through the coordinated efforts of the NASA Lewis Research Center, General Electric Aircraft Engines and Pratt & Whitney. The mission of the EPM Exhaust Nozzle Team is to develop and demonstrate this technology by the year 1999 to enable its timely incorporation into HSCT propulsion systems.
Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges
NASA Astrophysics Data System (ADS)
Maruping, Likoebe M.
Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.
Platform Architecture for Decentralized Positioning Systems
Kasmi, Zakaria; Norrdine, Abdelmoumen; Blankenbach, Jörg
2017-01-01
A platform architecture for positioning systems is essential for the realization of a flexible localization system, which interacts with other systems and supports various positioning technologies and algorithms. The decentralized processing of a position enables pushing the application-level knowledge into a mobile station and avoids the communication with a central unit such as a server or a base station. In addition, the calculation of the position on low-cost and resource-constrained devices presents a challenge due to the limited computing, storage capacity, as well as power supply. Therefore, we propose a platform architecture that enables the design of a system with the reusability of the components, extensibility (e.g., with other positioning technologies) and interoperability. Furthermore, the position is computed on a low-cost device such as a microcontroller, which simultaneously performs additional tasks such as data collecting or preprocessing based on an operating system. The platform architecture is designed, implemented and evaluated on the basis of two positioning systems: a field strength system and a time of arrival-based positioning system. PMID:28445414
Real-Time Compressive Sensing MRI Reconstruction Using GPU Computing and Split Bregman Methods
Smith, David S.; Gore, John C.; Yankeelov, Thomas E.; Welch, E. Brian
2012-01-01
Compressive sensing (CS) has been shown to enable dramatic acceleration of MRI acquisition in some applications. Being an iterative reconstruction technique, CS MRI reconstructions can be more time-consuming than traditional inverse Fourier reconstruction. We have accelerated our CS MRI reconstruction by factors of up to 27 by using a split Bregman solver combined with a graphics processing unit (GPU) computing platform. The increases in speed we find are similar to those we measure for matrix multiplication on this platform, suggesting that the split Bregman methods parallelize efficiently. We demonstrate that the combination of the rapid convergence of the split Bregman algorithm and the massively parallel strategy of GPU computing can enable real-time CS reconstruction of even acquisition data matrices of dimension 40962 or more, depending on available GPU VRAM. Reconstruction of two-dimensional data matrices of dimension 10242 and smaller took ~0.3 s or less, showing that this platform also provides very fast iterative reconstruction for small-to-moderate size images. PMID:22481908
Pavlopoulos, Nicholas G.; Dubose, Jeffrey T.; Hartnett, Erin D.; ...
2016-07-26
We report on a versatile synthetic m-shell nanoparticles (NPs) in the backbone, along with semiconductor CdSe@CdS nanorod (NR), or tetrapod (TP) side chain groups. A seven-step colloidal total synthesis enabled the synthesis of well-defined colloidal comonomers composed of a dipolar Au@CoNP attached to a single CdSe@CdS NR, or TP, where magnetic dipolar associations between Au@CoNP units promoted the formation of colloidal co- or terpolymers. The key step in this synthesis was the ability to photodeposit a single AuNP tip onto CdSe@CdS NR or TP that enables selective seeding of a dipolar CoNP onto the AuNP seed. In conclusion, we showmore » that the variation of the AuNP size directly controlled the size and dipolar character of the CoNP tip, where the size modulation of the Au and Au@CoNP tips is analogous to control of comonomer reactivity ratios in classical copolymerization processes.« less
Filho, Fernando Jorge C Magalhães; Sobrinho, Teodorico Alves; Steffen, Jorge L; Arias, Carlos A; Paulo, Paula L
2018-05-12
Constructed wetlands systems demand preliminary and primary treatment to remove solids present in greywater (GW) to avoid or reduce clogging processes. The current paper aims to assess hydraulic and hydrological behavior in an improved constructed wetland system, which has a built-in anaerobic digestion chamber (AnC), GW is distributed to the evapotranspiration and treatment tank (CEvaT), combined with a subsurface horizontal flow constructed wetland (SSHF-CW). The results show that both the plants present in the units and the AnC improve hydraulic and volumetric efficiency, decrease short-circuiting and improve mixing conditions in the system. Moreover, the hydraulic conductivity measured on-site indicates that the presence of plants in the system and the flow distribution pattern provided by the AnC might reduce clogging in the SSHF-CW. It is observed that rainfall enables salt elimination, thus increasing evapotranspiration (ET), which promotes effluent reduction and enables the system to have zero discharge when reuse is unfeasible.
Real-Time Compressive Sensing MRI Reconstruction Using GPU Computing and Split Bregman Methods.
Smith, David S; Gore, John C; Yankeelov, Thomas E; Welch, E Brian
2012-01-01
Compressive sensing (CS) has been shown to enable dramatic acceleration of MRI acquisition in some applications. Being an iterative reconstruction technique, CS MRI reconstructions can be more time-consuming than traditional inverse Fourier reconstruction. We have accelerated our CS MRI reconstruction by factors of up to 27 by using a split Bregman solver combined with a graphics processing unit (GPU) computing platform. The increases in speed we find are similar to those we measure for matrix multiplication on this platform, suggesting that the split Bregman methods parallelize efficiently. We demonstrate that the combination of the rapid convergence of the split Bregman algorithm and the massively parallel strategy of GPU computing can enable real-time CS reconstruction of even acquisition data matrices of dimension 4096(2) or more, depending on available GPU VRAM. Reconstruction of two-dimensional data matrices of dimension 1024(2) and smaller took ~0.3 s or less, showing that this platform also provides very fast iterative reconstruction for small-to-moderate size images.
Scientific Innovation's Two Valleys of Death: How Blood and Tissue Banks Can Help to Bridge the Gap
Thompson, Sean D.A.
2014-01-01
Abstract Most biomedical basic research in the United States takes place at universities and research institutes and is funded by federal grants. Basic research is awarded billions of federal dollars every year, enabling new discoveries and greater understanding of the fundamental science that makes new innovations and therapies possible. However, when basic research yields an invention of practical use and the research evolves from basic to applied, the playing field changes. Pre-technology licensing federal dollars all but disappear, and innovations rely predominantly on private funding to support the full path from bench to bedside. It is along this path that the scientific advance faces two Valleys of Death. These sometimes insurmountable development stages are the product of the innovation’s inherent financial, business and investment risks. Well-planned and executed in vivo studies using quality biological materials demonstrating proof-of-concept is often the key to bridging these gaps, and blood and tissue banks offer unique services and resources to enable this process. PMID:25457967
Mobile GPU-based implementation of automatic analysis method for long-term ECG.
Fan, Xiaomao; Yao, Qihang; Li, Ye; Chen, Runge; Cai, Yunpeng
2018-05-03
Long-term electrocardiogram (ECG) is one of the important diagnostic assistant approaches in capturing intermittent cardiac arrhythmias. Combination of miniaturized wearable holters and healthcare platforms enable people to have their cardiac condition monitored at home. The high computational burden created by concurrent processing of numerous holter data poses a serious challenge to the healthcare platform. An alternative solution is to shift the analysis tasks from healthcare platforms to the mobile computing devices. However, long-term ECG data processing is quite time consuming due to the limited computation power of the mobile central unit processor (CPU). This paper aimed to propose a novel parallel automatic ECG analysis algorithm which exploited the mobile graphics processing unit (GPU) to reduce the response time for processing long-term ECG data. By studying the architecture of the sequential automatic ECG analysis algorithm, we parallelized the time-consuming parts and reorganized the entire pipeline in the parallel algorithm to fully utilize the heterogeneous computing resources of CPU and GPU. The experimental results showed that the average executing time of the proposed algorithm on a clinical long-term ECG dataset (duration 23.0 ± 1.0 h per signal) is 1.215 ± 0.140 s, which achieved an average speedup of 5.81 ± 0.39× without compromising analysis accuracy, comparing with the sequential algorithm. Meanwhile, the battery energy consumption of the automatic ECG analysis algorithm was reduced by 64.16%. Excluding energy consumption from data loading, 79.44% of the energy consumption could be saved, which alleviated the problem of limited battery working hours for mobile devices. The reduction of response time and battery energy consumption in ECG analysis not only bring better quality of experience to holter users, but also make it possible to use mobile devices as ECG terminals for healthcare professions such as physicians and health advisers, enabling them to inspect patient ECG recordings onsite efficiently without the need of a high-quality wide-area network environment.
Unpacking the enabling factors for hand, cord and birth-surface hygiene in Zanzibar maternity units
Gon, Giorgia; Ali, Said M; Towriss, Catriona; Kahabuka, Catherine; Ali, Ali O; Cavill, Sue; Dahoma, Mohammed; Faulkner, Sally; Haji, Haji S; Kabole, Ibrahim; Morrison, Emma; Said, Rukaiya M; Tajo, Amour; Velleman, Yael; Woodd, Susannah L; Graham, and Wendy J
2017-01-01
Abstract Recent national surveys in The United Republic of Tanzania have revealed poor standards of hygiene at birth in facilities. As more women opt for institutional delivery, improving basic hygiene becomes an essential part of preventative strategies for reducing puerperal and newborn sepsis. Our collaborative research in Zanzibar provides an in-depth picture of the state of hygiene on maternity wards to inform action. Hygiene was assessed in 2014 across all 37 facilities with a maternity unit in Zanzibar. We used a mixed methods approach, including structured and semi-structured interviews, and environmental microbiology. Data were analysed according to the WHO ‘cleans’ framework, focusing on the fundamental practices for prevention of newborn and maternal sepsis. For each ‘clean’ we explored the following enabling factors: knowledge, infrastructure (including equipment), staffing levels and policies. Composite indices were constructed for the enabling factors of the ‘cleans’ from the quantitative data: clean hands, cord cutting, and birth surface. Results from the qualitative tools were used to complement this information. Only 49% of facilities had the ‘infrastructural’ requirements to enable ‘clean hands’, with the availability of constant running water particularly lacking. Less than half (46%) of facilities met the ‘knowledge’ requirements for ensuring a ‘clean delivery surface’; six out of seven facilities had birthing surfaces that tested positive for multiple potential pathogens. Almost two thirds of facilities met the ‘infrastructure (equipment) requirement’ for ‘clean cord’; however, disposable cord clamps being frequently out of stock, often resulted in the use of non-sterile thread made of fabric. This mixed methods approach, and the analytical framework based on the WHO ‘cleans’ and the enabling factors, yielded practical information of direct relevance to action at local and ministerial levels. The same approach could be applied to collect and analyse data on infection prevention from maternity units in other contexts. PMID:28931118
Careers (A Course of Study). Unit VIII: Budgeting--What to Do With Your Paycheck.
ERIC Educational Resources Information Center
Turley, Kay
Designed to enable special needs students to understand the basics of personal income management, normal living expenses, banking, and comparison shopping, this set of activities on budgeting is the eighth unit in a nine-unit secondary level career course intended to provide handicapped students with the knowledge and tools necessary to succeed in…
NASA Astrophysics Data System (ADS)
Stein, Stefan; Wedler, Jonathan; Rhein, Sebastian; Schmidt, Michael; Körner, Carolin; Michaelis, Alexander; Gebhardt, Sylvia
The application of piezoelectric transducers to structural body parts of machines or vehicles enables the combination of passive mechanical components with sensor and actuator functions in one single structure. According to Herold et al. [1] and Staeves [2] this approach indicates significant potential regarding smart lightweight construction. To obtain the highest yield, the piezoelectric transducers need to be integrated into the flux of forces (load path) of load bearing structures. Application in a downstream process reduces yield and process efficiency during manufacturing and operation, due to the necessity of a subsequent process step of sensor/actuator application. The die casting process offers the possibility for integration of piezoelectric transducers into metal structures. Aluminum castings are particularly favorable due to their high quality and feasibility for high unit production at low cost (Brunhuber [3], Nogowizin [4]). Such molded aluminum parts with integrated piezoelectric transducers enable functions like active vibration damping, structural health monitoring or energy harvesting resulting in significant possibilities of weight reduction, which is an increasingly important driving force of automotive and aerospace industry (Klein [5], Siebenpfeiffer [6]) due to increasingly stringent environmental protection laws. In the scope of those developments, this paper focuses on the entire process chain enabling the generation of lightweight metal structures with sensor and actuator function, starting from the manufacturing of piezoelectric modules over electrical and mechanical bonding to the integration of such modules into aluminum (Al) matrices by die casting. To achieve this challenging goal, piezoceramic sensors/actuator modules, so-called LTCC/PZT modules (LPM) were developed, since ceramic based piezoelectric modules are more likely to withstand the thermal stress of about 700 °C introduced by the casting process (Flössel et al., [7]). The modules are made of low temperature cofired ceramic (LTCC) tapes with an embedded lead zirconate titanate (PZT) plate and are manufactured in multilayer technique. For joining conducting copper (Cu) wires with the electrode structure of the LPM, a novel laser drop on demand wire bonding method (LDB) is applied, which is based on the melting of a spherical CuSn12 braze preform with a liquidus temperature Tliquid of 989.9 °C (Deutsches Kupfer-Institut Düsseldorf, [8]) providing sufficient thermal stability for a subsequent casting process.
Enabling Lead Free Interconnects in DoD Weapon Systems
2017-09-28
FINAL PRESENTATION Enabling Lead-Free Interconnects in DoD Weapon Systems ESTCP Project WP-201573-T2 SEPTEMBER 2017 Dr. Stephan Meschter BAE...Meschter 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) BAE...the results of the SERDP lead- free projects to various stake-holders and to enable standardization. The work products and transferred data must not be
[Maturity Levels of Quality and Risk Management at the University Hospital Schleswig-Holstein].
Jussli-Melchers, Jill; Hilbert, Carsten; Jahnke, Iris; Wehkamp, Kai; Rogge, Annette; Freitag-Wolf, Sandra; Kahla-Witzsch, Heike A; Scholz, Jens; Petzina, Rainer
2018-05-16
Quality and risk management in hospitals are not only required by law but also for an optimal patient-centered and process-optimized patient care. To evaluate the maturity levels of quality and risk management at the University Hospital Schleswig-Holstein (UKSH), a structured analytical tool was developed for easy and efficient application. Four criteria concerning quality management - quality assurance (QS), critical incident reporting system (CIRS), complaint management (BM) and process management (PM) - were evaluated with a structured questionnaire. Self-assessment and external assessment were performed to classify the maturity levels at the UKSH (location Kiel and Lübeck). Every quality item was graded into four categories from "A" (fully implemented) to "D" (not implemented at all). First of all, an external assessment was initiated by the head of the department of quality and risk management. Thereafter, a self-assessment was performed by 46 clinical units of the UKSH. Discrepancies were resolved in a collegial dialogue. Based on these data, overall maturity levels were obtained for every clinical unit. The overall maturity level "A" was reached by three out of 46 (6.5%) clinical units. No unit was graded with maturity level "D". 50% out of all units reached level "B" and 43.5% level "C". The distribution of the four different quality criteria revealed a good implementation of complaint management (maturity levels "A" and "B" in 78.3%), whereas the levels for CIRS were "C" and "D" in 73.9%. Quality assurance and process management showed quite similar distributions for the levels of maturity "B" and "C" (87% QS; 91% PM). The structured analytical tool revealed maturity levels of 46 clinical units of the UKSH and defined the maturity levels of four relevant quality criteria (QS, CIRS, BM, PM). As a consequence, extensive procedures were implemented to raise the standard of quality and risk management. In future, maturity levels will be reevaluated every two years. This qualitative maturity level model enables in a simple and efficient way precise statements concerning presence, manifestation and development of quality and risk management. © Georg Thieme Verlag KG Stuttgart · New York.
Nanosatellite Launch Adapter System (NLAS)
NASA Technical Reports Server (NTRS)
Chartres, James; Cappuccio, Gelsomina
2015-01-01
The Nanosatellite Launch Adapter System (NLAS) was developed to increase access to space while simplifying the integration process of miniature satellites, called nanosats or CubeSats, onto launch vehicles. A standard CubeSat measures about 10 cm square, and is referred to as a 1-unit (1U) CubeSat. A single NLAS provides the capability to deploy 24U of CubeSats. The system is designed to accommodate satellites measuring 1U, 1.5U, 2U, 3U and 6U sizes for deployment into orbit. The NLAS may be configured for use on different launch vehicles. The system also enables flight demonstrations of new technologies in the space environment.
Addressing hypertext design and conversion issues
NASA Technical Reports Server (NTRS)
Glusko, Robert J.
1990-01-01
Hypertext is a network of information units connected by relational links. A hypertext system is a configuration of hardware and software that presents a hypertext to users and allows them to manage and access the information that it contains. Hypertext is also a user interface concept that closely supports the ways that people use printed information. Hypertext concepts encourage modularity and the elimination of redundancy in data bases because information can be stored only once but viewed in any appropriate context. Hypertext is such a hot idea because it is an enabling technology in that workstations and personal computers finally provide enough local processing power for hypertext user interfaces.
[3,3]-Sigmatropic rearrangements: recent applications in the total synthesis of natural products†
Ilardi, Elizabeth A.; Stivala, Craig E.
2014-01-01
Among the fundamental chemical transformations in organic synthesis, the [3,3]-sigmatropic rearrangement occupies a unique position as a powerful, reliable, and well-defined method for the stereoselective construction of carbon–carbon or carbon–heteroatom bonds. While many other reactions can unite two subunits and create a new bond, the strengths of sigmatropic rearrangements derive from their ability to enable structural reorganization with unmatched build-up of complexity. Recent applications that illustrate [3,3]-sigmatropic processes as a key concept in the synthesis of complex natural products are described in this tutorial review, covering literature from about 2001 through early 2009. PMID:19847347
Automated batch characterization of inkjet-printed elastomer lenses using a LEGO platform.
Sung, Yu-Lung; Garan, Jacob; Nguyen, Hoang; Hu, Zhenyu; Shih, Wei-Chuan
2017-09-10
Small, self-adhesive, inkjet-printed elastomer lenses have enabled smartphone cameras to image and resolve microscopic objects. However, the performance of different lenses within a batch is affected by hard-to-control environmental variables. We present a cost-effective platform to perform automated batch characterization of 300 lens units simultaneously for quality inspection. The system was designed and configured with LEGO bricks, 3D printed parts, and a digital camera. The scheme presented here may become the basis of a high-throughput, in-line inspection tool for quality control purposes and can also be employed for optimization of the manufacturing process.
Fast micromagnetic simulations on GPU—recent advances made with \\mathsf{mumax}^3
NASA Astrophysics Data System (ADS)
Leliaert, J.; Dvornik, M.; Mulkers, J.; De Clercq, J.; Milošević, M. V.; Van Waeyenberge, B.
2018-03-01
In the last twenty years, numerical modeling has become an indispensable part of magnetism research. It has become a standard tool for both the exploration of new systems and for the interpretation of experimental data. In the last five years, the capabilities of micromagnetic modeling have dramatically increased due to the deployment of graphical processing units (GPU), which have sped up calculations to a factor of 200. This has enabled many studies which were previously unfeasible. In this topical review, we give an overview of this modeling approach and show how it has contributed to the forefront of current magnetism research.
NASA Technical Reports Server (NTRS)
Perry, Jay L.; Abney, Morgan B.; Frederick, Kenneth R.; Greenwood, Zachary W.; Kayatin, Matthew J.; Newton, Robert L.; Parrish, Keith J.; Roman, Monsi C.; Takada, Kevin C.; Miller, Lee A.;
2013-01-01
A subsystem architecture derived from the International Space Station's (ISS) Atmosphere Revitalization Subsystem (ARS) has been functionally demonstrated. This ISS-derived architecture features re-arranged unit operations for trace contaminant control and carbon dioxide removal functions, a methane purification component as a precursor to enhance resource recovery over ISS capability, operational modifications to a water electrolysis-based oxygen generation assembly, and an alternative major atmospheric constituent monitoring concept. Results from this functional demonstration are summarized and compared to the performance observed during ground-based testing conducted on an ISS-like subsystem architecture. Considerations for further subsystem architecture and process technology development are discussed.
Wholesale bakeries: A small-business guide. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Commercial baking is thought to be one of the world`s oldest industries, with evidence of commercial bakeries dating back to the Egyptians. In the late 19th century, technological innovations such as the development of {open_quotes}tame{close_quotes} yeast and the mechanization of bread kneading enabled mass production of baked goods. As a result, larger {open_quotes}wholesale{close_quotes} baking facilities began to replace smaller local bakeries. Today, there are over 3000 wholesale bakeries across the United States. This report is intended to provide information on the bakery business from the perspective of processes, issues, and challenges faced, including energy consumption of electrically driven equipment.
NASA Astrophysics Data System (ADS)
Olschanowsky, C.; Flores, A. N.; FitzGerald, K.; Masarik, M. T.; Rudisill, W. J.; Aguayo, M.
2017-12-01
Dynamic models of the spatiotemporal evolution of water, energy, and nutrient cycling are important tools to assess impacts of climate and other environmental changes on ecohydrologic systems. These models require spatiotemporally varying environmental forcings like precipitation, temperature, humidity, windspeed, and solar radiation. These input data originate from a variety of sources, including global and regional weather and climate models, global and regional reanalysis products, and geostatistically interpolated surface observations. Data translation measures, often subsetting in space and/or time and transforming and converting variable units, represent a seemingly mundane, but critical step in the application workflows. Translation steps can introduce errors, misrepresentations of data, slow execution time, and interrupt data provenance. We leverage a workflow that subsets a large regional dataset derived from the Weather Research and Forecasting (WRF) model and prepares inputs to the Parflow integrated hydrologic model to demonstrate the impact translation tool software quality on scientific workflow results and performance. We propose that such workflows will benefit from a community approved collection of data transformation components. The components should be self-contained composable units of code. This design pattern enables automated parallelization and software verification, improving performance and reliability. Ensuring that individual translation components are self-contained and target minute tasks increases reliability. The small code size of each component enables effective unit and regression testing. The components can be automatically composed for efficient execution. An efficient data translation framework should be written to minimize data movement. Composing components within a single streaming process reduces data movement. Each component will typically have a low arithmetic intensity, meaning that it requires about the same number of bytes to be read as the number of computations it performs. When several components' executions are coordinated the overall arithmetic intensity increases, leading to increased efficiency.
Physician Enabling Skills Questionnaire
Hudon, Catherine; Lambert, Mireille; Almirall, José
2015-01-01
Abstract Objective To evaluate the reliability and validity of the newly developed Physician Enabling Skills Questionnaire (PESQ) by assessing its internal consistency, test-retest reliability, concurrent validity with patient-centred care, and predictive validity with patient activation and patient enablement. Design Validation study. Setting Saguenay, Que. Participants One hundred patients with at least 1 chronic disease who presented in a waiting room of a regional health centre family medicine unit. Main outcome measures Family physicians’ enabling skills, measured with the PESQ at 2 points in time (ie, while in the waiting room at the family medicine unit and 2 weeks later through a mail survey); patient-centred care, assessed with the Patient Perception of Patient-Centredness instrument; patient activation, assessed with the Patient Activation Measure; and patient enablement, assessed with the Patient Enablement Instrument. Results The internal consistency of the 6 subscales of the PESQ was adequate (Cronbach α = .69 to .92). The test-retest reliability was very good (r = 0.90; 95% CI 0.84 to 0.93). Concurrent validity with the Patient Perception of Patient-Centredness instrument was good (r = −0.67; 95% CI −0.78 to −0.53; P < .001). The PESQ accounts for 11% of the total variance with the Patient Activation Measure (r2 = 0.11; P = .002) and 19% of the variance with the Patient Enablement Instrument (r2 = 0.19; P < .001). Conclusion The newly developed PESQ presents good psychometric properties, allowing for its use in practice and research. PMID:26889507
High-Performance Data Analysis Tools for Sun-Earth Connection Missions
NASA Technical Reports Server (NTRS)
Messmer, Peter
2011-01-01
The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.
A Framework for Comprehensive Health Terminology Systems in the United States
Chute, Christopher G.; Cohn, Simon P.; Campbell, James R.
1998-01-01
Health care in the United States has become an information-intensive industry, yet electronic health records represent patient data inconsistently for lack of clinical data standards. Classifications that have achieved common acceptance, such as the ICD-9-CM or ICD, aggregate heterogeneous patients into broad categories, which preclude their practical use in decision support, development of refined guidelines, or detailed comparison of patient outcomes or benchmarks. This document proposes a framework for the integration and maturation of clinical terminologies that would have practical applications in patient care, process management, outcome analysis, and decision support. Arising from the two working groups within the standards community—the ANSI (American National Standards Institute) Healthcare Informatics Standards Board Working Group and the Computer-based Patient Records Institute Working Group on Codes and Structures—it outlines policies regarding 1) functional characteristics of practical terminologies, 2) terminology models that can broaden their applications and contribute to their sustainability, 3) maintenance attributes that will enable terminologies to keep pace with rapidly changing health care knowledge and process, and 4) administrative issues that would facilitate their accessibility, adoption, and application to improve the quality and efficiency of American health care. PMID:9824798
NASA Astrophysics Data System (ADS)
Derkachov, G.; Jakubczyk, T.; Jakubczyk, D.; Archer, J.; Woźniak, M.
2017-07-01
Utilising Compute Unified Device Architecture (CUDA) platform for Graphics Processing Units (GPUs) enables significant reduction of computation time at a moderate cost, by means of parallel computing. In the paper [Jakubczyk et al., Opto-Electron. Rev., 2016] we reported using GPU for Mie scattering inverse problem solving (up to 800-fold speed-up). Here we report the development of two subroutines utilising GPU at data preprocessing stages for the inversion procedure: (i) A subroutine, based on ray tracing, for finding spherical aberration correction function. (ii) A subroutine performing the conversion of an image to a 1D distribution of light intensity versus azimuth angle (i.e. scattering diagram), fed from a movie-reading CPU subroutine running in parallel. All subroutines are incorporated in PikeReader application, which we make available on GitHub repository. PikeReader returns a sequence of intensity distributions versus a common azimuth angle vector, corresponding to the recorded movie. We obtained an overall ∼ 400 -fold speed-up of calculations at data preprocessing stages using CUDA codes running on GPU in comparison to single thread MATLAB-only code running on CPU.
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo.
McDaniel, T; D'Azevedo, E F; Li, Y W; Wong, K; Kent, P R C
2017-11-07
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo
NASA Astrophysics Data System (ADS)
McDaniel, T.; D'Azevedo, E. F.; Li, Y. W.; Wong, K.; Kent, P. R. C.
2017-11-01
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.
Automatic oscillator frequency control system
NASA Technical Reports Server (NTRS)
Smith, S. F. (Inventor)
1985-01-01
A frequency control system makes an initial correction of the frequency of its own timing circuit after comparison against a frequency of known accuracy and then sequentially checks and corrects the frequencies of several voltage controlled local oscillator circuits. The timing circuit initiates the machine cycles of a central processing unit which applies a frequency index to an input register in a modulo-sum frequency divider stage and enables a multiplexer to clock an accumulator register in the divider stage with a cyclical signal derived from the oscillator circuit being checked. Upon expiration of the interval, the processing unit compares the remainder held as the contents of the accumulator against a stored zero error constant and applies an appropriate correction word to a correction stage to shift the frequency of the oscillator being checked. A signal from the accumulator register may be used to drive a phase plane ROM and, with periodic shifts in the applied frequency index, to provide frequency shift keying of the resultant output signal. Interposition of a phase adder between the accumulator register and phase plane ROM permits phase shift keying of the output signal by periodic variation in the value of a phase index applied to one input of the phase adder.
GPUmotif: An Ultra-Fast and Energy-Efficient Motif Analysis Program Using Graphics Processing Units
Zandevakili, Pooya; Hu, Ming; Qin, Zhaohui
2012-01-01
Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS) that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU)-accelerated motif analysis program named GPUmotif. We proposed a “fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/ PMID:22662128
Alyahya, Mohammad
2012-02-01
Organizational structure is built through dynamic processes which blend historical force and management decisions, as a part of a broader process of constructing organizational memory (OM). OM is considered to be one of the main competences leading to the organization's success. This study focuses on the impact of the Quality and Outcome Framework (QOF), which is a Pay-for-Performance scheme, on general practitioner (GP) practices in the UK. The study is based on semistructured interviews with four GP practices in the north of England involving 39 informants. The findings show that the way practices assigned different functions into specialized units, divisions or departments shows the degree of specialization in their organizational structures. More specialized unit arrangements, such as an IT division, particular chronic disease clinics or competence-based job distributions enhanced procedural memory development through enabling regular use of knowledge in specific context, which led to competence building. In turn, such competence at particular functions or jobs made it possible for the practices to achieve their goals more efficiently. This study concludes that organizational structure contributed strongly to the enhancement of OM, which in turn led to better organizational competence.
Multilevel Summation of Electrostatic Potentials Using Graphics Processing Units*
Hardy, David J.; Stone, John E.; Schulten, Klaus
2009-01-01
Physical and engineering practicalities involved in microprocessor design have resulted in flat performance growth for traditional single-core microprocessors. The urgent need for continuing increases in the performance of scientific applications requires the use of many-core processors and accelerators such as graphics processing units (GPUs). This paper discusses GPU acceleration of the multilevel summation method for computing electrostatic potentials and forces for a system of charged atoms, which is a problem of paramount importance in biomolecular modeling applications. We present and test a new GPU algorithm for the long-range part of the potentials that computes a cutoff pair potential between lattice points, essentially convolving a fixed 3-D lattice of “weights” over all sub-cubes of a much larger lattice. The implementation exploits the different memory subsystems provided on the GPU to stream optimally sized data sets through the multiprocessors. We demonstrate for the full multilevel summation calculation speedups of up to 26 using a single GPU and 46 using multiple GPUs, enabling the computation of a high-resolution map of the electrostatic potential for a system of 1.5 million atoms in under 12 seconds. PMID:20161132
GPUmotif: an ultra-fast and energy-efficient motif analysis program using graphics processing units.
Zandevakili, Pooya; Hu, Ming; Qin, Zhaohui
2012-01-01
Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS) that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU)-accelerated motif analysis program named GPUmotif. We proposed a "fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/
Changing Landscapes: Integrated Teaching Units.
ERIC Educational Resources Information Center
Whitty, Helen, Ed.
This collection of teaching units developed in Australia arises from "Special Forever: An Environmental Communications Project," which enables students to develop critical awareness of their local environments and communicate this awareness effectively. The project also aims to encourage school-based action in support of the environment…
2016-05-01
Biometrics in Support of Operations Biometrics -at-Sea: Business Rules for South Florida United States...Intelligence Activities Biometrics -Enabled Intelligence USCG Biometrics -at-Sea: Business Rules for...Defense Biometrics United States Intelligence Activities Active Army,
CFDP for Interplanetary Overlay Network
NASA Technical Reports Server (NTRS)
Burleigh, Scott C.
2011-01-01
The CCSDS (Consultative Committee for Space Data Systems) File Delivery Protocol for Interplanetary Overlay Network (CFDP-ION) is an implementation of CFDP that uses IO' s DTN (delay tolerant networking) implementation as its UT (unit-data transfer) layer. Because the DTN protocols effect automatic, reliable transmission via multiple relays, CFDP-ION need only satisfy the requirements for Class 1 ("unacknowledged") CFDP. This keeps the implementation small, but without loss of capability. This innovation minimizes processing resources by using zero-copy objects for file data transmission. It runs without modification in VxWorks, Linux, Solaris, and OS/X. As such, this innovation can be used without modification in both flight and ground systems. Integration with DTN enables the CFDP implementation itself to be very simple; therefore, very small. Use of ION infrastructure minimizes consumption of storage and processing resources while maximizing safety.
Magnetospheric Multiscale (MMS)
2014-05-09
Electrical technicians work diligently to build the connector harnessing for the Command and Data Handling (C&DH) unit, (black box with two red handles) that is installed on spacecraft Deck for MMS #4. Learn more about MMS at www.nasa.gov/mms Credit NASA/Goddard The Magnetospheric Multiscale, or MMS, will study how the sun and the Earth's magnetic fields connect and disconnect, an explosive process that can accelerate particles through space to nearly the speed of light. This process is called magnetic reconnection and can occur throughout all space. NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram
Communication elements supporting patient safety in psychiatric inpatient care.
Kanerva, A; Kivinen, T; Lammintakanen, J
2015-06-01
Communication is important for safe and quality health care. The study provides needed insight on the communication elements that support patient safety from the psychiatric care view. Fluent information transfer between the health care professionals and care units is important for care planning and maintaining practices. Information should be documented and implemented accordingly. Communication should happen in an open communication culture that enables discussion, the opportunity to have debriefing discussions and the entire staff can feel they are heard. For effective communication, it is also important that staff are active themselves in information collecting about the essential information needed in patient care. In mental health nursing, it is important to pay attention to all elements of communication and to develop processes concerning communication in multidisciplinary teams and across unit boundaries. The study aims to describe which communication elements support patient safety in psychiatric inpatient care from the viewpoint of the nursing staff. Communication is an essential part of care and one of the core competencies of the psychiatric care. It enables safe and quality patient care. Errors in health care are often connected with poor communication. The study brings needed insight from the psychiatric care view to the topic. The data were gathered from semi-structured interviews in which 26 nurses were asked to describe the elements that constitute patient safety in psychiatric inpatient care. The data were analysed inductively from the viewpoint of communication. The descriptions connected with communication formed a main category of communication elements that support patient safety; this main category was made up of three subcategories: fluent information transfer, open communication culture and being active in information collecting. Fluent information transfer consists of the practical implementation of communication; open communication culture is connected with the cultural issues of communication; and being active in information collecting is related to a nurse's personal working style, which affects communication. It is important to pay attention to all the three areas and use this knowledge in developing patient safety practices and strategies where communication aspect and culture are noted and developed. In mental health nursing, it is important to develop processes concerning communication in multidisciplinary teams and across unit boundaries. © 2015 John Wiley & Sons Ltd.
Cascaded Quadruple Active Bridge Structures for Multilevel DC to Three-Phase AC Conversion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brian B; Achanta, Prasanta K; Maksimovic, Dragan
This paper introduces a multilevel architecture comprised of interconnected dc to three-phase ac converter units. To enable series connected operation, each converter unit contains a quadruple active bridge (QAB) converter that provides isolation between the dc side and each of the three ac sides. Since each converter unit transfers dc-side power as constant balanced three-phase power on the ac side, this implies instantaneous input-output power balance and allows elimination of bulk capacitive energy storage. In addition to minimizing required capacitance, the proposed approach simultaneously enables simplified dc-link controllers amenable to decentralized implementation, supports bidirectional power transfer, and exhibits a modularmore » structure to enhance scalability. Isolation provided by the QAB allows a wide range of electrical configurations among multiple units in various dc-ac, ac-dc or ac-ac applications. In this paper, the focus is on series connections on the ac side to emphasize multilevel operation, and the approach is experimentally validated in a dc-ac system containing two cascaded converter units.« less
Careers (A Course of Study). Unit IV: Applying for the Job.
ERIC Educational Resources Information Center
Turley, Kay
Designed to enable special needs students to write resumes and complete application forms with employable accuracy, this set of activities on applying for a job is the fourth unit in a nine-unit secondary level careers course intended to provide handicapped students with the knowledge and tools necessary to succeed in the world of work. Chapter 1…
Careers (A Course of Study). Unit VI: Interviewing for the Job.
ERIC Educational Resources Information Center
Turley, Kay
Designed to enable special needs students to arrange, complete, and follow up a job interview, this set of activities on job interviews is the sixth unit in a nine-unit secondary level careers course intended to provide handicapped students with the knowledge and tools necessary to succeed in the world of work. The eight activities in the first…
Lake Erie...A Day in the Life of a Fish.
ERIC Educational Resources Information Center
Canning, Maureen; Dunlevy, Margie
This elementary school teaching unit was developed as a part of a series of units that deal with Lake Erie. This unit was developed to enable children to: (1) examine a moving fish; (2) conduct experiments with a live fish; (3) understand the swimming habits of fish; (4) learn how fish breathe; (5) recognize different methods of fish protection…
ERIC Educational Resources Information Center
Canning, Maureen; Dunlevy, Margie
This elementary school teaching unit was developed as a part of a series of teaching units that deal with Lake Erie. This unit was developed to enable children to: (1) identify the Great Lakes and pick out Lake Erie on a map; (2) demonstrate knowledge of Lake Erie's origin and geography; (3) list some uses of Lake Erie; and (4) give examples of…
Lake Erie...Build a Fish to Scale!
ERIC Educational Resources Information Center
Canning, Maureen; Dunlevy, Margie
This elementary school teaching unit was developed as a part of a series of teaching units that deal with Lake Erie. This unit was developed to enable children to: (1) name the different parts of a fish; (2) assemble a fish using overlapping overheads to reinforce fish parts; (3) build a fish to scale using jumbo fish puzzle parts; (4) classify…
Quality Assurance in Dietetic Services Workshop for the Dietetic Assistant.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This workshop guide is a unit of study for teaching dietetic assistants to work with quality control in a nursing home or hospital. The objective of the unit is to enable the students to develop and expand a dietetic services administrative and clinical quality assurance program in his or her own institution. Following the unit objective, the unit…
Evaluating the Cost-Effectiveness of Instructional Programs.
ERIC Educational Resources Information Center
Alkin, Marvin C.
A model of cost-effectiveness is outlined which enables consideration of some non-financial, as well as financial, elements of educational systems at school or district levels. The model enables the decision-maker to compare educational outcomes of different units, to assess the impact of alternative levels of financial input, and to select…
NASA Astrophysics Data System (ADS)
Grobman, Warren D.
2002-07-01
Dramatically increasing mask set costs, long-loop design-fabrication iterations, and lithography of unprecedented complexity and cost threaten to disrupt time-accepted IC industry progression as described by Moore"s Law. Practical and cost-effective IC manufacturing below the 100nm technology node presents significant and unique new challenges spanning multiple disciplines and overlapping traditionally separable components of the design-through-chip manufacturing flow. Lithographic and other process complexity is compounded by design, mask, and infrastructure technologies, which do not sufficiently account for increasingly stringent and complex manufacturing issues. Deep subwavelength and atomic-scale process and device physics effects increasingly invade and impact the design flow strongly at a time when the pressures for increased design productivity are escalating at a superlinear rate. Productivity gaps, both upstream in design and downstream in fabrication, are anticipated by many to increase due to dramatic increases in inherent complexity of the design-to-chip equation. Furthermore, the cost of lithographic equipment is increasing at an aggressive compound growth rate so large that we can no longer economically derive the benefit of the increased number of circuits per unit area unless we extend the life of lithographic equipment for more generations, and deeper into the subwavelength regime. Do these trends unambiguously lead to the conclusion that we need a revolution in design and design-process integration to enable the sub-100nm nodes? Or is such a premise similar to other well-known predictions of technology brick walls that never came true?
A Rinsing Effluent Evaporator for Dismantling Operations - 13271
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rives, Rachel; Asou-Pothet, Marielle; Chambon, Frederic
2013-07-01
Between 1958 and 1997, the UP1 plant at Marcoule - located in the south of France - reprocessed and recycled nearly 20,000 MT of used fuel from special defense applications reactors, as well as fuel from the first generation of electricity generating reactors in France (natural uranium fuel, CO{sub 2}-cooled, graphite-moderated). Decommissioning and Dismantling of the UP1 plant and its associated units started in 1998. Since 2005, the UP1 facility has been operated by AREVA as the Marcoule Management and Operation contractor for French Atomic Energy Commission (CEA). An important part of this decommissioning program deals with the vitrification facilitymore » of Marcoule. This facility includes 20 tanks devoted to interim storage of highly active solutions, prior to vitrification. In 2006, a rinsing program was defined as part of the tank cleanup strategy. The main objective of the rinsing phases was to decrease activity in order to limit the volume of 'long-life active' waste produced during the decommissioning operations, so the tanks can be dismantled without the need of remote operations. To enable this rinsing program, and anticipating large volumes of generated effluent, the construction of an evaporation unit proved to be essential. The main objective of this unit was to concentrate the effluent produced during tank rinsing operations by a factor of approximately 10, prior to it being treated by vitrification. The evaporator design phase was launched in September 2006. The main challenge for the Project team was the installation of this new unit within a nuclear facility still in operation and in existing compartments not initially designed for this purpose. Cold operating tests were completed in 2008, and in May 2009, the final connections to the process were activated to start the hot test phase. During the first hot test operations performed on the first batches of clean-up effluent, the evaporator had a major operating problem. Extremely large quantities of foam were produced, affecting the evaporator operation, and creating the risk of a reduction in its capacity and throughput performance. A task force of AREVA process, operations, and safety experts from Marcoule and the La Hague reprocessing complex was assembled. New operating parameters were defined and tested to improve the process. Since then, the evaporator has performed very satisfactorily. The foam buildup phenomenon has been brought under complete control. All the different types of effluents produced during cleanup operations have been concentrated, and the results obtained in terms of quality and throughput, have ensured a consistent supply to the vitrification unit. The evaporator was operated until the end of April 2012, and enabled the production of 500 cubic meters of very high activity effluent, concentrating the fission products rinsed from the storage tanks. The evaporator will now be deactivated and decommissioned, with the first rinsing and cleanup operations scheduled to begin in 2014. (authors)« less
A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations
NASA Astrophysics Data System (ADS)
Demir, I.; Agliamzanov, R.
2014-12-01
Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.
Kaliman, Ilya A; Krylov, Anna I
2017-04-30
A new hardware-agnostic contraction algorithm for tensors of arbitrary symmetry and sparsity is presented. The algorithm is implemented as a stand-alone open-source code libxm. This code is also integrated with general tensor library libtensor and with the Q-Chem quantum-chemistry package. An overview of the algorithm, its implementation, and benchmarks are presented. Similarly to other tensor software, the algorithm exploits efficient matrix multiplication libraries and assumes that tensors are stored in a block-tensor form. The distinguishing features of the algorithm are: (i) efficient repackaging of the individual blocks into large matrices and back, which affords efficient graphics processing unit (GPU)-enabled calculations without modifications of higher-level codes; (ii) fully asynchronous data transfer between disk storage and fast memory. The algorithm enables canonical all-electron coupled-cluster and equation-of-motion coupled-cluster calculations with single and double substitutions (CCSD and EOM-CCSD) with over 1000 basis functions on a single quad-GPU machine. We show that the algorithm exhibits predicted theoretical scaling for canonical CCSD calculations, O(N 6 ), irrespective of the data size on disk. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
77 FR 70136 - Privacy Act of 1974, System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-23
.... It is USAID's core financial management system and accounting system of record. Phoenix enables USAID... of 1974, System of Records AGENCY: United States Agency for International Development. ACTION: Altered system of records. SUMMARY: The United States Agency for International Development (USAID) is...
Carroll, Katherine
2014-06-01
When mothers of preterm infants are unable to produce sufficient volumes of breastmilk, neonatologists in many Western countries prescribe pasteurized donor breastmilk. Breastmilk has a paradoxical presence in the neonatal intensive care unit while it has therapeutic properties, it also has the potential to transmit disease. National health authorities and local neonatal intensive care unit policies each delimit the safety of donor milk by focusing on the presence or absence of pathogens. It is in this light that breastmilk from the human milk bank is both sought and legitimated to minimize safety concerns. This research uses data arising from an ethnographic study of two human milk banks and two neonatal intensive care units in the United States, and 73 interviews with milk donors, neonatal intensive care unit parents and clinicians. The primary research question framing the study was 'What are the underlying processes and practices that have enabled donor milk to be endorsed as a safe and legitimate feeding option in neonatal intensive care units?' This study is framed using three key principles of Latour's 'new critique', namely, adding to reality rather than debunking it, getting closer to data rather than turning away from fact and creating arenas in which to assemble. As a result, conceptions of donor milk's safety are expanded. This case study of donor milk demonstrates how Latour's new critique can inform science and technology studies approaches to the study of safety in health care.
Purdon, Patrick L.; Millan, Hernan; Fuller, Peter L.; Bonmassar, Giorgio
2008-01-01
Simultaneous recording of electrophysiology and functional magnetic resonance imaging (fMRI) is a technique of growing importance in neuroscience. Rapidly evolving clinical and scientific requirements have created a need for hardware and software that can be customized for specific applications. Hardware may require customization to enable a variety of recording types (e.g., electroencephalogram, local field potentials, or multi-unit activity) while meeting the stringent and costly requirements of MRI safety and compatibility. Real-time signal processing tools are an enabling technology for studies of learning, attention, sleep, epilepsy, neurofeedback, and neuropharmacology, yet real-time signal processing tools are difficult to develop. We describe an open source system for simultaneous electrophysiology and fMRI featuring low-noise (< 0.6 uV p-p input noise), electromagnetic compatibility for MRI (tested up to 7 Tesla), and user-programmable real-time signal processing. The hardware distribution provides the complete specifications required to build an MRI-compatible electrophysiological data acquisition system, including circuit schematics, print circuit board (PCB) layouts, Gerber files for PCB fabrication and robotic assembly, a bill of materials with part numbers, data sheets, and vendor information, and test procedures. The software facilitates rapid implementation of real-time signal processing algorithms. This system has used in human EEG/fMRI studies at 3 and 7 Tesla examining the auditory system, visual system, sleep physiology, and anesthesia, as well as in intracranial electrophysiological studies of the non-human primate visual system during 3 Tesla fMRI, and in human hyperbaric physiology studies at depths of up to 300 feet below sea level. PMID:18761038
Purdon, Patrick L; Millan, Hernan; Fuller, Peter L; Bonmassar, Giorgio
2008-11-15
Simultaneous recording of electrophysiology and functional magnetic resonance imaging (fMRI) is a technique of growing importance in neuroscience. Rapidly evolving clinical and scientific requirements have created a need for hardware and software that can be customized for specific applications. Hardware may require customization to enable a variety of recording types (e.g., electroencephalogram, local field potentials, or multi-unit activity) while meeting the stringent and costly requirements of MRI safety and compatibility. Real-time signal processing tools are an enabling technology for studies of learning, attention, sleep, epilepsy, neurofeedback, and neuropharmacology, yet real-time signal processing tools are difficult to develop. We describe an open-source system for simultaneous electrophysiology and fMRI featuring low-noise (<0.6microV p-p input noise), electromagnetic compatibility for MRI (tested up to 7T), and user-programmable real-time signal processing. The hardware distribution provides the complete specifications required to build an MRI-compatible electrophysiological data acquisition system, including circuit schematics, print circuit board (PCB) layouts, Gerber files for PCB fabrication and robotic assembly, a bill of materials with part numbers, data sheets, and vendor information, and test procedures. The software facilitates rapid implementation of real-time signal processing algorithms. This system has been used in human EEG/fMRI studies at 3 and 7T examining the auditory system, visual system, sleep physiology, and anesthesia, as well as in intracranial electrophysiological studies of the non-human primate visual system during 3T fMRI, and in human hyperbaric physiology studies at depths of up to 300 feet below sea level.
McLaughlin, William A; Chen, Ken; Hou, Tingjun; Wang, Wei
2007-01-01
Background Protein domains coordinate to perform multifaceted cellular functions, and domain combinations serve as the functional building blocks of the cell. The available methods to identify functional domain combinations are limited in their scope, e.g. to the identification of combinations falling within individual proteins or within specific regions in a translated genome. Further effort is needed to identify groups of domains that span across two or more proteins and are linked by a cooperative function. Such functional domain combinations can be useful for protein annotation. Results Using a new computational method, we have identified 114 groups of domains, referred to as domain assembly units (DASSEM units), in the proteome of budding yeast Saccharomyces cerevisiae. The units participate in many important cellular processes such as transcription regulation, translation initiation, and mRNA splicing. Within the units the domains were found to function in a cooperative manner; and each domain contributed to a different aspect of the unit's overall function. The member domains of DASSEM units were found to be significantly enriched among proteins contained in transcription modules, defined as genes sharing similar expression profiles and presumably similar functions. The observation further confirmed the functional coherence of DASSEM units. The functional linkages of units were found in both functionally characterized and uncharacterized proteins, which enabled the assessment of protein function based on domain composition. Conclusion A new computational method was developed to identify groups of domains that are linked by a common function in the proteome of Saccharomyces cerevisiae. These groups can either lie within individual proteins or span across different proteins. We propose that the functional linkages among the domains within the DASSEM units can be used as a non-homology based tool to annotate uncharacterized proteins. PMID:17937820
Skogheim, Gry; Hanssen, Tove A
2015-12-01
In some economically developed countries, women's choice of birth care and birth place is encouraged. The aim of this study was to explore and describe the experiences of midwives who started working in alongside/free-standing midwifery units (AMU/FMU) and their experiences with labour care in this setting. A qualitative explorative design using a phenomenographic approach was used. Semi-structured interviews were conducted with ten strategically sampled midwives working in midwifery units. The analysis revealed the following five categories of experiences noted by the midwives: mixed emotions and de-learning obstetric unit habits, revitalising midwifery philosophy, alertness and preparedness, presence and patience, and coping with time. Starting to work in an AMU/FMU can be a distressing period for a midwife. First, it may require de-learning the medical approach to birth, and, second, it may entail a revitalisation (and re-learning) of birth care that promotes physiological birth. Midwifery, particularly in FMUs, requires an especially careful assessment of the labouring process, the ability to be foresighted, and capability in emergencies. The autonomy of midwives may be constrained also in AMUs/FMUs. However, working in these settings is also viewed as experiencing "the art of midwifery" and enables revitalisation of the midwifery philosophy. Copyright © 2015 Elsevier B.V. All rights reserved.
Body Area Network BAN--a key infrastructure element for patient-centered medical applications.
Schmidt, Robert; Norgall, Thomas; Mörsdorf, Joachim; Bernhard, Josef; von der Grün, Thomas
2002-01-01
The Body Area Network (BAN) concept enables wireless communication between several miniaturized, intelligent Body Sensor (or actor) Units (BSU) and a single Body Central Unit (BCU) worn at the human body. A separate wireless transmission link from the BCU to a network access point--using different technology--provides for online access to BAN data via usual network infrastructure. BAN is expected to become a basic infrastructure element for service-based electronic health assistance: By integrating patient-attached sensors and control of mobile dedicated actor units, the range of medical workflow can be extended by wireless patient monitoring and therapy support. Beyond clinical use, professional disease management environments, and private personal health assistance scenarios (without financial reimbursement by health agencies/insurance companies), BAN enables a wide range of health care applications and related services.
Partitioning in parallel processing of production systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oflazer, K.
1987-01-01
This thesis presents research on certain issues related to parallel processing of production systems. It first presents a parallel production system interpreter that has been implemented on a four-processor multiprocessor. This parallel interpreter is based on Forgy's OPS5 interpreter and exploits production-level parallelism in production systems. Runs on the multiprocessor system indicate that it is possible to obtain speed-up of around 1.7 in the match computation for certain production systems when productions are split into three sets that are processed in parallel. The next issue addressed is that of partitioning a set of rules to processors in a parallel interpretermore » with production-level parallelism, and the extent of additional improvement in performance. The partitioning problem is formulated and an algorithm for approximate solutions is presented. The thesis next presents a parallel processing scheme for OPS5 production systems that allows some redundancy in the match computation. This redundancy enables the processing of a production to be divided into units of medium granularity each of which can be processed in parallel. Subsequently, a parallel processor architecture for implementing the parallel processing algorithm is presented.« less
NASA Technical Reports Server (NTRS)
Tri, Terry O.; Kennedy, Kriss J.; Toups, Larry; Gill, Tracy R.; Howe, A. Scott
2011-01-01
This paper describes the construction, assembly, subsystem integration, transportation, and field testing operations associated with the Habitat Demonstration Unit (HDU) Pressurized Excursion Module (PEM) and discusses lessons learned. In a one-year period beginning summer 2009, a tightly scheduled design-develop-build process was utilized by a small NASA "tiger team" to produce the functional HDU-PEM prototype in time to participate in the 2010 Desert Research and Technology Studies (Desert RATS) field campaign. The process required the coordination of multiple teams, subcontractors, facility management and safety staff. It also required a well-choreographed material handling and transportation process to deliver the finished product from the NASA-Johnson Space Center facilities to the remote Arizona desert locations of the field test. Significant findings of this paper include the team s greater understanding of the HDU-PEM s many integration issues and the in-field training the team acquired which will enable the implementation of the next-generation of improvements and development of high-fidelity field operations in a harsh environment. The Desert RATS analog environment is being promoted by NASA as an efficient means to design, build, and integrate multiple technologies in a mission architecture context, with the eventual goal of evolving the technologies into robust flight hardware systems. The HDU-PEM in-field demonstration at Desert RATS 2010 provided a validation process for the integration team, which has already begun to retool for the 2011 field tests that require an adapted architecture.
User's manual for computer program BASEPLOT
Sanders, Curtis L.
2002-01-01
The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.
Method and apparatus to debug an integrated circuit chip via synchronous clock stop and scan
Bellofatto, Ralph E [Ridgefield, CT; Ellavsky, Matthew R [Rochester, MN; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Gooding, Thomas M [Rochester, MN; Haring, Rudolf A [Cortlandt Manor, NY; Hehenberger, Lance G [Leander, TX; Ohmacht, Martin [Yorktown Heights, NY
2012-03-20
An apparatus and method for evaluating a state of an electronic or integrated circuit (IC), each IC including one or more processor elements for controlling operations of IC sub-units, and each the IC supporting multiple frequency clock domains. The method comprises: generating a synchronized set of enable signals in correspondence with one or more IC sub-units for starting operation of one or more IC sub-units according to a determined timing configuration; counting, in response to one signal of the synchronized set of enable signals, a number of main processor IC clock cycles; and, upon attaining a desired clock cycle number, generating a stop signal for each unique frequency clock domain to synchronously stop a functional clock for each respective frequency clock domain; and, upon synchronously stopping all on-chip functional clocks on all frequency clock domains in a deterministic fashion, scanning out data values at a desired IC chip state. The apparatus and methodology enables construction of a cycle-by-cycle view of any part of the state of a running IC chip, using a combination of on-chip circuitry and software.
Code of Federal Regulations, 2013 CFR
2013-10-01
... was established. (g) Conservation System Unit (CSU) means any unit in Alaska of the National Park... established as defined by the enabling legislation for the area. (m) Related structures and facilities means... Interior. (p) Transportation or utility system (TUS) means any of the systems listed in paragraphs (p) (1...
Code of Federal Regulations, 2014 CFR
2014-10-01
... was established. (g) Conservation System Unit (CSU) means any unit in Alaska of the National Park... established as defined by the enabling legislation for the area. (m) Related structures and facilities means... Interior. (p) Transportation or utility system (TUS) means any of the systems listed in paragraphs (p) (1...
Code of Federal Regulations, 2011 CFR
2011-10-01
... was established. (g) Conservation System Unit (CSU) means any unit in Alaska of the National Park... established as defined by the enabling legislation for the area. (m) Related structures and facilities means... Interior. (p) Transportation or utility system (TUS) means any of the systems listed in paragraphs (p) (1...
Young, John A.; Mahan, Carolyn G.; Forder, Melissa
2017-01-01
Many eastern forest communities depend on fire for regeneration or are enhanced by fire as a restoration practice. However, the use of prescribed fire in the mesic forested environments and the densely populated regions of the eastern United States has been limited. The objective of our research was to develop a science-based approach to prioritizing the use of prescribed fire in appropriate forest types in the eastern United States based on a set of desired management outcomes. Through a process of expert elicitation and data analysis, we assessed and integrated recent vegetation community mapping results along with other available spatial data layers into a spatial prioritization tool for prescribed fire planning at Shenandoah National Park (Virginia, USA). The integration of vegetation spatial data allowed for development of per-pixel priority rankings and exclusion areas enabling precise targeting of fire management activities on the ground, as well as a park-wide ranking of fire planning compartments. We demonstrate the use and evaluation of this approach through implementation and monitoring of a prescribed burn and show that progress is being made toward desired conditions. Integration of spatial data into the fire planning process has served as a collaborative tool for the implementation of prescribed fire projects, which assures projects will be planned in the most appropriate areas to meet objectives that are supported by current science.
Accelerated rescaling of single Monte Carlo simulation runs with the Graphics Processing Unit (GPU).
Yang, Owen; Choi, Bernard
2013-01-01
To interpret fiber-based and camera-based measurements of remitted light from biological tissues, researchers typically use analytical models, such as the diffusion approximation to light transport theory, or stochastic models, such as Monte Carlo modeling. To achieve rapid (ideally real-time) measurement of tissue optical properties, especially in clinical situations, there is a critical need to accelerate Monte Carlo simulation runs. In this manuscript, we report on our approach using the Graphics Processing Unit (GPU) to accelerate rescaling of single Monte Carlo runs to calculate rapidly diffuse reflectance values for different sets of tissue optical properties. We selected MATLAB to enable non-specialists in C and CUDA-based programming to use the generated open-source code. We developed a software package with four abstraction layers. To calculate a set of diffuse reflectance values from a simulated tissue with homogeneous optical properties, our rescaling GPU-based approach achieves a reduction in computation time of several orders of magnitude as compared to other GPU-based approaches. Specifically, our GPU-based approach generated a diffuse reflectance value in 0.08ms. The transfer time from CPU to GPU memory currently is a limiting factor with GPU-based calculations. However, for calculation of multiple diffuse reflectance values, our GPU-based approach still can lead to processing that is ~3400 times faster than other GPU-based approaches.
Visualization and recommendation of large image collections toward effective sensemaking
NASA Astrophysics Data System (ADS)
Gu, Yi; Wang, Chaoli; Nemiroff, Robert; Kao, David; Parra, Denis
2016-03-01
In our daily lives, images are among the most commonly found data which we need to handle. We present iGraph, a graph-based approach for visual analytics of large image collections and their associated text information. Given such a collection, we compute the similarity between images, the distance between texts, and the connection between image and text to construct iGraph, a compound graph representation which encodes the underlying relationships among these images and texts. To enable effective visual navigation and comprehension of iGraph with tens of thousands of nodes and hundreds of millions of edges, we present a progressive solution that offers collection overview, node comparison, and visual recommendation. Our solution not only allows users to explore the entire collection with representative images and keywords but also supports detailed comparison for understanding and intuitive guidance for navigation. The visual exploration of iGraph is further enhanced with the implementation of bubble sets to highlight group memberships of nodes, suggestion of abnormal keywords or time periods based on text outlier detection, and comparison of four different recommendation solutions. For performance speedup, multiple graphics processing units and central processing units are utilized for processing and visualization in parallel. We experiment with two image collections and leverage a cluster driving a display wall of nearly 50 million pixels. We show the effectiveness of our approach by demonstrating experimental results and conducting a user study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lebarbier Dagel, Vanessa M.; Li, J.; Taylor, Charles E.
This collaborative joint research project is in the area of advanced gasification and conversion, within the Chinese Academy of Sciences (CAS)-National Energy Technology Laboratory (NETL)-Pacific Northwest National Laboratory (PNNL) Memorandum of Understanding. The goal for this subtask is the development of advanced syngas conversion technologies. Two areas of investigation were evaluated: Sorption-Enhanced Synthetic Natural Gas Production from Syngas The conversion of synthetic gas (syngas) to synthetic natural gas (SNG) is typically catalyzed by nickel catalysts performed at moderate temperatures (275 to 325°C). The reaction is highly exothermic and substantial heat is liberated, which can lead to process thermal imbalance andmore » destruction of the catalyst. As a result, conversion per pass is typically limited, and substantial syngas recycle is employed. Commercial methanation catalysts and processes have been developed by Haldor Topsoe, and in some reports, they have indicated that there is a need and opportunity for thermally more robust methanation catalysts to allow for higher per-pass conversion in methanation units. SNG process requires the syngas feed with a higher H2/CO ratio than typically produced from gasification processes. Therefore, the water-gas shift reaction (WGS) will be required to tailor the H2/CO ratio. Integration with CO2 separation could potentially eliminate the need for a separate WGS unit, thereby integrating WGS, methanation, and CO2 capture into one single unit operation and, consequently, leading to improved process efficiency. The SNG process also has the benefit of producing a product stream with high CO2 concentrations, which makes CO2 separation more readily achievable. The use of either adsorbents or membranes that selectively separate the CO2 from the H2 and CO would shift the methanation reaction (by driving WGS for hydrogen production) and greatly improve the overall efficiency and economics of the process. The scope of this activity was to develop methods and enabling materials for syngas conversion to SNG with readily CO2 separation. Suitable methanation catalyst and CO2 sorbent materials were developed. Successful proof-of-concept for the combined reaction-sorption process was demonstrated, which culminated in a research publication. With successful demonstration, a decision was made to switch focus to an area of fuels research of more interest to all three research institutions (CAS-NETL-PNNL). Syngas-to-Hydrocarbon Fuels through Higher Alcohol Intermediates There are two types of processes in syngas conversion to fuels that are attracting R&D interest: 1) syngas conversion to mixed alcohols; and 2) syngas conversion to gasoline via the methanol-to-gasoline process developed by Exxon-Mobil in the 1970s. The focus of this task was to develop a one-step conversion technology by effectively incorporating both processes, which is expected to reduce the capital and operational cost associated with the conversion of coal-derived syngas to liquid fuels. It should be noted that this work did not further study the classic Fischer-Tropsch reaction pathway. Rather, we focused on the studies for unique catalyst pathways that involve the direct liquid fuel synthesis enabled by oxygenated intermediates. Recent advances made in the area of higher alcohol synthesis including the novel catalytic composite materials recently developed by CAS using base metal catalysts were used.« less
Analysis and Development of a Web-Enabled Planning and Scheduling Database Application
2013-09-01
establishes an entity—relationship diagram for the desired process, constructs an operable database using MySQL , and provides a web- enabled interface for...development, develop, design, process, re- engineering, reengineering, MySQL , structured query language, SQL, myPHPadmin. 15. NUMBER OF PAGES 107 16...relationship diagram for the desired process, constructs an operable database using MySQL , and provides a web-enabled interface for the population of
Method and system for enabling real-time speckle processing using hardware platforms
NASA Technical Reports Server (NTRS)
Ortiz, Fernando E. (Inventor); Kelmelis, Eric (Inventor); Durbano, James P. (Inventor); Curt, Peterson F. (Inventor)
2012-01-01
An accelerator for the speckle atmospheric compensation algorithm may enable real-time speckle processing of video feeds that may enable the speckle algorithm to be applied in numerous real-time applications. The accelerator may be implemented in various forms, including hardware, software, and/or machine-readable media.
Bio-Fuel Production Assisted with High Temperature Steam Electrolysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grant Hawkes; James O'Brien; Michael McKellar
2012-06-01
Two hybrid energy processes that enable production of synthetic liquid fuels that are compatible with the existing conventional liquid transportation fuels infrastructure are presented. Using biomass as a renewable carbon source, and supplemental hydrogen from high-temperature steam electrolysis (HTSE), these two hybrid energy processes have the potential to provide a significant alternative petroleum source that could reduce dependence on imported oil. The first process discusses a hydropyrolysis unit with hydrogen addition from HTSE. Non-food biomass is pyrolyzed and converted to pyrolysis oil. The pyrolysis oil is upgraded with hydrogen addition from HTSE. This addition of hydrogen deoxygenates the pyrolysis oilmore » and increases the pH to a tolerable level for transportation. The final product is synthetic crude that could then be transported to a refinery and input into the already used transportation fuel infrastructure. The second process discusses a process named Bio-Syntrolysis. The Bio-Syntrolysis process combines hydrogen from HTSE with CO from an oxygen-blown biomass gasifier that yields syngas to be used as a feedstock for synthesis of liquid synthetic crude. Conversion of syngas to liquid synthetic crude, using a biomass-based carbon source, expands the application of renewable energy beyond the grid to include transportation fuels. It can also contribute to grid stability associated with non-dispatchable power generation. The use of supplemental hydrogen from HTSE enables greater than 90% utilization of the biomass carbon content which is about 2.5 times higher than carbon utilization associated with traditional cellulosic ethanol production. If the electrical power source needed for HTSE is based on nuclear or renewable energy, the process is carbon neutral. INL has demonstrated improved biomass processing prior to gasification. Recyclable biomass in the form of crop residue or energy crops would serve as the feedstock for this process. A process model of syngas production using high temperature electrolysis and biomass gasification is presented. Process heat from the biomass gasifier is used to heat steam for the hydrogen production via the high temperature steam electrolysis process. Oxygen produced form the electrolysis process is used to control the oxidation rate in the oxygen-blown biomass gasifier.« less
Quantum Computation Using Optically Coupled Quantum Dot Arrays
NASA Technical Reports Server (NTRS)
Pradhan, Prabhakar; Anantram, M. P.; Wang, K. L.; Roychowhury, V. P.; Saini, Subhash (Technical Monitor)
1998-01-01
A solid state model for quantum computation has potential advantages in terms of the ease of fabrication, characterization, and integration. The fundamental requirements for a quantum computer involve the realization of basic processing units (qubits), and a scheme for controlled switching and coupling among the qubits, which enables one to perform controlled operations on qubits. We propose a model for quantum computation based on optically coupled quantum dot arrays, which is computationally similar to the atomic model proposed by Cirac and Zoller. In this model, individual qubits are comprised of two coupled quantum dots, and an array of these basic units is placed in an optical cavity. Switching among the states of the individual units is done by controlled laser pulses via near field interaction using the NSOM technology. Controlled rotations involving two or more qubits are performed via common cavity mode photon. We have calculated critical times, including the spontaneous emission and switching times, and show that they are comparable to the best times projected for other proposed models of quantum computation. We have also shown the feasibility of accessing individual quantum dots using the NSOM technology by calculating the photon density at the tip, and estimating the power necessary to perform the basic controlled operations. We are currently in the process of estimating the decoherence times for this system; however, we have formulated initial arguments which seem to indicate that the decoherence times will be comparable, if not longer, than many other proposed models.
Jasso, Guillermina
2011-01-01
Migration and stratification are increasingly intertwined. One day soon it will be impossible to understand one without the other. Both focus on life chances. Stratification is about differential life chances - who gets what and why - and migration is about improving life chances - getting more of the good things of life. To examine the interconnections of migration and stratification, we address a mix of old and new questions, carrying out analyses newly enabled by a unique new data set on recent legal immigrants to the United States (the New Immigrant Survey). We look at immigrant processing and lost documents, depression due to the visa process, presentation of self, the race-ethnic composition of an immigrant cohort (made possible by the data for the first time since 1961), black immigration from Africa and the Americas, skin-color diversity among couples formed by U.S. citizen sponsors and immigrant spouses, and English fluency among children age 8–12 and their immigrant parents. We find, inter alia, that children of previously illegal parents are especially more likely to be fluent in English, that native-born U.S. citizen women tend to marry darker, that immigrant applicants who go through the visa process while already in the United States are more likely to have their documents lost and to suffer visa depression, and that immigration, by introducing accomplished black immigrants from Africa (notably via the visa lottery), threatens to overturn racial and skin color associations with skill. Our analyses show the mutual embeddedness of migration and stratification in the unfolding of the immigrants' and their children's life chances and the impacts on the stratification structure of the United States. PMID:26321771
NASA Astrophysics Data System (ADS)
Zhan, Yan; Hou, Guiting; Kusky, Timothy; Gregg, Patricia M.
2016-03-01
The New Madrid Seismic Zone (NMSZ) in the Midwestern United States was the site of several major M 6.8-8 earthquakes in 1811-1812, and remains seismically active. Although this region has been investigated extensively, the ultimate controls on earthquake initiation and the duration of the seismicity remain unclear. In this study, we develop a finite element model for the Central United States to conduct a series of numerical experiments with the goal of determining the impact of heterogeneity in the upper crust, the lower crust, and the mantle on earthquake nucleation and rupture processes. Regional seismic tomography data (CITE) are utilized to infer the viscosity structure of the lithosphere which provide an important input to the numerical models. Results indicate that when differential stresses build in the Central United States, the stresses accumulating beneath the Reelfoot Rift in the NMSZ are highly concentrated, whereas the stresses below the geologically similar Midcontinent Rift System are comparatively low. The numerical observations coincide with the observed distribution of seismicity throughout the region. By comparing the numerical results with three reference models, we argue that an extensive mantle low velocity zone beneath the NMSZ produces differential stress localization in the layers above. Furthermore, the relatively strong crust in this region, exhibited by high seismic velocities, enables the elevated stress to extend to the base of the ancient rift system, reactivating fossil rifting faults and therefore triggering earthquakes. These results show that, if boundary displacements are significant, the NMSZ is able to localize tectonic stresses, which may be released when faults close to failure are triggered by external processes such as melting of the Laurentide ice sheet or rapid river incision.
Bunch, K J; Allin, B; Jolly, M; Hardie, T; Knight, M
2018-05-16
To develop a core metric set to monitor the quality of maternity care. Delphi process followed by a face-to-face consensus meeting. English maternity units. Three representative expert panels: service designers, providers and users. Maternity care metrics judged important by participants. Participants were asked to complete a two-phase Delphi process, scoring metrics from existing local maternity dashboards. A consensus meeting discussed the results and re-scored the metrics. In all, 125 distinct metrics across six domains were identified from existing dashboards. Following the consensus meeting, 14 metrics met the inclusion criteria for the final core set: smoking rate at booking; rate of birth without intervention; caesarean section delivery rate in Robson group 1 women; caesarean section delivery rate in Robson group 2 women; caesarean section delivery rate in Robson group 5 women; third- and fourth-degree tear rate among women delivering vaginally; rate of postpartum haemorrhage of ≥1500 ml; rate of successful vaginal birth after a single previous caesarean section; smoking rate at delivery; proportion of babies born at term with an Apgar score <7 at 5 minutes; proportion of babies born at term admitted to the neonatal intensive care unit; proportion of babies readmitted to hospital at <30 days of age; breastfeeding initiation rate; and breastfeeding rate at 6-8 weeks. Core outcome set methodology can be used to incorporate the views of key stakeholders in developing a core metric set to monitor the quality of care in maternity units, thus enabling improvement. Achieving consensus on core metrics for monitoring the quality of maternity care. © 2018 The Authors. BJOG: An International Journal of Obstetrics and Gynaecology published by John Wiley & Sons Ltd on behalf of Royal College of Obstetricians and Gynaecologists.
Wilson and the United States Entry into the Great War.
ERIC Educational Resources Information Center
Stark, Matthew J.
2002-01-01
Presents a lesson plan that enables students to learn how to analyze primary sources, while they also learn why the United States entered into World War I. States that this lesson can be used as an introduction to World War I. Includes handouts that feature primary materials. (CMK)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-02
... communications (email, eforms) and position services. This configuration is enabled through the Iridium Short... relevant features of the enhanced mobile transmitting unit (E-MTU) VMS and communications service providers... communications service providers (including specifications), please contact the VMS Support Center at phone (888...
37 CFR 1.72 - Title and abstract.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Title and abstract. 1.72 Section 1.72 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF... enable the United States Patent and Trademark Office and the public generally to determine quickly from a...
Rehabilitation Technology for the Blind in the United States.
ERIC Educational Resources Information Center
Jacobson, William H.
Research in the United States and abroad has led to advances in rehabilitation technology that enables blind and visually impaired persons to compete with sighted persons for employment. Relatively inexpensive devices such as pocket calculators, transistor radios, cassette recorders, and digital watches have become aids for the blind; some…
French I Supplementary Reader (For A-LM One, 1961, Units 9-14).
ERIC Educational Resources Information Center
Scott, Linda; Booth, Alice
Supplementary readings intended for use with the 1961 edition of the "A-LM" French 1 course are compiled in this text. They are specifically designed to accompany Units 9-14. It is suggested that the recombination narratives enable students to become more capable of independent reading. (RL)
Robertson, Dale M.
1998-01-01
The variability in water quality throughout the WMIC Study Unit during base-flow conditions could be described very well by subdividing the area into Relatively Homogeneous Units and sampling a few streams with drainage basins completely within these homogeneous units. This subdivision and sampling scheme enabled the differences in water quality to be directly related to the differences in the environmental characteristics that exist throughout the Study Unit.
A Cooling System for Impermeable Clothing
Gleeson, J. P.; Pisani, J. F.
1967-01-01
A self-contained conditioning unit for use with impermeable protective clothing is described. The pack-mounted unit weighing 10 lb. (4·5 kg.) will enable a wearer to work for approximately one hour at temperatures in the zone of evaporative regulation. At 40·6°C. (105°F.), the temperature at which the unit was tested, the heat load imposed by the complete assembly of suit, conditioning unit, and ducting is only slightly higher than that imposed by the wearing of shorts. Images PMID:6028716
Cepeda-Carrión, Gabriel; Cegarra-Navarro, Juan Gabriel; Martínez-Caro, Eva; Eldridge, Stephen
2011-10-01
With the passing of time, knowledge like other resources can become obsolete. Thus, people in a healthcare system need to update their knowledge in order to keep pace with the ongoing changes in their operational environment. Information technology continually provides a great amount of new knowledge which can lead to healthcare professionals becoming overloaded with knowledge. This overloading can be alleviated by a process of unlearning which enables the professional to retain just the relevant and critical knowledge required to improve the quality of service provided by them. This paper shows some of the tools and methods that Hospital-in-the-Home Units (HHUs) have used to update the physician-patient knowledge and the technology knowledge of the HHUs' personnel. A survey study was carried out in the HHU in Spanish health system in 2010. Fifty-five doctors and 62 nurses belonging to 44 HHUs. None. Three hypotheses are presented and supported, which suggest that technology and physician-patient knowledge is related to the unlearning context and the unlearning context impacts positively on the quality of health services provided. The key benefits of the unlearning context for the quality of service provided in HHUs are clear: it enables them to identify and replace poor practices and also avoids the reinvention of the wheel (e.g.: by minimizing unnecessary work caused by the use of poor methods) and it reduces costs through better productivity and efficiency (improving services to patients).
Multi-gigabit optical interconnects for next-generation on-board digital equipment
NASA Astrophysics Data System (ADS)
Venet, Norbert; Favaro, Henri; Sotom, Michel; Maignan, Michel; Berthon, Jacques
2017-11-01
Parallel optical interconnects are experimentally assessed as a technology that may offer the high-throughput data communication capabilities required to the next-generation on-board digital processing units. An optical backplane interconnect was breadboarded, on the basis of a digital transparent processor that provides flexible connectivity and variable bandwidth in telecom missions with multi-beam antenna coverage. The unit selected for the demonstration required that more than tens of Gbit/s be supported by the backplane. The demonstration made use of commercial parallel optical link modules at 850 nm wavelength, with 12 channels running at up to 2.5 Gbit/s. A flexible optical fibre circuit was developed so as to route board-to-board connections. It was plugged to the optical transmitter and receiver modules through 12-fibre MPO connectors. BER below 10-14 and optical link budgets in excess of 12 dB were measured, which would enable to integrate broadcasting. Integration of the optical backplane interconnect was successfully demonstrated by validating the overall digital processor functionality.
Lobato, L C S; Chernicharo, C A L; Pujatti, F J P; Martins, O M; Melo, G C B; Recio, A A R
2013-01-01
A small unit of cogeneration of energy and heat was tested at the Centre for Research and Training on Sanitation UFMG/COPASA - CePTS, located at the Arrudas Sewage Treatment Plant, in Belo Horizonte, Minas Gerais, Brazil. The unit consisted of an engine power generator adapted to run on biogas, a thermal dryer prototype and other peripherals (compressor, biogas storage tank, air blower, etc.). The heat from engine power generator exhaust gases was directed towards the thermal dryer prototype to dry the sludge and disinfect it. The results showed that the experimental apparatus is self-sufficient in electricity, even producing a surplus, available for other uses. The tests of drying and disinfection of sludge lasted 7 h, leading to an increase in solids content from 4 to 8% (50% reduction in sludge volume). Although the drying of sludge was not possible (only thickening was achieved), the disinfection process proved very effective, enabling the complete inactivation of helminth eggs.
Multi-gigabit optical interconnects for next-generation on-board digital equipment
NASA Astrophysics Data System (ADS)
Venet, Norbert; Favaro, Henri; Sotom, Michel; Maignan, Michel; Berthon, Jacques
2004-06-01
Parallel optical interconnects are experimentally assessed as a technology that may offer the high-throughput data communication capabilities required to the next-generation on-board digital processing units. An optical backplane interconnect was breadboarded, on the basis of a digital transparent processor that provides flexible connectivity and variable bandwidth in telecom missions with multi-beam antenna coverage. The unit selected for the demonstration required that more than tens of Gbit/s be supported by the backplane. The demonstration made use of commercial parallel optical link modules at 850 nm wavelength, with 12 channels running at up to 2.5 Gbit/s. A flexible optical fibre circuit was developed so as to route board-to-board connections. It was plugged to the optical transmitter and receiver modules through 12-fibre MPO connectors. BER below 10-14 and optical link budgets in excess of 12 dB were measured, which would enable to integrate broadcasting. Integration of the optical backplane interconnect was successfully demonstrated by validating the overall digital processor functionality.
2017-01-01
We selected iOS in this study as the App operation system, Objective-C as the programming language, and Oracle as the database to develop an App to inspect controlled substances in patient care units. Using a web-enabled smartphone, pharmacist inspection can be performed on site and the inspection result can be directly recorded into HIS through the Internet, so human error of data translation can be minimized and the work efficiency and data processing can be improved. This system not only is fast and convenient compared to the conventional paperwork, but also provides data security and accuracy. In addition, there are several features to increase inspecting quality: (1) accuracy of drug appearance, (2) foolproof mechanism to avoid input errors or miss, (3) automatic data conversion without human judgments, (4) online alarm of expiry date, and (5) instant inspection result to show not meted items. This study has successfully turned paper-based medication inspection into inspection using a web-based mobile device. PMID:28286761
Lu, Ying-Hao; Lee, Li-Yao; Chen, Ying-Lan; Cheng, Hsing-I; Tsai, Wen-Tsung; Kuo, Chen-Chun; Chen, Chung-Yu; Huang, Yaw-Bin
2017-01-01
We selected iOS in this study as the App operation system, Objective-C as the programming language, and Oracle as the database to develop an App to inspect controlled substances in patient care units. Using a web-enabled smartphone, pharmacist inspection can be performed on site and the inspection result can be directly recorded into HIS through the Internet, so human error of data translation can be minimized and the work efficiency and data processing can be improved. This system not only is fast and convenient compared to the conventional paperwork, but also provides data security and accuracy. In addition, there are several features to increase inspecting quality: (1) accuracy of drug appearance, (2) foolproof mechanism to avoid input errors or miss, (3) automatic data conversion without human judgments, (4) online alarm of expiry date, and (5) instant inspection result to show not meted items. This study has successfully turned paper-based medication inspection into inspection using a web-based mobile device.
Fully Roll-to-Roll Gravure Printable Wireless (13.56 MHz) Sensor-Signage Tags for Smart Packaging
NASA Astrophysics Data System (ADS)
Kang, Hwiwon; Park, Hyejin; Park, Yongsu; Jung, Minhoon; Kim, Byung Chul; Wallace, Gordon; Cho, Gyoujin
2014-06-01
Integration of sensing capabilities with an interactive signage through wireless communication is enabling the development of smart packaging wherein wireless (13.56 MHz) power transmission is used to interlock the smart packaging with a wireless (13.56 MHz) reader or a smart phone. Assembly of the necessary componentry for smart packaging on plastic or paper foils is limited by the manufacturing costs involved with Si based technologies. Here, the issue of manufacturing cost for smart packaging has been obviated by materials that allow R2R (roll-to-roll) gravure in combination with R2R coating processes to be employed. R2R gravure was used to print the wireless power transmission device, called rectenna (antenna, diode and capacitor), and humidity sensor on poly(ethylene terephtalate) (PET) films while electrochromic signage units were fabricated by R2R coating. The signage units were laminated with the R2R gravure printed rectenna and sensor to complete the prototype smart packaging.
Fully Roll-to-Roll Gravure Printable Wireless (13.56 MHz) Sensor-Signage Tags for Smart Packaging
Kang, Hwiwon; Park, Hyejin; Park, Yongsu; Jung, Minhoon; Kim, Byung Chul; Wallace, Gordon; Cho, Gyoujin
2014-01-01
Integration of sensing capabilities with an interactive signage through wireless communication is enabling the development of smart packaging wherein wireless (13.56 MHz) power transmission is used to interlock the smart packaging with a wireless (13.56 MHz) reader or a smart phone. Assembly of the necessary componentry for smart packaging on plastic or paper foils is limited by the manufacturing costs involved with Si based technologies. Here, the issue of manufacturing cost for smart packaging has been obviated by materials that allow R2R (roll-to-roll) gravure in combination with R2R coating processes to be employed. R2R gravure was used to print the wireless power transmission device, called rectenna (antenna, diode and capacitor), and humidity sensor on poly(ethylene terephtalate) (PET) films while electrochromic signage units were fabricated by R2R coating. The signage units were laminated with the R2R gravure printed rectenna and sensor to complete the prototype smart packaging. PMID:24953037
Fully roll-to-roll gravure printable wireless (13.56 MHz) sensor-signage tags for smart packaging.
Kang, Hwiwon; Park, Hyejin; Park, Yongsu; Jung, Minhoon; Kim, Byung Chul; Wallace, Gordon; Cho, Gyoujin
2014-06-23
Integration of sensing capabilities with an interactive signage through wireless communication is enabling the development of smart packaging wherein wireless (13.56 MHz) power transmission is used to interlock the smart packaging with a wireless (13.56 MHz) reader or a smart phone. Assembly of the necessary componentry for smart packaging on plastic or paper foils is limited by the manufacturing costs involved with Si based technologies. Here, the issue of manufacturing cost for smart packaging has been obviated by materials that allow R2R (roll-to-roll) gravure in combination with R2R coating processes to be employed. R2R gravure was used to print the wireless power transmission device, called rectenna (antenna, diode and capacitor), and humidity sensor on poly(ethylene terephtalate) (PET) films while electrochromic signage units were fabricated by R2R coating. The signage units were laminated with the R2R gravure printed rectenna and sensor to complete the prototype smart packaging.
Green Application for Space Power
NASA Technical Reports Server (NTRS)
Robinson, Joel
2015-01-01
Most space vehicle auxiliary power units (APUs) use hydrazine propellant for generating power. Hydrazine is a toxic, hazardous fuel that requires special safety equipment and processes for handling and loading. In recent years, there has been development of two green propellants (less toxic) that could enable their use in APUs. The Swedish government, in concert with the Swedish Space Corporation, has developed a propellant based on ammonium dinitramide (LMP-103S) that was flown on the Prisma spacecraft in 2010. The United States Air Force (USAF) has been developing a propellant based on hydroxylammonium nitrate (AFM315E) that is scheduled to fly on the Green Propellant Infusion Mission in the spring of 2016 to demonstrate apogee and reaction control thrusters. However, no one else in the Agency is currently pursuing use of green propellants for application to the APUs. Per the TA-01 Launch Propulsion Roadmap, the Space Technology Mission Directorate had identified the need to have a green propellant APU by 2015. This is our motivation for continuing activities.
Zineh, Issam; Pacanowski, Michael A
2011-08-01
Pharmacogenomics is the study of how genetic variations influence responses to drugs, diagnostics, or biologic agents. The field of pharmacogenomics has significant potential to enhance drug development and aid in making regulatory decisions. The United States Food and Drug Administration (FDA) has supported pharmacogenomics for nearly a decade by providing regulatory advice and reviewing applications, with the intent of discovering and applying genetic determinants of treatment effects. The FDA will continue to develop policies and processes centered on genomics and individualized therapeutics to guide rational drug development. It will also continue to inform the public of clinically relevant pharmacogenomic issues through various mechanisms of communication, such as drug labeling. In this review, we provide a perspective on several pharmacogenomic activities at the FDA. In addition, we attempt to clarify what we believe are several misperceptions regarding the FDA's pharmacogenomic initiatives. We hope this perspective provides a window into some ways in which the FDA is enabling individualized therapeutics through its mission-critical activities.
NASA Astrophysics Data System (ADS)
Lang, Norbert; Hempel, Frank; Strämke, Siegfried; Röpcke, Jürgen
2011-08-01
In situ measurements are reported giving insight into the plasma chemical conversion of the precursor BCl3 in industrial applications of boriding plasmas. For the online monitoring of its ground state concentration, quantum cascade laser absorption spectroscopy (QCLAS) in the mid-infrared spectral range was applied in a plasma assisted chemical vapor deposition (PACVD) reactor. A compact quantum cascade laser measurement and control system (Q-MACS) was developed to allow a flexible and completely dust-sealed optical coupling to the reactor chamber of an industrial plasma surface modification system. The process under the study was a pulsed DC plasma with periodically injected BCl3 at 200 Pa. A synchronization of the Q-MACS with the process control unit enabled an insight into individual process cycles with a sensitivity of 10-6 cm-1·Hz-1/2. Different fragmentation rates of the precursor were found during an individual process cycle. The detected BCl3 concentrations were in the order of 1014 molecules·cm-3. The reported results of in situ monitoring with QCLAS demonstrate the potential for effective optimization procedures in industrial PACVD processes.
Weather satellite picture receiving stations, APT digital scan converter
NASA Technical Reports Server (NTRS)
Vermillion, C. H.; Kamowski, J. C.
1975-01-01
The automatic picture transmission digital scan converter is used at ground stations to convert signals received from scanning radiometers to data compatible with ground equipment designed to receive signals from vidicons aboard operational meteorological satellites. Information necessary to understand the circuit theory, functional operation, general construction and calibration of the converter is provided. Brief and detailed descriptions of each of the individual circuits are included, accompanied by a schematic diagram contained at the end of each circuit description. Listings of integral parts and testing equipment required as well as an overall wiring diagram are included. This unit will enable the user to readily accept and process weather photographs from the operational meteorological satellites.
Solvent properties of hydrazine in the preparation of metal chalcogenide bulk materials and films.
Yuan, Min; Mitzi, David B
2009-08-21
A combination of unique solvent properties of hydrazine enables the direct dissolution of a range of metal chalcogenides at ambient temperature, rendering this an extraordinarily simple and soft synthetic approach to prepare new metal chalcogenide-based materials. The extended metal chalcogenide parent framework is broken up during this process, and the resulting metal chalcogenide building units are re-organized into network structures (from 0D to 3D) based upon their interactions with the hydrazine/hydrazinium moieties. This Perspective will review recent crystal and materials chemistry developments within this family of compounds and will briefly discuss the utility of this approach in metal chalcogenide thin-film deposition.
2013-01-01
The 2002, 2007, and 2012 complementary medicine questionnaires fielded on the National Health Interview Survey provide the most comprehensive data on complementary medicine available for the United States. They filled the void for large-scale, nationally representative, publicly available datasets on the out-of-pocket costs, prevalence, and reasons for use of complementary medicine in the U.S. Despite their wide use, this is the first article describing the multi-faceted and largely qualitative processes undertaken to develop the surveys. We hope this in-depth description enables policy makers and researchers to better judge the content validity and utility of the questionnaires and their resultant publications. PMID:24267412
Stussman, Barbara J; Bethell, Christina D; Gray, Caroline; Nahin, Richard L
2013-11-23
The 2002, 2007, and 2012 complementary medicine questionnaires fielded on the National Health Interview Survey provide the most comprehensive data on complementary medicine available for the United States. They filled the void for large-scale, nationally representative, publicly available datasets on the out-of-pocket costs, prevalence, and reasons for use of complementary medicine in the U.S. Despite their wide use, this is the first article describing the multi-faceted and largely qualitative processes undertaken to develop the surveys. We hope this in-depth description enables policy makers and researchers to better judge the content validity and utility of the questionnaires and their resultant publications.
Programmed release triggered by osmotic gradients in multicomponent vesicles
NASA Astrophysics Data System (ADS)
Dong, Ruo-Yu; Jang, Hyun-Sook; Granick, Steve
Polymersomes, a good candidate for encapsulation and delivery of active ingredients, can be constructed with inter-connected multiple compartments. These so-called multisomes on the one hand enable the spatial separation of various incompatible contents or processes, and on the other hand provide an efficient route for inter-compartment communication via the interface semipermeable membrane. Here we show that by establishing osmotic imbalances between different compartments, interesting synergetic morphology changes of the multisomes can be observed. And by further carefully adjusting the osmotic gradients and the arrangement of compartments, we can realize a cascade rupture of these individual units, which may be a new step towards controlled mixing and timed sequences of chemical reactions.
NASA Astrophysics Data System (ADS)
Levit, Creon; Gazis, P.
2006-06-01
The graphics processing units (GPUs) built in to all professional desktop and laptop computers currently on the market are capable of transforming, filtering, and rendering hundreds of millions of points per second. We present a prototype open-source cross-platform (windows, linux, Apple OSX) application which leverages some of the power latent in the GPU to enable smooth interactive exploration and analysis of large high-dimensional data using a variety of classical and recent techniques. The targeted application area is the interactive analysis of complex, multivariate space science and astrophysics data sets, with dimensionalities that may surpass 100 and sample sizes that may exceed 10^6-10^8.
Maretzki, Audrey N
2007-01-01
With funding provided by the Center for Higher Education of the United States Agency for International Development, The Pennsylvania State University and Tuskegee University collaborated with the University of Nairobi in establishing women's NutriBusiness Cooperatives in the Rift Valley and Central Provinces of Kenya. Between 1992 and 1999, the cooperatives were established, facilities and equipment were supplied and extensive participatory training was provided by university-affiliated investigators and project staff. This initiative enabled approximately 2500 rural Kenyan women farmers to add value to their crops by processing and locally marketing nutritious, convenient, culturally-appropriate weaning food mixes. Implementation of the NutriBusiness model is described and challenges of cultural engagement are highlighted.
The gputools package enables GPU computing in R.
Buckner, Joshua; Wilson, Justin; Seligman, Mark; Athey, Brian; Watson, Stanley; Meng, Fan
2010-01-01
By default, the R statistical environment does not make use of parallelism. Researchers may resort to expensive solutions such as cluster hardware for large analysis tasks. Graphics processing units (GPUs) provide an inexpensive and computationally powerful alternative. Using R and the CUDA toolkit from Nvidia, we have implemented several functions commonly used in microarray gene expression analysis for GPU-equipped computers. R users can take advantage of the better performance provided by an Nvidia GPU. The package is available from CRAN, the R project's repository of packages, at http://cran.r-project.org/web/packages/gputools More information about our gputools R package is available at http://brainarray.mbni.med.umich.edu/brainarray/Rgpgpu
Software for Preprocessing Data From Rocket-Engine Tests
NASA Technical Reports Server (NTRS)
Cheng, Chiu-Fu
2002-01-01
Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC "E" test-stand complex and utilize the SSC file format. The programs are the following: 1) Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel; 2) QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris); and 3) EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PVWAVE based plotting software.
Coupled Gravity and Elevation Measurements of Ice Sheet Mass Change
NASA Technical Reports Server (NTRS)
Jezek, K. C.
2005-01-01
We measured surface gravity and position at ten locations about two glaciological measurement networks located on the South-central Greenland Ice during June 2004. Six of the individual sites of the first network were occupied the previous year. At the repeat sites we were able to measure annual accumulation rate and surface displacement by referencing measurements to aluminum poles left in the firn the previous year. We occupied 4 additional sites at a second measurement network for the first time since initial observations were last made at the network in 1981. At each individual site, we operated a GPS unit for 90 minutes - the unit was operated simultaneously with a base station unit in Sondrestrom Fjord so as to enable differential, post-processing of the data. We installed an aluminum, accumulation-rate-pole at each site. The base section of the pole also served as the mount for the GPS antenna. A new, Scintrex gravimeter was used at each site and relative gravity measurements were tied to the network of absolute gravity stations in Sondrestrom. We measured snow physical properties in two shallow pits. This report summarizes our observations and data analysis.
Maharlou, Hamidreza; Niakan Kalhori, Sharareh R; Shahbazi, Shahrbanoo; Ravangard, Ramin
2018-04-01
Accurate prediction of patients' length of stay is highly important. This study compared the performance of artificial neural network and adaptive neuro-fuzzy system algorithms to predict patients' length of stay in intensive care units (ICU) after cardiac surgery. A cross-sectional, analytical, and applied study was conducted. The required data were collected from 311 cardiac patients admitted to intensive care units after surgery at three hospitals of Shiraz, Iran, through a non-random convenience sampling method during the second quarter of 2016. Following the initial processing of influential factors, models were created and evaluated. The results showed that the adaptive neuro-fuzzy algorithm (with mean squared error [MSE] = 7 and R = 0.88) resulted in the creation of a more precise model than the artificial neural network (with MSE = 21 and R = 0.60). The adaptive neuro-fuzzy algorithm produces a more accurate model as it applies both the capabilities of a neural network architecture and experts' knowledge as a hybrid algorithm. It identifies nonlinear components, yielding remarkable results for prediction the length of stay, which is a useful calculation output to support ICU management, enabling higher quality of administration and cost reduction.
NASA Astrophysics Data System (ADS)
Angiboust, Samuel; Hyppolito, Thais; Glodny, Johannes; Cambeses, Aitor; Monié, Patrick; Garcia-Casco, Antonio; Calderon, Mauricio; Juliani, Caetano
2017-04-01
The Diego de Almagro Island preserves one of the rare remnants of the Mesozoic Chilean paleo-accretionary wedge. This complex, formed by MOR-basalts interleaved with metasedimentary rocks, comprises three major tectonic units with distinct P-T-t paths: the HP granulite (Lazaro unit), the garnet amphibolite (GA) and the blueschist (BS) units. HP granulite-facies metamorphic conditions in the Lazaro Unit are attested by Grt-Cpx-Zo-Prg assemblages associated with trondhjemitic leucosomes (c. 1.3 GPa, 750°C). U-Pb SHRIMP dating of zircon metamorphic rims yields a homogeneous age population of 162 ± 2 Ma for this HT event, in agreement with Sm-Nd dating of peritectic garnet (163 ± 2 Ma and 163 ± 18 Ma). In situ white mica Ar-Ar dating and multi-mineral Rb-Sr dating of LT mylonites (c. 450°C) along the base of the Lazaro Unit reveals partial resetting of HT assemblages during deformation between 115 and 72 Ma. GA unit rocks, structurally below the Lazaro unit, locally preserve eclogite facies parageneses (c. 570°C, 1.7 GPa) that underwent a pervasive stage of amphibolitization during decompression down to 1.3 GPa. U-Pb dating of zircon metamorphic rims and Rb-Sr dating indicate that amphibolitization in GA unit took place at 125-120 Ma. GA unit rocks have been also lately overprinted by another HP-LT assemblage as shown by Si-richer phengite rims and small blue amphibole overgrowths. Conversely, the underlying BS unit does not show strong amphibolite facies overprint as seen in GA and Lazaro units and exhibits slightly cooler peak metamorphic conditions (c. 520°C, 1.7 GPa). Rb-Sr and Ar-Ar dating of these blueschists yield deformation ages between 80 and 70 Ma, i.e. 50 Ma younger than the overlying rocks from the GA unit, and 90 Ma younger than Lazaro unit HP-granulites. This new report sheds light on the formation of the youngest and deepest HP rocks exposed along the Chilean subduction margin. The Diego de Almagro Island represents a unique window onto long-term tectonic processes rooted below the base of the accretionary wedge (c. 40-50 km). The exceptionally long residence time of the earlier accreted material -almost 100 Ma-, enables the record of multiple thermal gradient fluctuations and highlights the variability of the subduction interface thermal structure over tens of millions yrs.
Software to Facilitate Remote Sensing Data Access for Disease Early Warning Systems
Liu, Yi; Hu, Jiameng; Snell-Feikema, Isaiah; VanBemmel, Michael S.; Lamsal, Aashis; Wimberly, Michael C.
2015-01-01
Satellite remote sensing produces an abundance of environmental data that can be used in the study of human health. To support the development of early warning systems for mosquito-borne diseases, we developed an open-source, client based software application to enable the Epidemiological Applications of Spatial Technologies (EASTWeb). Two major design decisions were full automation of the discovery, retrieval and processing of remote sensing data from multiple sources, and making the system easily modifiable in response to changes in data availability and user needs. Key innovations that helped to achieve these goals were the implementation of a software framework for data downloading and the design of a scheduler that tracks the complex dependencies among multiple data processing tasks and makes the system resilient to external errors. EASTWeb has been successfully applied to support forecasting of West Nile virus outbreaks in the United States and malaria epidemics in the Ethiopian highlands. PMID:26644779
Describing the clinical reasoning process: application of a model of enablement to a pediatric case.
Furze, Jennifer; Nelson, Kelly; O'Hare, Megan; Ortner, Amanda; Threlkeld, A Joseph; Jensen, Gail M
2013-04-01
Clinical reasoning is a core tenet of physical therapy practice leading to optimal patient care. The purpose of this case was to describe the outcomes, subjective experience, and reflective clinical reasoning process for a child with cerebral palsy using the International Classification of Functioning, Disability, and Health (ICF) model. Application of the ICF framework to a 9-year-old boy with spastic triplegic cerebral palsy was utilized to capture the interwoven factors present in this case. Interventions in the pool occurred twice weekly for 1 h over a 10-week period. Immediately post and 4 months post-intervention, the child made functional and meaningful gains. The family unit also developed an enjoyment of exercising together. Each individual family member described psychological, emotional, or physical health improvements. Reflection using the ICF model as a framework to discuss clinical reasoning can highlight important factors contributing to effective patient management.
Aerodynamic optimization of supersonic compressor cascade using differential evolution on GPU
NASA Astrophysics Data System (ADS)
Aissa, Mohamed Hasanine; Verstraete, Tom; Vuik, Cornelis
2016-06-01
Differential Evolution (DE) is a powerful stochastic optimization method. Compared to gradient-based algorithms, DE is able to avoid local minima but requires at the same time more function evaluations. In turbomachinery applications, function evaluations are performed with time-consuming CFD simulation, which results in a long, non affordable, design cycle. Modern High Performance Computing systems, especially Graphic Processing Units (GPUs), are able to alleviate this inconvenience by accelerating the design evaluation itself. In this work we present a validated CFD Solver running on GPUs, able to accelerate the design evaluation and thus the entire design process. An achieved speedup of 20x to 30x enabled the DE algorithm to run on a high-end computer instead of a costly large cluster. The GPU-enhanced DE was used to optimize the aerodynamics of a supersonic compressor cascade, achieving an aerodynamic loss minimization of 20%.
Analyzing the requirements for mass production of small wind turbine generators
NASA Astrophysics Data System (ADS)
Anuskiewicz, T.; Asmussen, J.; Frankenfield, O.
Mass producibility of small wind turbine generators to give manufacturers design and cost data for profitable production operations is discussed. A 15 kW wind turbine generator for production in annual volumes from 1,000 to 50,000 units is discussed. Methodology to cost the systems effectively is explained. The process estimate sequence followed is outlined with emphasis on the process estimate sheets compiled for each component and subsystem. These data enabled analysts to develop cost breakdown profiles crucial in manufacturing decision-making. The appraisal also led to various design recommendations including replacement of aluminum towers with cost effective carbon steel towers. Extensive cost information is supplied in tables covering subassemblies, capital requirements, and levelized energy costs. The physical layout of the plant is depicted to guide manufacturers in taking advantage of the growing business opportunity now offered in conjunction with the national need for energy development.
Yonamine, Yusuke; Cervantes-Salguero, Keitel; Minami, Kosuke; Kawamata, Ibuki; Nakanishi, Waka; Hill, Jonathan P; Murata, Satoshi; Ariga, Katsuhiko
2016-05-14
In this study, a Langmuir-Blodgett (LB) system has been utilized for the regulation of polymerization of a DNA origami structure at the air-water interface as a two-dimensionally confined medium, which enables dynamic condensation of DNA origami units through variation of the film area at the macroscopic level (ca. 10-100 cm(2)). DNA origami sheets were conjugated with a cationic lipid (dioctadecyldimethylammonium bromide, 2C18N(+)) by electrostatic interaction and the corresponding LB-film was prepared. By applying dynamic pressure variation through compression-expansion processes, the lipid-modified DNA origami sheets underwent anisotropic polymerization forming a one-dimensionally assembled belt-shaped structure of a high aspect ratio although the thickness of the polymerized DNA origami was maintained at the unimolecular level. This approach opens up a new field of mechanical induction of the self-assembly of DNA origami structures.
Optimal superadiabatic population transfer and gates by dynamical phase corrections
NASA Astrophysics Data System (ADS)
Vepsäläinen, A.; Danilin, S.; Paraoanu, G. S.
2018-04-01
In many quantum technologies adiabatic processes are used for coherent quantum state operations, offering inherent robustness to errors in the control parameters. The main limitation is the long operation time resulting from the requirement of adiabaticity. The superadiabatic method allows for faster operation, by applying counterdiabatic driving that corrects for excitations resulting from the violation of the adiabatic condition. In this article we show how to construct the counterdiabatic Hamiltonian in a system with forbidden transitions by using two-photon processes and how to correct for the resulting time-dependent ac-Stark shifts in order to enable population transfer with unit fidelity. We further demonstrate that superadiabatic stimulated Raman passage can realize a robust unitary NOT-gate between the ground state and the second excited state of a three-level system. The results can be readily applied to a three-level transmon with the ladder energy level structure.
Thio-amide functionalized polymers via polymerization or post-polymerization modification
NASA Astrophysics Data System (ADS)
Ozcam, Ali; Henke, Adam; Stibingerova, Iva; Srogl, Jiri; Genzer, Jan
2011-03-01
Decreasing supplies of fresh water and increasing population necessitates development of advanced water cleaning technologies, which would facilitate the removal of water pollutants. Amongst the worst of such contaminants are heavy metals and cyanides, infamous for their high toxicity. To assist the water purification processes, we aim to synthesize functionalized macromolecules that would contribute in the decontamination processes by scavenging detrimental chemicals. Epitomizing this role thio-amide unit features remarkable chemical flexibility that facilitates reversible catch-release of the ions, where the behavior controlled by subtle red-ox changes in the environment. Chemical tunability of the thio-amide moiety enables synthesis of thio-amide based monomers and post-polymerization modification agents. Two distinct synthetic pathways, polymerization and post-polymerization modification, have been exploited, leading to functional thioamide-based macromolecules: thioamide-monomers were copolymerized with N-isopropylacrylamide and post-polymerization modifications of poly(dimethylaminoethyl methacrylate) and poly(propargyl methacrylate) were accomplished via quarternization and ``click'' reactions, respectively.
Addressing the ethical, legal, and social issues raised by voting by persons with dementia.
Karlawish, Jason H; Bonnie, Richard J; Appelbaum, Paul S; Lyketsos, Constantine; James, Bryan; Knopman, David; Patusky, Christopher; Kane, Rosalie A; Karlan, Pamela S
2004-09-15
This article addresses an emerging policy problem in the United States participation in the electoral process by citizens with dementia. At present, health care professionals, family caregivers, and long-term care staff lack adequate guidance to decide whether individuals with dementia should be precluded from or assisted in casting a ballot. Voting by persons with dementia raises a series of important questions about the autonomy of individuals with dementia, the integrity of the electoral process, and the prevention of fraud. Three subsidiary issues warrant special attention: development of a method to assess capacity to vote; identification of appropriate kinds of assistance to enable persons with cognitive impairment to vote; and formulation of uniform and workable policies for voting in long-term care settings. In some instances, extrapolation from existing policies and research permits reasonable recommendations to guide policy and practice. However, in other instances, additional research is necessary.
NASA Astrophysics Data System (ADS)
Gesing, Adam J.; Das, Subodh K.
2017-02-01
With United States Department of Energy Advanced Research Project Agency funding, experimental proof-of-concept was demonstrated for RE-12TM electrorefining process of extraction of desired amount of Mg from recycled scrap secondary Al molten alloys. The key enabling technology for this process was the selection of the suitable electrolyte composition and operating temperature. The selection was made using the FactSage thermodynamic modeling software and the light metal, molten salt, and oxide thermodynamic databases. Modeling allowed prediction of the chemical equilibria, impurity contents in both anode and cathode products, and in the electrolyte. FactSage also provided data on the physical properties of the electrolyte and the molten metal phases including electrical conductivity and density of the molten phases. Further modeling permitted selection of electrode and cell construction materials chemically compatible with the combination of molten metals and the electrolyte.
Progress on 241Am Production for Use in Radioisotope Power Systems
NASA Astrophysics Data System (ADS)
Baker, S. R.; Bell, K. J.; Brown, J.; Carrigan, C.; Carrott, M. J.; Gregson, C.; Clough, M.; Maher, C. J.; Mason, C.; Rhodes, C. J.; Rice, T. G.; Sarsfield, M. J.; Stephenson, K.; Taylor, R. J.; Tinsley, T. P.; Woodhead, D. A.; Wiss, T.
2014-08-01
Electrical power sources used in outer planet missions are a key enabling technology for data acquisition and communications. Power sources generate electricity from the thermal energy from alpha decay of the radioisotope 238Pu via thermo-electric conversion. Production of 238Pu requires specialist facilities including a nuclear reactor and reprocessing plants that are expensive to build and operate, so naturally, a more economical alternative is attractive to the industry. Within Europe 241Am is a feasible alternative to 238Pu that can provide a heat source for radioisotope thermoelectric generators (RTGs) and radioisotope heating units (RHUs). As a daughter product of 241Pu decay, 241Am is present at 1000s kg levels within the UK civil plutonium stockpile.A chemical separation process is required to extract the 241Am in a pure form and this paper describes such a process, successfully developed to the proof of concept stage.
Aerodynamic optimization of supersonic compressor cascade using differential evolution on GPU
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aissa, Mohamed Hasanine; Verstraete, Tom; Vuik, Cornelis
Differential Evolution (DE) is a powerful stochastic optimization method. Compared to gradient-based algorithms, DE is able to avoid local minima but requires at the same time more function evaluations. In turbomachinery applications, function evaluations are performed with time-consuming CFD simulation, which results in a long, non affordable, design cycle. Modern High Performance Computing systems, especially Graphic Processing Units (GPUs), are able to alleviate this inconvenience by accelerating the design evaluation itself. In this work we present a validated CFD Solver running on GPUs, able to accelerate the design evaluation and thus the entire design process. An achieved speedup of 20xmore » to 30x enabled the DE algorithm to run on a high-end computer instead of a costly large cluster. The GPU-enhanced DE was used to optimize the aerodynamics of a supersonic compressor cascade, achieving an aerodynamic loss minimization of 20%.« less
[Legal aspects and the treatment procedure of gender dysphoria in Hungary].
Kórász, Krisztián
2015-07-26
The legal process of gender transition in Hungary had previously been more developed as in most European countries, as the law enabled transsexual people to change their name and gender before or without a medical treatment, which was unique at the time. Over the years, however, lots of European countries developed legal frameworks and accepted international standards of care for the treatment of gender dysphoria that Hungary did not follow. Currently in Hungary there is no consistent legal framework of gender transition, there is no official regulation or guidelines regarding gender transition process, no institution with the obligation to accommodate the process, and there is no nominated specialist in the state health care system whose remit included dealing with transsexual patients. The information on gender transition options both to the professionals and to the patients is limited and incoherent. This paper reviews the legal aspects and clinical management process of gender dysphoria in Hungary. Some issues regarding the Hungarian practice and possible solutions based on examples from the United Kingdom are addressed within the paper.
Roy, Abhishek; Klinefelter, Alicia; Yahya, Farah B; Chen, Xing; Gonzalez-Guerrero, Luisa Patricia; Lukas, Christopher J; Kamakshi, Divya Akella; Boley, James; Craig, Kyle; Faisal, Muhammad; Oh, Seunghyun; Roberts, Nathan E; Shakhsheer, Yousef; Shrivastava, Aatmesh; Vasudevan, Dilip P; Wentzloff, David D; Calhoun, Benton H
2015-12-01
This paper presents a batteryless system-on-chip (SoC) that operates off energy harvested from indoor solar cells and/or thermoelectric generators (TEGs) on the body. Fabricated in a commercial 0.13 μW process, this SoC sensing platform consists of an integrated energy harvesting and power management unit (EH-PMU) with maximum power point tracking, multiple sensing modalities, programmable core and a low power microcontroller with several hardware accelerators to enable energy-efficient digital signal processing, ultra-low-power (ULP) asymmetric radios for wireless transmission, and a 100 nW wake-up radio. The EH-PMU achieves a peak end-to-end efficiency of 75% delivering power to a 100 μA load. In an example motion detection application, the SoC reads data from an accelerometer through SPI, processes it, and sends it over the radio. The SPI and digital processing consume only 2.27 μW, while the integrated radio consumes 4.18 μW when transmitting at 187.5 kbps for a total of 6.45 μW.
Evaluation of the quality of the teaching-learning process in undergraduate courses in Nursing 1
González-Chordá, Víctor Manuel; Maciá-Soler, María Loreto
2015-01-01
Abstract Objective: to identify aspects of improvement of the quality of the teaching-learning process through the analysis of tools that evaluated the acquisition of skills by undergraduate students of Nursing. Method: prospective longitudinal study conducted in a population of 60 secondyear Nursing students based on registration data, from which quality indicators that evaluate the acquisition of skills were obtained, with descriptive and inferential analysis. Results: nine items were identified and nine learning activities included in the assessment tools that did not reach the established quality indicators (p<0.05). There are statistically significant differences depending on the hospital and clinical practices unit (p<0.05). Conclusion: the analysis of the evaluation tools used in the article "Nursing Care in Welfare Processes" of the analyzed university undergraduate course enabled the detection of the areas for improvement in the teachinglearning process. The challenge of education in nursing is to reach the best clinical research and educational results, in order to provide improvements to the quality of education and health care. PMID:26444173
Human Spaceflight Safety for the Next Generation on Orbital Space Systems
NASA Technical Reports Server (NTRS)
Mango, Edward J.
2011-01-01
The National Aeronautics and Space Administration (NASA) Commercial Crew Program (CCP) has been chartered to facilitate the development of a United States (U.S.) commercial crew space transportation capability with the goal of achieving safe, reliable, and cost effective access to and from low Earth orbit (LEO) and the International Space Station (ISS) as soon as possible. Once the capability is matured and is available to the Government and other customers, NASA expects to purchase commercial services to meet its ISS crew rotation and emergency return objectives. The primary role of the CCP is to enable and ensure safe human spaceflight and processes for the next generation of earth orbital space systems. The architecture of the Program delineates the process for investment performance in safe orbital systems, Crew Transportation System (CTS) certification, and CTS Flight Readiness. A series of six technical documents build up the architecture to address the top-level CTS requirements and standards. They include Design Reference Missions, with the near term focus on ISS crew services, Certification and Service Requirements, Technical Management Processes, and Technical and Operations Standards Evaluation Processes.
A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services
NASA Astrophysics Data System (ADS)
Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.
2015-12-01
Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014, 11th International Conf. on Hydroinformatics, New York, NY.
Universal electronics for miniature and automated chemical assays.
Urban, Pawel L
2015-02-21
This minireview discusses universal electronic modules (generic programmable units) and their use by analytical chemists to construct inexpensive, miniature or automated devices. Recently, open-source platforms have gained considerable popularity among tech-savvy chemists because their implementation often does not require expert knowledge and investment of funds. Thus, chemistry students and researchers can easily start implementing them after a few hours of reading tutorials and trial-and-error. Single-board microcontrollers and micro-computers such as Arduino, Teensy, Raspberry Pi or BeagleBone enable collecting experimental data with high precision as well as efficient control of electric potentials and actuation of mechanical systems. They are readily programmed using high-level languages, such as C, C++, JavaScript or Python. They can also be coupled with mobile consumer electronics, including smartphones as well as teleinformatic networks. More demanding analytical tasks require fast signal processing. Field-programmable gate arrays enable efficient and inexpensive prototyping of high-performance analytical platforms, thus becoming increasingly popular among analytical chemists. This minireview discusses the advantages and drawbacks of universal electronic modules, considering their application in prototyping and manufacture of intelligent analytical instrumentation.
van Oordt, Thomas; Barb, Yannick; Smetana, Jan; Zengerle, Roland; von Stetten, Felix
2013-08-07
Stick-packaging of goods in tubular-shaped composite-foil pouches has become a popular technology for food and drug packaging. We miniaturized stick-packaging for use in lab-on-a-chip (LOAC) systems to pre-store and on-demand release the liquid and dry reagents in a volume range of 80-500 μl. An integrated frangible seal enables the pressure-controlled release of reagents and simplifies the layout of LOAC systems, thereby making the package a functional microfluidic release unit. The frangible seal is adjusted to defined burst pressures ranging from 20 to 140 kPa. The applied ultrasonic welding process allows the packaging of temperature sensitive reagents. Stick-packs have been successfully tested applying recovery tests (where 99% (STDV = 1%) of 250 μl pre-stored liquid is released), long-term storage tests (where there is loss of only <0.5% for simulated 2 years) and air transport simulation tests. The developed technology enables the storage of a combination of liquid and dry reagents. It is a scalable technology suitable for rapid prototyping and low-cost mass production.
Problem Solving. Workplace Strategies for Thoughtful Change.
ERIC Educational Resources Information Center
Diller, Janelle; Moore, Rita
This learning module is designed to enable participants to look at problems from a variety of perspectives, to apply a basic problem-solving strategy, to implement a plan of action, and to identify problems that are of particular importance to their workplace. The module includes units for six class sessions. Each unit includes the following…
Geriatric Nutrition Workshop for the Dietetic Assistant.
ERIC Educational Resources Information Center
Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.
This workshop guide is a unit of study for teaching dietetic assistants to work with elderly persons. The objective of the unit is to enable the students to apply knowledge of the physiological and psychological effects of aging in providing nutritional care to the elderly in independent living and nursing home situations. Following the unit…
Citizenship and Human Rights Education: A Comparison of Textbooks in Turkey and the United States
ERIC Educational Resources Information Center
Karaman-Kepenekci, Yasemin
2005-01-01
Textbooks are major educational tools for students. A United Nations Educational, Scientific, and Cultural Organization (UNESCO) project titled "Basic Learning Material" claims that textbooks provide the main resource for teachers, enabling them to animate the curricula and giving life to the subjects taught in the classroom. As Power…
Going Global: Science Issues for the Junior High.
ERIC Educational Resources Information Center
Cronkhite, Louella; And Others
This book contains a unit on science and global education that is designed to enable students to gain a practical understanding of the world they live in and the confidence to take appropriate action as responsible global citizens. This unit emphasizes cooperative learning that is experiential and participatory. Teachers and students are…
Smoking and Health. A Guide for School Action Grades 1-12.
ERIC Educational Resources Information Center
Ohio State Dept. of Health, Columbus.
Enabling teachers to present a detailed unit on smoking is an objective of this curriculum guide. It organizes information which, if made a relevant part of the student's experience, attempts to help him effectively resist the pressures to begin smoking. Seventeen units, arranged in sequential order, cover cardiovascular and respiratory systems,…
The Work of the Television Journalist.
ERIC Educational Resources Information Center
Tyrrell, Robert
This book describes the various functions of the television journalist--in the United States and Great Britain--and supplies knowledge enabling members of a television team to work successfully as a unit. Separate chapters are devoted to discussions of (1) the world of television journalism, (2) writing for television, (3) the role of the…
Schoolchildren and Drugs in 1987.
ERIC Educational Resources Information Center
Balding, John
Since 1983 the Health Education Authority Schools Education Unit has been providing a survey service to schools throughout the United Kingdom. The service enables a school to survey the health behavior of boys and girls at different ages. The purpose is to make the planning of programs in Health and Social Education in the schools more realistic.…
Experiments on Learning by Back Propagation.
ERIC Educational Resources Information Center
Plaut, David C.; And Others
This paper describes further research on a learning procedure for layered networks of deterministic, neuron-like units, described by Rumelhart et al. The units, the way they are connected, the learning procedure, and the extension to iterative networks are presented. In one experiment, a network learns a set of filters, enabling it to discriminate…
Pressure Swing Adsorption in the Unit Operations Laboratory
ERIC Educational Resources Information Center
Ganley, Jason
2018-01-01
This paper describes a student laboratory in the Unit Operations Laboratory at the Colorado School of Mines: air separation by pressure swing adsorption. The flexibility of the system enables students to study the production of enriched nitrogen or oxygen streams. Automatic data acquisition permits the study of cycle steps and performance.…
Competency-Based Curriculum Guide for Introduction to Business. Grades 9-12. Bulletin No. 1729.
ERIC Educational Resources Information Center
Louisiana State Dept. of Education, Baton Rouge. Div. of Vocational Education.
This curriculum guide is intended to assist business teachers in enabling students to develop consumer-business and socioeconomic competencies necessary for success in this competitive free enterprise system. Introductory materials include suggested teacher activities. Materials for 12 units are provided. Each unit contains these components: time…
Ancient India: The Asiatic Ethiopians.
ERIC Educational Resources Information Center
Scott, Carolyn McPherson
This curriculum unit was developed by a participant in the 1993 Fulbright-Hays Program "India: Continuity and Change." The unit attempts to place India in the "picture frame" of the ancient world as a part of a whole, not as a separate entity. Reading materials enable students to draw broader general conclusions based on the…
Nutrition and the Growing Population. Environmental Education Curriculum. Revised.
ERIC Educational Resources Information Center
Topeka Public Schools, KS.
This unit attempts to respond to the increasing problems of malnutrition in the United States seemingly related to rising market prices, low quality foods attracting the consumer dollar and the shrinking number of students studying nutrition in our schools. It is designed to enable secondary school students to evaluate food selections, understand…
Design of voice coil motor dynamic focusing unit for a laser scanner
NASA Astrophysics Data System (ADS)
Lee, Moon G.; Kim, Gaeun; Lee, Chan-Woo; Lee, Soo-Hun; Jeon, Yongho
2014-04-01
Laser scanning systems have been used for material processing tasks such as welding, cutting, marking, and drilling. However, applications have been limited by the small range of motion and slow speed of the focusing unit, which carries the focusing optics. To overcome these limitations, a dynamic focusing system with a long travel range and high speed is needed. In this study, a dynamic focusing unit for a laser scanning system with a voice coil motor (VCM) mechanism is proposed to enable fast speed and a wide focusing range. The VCM has finer precision and higher speed than conventional step motors and a longer travel range than earlier lead zirconium titanate actuators. The system has a hollow configuration to provide a laser beam path. This also makes it compact and transmission-free and gives it low inertia. The VCM's magnetics are modeled using a permeance model. Its design parameters are determined by optimization using the Broyden-Fletcher-Goldfarb-Shanno method and a sequential quadratic programming algorithm. After the VCM is designed, the dynamic focusing unit is fabricated and assembled. The permeance model is verified by a magnetic finite element method simulation tool, Maxwell 2D and 3D, and by measurement data from a gauss meter. The performance is verified experimentally. The results show a resolution of 0.2 μm and travel range of 16 mm. These are better than those of conventional focusing systems; therefore, this focusing unit can be applied to laser scanning systems for good machining capability.
Design of voice coil motor dynamic focusing unit for a laser scanner.
Lee, Moon G; Kim, Gaeun; Lee, Chan-Woo; Lee, Soo-Hun; Jeon, Yongho
2014-04-01
Laser scanning systems have been used for material processing tasks such as welding, cutting, marking, and drilling. However, applications have been limited by the small range of motion and slow speed of the focusing unit, which carries the focusing optics. To overcome these limitations, a dynamic focusing system with a long travel range and high speed is needed. In this study, a dynamic focusing unit for a laser scanning system with a voice coil motor (VCM) mechanism is proposed to enable fast speed and a wide focusing range. The VCM has finer precision and higher speed than conventional step motors and a longer travel range than earlier lead zirconium titanate actuators. The system has a hollow configuration to provide a laser beam path. This also makes it compact and transmission-free and gives it low inertia. The VCM's magnetics are modeled using a permeance model. Its design parameters are determined by optimization using the Broyden-Fletcher-Goldfarb-Shanno method and a sequential quadratic programming algorithm. After the VCM is designed, the dynamic focusing unit is fabricated and assembled. The permeance model is verified by a magnetic finite element method simulation tool, Maxwell 2D and 3D, and by measurement data from a gauss meter. The performance is verified experimentally. The results show a resolution of 0.2 μm and travel range of 16 mm. These are better than those of conventional focusing systems; therefore, this focusing unit can be applied to laser scanning systems for good machining capability.
NASA Astrophysics Data System (ADS)
Weng, M. H.; Clark, D. T.; Wright, S. N.; Gordon, D. L.; Duncan, M. A.; Kirkham, S. J.; Idris, M. I.; Chan, H. K.; Young, R. A. R.; Ramsay, E. P.; Wright, N. G.; Horsfall, A. B.
2017-05-01
A high manufacturing readiness level silicon carbide (SiC) CMOS technology is presented. The unique process flow enables the monolithic integration of pMOS and nMOS transistors with passive circuit elements capable of operation at temperatures of 300 °C and beyond. Critical to this functionality is the behaviour of the gate dielectric and data for high temperature capacitance-voltage measurements are reported for SiO2/4H-SiC (n and p type) MOS structures. In addition, a summary of the long term reliability for a range of structures including contact chains to both n-type and p-type SiC, as well as simple logic circuits is presented, showing function after 2000 h at 300 °C. Circuit data is also presented for the performance of digital logic devices, a 4 to 1 analogue multiplexer and a configurable timer operating over a wide temperature range. A high temperature micro-oven system has been utilised to enable the high temperature testing and stressing of units assembled in ceramic dual in line packages, including a high temperature small form-factor SiC based bridge leg power module prototype, operated for over 1000 h at 300 °C. The data presented show that SiC CMOS is a key enabling technology in high temperature integrated circuit design. In particular it provides the ability to realise sensor interface circuits capable of operating above 300 °C, accommodate shifts in key parameters enabling deployment in applications including automotive, aerospace and deep well drilling.
NASA Astrophysics Data System (ADS)
Negrut, Dan; Lamb, David; Gorsich, David
2011-06-01
This paper describes a software infrastructure made up of tools and libraries designed to assist developers in implementing computational dynamics applications running on heterogeneous and distributed computing environments. Together, these tools and libraries compose a so called Heterogeneous Computing Template (HCT). The heterogeneous and distributed computing hardware infrastructure is assumed herein to be made up of a combination of CPUs and Graphics Processing Units (GPUs). The computational dynamics applications targeted to execute on such a hardware topology include many-body dynamics, smoothed-particle hydrodynamics (SPH) fluid simulation, and fluid-solid interaction analysis. The underlying theme of the solution approach embraced by HCT is that of partitioning the domain of interest into a number of subdomains that are each managed by a separate core/accelerator (CPU/GPU) pair. Five components at the core of HCT enable the envisioned distributed computing approach to large-scale dynamical system simulation: (a) the ability to partition the problem according to the one-to-one mapping; i.e., spatial subdivision, discussed above (pre-processing); (b) a protocol for passing data between any two co-processors; (c) algorithms for element proximity computation; and (d) the ability to carry out post-processing in a distributed fashion. In this contribution the components (a) and (b) of the HCT are demonstrated via the example of the Discrete Element Method (DEM) for rigid body dynamics with friction and contact. The collision detection task required in frictional-contact dynamics (task (c) above), is shown to benefit on the GPU of a two order of magnitude gain in efficiency when compared to traditional sequential implementations. Note: Reference herein to any specific commercial products, process, or service by trade name, trademark, manufacturer, or otherwise, does not imply its endorsement, recommendation, or favoring by the United States Army. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Army, and shall not be used for advertising or product endorsement purposes.
Decentralizing Centralized Control: Reorienting a Fundamental Tenet for Resilient Air Operation
2008-05-22
Forward Air Controller FBIS Foreign Broadcast Information Service FCC Functional Component Commander FM Field Manual GBU - 39 Guided Bomb Unit - 39...Diameter Bomb ( GBU - 39 ) TEG Test and Evaluation Group TST Time Sensitive Target UAV Unmanned Aerial Vehicle USAF United States Air Force USAFWS United...www.afmc.af.mil/news/story.asp?id=123017916 (accessed November 11, 2007). Recent fielding of the GBU - 39 , Small Diameter Bomb (SDB), enables a
Careers (A Course of Study). Unit VII: Now that You've Got the Job--How Do You Keep It?
ERIC Educational Resources Information Center
Turley, Kay
Designed to enable special needs students to learn how to hold a job and how to change jobs when necessary, this set of activities is the seventh unit in a nine-unit secondary level course intended to provide handicapped students with the knowledge and tools necessary to succeed in the world of work. In the first chapter entitled "How Do I Keep My…
Chen, Shuo; Bi, Xiaoping; Sun, Lijie; Gao, Jin; Huang, Peng; Fan, Xianqun; You, Zhengwei; Wang, Yadong
2016-08-17
Biodegradable and biocompatible elastomers (bioelastomers) could resemble the mechanical properties of extracellular matrix and soft tissues and, thus, are very useful for many biomedical applications. Despite significant advances, tunable bioelastomers with easy processing, facile biofunctionalization, and the ability to withstand a mechanically dynamic environment have remained elusive. Here, we reported new dynamic hydrogen-bond cross-linked PSeD-U bioelastomers possessing the aforementioned features by grafting 2-ureido-4[1H]-pyrimidinones (UPy) units with strong self-complementary quadruple hydrogen bonds to poly(sebacoyl diglyceride) (PSeD), a refined version of a widely used bioelastomer poly(glycerol sebacate) (PGS). PSeD-U polymers exhibited stronger mechanical strength than their counterparts of chemically cross-linked PSeD and tunable elasticity by simply varying the content of UPy units. In addition to the good biocompatibility and biodegradability as seen in PSeD, PSeD-U showed fast self-healing (within 30 min) at mild conditions (60 °C) and could be readily processed at moderate temperature (90-100 °C) or with use of solvent casting at room temperature. Furthermore, the free hydroxyl groups of PSeD-U enabled facile functionalization, which was demonstrated by the modification of PSeD-U film with FITC as a model functional molecule.
Follicular Unit Extraction Hair Transplantation with Micromotor: Eight Years Experience.
Ors, Safvet; Ozkose, Mehmet; Ors, Sevgi
2015-08-01
Follicular unit extraction (FUE) has been performed for over a decade. Our experience in the patients who underwent hair transplantation using only the FUE method was included in this study. A total of 1000 patients had hair transplantation using the FUE method between 2005 and 2014 in our clinic. Manual punch was used in 32 and micromotor was used in 968 patients for graft harvesting. During the time that manual punch was used for graft harvesting, 1000-2000 grafts were transplanted in one session in 6-8 h. Following micromotor use, the average graft count was increased to 2500 and the operation time remained unchanged. Graft take was difficult in 11.1 %, easy in 52.2 %, and very easy in 36.7 % of our patients. The main purpose of hair transplantation is to restore the hair loss. During the process, obtaining a natural appearance and adequate hair intensity is important. In the FUE method, grafts can be taken without changing their natural structure, there is no need for magnification, and the grafts can be transplanted directly without using any other processes. Because there is no suture in the FUE method, patients do not experience these incision site problems and scar formation. The FUE method enables us to achieve a natural appearance with less morbidity.
NASA Astrophysics Data System (ADS)
Hayakawa, Hitoshi; Ogawa, Makoto; Shibata, Tadashi
2005-04-01
A very large scale integrated circuit (VLSI) architecture for a multiple-instruction-stream multiple-data-stream (MIMD) associative processor has been proposed. The processor employs an architecture that enables seamless switching from associative operations to arithmetic operations. The MIMD element is convertible to a regular central processing unit (CPU) while maintaining its high performance as an associative processor. Therefore, the MIMD associative processor can perform not only on-chip perception, i.e., searching for the vector most similar to an input vector throughout the on-chip cache memory, but also arithmetic and logic operations similar to those in ordinary CPUs, both simultaneously in parallel processing. Three key technologies have been developed to generate the MIMD element: associative-operation-and-arithmetic-operation switchable calculation units, a versatile register control scheme within the MIMD element for flexible operations, and a short instruction set for minimizing the memory size for program storage. Key circuit blocks were designed and fabricated using 0.18 μm complementary metal-oxide-semiconductor (CMOS) technology. As a result, the full-featured MIMD element is estimated to be 3 mm2, showing the feasibility of an 8-parallel-MIMD-element associative processor in a single chip of 5 mm× 5 mm.
Acceleration of spiking neural network based pattern recognition on NVIDIA graphics processors.
Han, Bing; Taha, Tarek M
2010-04-01
There is currently a strong push in the research community to develop biological scale implementations of neuron based vision models. Systems at this scale are computationally demanding and generally utilize more accurate neuron models, such as the Izhikevich and the Hodgkin-Huxley models, in favor of the more popular integrate and fire model. We examine the feasibility of using graphics processing units (GPUs) to accelerate a spiking neural network based character recognition network to enable such large scale systems. Two versions of the network utilizing the Izhikevich and Hodgkin-Huxley models are implemented. Three NVIDIA general-purpose (GP) GPU platforms are examined, including the GeForce 9800 GX2, the Tesla C1060, and the Tesla S1070. Our results show that the GPGPUs can provide significant speedup over conventional processors. In particular, the fastest GPGPU utilized, the Tesla S1070, provided a speedup of 5.6 and 84.4 over highly optimized implementations on the fastest central processing unit (CPU) tested, a quadcore 2.67 GHz Xeon processor, for the Izhikevich and the Hodgkin-Huxley models, respectively. The CPU implementation utilized all four cores and the vector data parallelism offered by the processor. The results indicate that GPUs are well suited for this application domain.
Fiehn, Oliver
2016-01-01
Gas chromatography-mass spectrometry (GC-MS)-based metabolomics is ideal for identifying and quantitating small molecular metabolites (<650 daltons), including small acids, alcohols, hydroxyl acids, amino acids, sugars, fatty acids, sterols, catecholamines, drugs, and toxins, often using chemical derivatization to make these compounds volatile enough for gas chromatography. This unit shows that on GC-MS- based metabolomics easily allows integrating targeted assays for absolute quantification of specific metabolites with untargeted metabolomics to discover novel compounds. Complemented by database annotations using large spectral libraries and validated, standardized standard operating procedures, GC-MS can identify and semi-quantify over 200 compounds per study in human body fluids (e.g., plasma, urine or stool) samples. Deconvolution software enables detection of more than 300 additional unidentified signals that can be annotated through accurate mass instruments with appropriate data processing workflows, similar to liquid chromatography-MS untargeted profiling (LC-MS). Hence, GC-MS is a mature technology that not only uses classic detectors (‘quadrupole’) but also target mass spectrometers (‘triple quadrupole’) and accurate mass instruments (‘quadrupole-time of flight’). This unit covers the following aspects of GC-MS-based metabolomics: (i) sample preparation from mammalian samples, (ii) acquisition of data, (iii) quality control, and (iv) data processing. PMID:27038389
Enabling Large Focal Plane Arrays Through Mosaic Hybridization
NASA Technical Reports Server (NTRS)
Miller, TImothy M.; Jhabvala, Christine A.; Leong, Edward; Costen, Nicholas P.; Sharp, Elmer; Adachi, Tomoko; Benford, Dominic
2012-01-01
We have demonstrated advances in mosaic hybridization that will enable very large format far-infrared detectors. Specifically we have produced electrical detector models via mosaic hybridization yielding superconducting circuit paths by hybridizing separately fabricated sub-units onto a single detector unit. The detector model was made on a 100mm diameter wafer while four model readout quadrant chips were made from a separate 100mm wafer. The individually fabricated parts were hybridized using a flip-chip bonder to assemble the detector-readout stack. Once all of the hybridized readouts were in place, a single, large and thick silicon substrate was placed on the stack and attached with permanent epoxy to provide strength and a Coefficient of Thermal Expansion match to the silicon components underneath. Wirebond pads on the readout chips connect circuits to warm readout electronics; and were used to validate the successful superconducting electrical interconnection of the model mosaic-hybrid detector. This demonstration is directly scalable to 150 mm diameter wafers, enabling pixel areas over ten times the area currently available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bell, Alexis T.; Alger, Monty M.; Flytzani-Stephanopoulos, Maria
A decade ago, the U.S. chemical industry was in decline. Of the more than 40 chemical manufacturing plants being built worldwide in the mid-2000s with more than $1 billion in capitalization, none were under construction in the United States. Today, as a result of abundant domestic supplies of affordable natural gas and natural gas liquids resulting from the dramatic rise in shale gas production, the U.S. chemical industry has gone from the world’s highest-cost producer in 2005 to among the lowest-cost producers today. The low cost and increased supply of natural gas and natural gas liquids provides an opportunity tomore » discover and develop new catalysts and processes to enable the direct conversion of natural gas and natural gas liquids into value-added chemicals with a lower carbon footprint. The economic implications of developing advanced technologies to utilize and process natural gas and natural gas liquids for chemical production could be significant, as commodity, intermediate, and fine chemicals represent a higher-economic-value use of shale gas compared with its use as a fuel. To better understand the opportunities for catalysis research in an era of shifting feedstocks for chemical production and to identify the gaps in the current research portfolio, the National Academies of Sciences, Engineering, and Medicine conducted an interactive, multidisciplinary workshop in March 2016. The goal of this workshop was to identify advances in catalysis that can enable the United States to fully realize the potential of the shale gas revolution for the U.S. chemical industry and, as a result, to help target the efforts of U.S. researchers and funding agencies on those areas of science and technology development that are most critical to achieving these advances. This publication summarizes the presentations and discussions from the workshop.« less
Improvement of Vivarium Biodecontamination through Data-acquisition Systems and Automation.
Devan, Shakthi Rk; Vasu, Suresh; Mallikarjuna, Yogesha; Ponraj, Ramkumar; Kamath, Gireesh; Poosala, Suresh
2018-03-01
Biodecontamination is important for eliminating pathogens at research animal facilities, thereby preventing contamination within barrier systems. We enhanced our facility's standard biodecontamination method to replace the traditional foggers, and the new system was used effectively after creating bypass ducts in HVAC units so that individual rooms could be isolated. The entire system was controlled by inhouse-developed supervisory control and data-acquisition software that supported multiple cycles of decontamination by equipment, which had different decontamination capacities, operated in parallel, and used different agents, including H2O2 vapor and ClO2 gas. The process was validated according to facility mapping, and effectiveness was assessed by using biologic (Geobacillus stearothermophilus) and chemical indicator strips, which were positioned before decontamination, and by sampling contact plates after the completion of each cycle. The results of biologic indicators showed 6-log reduction in microbial counts after successful decontamination cycles for both agents and found to be compatible with clean-room panels including commonly used materials in vivarium such as racks, cages, trolleys, cage changing stations, biosafety cabinets, refrigerators and other equipment in both procedure and animal rooms. In conclusion, the automated process enabled users to perform effective decontamination through multiple cycles with realtime documentation and provided additional capability to deal with potential outbreaks. Enabling software integration of automation improved quality-control systems in our vivarium.
NASA Astrophysics Data System (ADS)
Spychala, Y. T.; Hodgson, D. M.; Flint, S. S.; Mountney, N. P.
2015-06-01
Intraslope lobe deposits provide a process record of the infill of accommodation on submarine slopes and their recognition enables the accurate reconstruction of the stratigraphic evolution of submarine slope systems. Extensive exposures of discrete sand-prone packages in Units D/E and E, Fort Brown Formation, Karoo Basin, South Africa, permit analysis of the sedimentology and stacking patterns of three intraslope lobe complexes and their palaeogeographic reconstruction via bed-scale analysis and physical correlation of key stratal surfaces. The sand-prone packages comprise tabular, aggradationally to slightly compensationally stacked lobe deposits with constituent facies associations that can be attributed to lobe axis, lobe off-axis, lobe-fringe and distal lobe-fringe environments. Locally, intraslope lobe deposits are incised by low aspect ratio channels that mark basinward progradation of the deepwater system. The origin of accommodation on the slope for lobe deposition is interpreted to be due to differential compaction or healing of scars from mass wasting processes. The stacking patterns and sedimentary facies arrangement identified in this study are distinct from those of more commonly recognized basin-floor lobe deposits, thereby enabling the establishment of recognition criteria for intraslope lobe deposits in other less well exposed and studied fine-grained systems. Compared to basin floor lobes, intraslope lobes are smaller in volume, influenced by higher degrees of confinement, and tend to show aggradational stacking patterns.
Diagenetic gypsum related to sulfur deposits in evaporites (Libros Gypsum, Miocene, NE Spain)
NASA Astrophysics Data System (ADS)
Ortí, Federico; Rosell, Laura; Anadón, Pere
2010-07-01
The Libros Gypsum is the thickest evaporite unit of the Miocene infill of the Teruel Basin in NE Spain. During the deposition of this unit, intense bacterial sulfate-reducing (BSR) activity in the lake depocenter generated a native sulfur deposit. Diagenetic gypsum resulted from subsequent sulfur oxidation. The different processes involved in these transformations were first investigated by Anadón et al. (1992). The present paper is concerned with this diagenetic gypsum from the stratigraphic, petrographic, isotopic and genetic points of view. Diagenetic gypsum occurs mainly as continuous or discontinuous layers, individual levels or lenses, irregular masses, nodules and micronodules, and veins. Its main textures are coarse-crystalline anhedral and fine-grained (alabastrine), both of which can replace any former lithology (carbonate, gypsum, and sulfur). The following sequence of processes and mineral/textural transformations is deduced: primary gypsum deposition — BSR and biodiagenetic carbonate/H 2S production — growth of native sulfur — growth of diagenetic gypsum — partial recrystallization of the diagenetic gypsum textures. The gypsification of the native sulfur generated two types of banded structures in the diagenetic gypsum: (1) concentric structures of centripetal growth, and (2) expansive, roughly concentric structures. In the first type, the gypsification operated from the outer boundaries towards the inner parts. In the second type, part of the carbonate hosting the sulfur was also gypsified (replaced/cemented). In the diagenetic gypsum, the δ34S values are in agreement with a native sulfur and H 2S provenance. The δ18O sulfate values, however, enable us to differentiate two main groups of values: one with positive values and the other with negative values. In the group of positive values, interstitial (evaporated) solutions participated in the sulfur oxidation; this process presumably occurred in a first oxidation stage during shallow-to-deeper burial of the Libros Gypsum unit. In the group of negative values, however, only meteoric waters participated in the oxidation, which presumably occurred in a second oxidation stage during the final exhumation of the unit. A third group of values is characterized by very high sulfur and oxygen values, suggesting that BSR residual solutions also participated in the oxidation processes locally. During the two oxidation stages, both the textural characteristics and the isotopic composition of the diagenetic gypsum indicate that gypsification operated as a multistadic process.
Constructing Uniformity: the Standardization of International Electromagnetic Measures, 1860-1912
NASA Astrophysics Data System (ADS)
Lagerstrom, Larry Randles
Metrology gained much attention from electrical scientists and practitioners in the nineteenth and early twentieth centuries. Spurred by the expanding telegraph industries, they considered the construction and acceptance of a universal system of electromagnetic measures essential for the growth of science and technology. The task was not easy. Scientists and practitioners, having different concerns and needs, often found themselves at odds. National rivalries further obstructed the attainment of uniform measures. Under the auspices of a series of international electrical congresses and conferences between 1881 and 1908, the systembuilders succeeded in establishing an international system of practical electrical units and standards--the ohm, volt, ampere, coulomb, farad, joule, and watt--based on the centimeter-gram-second (CGS) system of measures. They had less success, however, with practical magnetic units. They had designed the system of electrical units to meet the needs of telegraphy. But the rise of the technologies of electrical power in the late nineteenth century made it difficult to define magnetic units that were both practical for the new technologies and coherent with the existing system of units. The international congress, as an institution, also gave them trouble. It lacked authority and stability and, in some cases, hindered the development of the system of units. More credit for the success of the systembuilders must go, paradoxically, to the national physical laboratories that arose in Germany, France, Great Britain, and the United States circa 1900. They enabled the standardization of international electromagnetic measures by narrowing the community of systembuilders to a small circle of elite experts. This historical process illustrates important aspects of the ways and means of standardization, of the technical and social construction of uniformity.
Shorebird Migration Patterns in Response to Climate Change: A Modeling Approach
NASA Technical Reports Server (NTRS)
Smith, James A.
2010-01-01
The availability of satellite remote sensing observations at multiple spatial and temporal scales, coupled with advances in climate modeling and information technologies offer new opportunities for the application of mechanistic models to predict how continental scale bird migration patterns may change in response to environmental change. In earlier studies, we explored the phenotypic plasticity of a migratory population of Pectoral sandpipers by simulating the movement patterns of an ensemble of 10,000 individual birds in response to changes in stopover locations as an indicator of the impacts of wetland loss and inter-annual variability on the fitness of migratory shorebirds. We used an individual based, biophysical migration model, driven by remotely sensed land surface data, climate data, and biological field data. Mean stop-over durations and stop-over frequency with latitude predicted from our model for nominal cases were consistent with results reported in the literature and available field data. In this study, we take advantage of new computing capabilities enabled by recent GP-GPU computing paradigms and commodity hardware (general purchase computing on graphics processing units). Several aspects of our individual based (agent modeling) approach lend themselves well to GP-GPU computing. We have been able to allocate compute-intensive tasks to the graphics processing units, and now simulate ensembles of 400,000 birds at varying spatial resolutions along the central North American flyway. We are incorporating additional, species specific, mechanistic processes to better reflect the processes underlying bird phenotypic plasticity responses to different climate change scenarios in the central U.S.
Competitive Electricity Market Regulation in the United States: A Primer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flores-Espino, Francisco; Tian, Tian; Chernyakhovskiy, Ilya
The electricity system in the United States is a complex mechanism where different technologies, jurisdictions and regulatory designs interact. Today, two major models for electricity commercialization operate in the United States. One is the regulated monopoly model, in which vertically integrated electricity providers are regulated by state commissions. The other is the competitive model, in which power producers can openly access transmission infrastructure and participate in wholesale electricity markets. This paper describes the origins, evolution, and current status of the regulations that enable competitive markets in the United States.
Sharma, Bharati; Ramani, K.V.; Mavalankar, Dileep; Kanguru, Lovney; Hussein, Julia
2015-01-01
Background Infections acquired during childbirth are a common cause of maternal and perinatal mortality and morbidity. Changing provider behaviour and organisational settings within the health system is key to reducing the spread of infection. Objective To explore the opinions of health personnel on health system factors related to infection control and their perceptions of change in a sample of hospital maternity units. Design An organisational change process called ‘appreciative inquiry’ (AI) was introduced in three maternity units of hospitals in Gujarat, India. AI is a change process that builds on recognition of positive actions, behaviours, and attitudes. In-depth interviews were conducted with health personnel to elicit information on the environment within which they work, including physical and organisational factors, motivation, awareness, practices, perceptions of their role, and other health system factors related to infection control activities. Data were obtained from three hospitals which implemented AI and another three not involved in the intervention. Results Challenges which emerged included management processes (e.g. decision-making and problem-solving modalities), human resource shortages, and physical infrastructure (e.g. space, water, and electricity supplies). AI was perceived as having a positive influence on infection control practices. Respondents also said that management processes improved although some hospitals had already undergone an accreditation process which could have influenced the changes described. Participants reported that team relationships had been strengthened due to AI. Conclusion Technical knowledge is often emphasised in health care settings and less attention is paid to factors such as team relationships, leadership, and problem solving. AI can contribute to improving infection control by catalysing and creating forums for team building, shared decision making and problem solving in an enabling environment. PMID:26119249
Sharma, Bharati; Ramani, K V; Mavalankar, Dileep; Kanguru, Lovney; Hussein, Julia
2015-01-01
Infections acquired during childbirth are a common cause of maternal and perinatal mortality and morbidity. Changing provider behaviour and organisational settings within the health system is key to reducing the spread of infection. To explore the opinions of health personnel on health system factors related to infection control and their perceptions of change in a sample of hospital maternity units. An organisational change process called 'appreciative inquiry' (AI) was introduced in three maternity units of hospitals in Gujarat, India. AI is a change process that builds on recognition of positive actions, behaviours, and attitudes. In-depth interviews were conducted with health personnel to elicit information on the environment within which they work, including physical and organisational factors, motivation, awareness, practices, perceptions of their role, and other health system factors related to infection control activities. Data were obtained from three hospitals which implemented AI and another three not involved in the intervention. Challenges which emerged included management processes (e.g. decision-making and problem-solving modalities), human resource shortages, and physical infrastructure (e.g. space, water, and electricity supplies). AI was perceived as having a positive influence on infection control practices. Respondents also said that management processes improved although some hospitals had already undergone an accreditation process which could have influenced the changes described. Participants reported that team relationships had been strengthened due to AI. Technical knowledge is often emphasised in health care settings and less attention is paid to factors such as team relationships, leadership, and problem solving. AI can contribute to improving infection control by catalysing and creating forums for team building, shared decision making and problem solving in an enabling environment.
DDGIPS: a general image processing system in robot vision
NASA Astrophysics Data System (ADS)
Tian, Yuan; Ying, Jun; Ye, Xiuqing; Gu, Weikang
2000-10-01
Real-Time Image Processing is the key work in robot vision. With the limitation of the hardware technique, many algorithm-oriented firmware systems were designed in the past. But their architectures were not flexible enough to achieve a multi-algorithm development system. Because of the rapid development of microelectronics technique, many high performance DSP chips and high density FPGA chips have come to life, and this makes it possible to construct a more flexible architecture in real-time image processing system. In this paper, a Double DSP General Image Processing System (DDGIPS) is concerned. We try to construct a two-DSP-based FPGA-computational system with two TMS320C6201s. The TMS320C6x devices are fixed-point processors based on the advanced VLIW CPU, which has eight functional units, including two multipliers and six arithmetic logic units. These features make C6x a good candidate for a general purpose system. In our system, the two TMS320C6201s each has a local memory space, and they also have a shared system memory space which enables them to intercommunicate and exchange data efficiently. At the same time, they can be directly inter-connected in star-shaped architecture. All of these are under the control of a FPGA group. As the core of the system, FPGA plays a very important role: it takes charge of DPS control, DSP communication, memory space access arbitration and the communication between the system and the host machine. And taking advantage of reconfiguring FPGA, all of the interconnection between the two DSP or between DSP and FPGA can be changed. In this way, users can easily rebuild the real-time image processing system according to the data stream and the task of the application and gain great flexibility.
DDGIPS: a general image processing system in robot vision
NASA Astrophysics Data System (ADS)
Tian, Yuan; Ying, Jun; Ye, Xiuqing; Gu, Weikang
2000-10-01
Real-Time Image Processing is the key work in robot vision. With the limitation of the hardware technique, many algorithm-oriented firmware systems were designed in the past. But their architectures were not flexible enough to achieve a multi- algorithm development system. Because of the rapid development of microelectronics technique, many high performance DSP chips and high density FPGA chips have come to life, and this makes it possible to construct a more flexible architecture in real-time image processing system. In this paper, a Double DSP General Image Processing System (DDGIPS) is concerned. We try to construct a two-DSP-based FPGA-computational system with two TMS320C6201s. The TMS320C6x devices are fixed-point processors based on the advanced VLIW CPU, which has eight functional units, including two multipliers and six arithmetic logic units. These features make C6x a good candidate for a general purpose system. In our system, the two TMS320C6210s each has a local memory space, and they also have a shared system memory space which enable them to intercommunicate and exchange data efficiently. At the same time, they can be directly interconnected in star- shaped architecture. All of these are under the control of FPGA group. As the core of the system, FPGA plays a very important role: it takes charge of DPS control, DSP communication, memory space access arbitration and the communication between the system and the host machine. And taking advantage of reconfiguring FPGA, all of the interconnection between the two DSP or between DSP and FPGA can be changed. In this way, users can easily rebuild the real-time image processing system according to the data stream and the task of the application and gain great flexibility.
NASA Astrophysics Data System (ADS)
Vucnik, Matevz; Robinson, Johanna; Smolnikar, Miha; Kocman, David; Horvat, Milena; Mohorcic, Mihael
2015-04-01
Key words: portable air quality sensor, CITI-SENSE, participatory monitoring, VESNA-AQ The emergence of low-cost easy to use portable air quality sensors units is opening new possibilities for individuals to assess their exposure to air pollutants at specific place and time, and share this information through the Internet connection. Such portable sensors units are being used in an ongoing citizen science project called CITI-SENSE, which enables citizens to measure and share the data. The project aims through creating citizens observatories' to empower citizens to contribute to and participate in environmental governance, enabling them to support and influence community and societal priorities as well as associated decision making. An air quality measurement system based on VESNA sensor platform was primarily designed within the project for the use as portable sensor unit in selected pilot cities (Belgrade, Ljubljana and Vienna) for monitoring outdoor exposure to pollutants. However, functionally the same unit with different set of sensors could be used for example as an indoor platform. The version designed for the pilot studies was equipped with the following sensors: NO2, O3, CO, temperature, relative humidity, pressure and accelerometer. The personal sensor unit is battery powered and housed in a plastic box. The VESNA-based air quality (AQ) monitoring system comprises the VESNA-AQ portable sensor unit, a smartphone app and the remote server. Personal sensor unit supports wireless connection to an Android smartphone via built-in Wi-Fi. The smartphone in turn serves also as the communication gateway towards the remote server using any of available data connections. Besides the gateway functionality the role of smartphone is to enrich data coming from the personal sensor unit with the GPS location, timestamps and user defined context. This, together with an accelerometer, enables the user to better estimate ones exposure in relation to physical activities, time and location. The end user can monitor the measured parameters through a smartphone application. The smartphone app implements a custom developed LCSP (Lightweight Client Server Protocol) protocol which is used to send requests to the VESNA-AQ unit and to exchange information. When the data is obtained from the VESNA-AQ unit, the mobile application visualizes the data. It also has an option to forward the data to the remote server in a custom JSON structure over a HTTP POST request. The server stores the data in the database and in parallel translates the data to WFS and forwards it to the main CITI-SENSE platform over WFS-T in a common XML format over HTTP POST request. From there data can be accessed through the Internet and visualised in different forms and web applications developed by the CITI-SENSE project. In the course of the project, the collected data will be made publicly available enabling the citizens to participate in environmental governance. Acknowledgements: CITI-SENSE is a Collaborative Project partly funded by the EU FP7-ENV-2012 under grant agreement no 308524 (www.citi-sense.eu).
Protecting the United States Against Information Warfare
2000-04-01
Information Systems Protection, An Invitation to a Dialogue, 2. ECommerce , "Business to Business to Consumer, Transaction Enabled Internet Solutions," n.d...available from <http://www.cplus.net/ ecommerce /estats.html>; Internet; accessed 7 February 2000. 7 Clinton, Defending America’s Cyberspace... ECommerce . "Business to Business to Consumer, Transaction Enabled Internet Solutions." n.d. Available from < http://www.cplus.net/ ecommerce
syris: a flexible and efficient framework for X-ray imaging experiments simulation.
Faragó, Tomáš; Mikulík, Petr; Ershov, Alexey; Vogelgesang, Matthias; Hänschke, Daniel; Baumbach, Tilo
2017-11-01
An open-source framework for conducting a broad range of virtual X-ray imaging experiments, syris, is presented. The simulated wavefield created by a source propagates through an arbitrary number of objects until it reaches a detector. The objects in the light path and the source are time-dependent, which enables simulations of dynamic experiments, e.g. four-dimensional time-resolved tomography and laminography. The high-level interface of syris is written in Python and its modularity makes the framework very flexible. The computationally demanding parts behind this interface are implemented in OpenCL, which enables fast calculations on modern graphics processing units. The combination of flexibility and speed opens new possibilities for studying novel imaging methods and systematic search of optimal combinations of measurement conditions and data processing parameters. This can help to increase the success rates and efficiency of valuable synchrotron beam time. To demonstrate the capabilities of the framework, various experiments have been simulated and compared with real data. To show the use case of measurement and data processing parameter optimization based on simulation, a virtual counterpart of a high-speed radiography experiment was created and the simulated data were used to select a suitable motion estimation algorithm; one of its parameters was optimized in order to achieve the best motion estimation accuracy when applied on the real data. syris was also used to simulate tomographic data sets under various imaging conditions which impact the tomographic reconstruction accuracy, and it is shown how the accuracy may guide the selection of imaging conditions for particular use cases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guanjing; Granderson, J.; Brambley, Michael R.
2015-07-01
In the United States, small commercial buildings represent 51% of total floor space of all commercial buildings and consume nearly 3 quadrillion Btu (3.2 quintillion joule) of site energy annually, presenting an enormous opportunity for energy savings. Retro-commissioning (RCx), the process through which professional energy service providers identify and correct operational problems, has proven to be a cost-effective means to achieve median energy savings of 16%. However, retro-commissioning is not typically conducted at scale throughout the commercial stock. Very few small commercial buildings are retro-commissioned because utility expenses are relatively modest, margins are tighter, and capital for improvements is limited.more » In addition, small buildings do not have in-house staff with the expertise to identify improvement opportunities. In response, a turnkey hardware-software solution was developed to enable cost-effective, monitoring-based RCx of small commercial buildings. This highly tailored solution enables non-commissioning providers to identify energy and comfort problems, as well as associated cost impacts and remedies. It also facilitates scale by offering energy service providers the means to streamline their existing processes and reduce costs by more than half. The turnkey RCx sensor suitcase consists of two primary components: a suitcase of sensors for short-term building data collection that guides users through the process of deploying and retrieving their data and a software application that automates analysis of sensor data, identifies problems and generates recommendations. This paper presents the design and testing of prototype models, including descriptions of the hardware design, analysis algorithms, performance testing, and plans for dissemination.« less
Kessels-Habraken, Marieke; Van der Schaaf, Tjerk; De Jonge, Jan; Rutte, Christel
2010-05-01
Medical errors in health care still occur frequently. Unfortunately, errors cannot be completely prevented and 100% safety can never be achieved. Therefore, in addition to error reduction strategies, health care organisations could also implement strategies that promote timely error detection and correction. Reporting and analysis of so-called near misses - usually defined as incidents without adverse consequences for patients - are necessary to gather information about successful error recovery mechanisms. This study establishes the need for a clearer and more consistent definition of near misses to enable large-scale reporting and analysis in order to obtain such information. Qualitative incident reports and interviews were collected on four units of two Dutch general hospitals. Analysis of the 143 accompanying error handling processes demonstrated that different incident types each provide unique information about error handling. Specifically, error handling processes underlying incidents that did not reach the patient differed significantly from those of incidents that reached the patient, irrespective of harm, because of successful countermeasures that had been taken after error detection. We put forward two possible definitions of near misses and argue that, from a practical point of view, the optimal definition may be contingent on organisational context. Both proposed definitions could yield large-scale reporting of near misses. Subsequent analysis could enable health care organisations to improve the safety and quality of care proactively by (1) eliminating failure factors before real accidents occur, (2) enhancing their ability to intercept errors in time, and (3) improving their safety culture. Copyright 2010 Elsevier Ltd. All rights reserved.
Maximizing coupling-efficiency of high-power diode lasers utilizing hybrid assembly technology
NASA Astrophysics Data System (ADS)
Zontar, D.; Dogan, M.; Fulghum, S.; Müller, T.; Haag, S.; Brecher, C.
2015-03-01
In this paper, we present hybrid assembly technology to maximize coupling efficiency for spatially combined laser systems. High quality components, such as center-turned focusing units, as well as suitable assembly strategies are necessary to obtain highest possible output ratios. Alignment strategies are challenging tasks due to their complexity and sensitivity. Especially in low-volume production fully automated systems are economically at a disadvantage, as operator experience is often expensive. However reproducibility and quality of automatically assembled systems can be superior. Therefore automated and manual assembly techniques are combined to obtain high coupling efficiency while preserving maximum flexibility. The paper will describe necessary equipment and software to enable hybrid assembly processes. Micromanipulator technology with high step-resolution and six degrees of freedom provide a large number of possible evaluation points. Automated algorithms are necess ary to speed-up data gathering and alignment to efficiently utilize available granularity for manual assembly processes. Furthermore, an engineering environment is presented to enable rapid prototyping of automation tasks with simultaneous data ev aluation. Integration with simulation environments, e.g. Zemax, allows the verification of assembly strategies in advance. Data driven decision making ensures constant high quality, documents the assembly process and is a basis for further improvement. The hybrid assembly technology has been applied on several applications for efficiencies above 80% and will be discussed in this paper. High level coupling efficiency has been achieved with minimized assembly as a result of semi-automated alignment. This paper will focus on hybrid automation for optimizing and attaching turning mirrors and collimation lenses.
Policy Process Editor for P3BM Software
NASA Technical Reports Server (NTRS)
James, Mark; Chang, Hsin-Ping; Chow, Edward T.; Crichton, Gerald A.
2010-01-01
A computer program enables generation, in the form of graphical representations of process flows with embedded natural-language policy statements, input to a suite of policy-, process-, and performance-based management (P3BM) software. This program (1) serves as an interface between users and the Hunter software, which translates the input into machine-readable form; and (2) enables users to initialize and monitor the policy-implementation process. This program provides an intuitive graphical interface for incorporating natural-language policy statements into business-process flow diagrams. Thus, the program enables users who dictate policies to intuitively embed their intended process flows as they state the policies, reducing the likelihood of errors and reducing the time between declaration and execution of policy.
NASA Astrophysics Data System (ADS)
Johansson, Emma; Lindborg, Tobias
2017-04-01
The Arctic region is sensitive to global warming, and permafrost thaw and release of old carbon are examples of processes that may have a positive feedback effect to the global climate system. Quantification and assumptions on future change are often based on model predictions. Such models require cross-disciplinary data of high quality that often is lacking. Biogeochemical processes in the landscape are highly influenced by the hydrology, which in turn is intimately related to permafrost processes. Thus, a multidisciplinary approach is needed when collecting data and setting up field experiments aiming at increase the understanding of these processes. Here we summarize and present data collected in the GRASP, Greenland Analogue Surface Project. GRASP is a catchment-scale field study of the periglacial area in the Kangerlussuaq region, West Greenland, focusing on hydrological and biogeochemical processes in the landscape. The site investigations were initiated in 2010 and have since then resulted in three separate data sets published in ESSD (Earth system and Science Data) each one focusing on i) meteorological data and hydrology, ii) biogeochemistry and iii) geometries of sediments and the active layer. The three data-sets, which are freely available via the PANGAEA data base, enable conceptual and coupled numerical modeling of hydrological and biogeochemical processes. An important strength with the GRASP data is that all data is collected within the same, relatively small, catchment area. This implies that measurements are more easily linked to the right source area or process. Despite the small catchment area it includes the major units of the periglacial hydrological system; a lake, a talik, a supra- and subpermafrost aquifer and, consequently, biogeochemical processes in each of these units may be studied. The new data from GRASP is both used with the aim to increase the knowledge of present day periglacial hydrology and biogeochemistry but also in order to predict consequences within these subjects of future climate change.
NASA Astrophysics Data System (ADS)
Le, Anh H.; Park, Young W.; Ma, Kevin; Jacobs, Colin; Liu, Brent J.
2010-03-01
Multiple Sclerosis (MS) is a progressive neurological disease affecting myelin pathways in the brain. Multiple lesions in the white matter can cause paralysis and severe motor disabilities of the affected patient. To solve the issue of inconsistency and user-dependency in manual lesion measurement of MRI, we have proposed a 3-D automated lesion quantification algorithm to enable objective and efficient lesion volume tracking. The computer-aided detection (CAD) of MS, written in MATLAB, utilizes K-Nearest Neighbors (KNN) method to compute the probability of lesions on a per-voxel basis. Despite the highly optimized algorithm of imaging processing that is used in CAD development, MS CAD integration and evaluation in clinical workflow is technically challenging due to the requirement of high computation rates and memory bandwidth in the recursive nature of the algorithm. In this paper, we present the development and evaluation of using a computing engine in the graphical processing unit (GPU) with MATLAB for segmentation of MS lesions. The paper investigates the utilization of a high-end GPU for parallel computing of KNN in the MATLAB environment to improve algorithm performance. The integration is accomplished using NVIDIA's CUDA developmental toolkit for MATLAB. The results of this study will validate the practicality and effectiveness of the prototype MS CAD in a clinical setting. The GPU method may allow MS CAD to rapidly integrate in an electronic patient record or any disease-centric health care system.
Initial sustainability assessment of tapioca starch production system in Lake Toba area
NASA Astrophysics Data System (ADS)
Situmorang, Asido; Manik, Yosef
2018-04-01
This study aims to explore to what extent the principles of sustainability have been applied in a tapioca industry located in Lake Toba area and to explore the aspects that open the opportunities for system improvement. In conducting such assessment, we adopted the life-cycle approach using Mass Flow Analysis methods that covers all cassava starch production processes from fresh cassava root till dry cassava starch. The inventory data were collected from the company, in the form of both production record and interviews. From data analysis the authors were able to present a linked flow that describes the production process of tapioca starch that quantifies into the functional unit of one pack marketable tapioca starch weighs 50 kg. In order to produce 50 kg of tapioca, 200 kg cassava root and 800 kg of water are required. This production efficiency translates to 25% yield. This system generates 40 kg of cassava peel, 60 kg of pulp and 850 kg of waste water. For starch drying 208.8 MJ of thermal energy is required in the form of heating fuel. The material flow analysis is employed for impact assessment. Several options in improving the operation are proposed includes utilization of pulp into more valuable co-products, integration of waste treatment plant to enable the use of water recycled from the extraction operation for the washing process, and to application of a waste water treatment system that produces biogas as a renewable energy, which reduces the consumption of fuel in dryer unit.
ERIC Educational Resources Information Center
Peirce, Harry Edgar, Jr.
The purposes of this study were to: develop and measure the effectiveness of instructional units designed to enable young adult farmers to improve their ability to use farm management principles when making decisions, and measure the influence that independent variables have on the young farmer's level of understanding these principles.…
Leadership in Art Education: Taking Action in Schools and Communities
ERIC Educational Resources Information Center
Freedman, Kerry
2011-01-01
One of the traditional privileges for teachers in the United States has been control over the curriculum. Unlike most countries in the world, the United States does not have a national curriculum "per se", enabling teachers to make curriculum decisions that most benefit local students. However, the Elementary and Secondary Act, also known as the…
A Happening? Creative Film-Making Resource Unit.
ERIC Educational Resources Information Center
Daley, Mary E.
To change the classroom trend of promoting competition among children and repressing their feelings, this unit on film making focuses on a creative activity which will enable students to (1) make new things meaningful to them; (2) see purpose and meaning in familiar things; (3) observe and create beauty in life and art; (4) redefine or form their…
CSHE@50: A Reflection and Prospectus on Globalization and Higher Education
ERIC Educational Resources Information Center
King, C. Judson, Ed.; Douglass, John Aubrey, Ed.
2007-01-01
In the spring of 1957, the Center for Studies in Higher Education (CSHE) at the University of California, Berkeley was formally established as an organized research unit, enabled by an initial grant from the Carnegie Corporation and making it the first academic enterprise in the United States focused on higher education policy issues. Since then,…
ERIC Educational Resources Information Center
Siddiqi, Zoveen; Tiro, Jasmin A.; Shuval, Kerem
2011-01-01
Physical inactivity is a leading cause of premature death, disability and numerous chronic diseases. Minority and underserved populations in the United States and worldwide have a higher prevalence of physical inactivity affecting their morbidity and mortality rates. In the United States, African Americans are less physically active and have a…
Daniel Murphy; Carina Wyborn; Laurie Yung; Daniel R. Williams; Cory Cleveland; Lisa Eby; Solomon Dobrowski; Erin Towler
2016-01-01
Current projections of future climate change foretell potentially transformative ecological changes that threaten communities globally. Using two case studies from the United States Intermountain West, this article highlights the ways in which a better articulation between theory and methods in research design can generate proactive applied tools that enable...
3 CFR 8766 - Proclamation 8766 of December 8, 2011. Bill of Rights Day, 2011
Code of Federal Regulations, 2012 CFR
2012-01-01
... Proclamation On December 15, 1791, the United States adopted the Bill of Rights, enshrining in our Constitution... promise of enumerated rights enabled the ratification of the Constitution without fear that a more... vested in me by the Constitution and the laws of the United States, do hereby proclaim December 15, 2011...
The American Influence in Indonesian Teacher Training, 1956-1964
ERIC Educational Resources Information Center
Suwignyo, Agus
2017-01-01
This paper examines United States-Indonesian cooperation in the training of Indonesian teachers during the early decades of the Cold War. Indonesia badly needed teachers but the government's efforts to train new teachers were hampered by the tremendous lack of teachers who could train new teachers. The aid provided by the United States enabled the…
Schloss, Patrick D; Handelsman, Jo
2006-10-01
The recent advent of tools enabling statistical inferences to be drawn from comparisons of microbial communities has enabled the focus of microbial ecology to move from characterizing biodiversity to describing the distribution of that biodiversity. Although statistical tools have been developed to compare community structures across a phylogenetic tree, we lack tools to compare the memberships and structures of two communities at a particular operational taxonomic unit (OTU) definition. Furthermore, current tests of community structure do not indicate the similarity of the communities but only report the probability of a statistical hypothesis. Here we present a computer program, SONS, which implements nonparametric estimators for the fraction and richness of OTUs shared between two communities.
Security Protection on Trust Delegated Data in Public Mobile Networks
NASA Astrophysics Data System (ADS)
Weerasinghe, Dasun; Rajarajan, Muttukrishnan; Rakocevic, Veselin
This paper provides detailed solutions for trust delegation and security protection for medical records in public mobile communication networks. The solutions presented in this paper enable the development of software for mobile devices that can be used by emergency medical units in urgent need of sensitive personal information about unconscious patients. In today's world, technical improvements in mobile communication systems mean that users can expect to have access to data at any time regardless of their location. This paper presents a token-based procedure for the data security at a mobile device and delegation of trust between a requesting mobile unit and secure medical data storage. The data security at the mobile device is enabled using identity based key generation methodology.
Galvanic reduction of uranium(III) chloride from LiCl-KCl eutectic salt using gadolinium metal
NASA Astrophysics Data System (ADS)
Bagri, Prashant; Zhang, Chao; Simpson, Michael F.
2017-09-01
The drawdown of actinides is an important unit operation to enable the recycling of electrorefiner salt and minimization of waste. A new method for the drawdown of actinide chlorides from LiCl-KCl molten salt has been demonstrated here. Using the galvanic interaction between the Gd/Gd(III) and U/U(III) redox reactions, it is shown that UCl3 concentration in eutectic LiCl-KCl can be reduced from 8.06 wt.% (1.39 mol %) to 0.72 wt.% (0.12 mol %) in about an hour via plating U metal onto a steel basket. This is a simple process for returning actinides to the electrorefiner and minimizing their loss to the salt waste stream.
NASA Technical Reports Server (NTRS)
Chien, Steve; Rabideau, Gregg; Tran, Daniel; Knight, Russell; Chouinard, Caroline; Estlin, Tara; Gaines, Daniel; Clement, Bradley; Barrett, Anthony
2007-01-01
CASPER is designed to perform automated planning of interdependent activities within a system subject to requirements, constraints, and limitations on resources. In contradistinction to the traditional concept of batch planning followed by execution, CASPER implements a concept of continuous planning and replanning in response to unanticipated changes (including failures), integrated with execution. Improvements over other, similar software that have been incorporated into CASPER version 2.0 include an enhanced executable interface to facilitate integration with a wide range of execution software systems and supporting software libraries; features to support execution while reasoning about urgency, importance, and impending deadlines; features that enable accommodation to a wide range of computing environments that include various central processing units and random- access-memory capacities; and improved generic time-server and time-control features.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Woohyun; Katipamula, Srinivas; Lutes, Robert G.
This report describes how the intelligent load control (ILC) algorithm can be implemented to achieve peak demand reduction while minimizing impacts on occupant comfort. The algorithm was designed to minimize the additional sensors and minimum configuration requirements to enable a scalable and cost-effective implementation for both large and small-/medium-sized commercial buildings. The ILC algorithm uses an analytic hierarchy process (AHP) to dynamically prioritize the available curtailable loads based on both quantitative (deviation of zone conditions from set point) and qualitative rules (types of zone). Although the ILC algorithm described in this report was highly tailored to work with rooftop units,more » it can be generalized for application to other building loads such as variable-air-volume (VAV) boxes and lighting systems.« less
Golf in the United States: an evolution of accessibility.
Parziale, John R
2014-09-01
Golf affords physical and psychological benefits to persons who are physically challenged. Advances in adaptive technology, changes in golf course design, and rules modifications have enabled persons with neurological, musculoskeletal, and other impairments to play golf at a recreational, elite amateur, or professional level. The Americans with Disabilities Act has been cited in both federal and US Supreme Court rulings that have improved access for physically challenged golfers. Medical specialties, including physiatry, have played an important role in this process. This article reviews the history of golf's improvements in accessibility, and provides clinicians and physically challenged golfers with information that will facilitate participation in the sport. Copyright © 2014 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.
Education and training for technicians in photonics-enabled technologies
NASA Astrophysics Data System (ADS)
Hull, Daniel M.; Hull, Darrell M.
2005-10-01
Within a few years after lasers were first made operational in 1960, it became apparent that rapid growth in the applications of this new technology in industry, health care, and other fields would require a new generation of technicians in laser/optics engineering. Technicians are the men and women who work alongside scientists and engineers in bringing their ideas, designs, and processes to fruition. In America, most highly qualified technicians are graduates of associate of applied science (AAS) programs in community and technical colleges (two-year postsecondary institutions). Curricula and educational programs designed to prepare technicians in laser/electro-optics technology (LEOT) emerged in the 1970s; today there are over 15 LEOT programs in the United States producing over 100 LEOT graduates each year.
Integrated analysis of error detection and recovery
NASA Technical Reports Server (NTRS)
Shin, K. G.; Lee, Y. H.
1985-01-01
An integrated modeling and analysis of error detection and recovery is presented. When fault latency and/or error latency exist, the system may suffer from multiple faults or error propagations which seriously deteriorate the fault-tolerant capability. Several detection models that enable analysis of the effect of detection mechanisms on the subsequent error handling operations and the overall system reliability were developed. Following detection of the faulty unit and reconfiguration of the system, the contaminated processes or tasks have to be recovered. The strategies of error recovery employed depend on the detection mechanisms and the available redundancy. Several recovery methods including the rollback recovery are considered. The recovery overhead is evaluated as an index of the capabilities of the detection and reconfiguration mechanisms.
gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data
NASA Astrophysics Data System (ADS)
Hummel, Jacob A.
2016-11-01
We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.
Treeby, Bradley E; Tumen, Mustafa; Cox, B T
2011-01-01
A k-space pseudospectral model is developed for the fast full-wave simulation of nonlinear ultrasound propagation through heterogeneous media. The model uses a novel equation of state to account for nonlinearity in addition to power law absorption. The spectral calculation of the spatial gradients enables a significant reduction in the number of required grid nodes compared to finite difference methods. The model is parallelized using a graphical processing unit (GPU) which allows the simulation of individual ultrasound scan lines using a 256 x 256 x 128 voxel grid in less than five minutes. Several numerical examples are given, including the simulation of harmonic ultrasound images and beam patterns using a linear phased array transducer.
Castoldi, Laura; Monticelli, Serena; Senatore, Raffaele; Ielo, Laura; Pace, Vittorio
2018-05-31
The transfer of a reactive nucleophilic CH2X unit into a preformed bond enables the introduction of a fragment featuring the exact and desired degree of functionalization through a single synthetic operation. The instability of metallated α-organometallic species often poses serious questions regarding the practicability of using this conceptually intuitive and simple approach for forming C-C or C-heteroatom bonds. A deep understanding of processes regulating the formation of these nucleophiles is a precious source of inspiration not only for successfully applying theoretically feasible transformations (i.e. determining how to employ a given reagent), but also for designing new reactions which ultimately lead to the introduction of molecular complexity via short experimental sequences.